• he/him

trying my best


doodlemancy
@doodlemancy

i guess Mozilla wants to know what you think of AI bullshit eventually being integrated with firefox! you can tell them there. in that thread. be polite! but firm! very fucking firm!!! ๐Ÿ”จ


You must log in to comment.

in reply to @doodlemancy's post:

We included chatbot providers like HuggingChat which says: "We endorse Privacy by Design. As such, your conversations are private to you and will not be shared with anyone, including model authors, for any purpose, including for research or model training purposes."

such a funny answer. don't even think the mozilla employee who said this believes in his own words

the one mozilla guy diligently fielding every flaming arrow tossed into that thread with a "gosh, sorry you think it's unethical, could you recommend any ethical corporate LLM services we could include alongside the unethical ones we added and shipped without asking anyone?" sort of vague toxic positivity makes me want to rip my own face off.

I've already left my piece on this in the replies, but might as well say it here too, with more... colorful language:
if Mozilla wants to save face and not shoot their foot with a 12 gauge, they're to remove this shit pronto. Maybe use some of the salvaged pieces to let users embed websites of their choice as a sidebar, but that's about all the use I can see of what's left if you remove the plagiarism engine garbage.

i work in the tech industry and these people are, unfortunately, everywhere. i have colleagues who genuinely ask chatgpt important questions, despite having a deep understanding of how LLMs work. it boggles the mind.

almost every non-tech person i talk to, on the other hand, is like "what is this AI garbage and how can i get rid of it" or simply has no idea what the hell i'm talking about. as soon as they know it can confidently lie to them then they throw it right out the window, as they should.

my favorite example is my kinda-tech-illiterate father-in-law saying "so it's just a worse version of using search that doesn't allow you to actually read its sources?" and i was so proud lol

my theory about this is... when you have a basic understanding of how the thing works you can also convince yourself, like, well i know how this works so i'll definitely be able to tell when it is and isn't telling me the truth. kinda like thinking you're immune to advertising or propaganda because you understand how it works. one of those tricky little mental traps where being kinda smart can make you kinda stupid lol

that combined with the novelty just seems to do something to people. i think in a few years a lot of tech guys are gonna be really embarrassed to talk about this