124
submitted 1 month ago by KarnaSubarna@lemmy.ml to c/firefox@lemmy.ml
all 35 comments
sorted by: hot top controversial new old
[-] adarza@lemmy.ca 140 points 1 month ago
  • no account or login required.
  • it's an addon (and one you have to go get), not baked-in.
  • limited to queries about content you're currently looking at.
    (it's not a general 'search' or queries engine)
  • llm is hosted by mozilla, not a third party.
  • session histories are not retained or shared, not even with mistral (it's their model).
  • user interactions are not used to train.
[-] jeena@piefed.jeena.net 27 points 1 month ago

Thanks for the summary. So it still sends the data to a server, even if it's Mozillas. Then I still can't use it for work, because the data is private and they wouldn't appreciate me sending their data toozilla.

[-] KarnaSubarna@lemmy.ml 21 points 1 month ago

In such scenario you need to host your choice of LLM locally.

[-] ReversalHatchery@beehaw.org 5 points 1 month ago

does the addon support usage like that?

[-] KarnaSubarna@lemmy.ml 7 points 1 month ago

No, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM.

I have this setup running for a while now.

[-] cmgvd3lw@discuss.tchncs.de 4 points 1 month ago

Which model you are running? Who much ram?

[-] KarnaSubarna@lemmy.ml 4 points 1 month ago* (last edited 1 month ago)

My (docker based) configuration:

Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1

Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM

Docker: https://docs.docker.com/engine/install/

Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html

Open WebUI: https://docs.openwebui.com/

Ollama: https://hub.docker.com/r/ollama/ollama

[-] LWD@lemm.ee 12 points 1 month ago

Technically it's a server operated by Google, leased by Mozilla. Mistral 7b could technically work locally, if Mozilla cared about doing such a thing.

I guess you can basically use the built-in AI chatbot functionality Mozilla rushed out the door, enable a secret setting, and use Mistral locally, but what a missed opportunity from the Privacy Browser Company

[-] Hamartiogonic@sopuli.xyz 1 points 1 month ago* (last edited 1 month ago)

According to Microsoft, you can safely send your work related stuff to Copilot. Besides, most companies already use a lot of their software and cloud services, so LLM queries don’t really add very much. If you happen to be working for one of those companies, MS probably already knows what you do for a living, hosts your meeting notes, knows your calendar etc.

If you’re working for Purism, RedHat or some other company like that, you might want to host your own LLM instead.

[-] fruitycoder@sh.itjust.works 10 points 1 month ago

That's really cool to see. A trusted hosted open source model is really missing in the ecosystem to me. I really like the idea of web centric integration too.

[-] thingsiplay@beehaw.org 1 points 1 month ago* (last edited 1 month ago)
[-] LucidBoi@lemmy.dbzer0.com 49 points 1 month ago

I want to point out that by downvoting this, you're reducing the visibility of the post for other people, therefore making less people informed of the change.

[-] lemmeBe@sh.itjust.works 6 points 1 month ago

My thoughts exactly.

However, I always found upvotes and downvotes a bit confusing because upvote is almost synonymous with "like" and downvote with "don't like". With upvote, that assumption isn't that problematic but with downvote it is, like in this case where post will have less chance of being seen.

[-] furzegulo@lemmy.dbzer0.com 48 points 1 month ago

no, i don't want to meet orbit, thank you very much.

[-] LWD@lemm.ee 33 points 1 month ago

I appreciate the option to not install it.

Now if only Mozilla could migrate their built-in AI stuff to this optional extension so it doesn't come pre-installed, that'd be great

[-] KarnaSubarna@lemmy.ml 11 points 1 month ago

The built-in AI staff,you referred to, is nothing but an accelerator to integrate with 3rd-party or self-hosted LLMs. It's quite similar to choosing a search engine in settings. This feature itself is lightweight and can be disabled in settings if not required.

[-] LWD@lemm.ee 3 points 1 month ago

The built-in AI staff [sic]... is... an accelerator to integrate with 3rd-party or self-hosted LLMs.

Users are only shown Big Tech "3rd-party" options. Mozilla made this choice intentionally.

Since Mozilla is clearly capable of developing an add-on that is not forcefully installed on user's devices, they should remove the built-in thing that endorses the highly unethical chatbots run by Google, OpenAI, etc.

[-] KarnaSubarna@lemmy.ml 2 points 1 month ago* (last edited 1 month ago)

Users are only shown Big Tech “3rd-party” options. Mozilla made this choice intentionally.

Well, how many users really have LLM local-hosted?

[-] LWD@lemm.ee 2 points 1 month ago

So we agree Mozilla only chose to promote Big Tech options.

[-] swordgeek@lemmy.ca 30 points 1 month ago

No thanks, I'll pass.

[-] dumbass@leminal.space 27 points 1 month ago

Firefox, tell your creepy little friend he can get the fuck off my property!

[-] KarnaSubarna@lemmy.ml 26 points 1 month ago

It's an add-on, not something baked-in the browser. It's not on your property at the first place, unless you choose to install it 🙂

[-] dumbass@leminal.space 23 points 1 month ago

For now, but one day Firefox will try sneak him into my house and hide him in a cupboard untill squatters rights kick in.

[-] KarnaSubarna@lemmy.ml 9 points 1 month ago

Even they choose to do so in future, usually there always is a about:config entry to disable it.

[-] possiblylinux127@lemmy.zip 2 points 1 month ago

I can't wait for Firefox to Starr putting product recommendations on all web pages

[-] ramblingsteve@lemmy.world 19 points 1 month ago* (last edited 1 month ago)

I'm starting to warm up to this stuff. There is a future rapidly hurtling towards us where, if you take the time to read and think for yourself, you will become a genius. It was happening already in some stem fields where people used GUI tools without ever reading what the buttons did, and if you took the time to read the manuals and the underlying methods, you could become vastly more competent than anybody else in your team. This "AI" bullshit is just extending the lazy culture out to every piece of information on the web, where average Joe is already unable to concentrate beyond 140 characters. Those that take the time to learn the fundamentals and read deeply will have vastly superior knowledge of any subject, while the majority will be spoon fed superficial summaries filled with errors and no way of realising.

[-] caseyweederman@lemmy.ca 3 points 1 month ago

That was written by an AI, wasn't it?
If anything brings me around on AI, it'll be the "kids these days and their dang quill and parchment, the chisel and stone tablet was good enough for me so it should be good enough for everyone" argument.

[-] ramblingsteve@lemmy.world 3 points 1 month ago

Thank you for illustrating my point!

[-] possiblylinux127@lemmy.zip 5 points 1 month ago

This is why I don't use direct Firefox. I use soft forks like Librewolf, Mull and now Fennec

[-] KarnaSubarna@lemmy.ml 8 points 1 month ago

This is just an add-on BTW. It's completely up to you to decide if you need this.

[-] possiblylinux127@lemmy.zip 6 points 1 month ago
[-] Blisterexe@lemmy.zip 3 points 1 month ago

that's an awful argument, are you worried that mozilla is also gonna start censoring swear words in firefox?

this post was submitted on 31 Dec 2024
124 points (100.0% liked)

Firefox

18626 readers
57 users here now

A place to discuss the news and latest developments on the open-source browser Firefox

founded 5 years ago
MODERATORS