They support Claude, ChatGPT, Gemini, HuggingChat, and Mistral.

  • Eiri@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    I wish I had telemetry on such features.

    I really doubt a significant number of people use AI chatbots often enough that having it in a dedicated sidebar is worth it.

    • HouseWolf@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      2 months ago

      I switched a while back before all the Ai and “privacy preserving” telemetry stuff.

      Every update note I see for Firefox now just reinforces my decision.

  • ocassionallyaduck@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Thing is, for your average user with no GPU and whp never thinks about RAM, running a local LLM is intimidating. But it shouldn’t be. Any system with an integrated GPU, and the more RAM the better, can run simple models locally.

    The not so dirty secret is that ChatGPT 3 vs 4 isn’t that big a difference, and neither are leaps and bounds ahead of the publically available models for about 99% of tasks. For that 1% people will ooh and aah over it, but 99% of use cases are only seeing marginal gains on 4o.

    And the simplified models that run “only” 95% as well? They can use 90% fewer resources give pretty much identical answers outside of hyperspecific use cases.

    Running a a “smol” model as some are called, gets you all the bang for none of the buck, and your data stays on your system and never leaves.

    I’ve been yelling from the rooftops to some stupid corporate types that once the model is trained, it’s trained. Unless you are training models yourself, there is no need for the massive AI clusters, just for the model. Run it local on your hardware at a fraction of the cost.

    • LWD@lemm.ee
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      2 months ago

      There’s the tragedy with this new feature: they fast-tracked this past more popular requests, sticking it into Release Firefox.

      But they only rushed the part that connects to third parties. There was also a “localhost” option which was originally alongside the Big Five corporate offerings, but Mozilla ultimately decided to bury that one inside of the about:config settings.

  • nu11@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I don’t understand the hate. It’s just a sidebar for the supported LLMs. Maybe I’m misunderstanding?

    Yes, I would prefer Mozilla focus on the browser, but to me, this seems like it was done in an afternoon.

    • Scrollone@feddit.it
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      I want my browser to be a browser. I don’t want Pocket, I don’t want AI, I don’t want bullshit. There are plugins for that.

          • ToxicWaste@lemm.ee
            link
            fedilink
            arrow-up
            0
            ·
            2 months ago

            i know it is an unpopular opinion around here. but currently AI features open doors for sales. that is important.

            for the software i help develop, we introduced an optional AI integration. just its presence allowed us to sell the main SW multiple times. the AI plugin was never sold so far.

            investment AI: 2 weeks of gluecode. i am not concerned with finances, but that plugin is for sure net positive.

            • LWD@lemm.ee
              link
              fedilink
              arrow-up
              0
              arrow-down
              1
              ·
              2 months ago

              Do users like the AI integration, or is this just something the management class wanted to see? Right now, those clothes look crazy good on that emperor…

  • ohwhatfollyisman@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    as someone who’s never dabbled with ai bots, what does this feature do? is it only to query for information like a web search?

    • LWD@lemm.ee
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      2 months ago

      It is a sidebar that sends a query from your browser directly to a server run by a giant corporation like Google or OpenAI, consumes an excessive amount of carbon/water, then sends a response back to you that may or may not be true (because AI is incapable of doing anything but generating what it thinks you want to see).

      Not only is it unethical in my opinion, it’s also ridiculously rudimentary…

      • TheMachineStops@discuss.tchncs.de
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        2 months ago

        It gives you many options on what to use, you can use Llama which is offline. Needs to be enabled though about:config > browser.ml.chat.hideLocalhost.

        • LWD@lemm.ee
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          2 months ago

          There’s a huge difference between something that is presented in an easily accessible settings menu, and something that requires you to go to an esoteric page, click through a scary warning message, and then search for esoteric settings… Before even installing a server.

          Nothing was compelling Mozilla to rush this through. In addition, nobody was asking Mozilla for remote access to AI, AFAIK. Before Mozilla pushed for it, people were praising them for resisting the temptation to follow the flock. They could have waited and provided better defaults.

          Or just wedged it into an extension, something they’re currently doing anyway.