• Midnitte@beehaw.org
    link
    fedilink
    English
    arrow-up
    20
    ·
    2 days ago

    There’s a great talk here where he talks about using local models where I could see them actually being useful.

    Hopefully we get there and memory stops this ridiculous 5000% markup.

    • Flax@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      2 days ago

      There are a bunch of useful ways. For example, I was toying with a Minecraft server where people start a country and a local llm can come up with a country code based on their country’s name and the ISO-3166 standard in a few minutes

      • Midnitte@beehaw.org
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        It’s actually a really good talk from someone qoth decades of UX experience. The focus was more on innovation in UX (with the example that Microsoft got AI in Windows very… very wrong).

    • CalcProgrammer1@lemmy.today
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 days ago

      My only takeaway that could be seen as good news is that they at least expect consumers to have access to local computing power strong enough to run local AI, and that computing power is very likely in the form of GPUs that can also be used for PC gaming. Hopefully this means there’s still some focus on consumer GPUs somewhere out there rather than just selling them all to OpenAI.

  • artyom@piefed.social
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 days ago

    Doesn’t really make much sense. I mean yeah, privacy and all of that but think of the environmental impact of 1000 inefficient PCs vs. 1 efficient PC shared by 1000 people. Maybe just open source models hosted by a community would be better.

    Or better yet just forget about it entirely.

  • Megaman_EXE@beehaw.org
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    2 days ago

    The only way I would be comfortable with Ai is if I could craft it myself, run it locally, and prevent it from feeding bullshit results, was energy efficient, could prevent it from phoning home, wasn’t built off of stolen data and also didn’t give profit to big companies

  • ɔiƚoxɘup@beehaw.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 days ago

    Fantastic. They’ll make US pay for it. There’s no way they don’t turn this into something more evil.