• PonyOfWar@pawb.social
    link
    fedilink
    arrow-up
    33
    arrow-down
    5
    ·
    9 months ago

    Honestly one of the AI applications I see real potential in. They can train the NPCs with an extensive backstory and the interactions with them could be way more dynamic than what we currently get for NPCs. Something like a more advanced version of “Starship Titanic”, if anyone remembers that.

    • MotoAsh@lemmy.world
      link
      fedilink
      arrow-up
      10
      arrow-down
      2
      ·
      edit-2
      9 months ago

      You are imagining a supercomputer’s LLM running an NPC.

      It literally cannot be that fancy. Maybe they can fake it and fool a few rubes, but no there will be no deep characters ran by this.

      • owen@lemmy.ca
        link
        fedilink
        arrow-up
        4
        ·
        9 months ago

        I think you could make it work by giving them each a limited word pool and pre-set phrases to cover for panic/confusion

      • PonyOfWar@pawb.social
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        9 months ago

        The way it works right now is usually over the cloud. I’ve already tried out a bit of “Convai” as a developer, which is a platform where you can create LLM NPCs and put them in Unreal Engine. It’s pretty neat, not perfect, but you can definitely give characters thousands of lines of backstory if you want and they will act in character. They will also remember any conversations a player had with them previously and can refer to them in later convos. Can still be fairly obvious that you’re talking to an LLM though, if you know what to ask and what to look for. Due to its cloud-based nature, there is also some delay between the player input and the response. But it has a lot of potential for dialog systems where you can do way more than just choose between 4 predefined sentences. Especially once running these things locally won’t be a performance-issue.

    • fruitycoder@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      9 months ago

      There are a couple indies and mods working on that! The trick definitely is to lower the power needed, maybe through a series of fine gunned models (might also lower the amount anacrinisms too)