AI has the potential to flesh out immersive worlds in video games in ways that are completely impossible for a team to accomplish today.
If it’s used to augment scripted characters and stories it can only make the soulless NPCs we are used to into much more interesting characters.
I welcome, and in fact, long for that treatment in games like the Elder Scrolls.
There’s absolutely no need for AI to replace Link from legend of Zelda, but hells yes it should be used to stop guards from talking about my stolen sweetroll.
The point is that right now language models are only good at generating coherent text. They aren’t at the level where they can control an NPC’s behaviour in a game world. NPCs need to actually interact with the world around them in order to be interesting. That words that come out of their mouths are only part of the equation.
Yes, language models are good for text. That’s their sole purpose. They can’t control characters. There are other models for that, and they are obviously not language models.
Well, they actually can, at least to an extent. All you need to do is encode the worldstate in a way the LLM can understand, then decode the LLM’s response to that worldstate (most examples I’ve seen use JSON to good effect).
That doesn’t seem to be the focus of most of these developers though, unfortunately.
That assumes the model is trained on a large training set of the worldstate encoding and understands what that worldstate means in the context of its actions and responses. That’s basically impossible with the state of language models we have now.
I disagree. Take this paper for example - keeping in mind it’s a year old already (using ChatGPT 3.5-turbo).
The basic idea is pretty solid, honestly. Representing worldstate for an LLM is essentially the same as how you would represent it for something like a GOAP system anyway, so it’s not a new idea by any stretch.
They’re a massive and combinatorially exploding part of the equation, though.
Imagine a world where instead of using AI to undermine writers and artists, we use it to explode their output. A writer could write the details that make a character unique, and the key and side quest dialogs that they write now, which could be used to customize a model for that character.
The player can now have realistic conversations with those characters that would make everything better. You could ask for directions to something and then follow it up with more questions that the NPC should know the answer to. Etc.
Now inconsequential filler characters, like a ramen shop owner in the example, become something potentially memorable but explicitly useful in a way that could never possibly be hand crafted.
This article is shitting on an incredible early attempt to allow for this by taking the fact that it’s not done yet and crossing that with their biased opining and producing a kotaku-style click bait from it.
Not even just exposition. An NPC could easily go off script and start talking about stuff that breaks immersion. Like imagine you’re sitting in a tavern in Skyrim and then some NPC comes up and is like “hey, you see any good movies lately?”
AI has the potential to enshittify what would be immersive worlds in video games. Nobody wants their crafted NPC dialogue turned into ChatGPT garbage. Your comment is just propaganda.
Oh Kotaku.
AI has the potential to flesh out immersive worlds in video games in ways that are completely impossible for a team to accomplish today.
If it’s used to augment scripted characters and stories it can only make the soulless NPCs we are used to into much more interesting characters.
I welcome, and in fact, long for that treatment in games like the Elder Scrolls.
There’s absolutely no need for AI to replace Link from legend of Zelda, but hells yes it should be used to stop guards from talking about my stolen sweetroll.
This article and headline are just propaganda.
The point is that right now language models are only good at generating coherent text. They aren’t at the level where they can control an NPC’s behaviour in a game world. NPCs need to actually interact with the world around them in order to be interesting. That words that come out of their mouths are only part of the equation.
Yes, language models are good for text. That’s their sole purpose. They can’t control characters. There are other models for that, and they are obviously not language models.
>3d navigation models
>look inside
>language models
Well, they actually can, at least to an extent. All you need to do is encode the worldstate in a way the LLM can understand, then decode the LLM’s response to that worldstate (most examples I’ve seen use JSON to good effect).
That doesn’t seem to be the focus of most of these developers though, unfortunately.
That assumes the model is trained on a large training set of the worldstate encoding and understands what that worldstate means in the context of its actions and responses. That’s basically impossible with the state of language models we have now.
I disagree. Take this paper for example - keeping in mind it’s a year old already (using ChatGPT 3.5-turbo).
The basic idea is pretty solid, honestly. Representing worldstate for an LLM is essentially the same as how you would represent it for something like a GOAP system anyway, so it’s not a new idea by any stretch.
Right, there’s no possible way actions can be represented by a stream of symbols.
They’re a massive and combinatorially exploding part of the equation, though.
Imagine a world where instead of using AI to undermine writers and artists, we use it to explode their output. A writer could write the details that make a character unique, and the key and side quest dialogs that they write now, which could be used to customize a model for that character.
The player can now have realistic conversations with those characters that would make everything better. You could ask for directions to something and then follow it up with more questions that the NPC should know the answer to. Etc.
Now inconsequential filler characters, like a ramen shop owner in the example, become something potentially memorable but explicitly useful in a way that could never possibly be hand crafted.
This article is shitting on an incredible early attempt to allow for this by taking the fact that it’s not done yet and crossing that with their biased opining and producing a kotaku-style click bait from it.
you do know that quality over quantity right? nobody likes bethesdas radiant fetch quests and this is that but with exposition dumping npcs
Not even just exposition. An NPC could easily go off script and start talking about stuff that breaks immersion. Like imagine you’re sitting in a tavern in Skyrim and then some NPC comes up and is like “hey, you see any good movies lately?”
Did you watch the demo? The player literally told the bartender to break out the good stuff and he did just that…
You mention the trick yourself.
AI can augment a real actor and script. Not replace.
Skyrim has mods that add AI voice to non voiced mod NPCs or lines. Works great. But its only augmenting what is already there.
AI has the potential to enshittify what would be immersive worlds in video games. Nobody wants their crafted NPC dialogue turned into ChatGPT garbage. Your comment is just propaganda.
I would love to have a game - or a genre of games - using AI NPCs.
What I absolutely don’t want is every game using AI NPCs.
Did you have an uncustomized ChatGPT session write this for you or are you actually this dense?
Guess I’m just this dense, thanks.