Inworld wants to make this kind of interaction more refined. It offers a product for AAA game studios in which developers can create the brains of an AI NPC that can then be imported into their game. Developers use the company’s “Inworld Studio” to create their NPC. For example, they can fill in a basic description that outlines the character’s personality, including likes and dislikes, motivations, or helpful backstory. The sliders allow you to set levels of traits such as introversion or extroversion, insecurity or confidence. And you can also use free text to make the character drunk, aggressive, prone to exaggeration—just about anything.
Developers can also add descriptions of how their character speaks, including examples of commonly used phrases, which Inworld’s various AI models, including LLM, turn into character-appropriate dialogue.
“Because there’s such a reliance on a lot of labor-intensive scenarios, it’s hard to get characters to handle a wide variety of ways a scenario can play out, especially as games become more open-ended.”
Jeff Orkin, founder, Bitpart
Game designers can also plug other information into the system: what the character knows and doesn’t know about the world (no references to Taylor Swift in a medieval combat game, ideally) and any relevant guardrails (does your character curse or not 😉 . Narrative controls will allow developers to make sure the NPC sticks to the story and doesn’t wander wildly off-base in their conversation. The idea is that the characters can then be imported into video game graphics engines like Unity or Unreal Engine to add a body and features. Inworld is partnering with text-to-speech startup ElevenLabs to add natural-sounding voices.
Inworld’s technology hasn’t appeared in any AAA games yet, but at the Game Developers Conference (GDC) in San Francisco in March 2024, the company revealed an early demo with Nvidia that showcased some of what will be possible. In Covert Protocol, each player acts as a private detective who must solve a case using clues from the various in-game NPCs. Also at GDC, Inworld revealed a demo called NEO NPC that he had worked with Ubisoft. In NEO NPC, a player could freely interact with NPCs using voice-to-text software and use chat to develop a deeper relationship with them.
LLMs give us an opportunity to make games more dynamic, says Jeff Orkin, founder of Bitpart, a new startup that also aims to create entire casts of LLM-powered NPCs that can be brought into games. “Because there’s such a reliance on a lot of labor-intensive scenarios, it’s hard to get characters to handle a wide variety of ways a scenario can play out, especially as games become more open-ended,” he says.
Bitpart’s approach is partly inspired by Orkin’s doctoral research at MIT’s Media Lab. There, he trained AIs to play roles in social situations using game logs of people doing the same things to each other in multiplayer games.
Bitpart’s cast of characters are trained using a large language model and then tuned in a way that means in-game interactions aren’t completely open and infinite. Instead, the company uses an LLM and other tools to create a scenario that covers a range of possible interactions, and then a human game designer will select a few. Orkin describes the process as writing the Lego bricks of interaction. An in-game algorithm looks for specific bricks to combine them at the right time.
Bitpart’s approach could make for some enjoyable moments in the game. In a restaurant, for example, you might ask a waiter for something, but the bartender might overhear and join in. Bitpart’s AI is currently working with Roblox. Orkin says the company is now running tests with AAA game studios, though he won’t say which ones yet.