Scammers are using ChatGPT to infiltrate online games for a quick cash grab
Most of the furious speculation on how artificial intelligence will shape our lives is far off possibilities - humanity-destroying pitfalls or world-altering solutions that will take years to manifest. But there is one place where AI is already starting to shape people's experiences in noticeable ways: online games.
In massively multiplayer online role-playing games, millions of people meet up to explore, compete, and collaborate in expansive virtual universes. MMO hype has cooled since the aughts, but the games still cater to sizable, committed audiences that have helped launch powerful franchises such as "World of Warcraft," "Guild Wars 2," and "Path of Exile." These communities even have built-in game cultures and economies complex enough to spawn a litany of academic research.
But these worlds are hounded by a growing problem: bots. Since many MMOs use simple point-and-click interfaces and reward players for completing repetitive tasks, nefarious actors can easily use automated scripts to reap in-game rewards at an inhuman pace. And those rewards can then be sold for real cash on secondary markets.
While bots have long been a scourge to gaming communities, their inability to hold a conversation with players had made them easy to spot. But a new ChatGPT plug-in that hit the market in March has made it possible for bots to more realistically mimic human players, leaving the real human players unable to report bad actors and developers scrambling to find a solution. The AI-driven technology not only threatens to transform these decadesold games but also offers a glimpse into a strange, new reality in which it is virtually impossible to distinguish humans from robots.
No game is plagued with bots today as much as "Old School RuneScape," a nostalgia-rich 2013 relaunch of the genre-defining adventure game by the British developer Jagex. In "OSRS," players inhabit the fantasy world of Gielinor, where they can slay monsters, build homes, mine, farm, and complete quests. In the decade since its launch, "OSRS" has grown to nearly 2 million active players, and the MMO Populations project ranks it as one of the busiest MMOs.
"OSRS" is appealing for bot operators, or "botters," because the game is fairly simple - it rewards repetitive tasks with digital gold, and the rudimentary enemies make it easy to create a program to slay predictable monsters with valuable loot. Add in a large black market in which players resell virtual currency or powerful items for real cash, and there's a significant incentive to deploy bots. While it's hard to nail down the exact number of nonhuman accounts, according to players I've spoken with and estimates based on the number of accounts Jagex bans each week, roughly one-quarter to one-third of the players logged on at any given moment are bots. While Jagex officially prohibits both bot farms and real-world trading of in-game goods, both have persisted despite periodic clampdowns. SirPugger, a YouTuber who reports on bots, has documented bot farms that rake in hundreds of thousands of dollars a year. Based on the scams he's weeded out, he said there could be millions of dollars traded across the various websites that comprise the "RuneScape" black market each year. Since his videos cause problems for people who make money in a legal gray area and violate the game's terms of service, SirPugger declined to share his real name.
Botters generally fall into two categories, SirPugger told me. One is the average-Joe player who uses automated bot services to cheat their way to a higher skill level and skip less-interesting parts of the game. The other is organized operations that run "bot farms," or large groups of bots, to collect in-game currency en masse to sell on the secondary markets. According to SirPugger's research, people can make up to six figures a year on the game's black market.
In "OSRS," botting hotspots have overrun areas of the game that are home to profitable loot-earning activities, making them inaccessible to legitimate players. Bots also make it virtually impossible for human players to place on leaderboards that track in-game accomplishments.
These headaches have forced Jagex's anti-cheating team into an arms race with bot developers. It rolls out increasingly sophisticated mechanisms for identifying and banning their accounts that are inevitably met with countermoves from botters. Given the technological dance, the company's most reliable bot-detection tool has been the community itself: Players can report accounts that they suspect of being run by third-party scripts. But ChatGPT integration threatens to make this grassroots solution a thing of the past.
Almost immediately after OpenAI released an update in early March, botters started hooking their scripts up to ChatGPT. Less than two weeks later, a developer at OSBot - which bills itself as "the most popular" bot provider for "Old School RuneScape" - posted on an internal forum advertising a new capacity to run scripts that hold coherent conversations.
At first, these "conversational bots" were clunky, SirPugger said - if you asked a bot the same question twice, it wouldn't notice. But the bots using ChatGPT began to rapidly improve, and SirPugger realized the automated players would soon become almost indistinguishable from human players.
"Now it's able to read previous conversations, so it has context for whatever you're talking about," SirPugger told me recently about the talking bots. "Now it would be like, 'Why are you asking me that again?' Or like, 'I already told you, but here's some more supplemental information.'"
Using third-party apps such as Bot Detector, which operate similar to how an AI text-generation checker works, some players have been able to document interactions with likely GPT-integrated bot scripts. In some cases, these AI-powered bots seemed to commiserate with real players, even complaining about the overabundance of bots. In one interaction SirPugger saw, a bot mimed a grouchy longtimer who ribbed and insulted nearby players: "We're quickly approaching a state where you won't be able to tell if you're talking with a bot."
In his 2010 novella, "The Lifecycle of Software Objects," the science-fiction writer Ted Chiang imagined a near future of digital worlds - something like a functional version of Mark Zuckerberg's disastrously underwhelming metaverse - populated by human-piloted avatars and AI beings. The story asks: If interacting with a digital "life-form" could be as complex as the real thing, does it matter whether you're interacting with a real person?
Today's chatbots are nowhere near as lifelike as the AI of Chiang's imagination, but MMO players are already beginning to face the question he posed. If bots were truly indistinguishable from human players - to the extent that they could maintain long-term friendships and partake in social activities in MMOs - would it be a bad thing? Or would they actually make the world feel more enriching?
On one hand, making bots impossible to weed out would permanently warp the economics of these games. Even without the ability to communicate and socialize, bots have come to dominate substantial portions of the largest in-game MMO economies. At the same time, bot scripts could eventually enhance rather than detract from the social aspects of online games. Online shooters like "PUBG" and "Fortnite" already populate their multiplayer lobbies with AI bots, but the social emphasis of MMOs means that they are uniquely well positioned to reap the prospective benefits of chatbot integration. Worlds would appear to become more populous, and though "real" players might still be cheated out of leaderboard slots, they wouldn't necessarily know it. As SirPugger put it, "I like multiplayer games because of the player interaction. So all of a sudden, you have more 'players' in the game because you can't tell that they're bots. Bots are suddenly kind of this positive externality."
Some hobbyists and industry players have already picked up on the prospective upsides to AI-scripted bots in games - not rogue bots run by nefarious players trying to make money but intentionally placed characters called "nonplayer characters" that have been beefed up with an AI chat function. A fan-made ChatGPT-powered modification for the popular "Elder Scrolls V: Skyrim" game, for example, allows the in-game characters to form memories of past interactions with the player. On the industry side, Replica Studios recently unveiled AI-powered NPCs that could hold complex conversations with players. And gamers are open to the idea of improving the feel of games via AI. A report published by Inworld AI, a developer platform that is trying to bring AI to gaming, said 99% of gamers it surveyed said they believed that incorporating AI NPCs would positively affect gameplay, while 78% said they would spend more time playing a game with it.
"Humans are storytellers," Kylan Gibbs, a cofounder and the chief product officer of Inworld AI, said in an interview with AI Magazine. "We're drawn to narratives that help us to make sense of the world around us. The same thing is true in the gaming world - the more immersive and believable a story is, the more we want to stay inside it."
Whatever one makes of Gibbs' "Matrix"-esque idea, significant hurdles to widespread integration remain. AI language models are still prone to dispensing false and inconsistent information, which would make it tough for major video-game developers to add them to games. And while a player might feel more engaged talking to an AI player than clicking through scripted dialogue, a free-form interaction would be far less likely to give them relevant information to advance in the storyline.
Given the murkiness of AI's potential, the present substance is such that people can see almost anything they want to see. But the rapid integration of ChatGPT and other AI technologies into games in nonsanctioned ways shows some of the pitfalls - and potential - of creating virtual spaces that humans can cohabit on apparently equal footing with highly responsive robots. Moving forward, these will likely continue to serve as test cases, sites of experimentation and indicators of how to deploy the technology in helpful or harmful ways.
Evan Malmgren is a writer who covers power and infrastructure and is currently working on a book about American off-gridders.