The conspiracy theory has the best attributes of a multiplatform game, except that it can cause harm in the real world.
WHEN QANON EMERGED in 2017, the game designer Adrian Hon felt a shock of recognition.
QAnon, as you very likely know, is the right-wing conspiracy theory that revolves around a figure named Q. This supposedly high-ranking insider claims that the deep state—an alleged cabal led by Barack Obama, Hillary Clinton, and George Soros and abetted by decadent celebrities—is running a global child-sex-trafficking ring and plotting a left-wing coup. Only Donald Trump heroically stands in the way.
It's nonsense, of course. But what intrigued Hon was the style of nonsense.
It is addictively participatory. Whenever Q posts about the conspiracy, he (or she or they) leaves clues—“Q drops”—on image boards like 8kun that are cryptic and open-ended. One in 2019, for example, read: “[C] BEFORE [D]. [C]oats BEFORE [D]. The month of AUGUST is traditionally very HOT. You have more than you know.” Since the clues are oblique, it's up to the followers of QAnon to interpret them. They instantly begin Googling the phrases, then energetically share their own exegeses online about What It All Means. (August is when Trump will finally imprison Clinton!) To belong to the QAnon pack is to be part of a massive crowdsourcing project that sees itself cracking a mystery.
Which is what gave Hon the shock of recognition: QAnon was behaving precisely like an alternate-reality game, or ARG.
ARGs are designed to be clue-cracking, multiplatform scavenger hunts. They're often used as a promotion, like for a movie. A studio plants a cryptic clue in the world around us. If you notice it and Google it, it leads to hundreds more clues that the gamemaker has craftily embedded in various websites, online videos, maps, and even voice message boxes. The first big ARG—called The Beast—was created in 2001 to promote the Steven Spielberg movie A.I. Artificial Intelligence and began with a reference to a “sentient machine therapist” in the credits listed on the movie poster.
Hon was a student when The Beast was released, and he became obsessed. He even moderated a discussion forum where players shared clues. They solved the puzzle in about five months, and Hon was so inspired that he created his own firm to make ARGs, launching Perplex City (his most well-known game) in 2005. He's run several others since.
The web has always been about making willy-nilly connections: This links to that which links to this.
This is why he's convinced that game dynamics help explain why QAnon is such a seductive conspiracy. It plugs into the psychological lures that make ARGs so fun.
First off, QAnon poses a mystery that feels so big it can only be solved by crowdsourcing. It's thrilling to be involved with other people in something bigger than yourself. Plus, it turns one's armchair-warrior Googling into a heroic quest for truth.
“They're all saying, ‘I've done my research,’” Hon told me of Q followers. “They're looking for signals in the noise.”
There's also the thrill of creativity, of adding to a canon. QAnon followers “don't just passively receive Q drops. They create new videos and texts,” notes Marc-André Argentino, a public scholar at Concordia University who researches QAnon. Q's followers behave like religious devotees who pore over their faith's central texts, crafting interpretations that become part of the official creed.
And, like an ARG, QAnon brings social rewards. If you're the first to post a new discovery, “other people can see it, and they instantly recognize it,” notes Dan Hon, Adrian's brother, who helped create the Perplex City ARG.
In a way, ARGs and QAnon are the quintessence of internet culture. The web has always been about making willy-nilly connections: This links to that which links to this. And cyberspace facilitates the obsessive joint scrutiny of everything, from TV shows to knitting patterns to the belief that reptilians walk among us.
Once you chew on it that way, you start thinking, jeez, maybe QAnon was almost inevitable. As the scholar M. R. Sauter has pointed out, the internet is exquisitely suited to the conspiratorial style. “It's the joy of creating connections,” Sauter says, noting that previous conspiracy theories have displayed ARG-like qualities too. One was Climategate, where global warming skeptics seized on the leaked emails of atmospheric scientists and produced reams of feverish, unglued analyses.
ARG makers have long worried about this culture and its relentless, wild-eyed nature. If players solve one puzzle, they crave the fun of tackling more, more, more. But they can wind up seeing puzzles that aren't puzzles. After the online group solved The Beast, one member suggested “solving 9/11.” Hon and the other mods quashed that rearguard action. But it showed how easily ARG culture can be hijacked toward delusional ends.
And with QAnon, the appeal has pushed the conspiracy dangerously from the fringes into the mainstream. An internal Facebook review reportedly found millions of people on various QAnon sites, and QAnon believers recently won congressional primary races in Georgia and Florida. All of which suggests that QAnon is, alas, unlikely to fade away soon. Quite apart from its ideological roots, it's fueled by one of the oldest internet urges: It's fun.