"It was almost like a drug, to be honest," Jitarth Jadeja, an ex-QAnon follower, told Newsy.
Starting with fake COVID causes, cures, and the mainstreaming of the conspiracy theory QAnon online, 2020 was a year people fell down rabbit holes and entered the realm of the surreal.
"The pandemic has offered QAnon a way to kind of worm itself into the lives of people who are home all the time now, don't have much to do, maybe aren't working or aren't working as much, are cut off from their family and their community structures," says Mike Rothschild, a conspiracy theory researcher.
The year of conspiracies ended with one that cut to the very heart of democracy — false claims of election fraud.
"Day after day, it appears to these people who exist in this disinfo-verse that something new and outrageous and shocking has happened. And, you know, in their belief, 'Why can't people see what's going on?'" says Mike Caulfield, a digital literacy expert at Washington State University Vancouver.
And it only grew more intense in January after rioters stormed the nation’s Capitol building.
Conspiracy theories and disinformation may have found their way into your own social circle. But efforts to pull people out of rabbit holes, and prevent them from crawling in in the first place, are also developing in many forms, including online games.
"We take real social media profiles, Twitter, Facebook and Instagram," says Darren Linvill, a disinformation expert at Clemson University. "Some of them are real people. Some of them are created troll accounts. And we have the user guess, 'Is it a troll or is it a real person?'"
News literacy courses are educating younger generations, teaching them about biases, propaganda, and how algorithms control what they see.
"We're growing up immersed in technology and all these articles and things. So we really need to learn how to navigate that before we get older so we're not surrounded by false news," a student named Sara Tewelde told Newsy.
Experts are breaking down some disinformation tactics.
"There's a phenomenon called 'just asking questions' where you were not yourself saying this is true, this happened," says Renee DiResta, research manager at the Stanford Internet Observatory. "You're instead insinuating or making a dark allegation or a hint that perhaps this other thing could be true. The vague insinuations provide an opportunity for people who are inclined to try to connect the dots."
Analysts say bad actors often target people’s emotions so they post before thinking.
"The ones who are seeding the narratives know that if they're including something like a warning of a violence or a safety related issue, we are most likely to, as the recipients, be seized by that information, become afraid immediately and then want to warn our entire networks," says Cindy Otis, a former CIA analyst and vice president of analysis at Alethea Group.
That’s one reason why social network analysis firms have gotten better at exposing foreign influence operations before they gather traction.
"The way this operation works was that they were running accounts on LinkedIn and they were running accounts on different social platforms," says Ben Nimmo, director of investigations at Graphika. "And using those accounts, they would approach bloggers and they would approach particularly novice journalists and say, hey, we're a new outlet, do you want to write for us?"
Still, national security officials say bad actors thrive on Americans with uninformed and polarizing ideologies.
"We've seen our adversaries, instead of using the trolls and the bots, taking organic American ideas and amplifying them, which has really been problematic," former Director of the National Counterintelligence and Security Center William Evanina told Newsy. "And I think it's the next step up in capability."
While the pandemic has created more opportunities for conspiracies to thrive, it has also compelled social media companies to take more responsibility for what people post.
"What's happening in an interesting way is the platforms have recognized this as an opportunity to take action against some of these communities," says Kate Starbird, a disinformation expert at the University of Washington. "They can now say, 'It's not just political, it's not just about votes. This is about lives.'"
Experts want people to think more about their roles in society — warning, after deadly protest, that the more hate-filled our rhetoric, the deeper, darker, and more dangerous our rabbit holes become.