Doing your own research is a good way to end up being wrong
Analysis by Philip Bump
National columnist
January 17, 2024 at 7:00 a.m. EST
The internet has been a huge boon for the accessibility of information. There are very few barriers to consuming classic literature or detailed scientific analyses or catalogues of news reports. There is also an exorbitant amount of garbage information, of course, and an entire universe of people who say stuff that they think will get people to click links that will earn themselves money.
While confidence in American institutions has been in decline for some time, it’s not hard to imagine how the economic incentives of the internet contribute. There is an outsize appetite for derogatory, counterintuitive or anti-institutional assessments of the world around us. This is in part because alleged scandals are interesting and in part because Americans like to view themselves as independent analysts of the world around us.
The result is that there is both a supply and a demand for nonsense or appealingly framed errors. Americans who have little trust in the system can easily find something to reinforce their skepticism. They often do.
This month, Nieman Lab’s Josh Benton reported on research released last year that showed how people “doing their own research” on the internet often led them to gain more confidence in untrue information. The paper, titled “Online searches to evaluate misinformation can increase its perceived veracity,” was written by researchers from the University of Central Florida, New York University and Stanford. Their conclusions were straightforward.
“Although conventional wisdom suggests that searching online when evaluating misinformation would reduce belief in it, there is little empirical evidence to evaluate this claim,” the authors wrote. Instead, they continued: “We present consistent evidence that online search to evaluate the truthfulness of false news articles actually increases the probability of believing them.”
Later, they summarize the process, “When individuals search online about misinformation, they are more likely to be exposed to lower-quality information than when individuals search about true news” and “those who are exposed to low-quality information are more likely to believe false/misleading news stories to be true relative to those who are not.” Look for info; see bad info; accept the bad info.
The mechanism is explored at length but, in short, false claims or other rumors often generate fewer hits on Google, meaning searchers are more likely to encounter unreliable information that aligns with their assumptions. (The paper is dense; Benton’s summary is useful.)
There’s probably another factor at play, one not measured in the research: people who believe false claims often do so because those claims comport with their broader ideology or philosophy. Like a parent confronted with allegations of misbehavior by their child, those individuals would presumably be more likely to embrace dubious information supporting their belief than information that corroborates the allegations. The study presented participants with news stories to evaluate without a predisposition. In the real world, that usually doesn’t happen.
One of the ways the modern media environment has made it easier for false information to spread is that false information often adopts the veneer of reliable information. Sites like the Gateway Pundit look like news sites in broad strokes. The Gateway Pundit, despite its history of propagating nonsense, is often treated as legitimate by people of prominence — so its claims are treated as credible.
The same thing happens on cable television. Channels like One America News, which I described shortly after the 2020 election as a “pro-Trump video channel offered with a cable-news-like aesthetic,” elevated numerous baseless claims before being booted from major cable-news systems. Its programming often appeared to be the equivalent of taking Alex Jones’s “Infowars” but setting it in a local television news studio. Doing your own research might lead you to reputable-looking sources that are anything but.
Journalist Michael Hobbes recently remarked upon another emergent pattern, that one can “just adopt the aesthetics of a fact-check or a secret document leak and people will act like you’ve blown the cover off a huge scandal without assessing the underlying claim for themselves.”
This has been central to the effort by House Republicans to allege that President Biden was engaged in impeachable activity. House Oversight Chairman James Comer (R-Ky.) has repeatedly offered allegations against Biden that mirror the presentation of lengthy, detailed investigations but then collapse under scrutiny. His supporters, mostly a subset of Donald Trump supporters, say Biden acted inappropriately and are happy to accept allegations as legitimate — while conservative media largely shields them from debunking, for whatever good it would do.
Why did so many Iowa caucus-goers indicate that they thought Biden’s win was illegitimate? Partly because of Trump’s advocacy for that idea, certainly, and in part because of the impermeability of the right-wing media universe. But many, too, cited doing their own research — looking for information on the subject that ended up reinforcing their beliefs.
The most extreme recent example of Americans’ recent willingness to embrace nonsensical claims is the QAnon movement. More muted since the Jan. 6, 2021, riot at the U.S. Capitol, QAnon adherents say a secret cabal controls the world, with media, entertainment and political elites conspiring — perhaps (some believe) to traffic children and consume a secret chemical that children produce. Many with whom I spoke in 2018 and 2019, though, simply viewed QAnon as an organizing concept for how Trump was battling nefarious, powerful forces on their behalf.
I remember clearly standing outside a Trump rally in New Hampshire in 2019 and speaking to a guy in a QAnon T-shirt. He was very genial and matter-of-fact in his claims. And he suggested that he knew why I didn’t agree with his assumptions.
What I needed, he said, was to do some of my own research.
Link: https://www.washingtonpost.com/politics/2024/01/17/do-your-own-research-study/?utm_campaign=wp_main&utm_medium=social&utm_source=facebook&fbclid=IwAR0tbjMaEIz36O9iolHdQ6HSuKsVC9F4VY_XFWsMsqJydv_cEbjqYy5MfIc
(no message)
(no message)
their own "Research" and funded by taxpayers...
Ok, stop...wipe the sweat off your brow...take a breath and vow to do all you can to make sure that never happens.
(no message)
(no message)
There is strengh in diversity.
Public schools are as diverse as the Ivy League.
fortunately, our kids had a fantastic P.S. system and we took advantage of it...teachers, admins and school board are outstanding...good neighbor and friend of mine was SB President for several years. When leadership is excellent...and funds are available...truly excellent students come out of that system...we've witnessed it first hand.
Also, I grew up in NYC, where the Schools were also well managed...outstanding students got recognized and were given the resources needed to eventually get into the very best colleges in the nation. While I didn't go through that system, the NY Regents program/exams have been well constructed to ensure quality output...no way a "Mish-Mash" of poorly supervised Charter Schools with their own agendas is going to do better...for the students...and our nation.
Hack
He writes for the Washington Post and the Atlantic. He is a vehement anti-G.O.P. barker. And in the writing you've pasted, his entire point isn't research methodology, but a way to paint the right as ill-informed people who need to trust people like him. Nowhere does he offer any criticism of the bad data used in leftist "research," the absurd behavior of multiple networks that drive leftist narratives, or really anything other than a Qanon guy he talked to and Republican politicians.
Partial list of lies that have eroded the public's trust, off the top of my head:
1. The laptop was Russian Disinformation, they knew it wasn't.
2. The vaccine will prevent transmission, it was never tested for that.
3. Masks will stop the spread, we've known through RCTs over the years that it wouldn't.
4. Ivermectin is horse dewormer, see Brian Stelter.
5. Stand six feet apart, there was no science to back this recommendation up.
6. We can't define man/women, this one's obvious.
7. Russian collusion, pee pee tapes and other nonsense, you'll note he brings up the politicization of the GOP house committee, but ignores the Adam Schiff era.
The lies fuel untrust which causes people to seek alternatives. Expecting that people are going to just start trusting those who have shown themselves to be untrustworthy is naive and arrogant which isn't a great combination.
LLM AI, like ChatGPT, will make this problem worse because it is based on information on the internet, which means it has a positive feedback loop towards information that is popular with no ability to fact check. And journalism is being replaced by ChatGPT, so we're headed towards Idiocracy.
(no message)
(no message)
(no message)
(no message)
(no message)
(no message)
away from everyone. You know, like the "experts" told you.
(no message)
(no message)
(no message)
(no message)
(no message)