fbpx

You may have heard the outlandish claim: Bill Gates is using the Covid-19 vaccine to implant microchips in everyone so he can track where they go. It’s false, of course, debunked repeatedly by journalists and fact-checking organizations. Yet the falsehood persists online — in fact, it’s one of the most popular conspiracy theories making the rounds. In May, a Yahoo/YouGov poll found that 44 percent of Republicans (and 19 percent of Democrats) said they believed it.

How online misinformation spreads

This particular example is just a small part of what the World Health Organization now calls an infodemic — an unprecedented glut of information that may be misleading or false. Misinformation — false or inaccurate information of all kinds, from honest mistakes to conspiracy theories — and its more intentional subset, disinformation, are both thriving, fueled by a once-in-a-generation pandemic, extreme political polarization and a brave new world of social media.

“The scale of reach that you have with social media, and the speed at which it spreads, is really like nothing humanity has ever seen,” says Jevin West, a disinformation researcher at the University of Washington.

One reason for this reach is that so many people are participants on social media. “Social media is the first type of mass communication that allows regular users to produce and share information,” says Ekaterina Zhuravskaya, an economist at the Paris School of Economics who coauthored an article on the political effects of the Internet and social media in the 2020 Annual Review of Economics.

Trying to stamp out online misinformation is like chasing an elusive and ever-changing quarry, researchers are learning. False tales — often intertwined with elements of truth — spread like a contagion across the Internet. They also evolve over time, mutating into more infectious strains, fanning across social media networks via constantly changing pathways and hopping from one platform to the next.

Misinformation doesn’t simply diffuse like a drop of ink in water, says Neil Johnson, a physicist at George Washington University who studies misinformation. “It’s something different. It kind of has a life of its own.”

how hate spreed online

Each dot in this diagram represents an online community hosted on one of six widely used social media networks. (Vkontakte is a largely Russian network.) Black circles indicate communities that often contain hateful posts; the others are clusters that link to those. The green square near the center is a particular Gab community that emerged in early 2020 to discuss the pandemic but quickly began to include misinformation and hate. Communities connect to one another with clickable links, and while they often form discrete groups within a platform, they can also link to different platforms. Such links can break and reconnect, creating changing pathways through which misinformation can travel. Breaking these links and preventing new ones from forming could be an effective way for society to control the spread of hate and misinformation.

The Gates fiction is a case in point. On March 18, 2020, Gates mentioned in an online forum on Reddit that electronic records of individuals’ vaccine history could be a better way to keep track of who had received the Covid-19 vaccine than paper documents, which can be lost or damaged. The very next day, a website called Biohackerinfo.com posted an article claiming that Gates wanted to implant devices into people to record their vaccination history. Another day later, a YouTube video expanded that narrative, explicitly claiming that Gates wanted to track people’s movements. That video was viewed nearly 2 million times. In April, former Donald Trump advisor Roger Stone repeated the conspiracy on a radio program, which was then covered in the New York Post. Fox News host Laura Ingraham also referred to Gates’s intent to track people in an interview with then US Attorney General William Barr.

But though it’s tempting to think from examples like this that websites like Biohackerinfo.com are the ultimate sources of most online misinformation, research suggests that’s not so. Even when such websites churn out misleading or false articles, they are often pushing what people have already been posting online, says Renee DiResta, a disinformation researcher at the Stanford Internet Observatory. Indeed, almost immediately after Gates wrote about digital certificates, Reddit users started commenting about implantable microchips, which Gates had never mentioned.

In fact, research suggests that malicious websites, bots and trolls make up a relatively small portion of the misinformation ecosystem. Instead, most misinformation emerges from regular people, and the biggest purveyors and amplifiers of misinformation are a handful of human super-spreaders. For example, a study of Twitter during the 2016 election found that in a sample of more than 16,000 users, 6 percent of those who shared political news also shared misinformation. But the vast majority — 80 percent — of the misinformation came from just 0.1 percent of users. Misinformation is amplified even more when those super-spreaders, such as media personalities and politicians like Donald Trump (until his banning by Twitter and other sites), have access to millions of people on social and traditional media.

Thanks to such super-spreaders, misinformation spreads in a way that resembles an epidemic. In a recent study, researchers analyzed the rise in the number of people engaging with Covid-19-related topics on Twitter, Reddit, YouTube, Instagram and a right-leaning network called Gab. Fitting epidemiological models to the data, they calculated R-values which, in epidemiology, represent the average number of people a sick person would infect. In this case, the R-values describe the contagiousness of Covid-19-related topics in social media platforms — and though the R-value differed depending on the platform, it was always greater than one, indicating exponential growth and, possibly, an infodemic.

media P gates collins fauci

Bill Gates (left) meeting with Francis Collins (center), director of the National Institutes of Health, and Anthony Fauci (right), director of the National Institute of Allergy and Infectious Diseases, in 2017. Gates is one of many victims of online misinformation: In early 2020, he made a casual comment about better ways to keep electronic records of individuals’ vaccine history. Social-media users twisted this into a false conspiracy theory in which Gates and others would use the Covid-19 vaccine to implant microchips in everyone to remotely track their movements.

CREDIT: NATIONAL INSTITUTES OF HEALTH

Differences in how information spreads depend on features of the particular platform and its users, not on the reliability of the information itself, says Walter Quattrociocchi, a data scientist at the University of Rome. He and his colleagues analyzed posts and reactions — such as likes and comments — about content from both reliable and unreliable websites, the latter being those that show extreme bias and promote conspiracies, as determined by independent fact-checking organizations. The number of posts and reactions regarding both types of content grew at the same rate, they found.

Complicating matters more, misinformation almost always contains kernels of truth. For example, the implantable microchips in the Gates conspiracy can be traced to a Gates Foundation-funded paper published in 2019 by MIT researchers, who designed technology to record someone’s vaccination history in the skin like a tattoo. The tattoo ink would consist of tiny semiconductor particles called quantum dots, whose glow could be read with a modified smartphone. There are no microchips, and the quantum dots can’t be tracked or read remotely. Yet the notion of implanting something to track vaccination status has been discussed. “It isn’t outlandish,” Johnson says. “It’s just outlandish to say it will then be used by Gates in some sinister way.”

What happens, Johnson explains, is that people pick nuggets of fact and stitch them together into a false or misleading narrative that fits their own worldview. These narratives then become reinforced in online communities that foster trust and thus lend credibility to misinformation.

Johnson and his colleagues track online discussion topics in social media posts. Using machine learning, their software automatically infers topics — say, vaccine side effects — from patterns in how words are used together. It’s similar to eavesdropping on multiple conversations by picking out particular words that signal what people are talking about, Johnson says.

And as in conversations, topics can evolve over time. In the past year, for example, a discussion about the March lockdowns mutated to include the US presidential election and QAnon conspiracy theories, according to Johnson. The researchers are trying to characterize such topic shifts, and what makes certain topics more evolutionarily fit and infectious.

Some broad narratives are especially tenacious. For example, Johnson says, the Gates microchip conspiracy contains enough truth to lend it credibility but also is often dismissed as absurd by mainstream voices, which feeds into believers’ distrust of the establishment. Throw in well-intentioned parents who are skeptical of vaccines, and you have a particularly persistent narrative. Details may differ, with some versions involving 5G wireless networks or radiofrequency ID tags, but the overall story — that powerful individuals want to track people with vaccines — endures.

And in online networks, these narratives can spread especially far. Johnson focuses on online groups, like public Facebook pages, some of which can include a million users. The researchers have mapped how these groups — within and across Facebook and five other platforms, Instagram, Telegram, Gab, 4Chan and a predominantly Russian-language platform called VKontakte — connect to one another with weblinks, where a user in one online group links to a page on another platform. In this way, groups form clusters that also link to other clusters. The connections can break and relink elsewhere, creating complex and changing pathways through which information can flow and spread. For example, Johnson says, earlier forms of the Gates conspiracy were brewing on Gab only to jump over to Facebook and merge with more mainstream discussions about Covid-19 vaccinations.

These cross-platform links mean that the efforts of social media companies to take down election- or Covid-19-related misinformation are only partly effective. “Good for them for doing that,” Johnson says. “But it’s not going to get rid of the problem.” The stricter policies of some platforms — Facebook, for example — won’t stop misinformation from spilling over to a platform where regulations are more relaxed.

Screenshot 1

Researchers compared how often users of several social networks reacted to posts (e.g. liked, shared or commented) from unreliable vs. reliable sources. Users of the predominantly right-wing network Gab reacted to unreliable information 3.9 times as often as reliable information, on average. In contrast, YouTube users mostly engaged with reliable information, while users of Reddit and Twitter fell in between. The way that misinformation spreads online seems to depend largely on the characteristics of a network and its users, the researchers say. (X axis shows the ratio of reactions to information from unreliable vs. reliable sources.)

And unless the entire social media landscape is somehow regulated, he adds, misinformation will simply congregate in more hospitable platforms. After companies like Facebook and Twitter started cracking down on election misinformation — even shutting down Trump’s accounts — many of his supporters migrated to platforms like Parler, which are more loosely policed.

To Johnson’s mind, the best way to contain misinformation maybe by targeting these inter-platform links, instead of chasing down every article, meme, account or even page that peddles in misinformation — which is ultimately a futile game of Whac-A-Mole. To show this, he and his colleagues calculated a different R-value. As before, their revised R-value describes the contagiousness of a topic, but it also incorporates the effects of dynamic connections in the underlying social networks. Their analysis isn’t yet peer-reviewed, but if it holds up, the formula can provide a mathematical way of understanding how a topic might spread — and, if that topic is rife with misinformation, how society can contain it.

For example, this new R-value suggests that by taking down cross-platform web links, social media companies or regulators can slow the transmission of misinformation so that it no longer spreads exponentially. Once regulators identify an online group brimming with misinformation, they can then remove links to other platforms. This needs to be the priority, Johnson says, even more than removing the group pages themselves.

Fact-checking may also help, as some studies suggest it can change minds and even discourage people from sharing misinformation. But the impact of a fact-check is limited, because corrections usually don’t spread as far or as fast as the original misinformation, West says. “Once you get something rolling, it’s real hard to catch up.” And people may not even read a fact-check if it doesn’t conform to their worldview, Quattrociocchi says.

Other approaches, such as improving education and media literacy, and reforming the business model of journalism to prioritize quality over clicks, are all important for controlling misinformation — and, ideally, for preventing conspiracy theories from taking hold in the first place. But misinformation will always exist, and no single action will solve the problem, DiResta says. “It’s more of a problem to be managed like a chronic disease,” she says. “It’s not something you’re going to cure.”

[Source: This article was published in knowablemagazine.org By Marcus Woo - Uploaded by the Association Member: Juan Kyser] 
Categorized in Investigative Research

A new study has shown that although they may protect your personal data, independent search engines display a lot more misinformation related to vaccines than internet giants, such as Google.

In 2019, the World Health Organization (Geneva, Switzerland) listed vaccine hesitancy as one of the top 10 threats to global health. The internet plays a huge role in this rise in negative attitudes towards vaccinations as misinformation continues to be published and widely spread, with many taking what they read online as fact.

Determined to fully evaluate the role of search engines in spreading this misinformation, an international research group conducted a study to monitor the amount of anti-vaccination resources returned in searches in different search engines.

Internet companies tracking and storing user’s personal data and monitoring their online behavior has left many internet users wary of internet giants and turning, instead, to independent search engines. The study, published in Frontiers in Medicine, focused on how the search engines’ approach to data privacy may impact the quality of scientific results.

“A recent report showed that (50%) of people in the UK would not take a Coronavirus vaccine if it was available. This is frightening – and this study perhaps gives some indication as to why this is happening,” remarked lead author Pietro Ghezzi (Brighton & Sussex Medical School, UK).

The researchers searched for the term “vaccines autism” in a variety of different search engines in English, Spanish, Italian and French. For each search the Chrome browser was cleared of cookies and previous search history. They then analyzed the first 30 results from all searches.

Vaccines being linked to autism is a concept inherited from a now discredited study published in 1998, linking the MMR vaccine to the development of autism. Despite the fact that countless studies have since been published since disproving the theory, the flawed findings are still shared as if fact by many.

The researchers discovered that alternative, independent search engines (Duckduckgo, Ecosia, Qwant, Swisscows, and Mojeek) and other commercial engines (Bing and Yahoo) display more anti-vaccination websites (10-53%) in the first 30 results than Google (0%).

Furthermore, some localized versions of Google (English-UK, Italian and Spanish) also returned up to 10% more anti-vaccination resources than the google.com (English-US).

“There are two main messages here,” Ghezzi summarized. “One is to the Internet giants, who are becoming more responsible in terms of avoiding misinformation, but need to build trust with users regarding privacy because of their use of personal data; and the other is to the alternative search engines, who could be responsible for spreading misinformation on vaccines, unless they become better in their role as information gatekeepers. This suggests that quality of the information provided, not just privacy, should be regulated.”

The researchers concluded that search engines should be developing tools to test search engines from the perspective of information quality, particularly with health-related webpages, before they can be deemed trustworthy providers of public health information.

[Source: This article was published in biotechniques.com - Uploaded by the Association Member: Eric Beaudoin]

Categorized in Internet Search

When tragic events happen, social media quickly fills with rumors and fake stories. NPR's Lulu Garcia-Navarro talks to Kate Starbird of the University of Washington about misinformation on the web.

LOURDES GARCIA-NAVARRO, HOST:

Was the U.S. duped into striking Syria? No. The grisly deaths this week of women and children in what looks to be a chemical weapons attack was not carried out by opponents of President Trump in his own government - the, quote, unquote, "deep state." But that is the rumor that's been circulating on social media. And of course, this isn't the first time rumors like that have spread. Kate Starbird studies the spread of rumors. She teaches at the University of Washington, and her research traces fake news back past this presidential race to at least the 2013 Boston Marathon bombing.

KATE STARBIRD: We found a couple of different kinds of rumors, and one of them - there was this weird little rumor. It was kind of small but very different from the other ones and that was this theory that the Navy SEALs had perpetrated the Boston Marathon bombings. And they had been blamed on these what they called patsies, which were the suspects that were the Chechnya brothers. But it was all part of this - it was a false flag that the U.S. government or some elements of the U.S. government had perpetrated this event on itself.

GARCIA-NAVARRO: What do we see happen when a tragedy occurs? Is this - did you find that this was common?

STARBIRD: We did see across all of the manmade disaster events, over and over again, these same claims go from, you know, event to event to event.

GARCIA-NAVARRO: What we've just seen now on the strike on Syria, saying that somehow these were actors staging the event to sort of dupe the United States to make them - to draw them into the war. It's the same kind of thing.

STARBIRD: Exactly. So as soon as I saw this event, I actually posted on Facebook. I said, you know, you're seeing these images. But within, you know, a couple hours or maybe a day, you're going to see claims that this didn't really happen or that it was perpetrated by someone else. And of course, that comes to fruition.

GARCIA-NAVARRO: So the question is, so if we're seeing the same thing happening over and over again, who's doing this?

STARBIRD: I think you have people that are doing it for individual reasons. They have some political motivation, or they have financial motivation. They can make money selling these ideas, selling ads on their website.

GARCIA-NAVARRO: Just trying to get eyeballs so that they can sell, you know, whatever product they're pushing.

STARBIRD: So there's that element. Then there's people that are sincere believers in this stuff. They really - they're bought in. They think about this. They're 9/11 truthers (ph). They're JFK conspiracy theorists. And then there was elements of what seemed to be purposeful disinformation strategies. So someone who doesn't believe these things, who's got a political motivation for injecting very confusing ideas that are anti-globalist, anti-corporatist, anti-mainstream media.

And they - a lot of the theories had this idea that there's a group of very powerful people that are outside of government that sort of orchestrate things. Those are the kinds of things that were actually much more problematic than the folks that were doing it for money or the sincere believers.

GARCIA-NAVARRO: What does this say to you? What have you learned through looking at this?

STARBIRD: I've learned a lot in the last few months. So, you know, I come from computer science and media studies, but I'm really sort of an engineer background. And I've ended up in this very politicized space looking for sort of a U.S. right versus left kind of spectrum, and that's not what I found.

What I found was these kinds of theories and this way of thinking about the world is appealing to both people on the left and the right - that people are going to see one theory. Like, they're anti-vaccine or they're anti-GMO. And they're getting drawn into these other theories of, you know, deep state actors that are changing world events to manipulate you. And then getting pulled into this worldview that is very potentially dangerous.

GARCIA-NAVARRO: That's Kate Starbird. She teaches at the University of Washington. And she studies the spread of misinformation online, a huge problem these days. Thanks so much for joining us.

STARBIRD: Thank you.

Source : npr.org

Categorized in Internet Technology

AOFIRS

World's leading professional association of Internet Research Specialists - We deliver Knowledge, Education, Training, and Certification in the field of Professional Online Research. The AOFIRS is considered a major contributor in improving Web Search Skills and recognizes Online Research work as a full-time occupation for those that use the Internet as their primary source of information.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.