The YouTube experiment reveals the potential of ‘i

video: The ‘Inoculation Science’ animation covering emotional language. Emotions are powerful tools of persuasion. Research shows that using emotional words, especially those that evoke negative emotions such as fear or outrage, increases the viral potential of social media content. This use of negative emotional words to manipulate is sometimes called “scaremongering.”
see After

Credit: Inoculation Science project

Short animations giving viewers a taste of the tactics behind misinformation can help ‘inoculate’ people against harmful content on social media when deployed, according to a large online experiment conducted by the University of Cambridge. in the YouTube ad slot.

In collaboration with Jigsaw, a unit of Google dedicated to combating threats to public companies, a team of psychologists from the universities of Cambridge and Bristol have created 90-second clips designed to familiarize users with manipulation techniques such as scapegoating and deliberate inconsistency.

This “pre-bunking” strategy preemptively exposes people to the tropes behind malicious propaganda, so they can better identify lies online, regardless of the topic.

The researchers behind the Inoculation Science The project compares it to a vaccine: by giving people a “micro-dose” of misinformation in advance, it helps prevent them from falling into the trap in the future – an idea based on what social psychologists call the “inoculation theory”.

The findings, published in Scientists progresscome from seven experiments involving a total of almost 30,000 participants – including the first “real-world field study” of the inoculation theory on a social media platform – and show that a single viewing of film clip raises awareness of misinformation.

The videos feature concepts from the “misinformation playbook,” illustrated with relatable examples from film and television such as Family Guy or, in the case of false dichotomies, Star Wars (“Only a Sith deals in the absolute”).

“YouTube has over 2 billion active users worldwide. Our videos could easily be embedded in the advertising space on YouTube to prevent misinformation,” said study co-author Professor Sander van der Linden, head of the Social Decision-Making Lab (SDML) in Cambridge, who led the work.

“Our research provides the necessary proof of concept that the principle of psychological inoculation can easily be extended to hundreds of millions of users worldwide.”

Lead author Dr Jon Roozenbeek of Cambridge’s SDML describes the team’s videos as “source-independent”, avoiding people’s biases about where the information comes from and how it matches – or doesn’t – match. what they already believe.

“Our interventions do not claim what is true or a fact, which is often disputed. They are effective for anyone who resents being manipulated,” he said.

“The inoculation effect was constant among liberals and conservatives. It worked for people with different levels of education and different personality types. This is the basis of a general inoculation against misinformation.

Google – the parent company of YouTube – is already mining the results. At the end of August, Jigsaw will launch a prebunking campaign on multiple platforms in Poland, Slovakia and the Czech Republic to preempt emerging disinformation regarding Ukrainian refugees. The campaign is designed to build resilience against harmful anti-refugee narratives, in partnership with local NGOs, fact checkers, academics and disinformation experts.

“Harmful misinformation takes many forms, but manipulation tactics and narratives are often repeated and therefore can be predicted,” said Beth Goldberg, co-author and head of research and development for Google’s Jigsaw unit.

“Teaching people techniques such as ad-hominem attacks that aim to manipulate them can help build resilience against belief and the spread of false information in the future.

“We’ve shown that video ads as a method of delivering prebunking messages can be used to reach millions of people, potentially before harmful narratives take hold,” Goldberg said.

The team argues that pre-bunking can be more effective in combating the deluge of disinformation than verifying each lie after it’s spread – the classic ‘debunking’ – which is impossible to do on a large scale and can entrench theories of the conspiracy feeling like personal attacks. to those who believe them.

“Propaganda, lies and hijackings are almost always created from the same playbook,” said co-author Professor Stephan Lewandowsky of the University of Bristol. “We developed the videos by analyzing the rhetoric of demagogues, who deal with scapegoating and false dichotomies.”

“Fact checkers can only refute a fraction of the lies circulating online. We need to teach people to recognize the misinformation manual, so they understand when they are being misled.

Six initial controlled experiments involved 6,464 participants, with the sixth experiment being conducted a year after the first five to ensure that earlier results could be replicated.

Data collection for each participant was comprehensive, ranging from basic information – gender, age, education, political leanings – to numeracy levels, conspiratorial thinking, checking news and social media, “responsiveness to bullshit” and a personality inventory, among other “variables”. .

Taking all this into account, the team found that inoculation videos improved people’s ability to spot misinformation and boosted their confidence in their ability to do so again. Clips also improve the quality of “sharing decisions”: whether or not to broadcast harmful content.

Two of the animations were then tested “in the wild” in an extensive experiment on YouTube, with clips positioned in the pre-video ad slot which offers a skip option after five seconds.

Google Jigsaw exposed around 5.4 million US YouTubers to an inoculation video, with nearly 1 million people watching for at least 30 seconds. The platform then randomly gave 30% of users who watched a voluntary test question within 24 hours of their initial viewing.

The clips aimed to inoculate against the misinformation tactics of hyper-emotional language and the use of false dichotomies, and the questions – based on fictional messages – tested for detection of these tropes. YouTube also gave a “control” group of users who hadn’t watched a video the same test question. A total of 22,632 users answered a question.

Despite the intense “noise” and distractions on YouTube, the ability to recognize the manipulation techniques at the heart of misinformation increased by 5% on average.

Google says the unprecedented nature of the experiment means there is no direct data comparison available. However, increasing brand awareness through advertising on YouTube – known as “brand lift” – is generally limited to 1% in surveys of less than 45,000 users.

“Users participated in the tests about 18 hours on average after watching the videos, so the inoculation seems to have stuck,” van der Linden said.

The researchers say such a boost in recognition could be a game-changer if scaled up significantly on social platforms, which would be inexpensive to do. The average cost for each significant length view was a tiny $0.05.

Roozenbeek added, “If someone wants to pay for a YouTube campaign that measurably reduces susceptibility to misinformation among millions of users, they can do it, and at a minimal cost per view.”

REMARKS:

Six first experiences:
The first six controlled experiments consisted of randomly assigning each participant a 90-second “inoculation” video or a neutral control video. Participants then randomly received ten social media posts: five using deliberately manipulative techniques (although not all of them involved proven lies) and five neutral posts. Participants were asked to rank levels of trust in the information, the degree to which they felt it was manipulative, and the likelihood of them sharing it.

Results include:

  • Emotional language video: “Inoculated” participants were between 1.5 and 1.67 times better than the control group at identifying this manipulation technique.
  • Video of the false dichotomies: the “inoculated” participants were 1.95 – almost twice – as good as the control group at identifying this manipulation technique.
  • Inconsistency video: “Inoculated” participants were more than twice as good (2.14) as the control group at identifying this manipulation technique.

YouTube review:
The YouTube inoculation ad campaign ran for a fortnight in [YEAR] and targeted English-speaking users in the United States aged 18 or over who had watched at least one political or current affairs video on the platform.

A total of 22,632 participants answered a test question on YouTube: 11,432 who had seen a vaccination video and 11,200 who had not seen it.

Examples of questions as presented by YouTube:

False dichotomy:
Rate this sentence: “We either need to improve our education system or tackle street crime.”
Users had to choose whether the sentence contained: a command; alarmist; false dichotomy; none of them.

emotional language:
Rate this sentence: “Formula linked to outbreak of terrifying new disease in helpless infants – parents despair.”
Users had to choose whether the sentence contained: a command; emotional language; false dichotomy; none of them.

Video and other links:

All inoculation videos, as well as general information about the approach, can be found at: https://inoculation.science/

Or on YouTube here: https://www.youtube.com/channel/UCiov-3rtgg9Nl_ezyWyOHpQ/videos

Details of Sander van der Linden’s forthcoming book on “prebunking” and countering misinformation, Foolproof, can be found here: https://www.waterstones.com/book/foolproof/dr-sander-van-der-linden/9780008466718

About the puzzle:
Jigsaw is a Google team that explores threats to public companies and leverages research, technology, and collaborations inside and outside of Google to develop scalable, long-term solutions. The team works to keep people safe online by tackling issues ranging from censorship and harassment to misinformation and violent extremism.


Comments are closed.