How to know what’s fake on social media? In this video, Prof. Darren Linvill from Clemson University Media Forensics Hub explains how disinformation works, common tactics of political propaganda, the role of trolls, hashtags, and how to spot fake news.
00:00 - Introduction
00:38 - About Prof. Darren Linvill
01:09 - About political disinformation and propaganda
01:55 - Russian trolls and disinformation tactics
04:26 - Chinese trolls and disinformation strategies
07:56 - About positive online reviews
09:49 - How to fact-check the information online
14:43 - Disinformation around COVID-19
21:05 - Russian propaganda around Ukraine-Russia war
23:55 - Russian social media disinformation
31:25 - How to avoid media disinformation
More information about Prof. Darren Linvill: https://news.clemson.edu/our-experts/darren-linvill/
#fakenews #misinformation #disinformation #socialmedia #pissedconsumer #experttalkswithpissedconsumer
Subscribe to Pissed Consumer social media accounts:
Facebook: https://www.facebook.com/PissedConsumerCom/
Instagram: https://www.instagram.com/pissedconsumer/
Show More Show Less View Video Transcript
0:00
Now let's begin with some fake news
0:02
There has never been a time in human history where information is as readily available as it is today
0:10
Do you want to know what it is? Guys, we have Darren Linwell with us, associate professor at Clemson University Media Forensic Hub
0:25
Darren specializing in social media disinformation and misinformation. And I assume Darren will tell us a little bit about himself
0:35
Introduce yourself, Darren, please, to our viewers. As you said, I'm an associate professor here at Clemson
0:41
I'm in the Department of Communication. And I'm also lead researcher in Clemson University's Media Forensics Hub
0:49
I've been studying social media for close to 15 years now. but specifically exploring disinformation and misinformation and the tactics and strategies of state actors in particular around disinformation for the past five years
1:08
Political disinformation, political propaganda. What are the strategies that companies utilize, that state actors utilize to disinform people about things
1:24
Yeah, sure. It can take a lot of forms. And it really depends on the particular actor and their goals. You know, not all strategies work for depending on the context
1:38
So let me answer that question by sort of juxtaposing two different actors in this space, the Russians and the Chinese and what we often see from from each of these groups
1:51
So Russia engages in disinformation in a very artis way. They will create accounts from the ground up, you know, starting from zero followers on a particular social media platform
2:07
And they will integrate that account into a particular online community, purporting to be a member of that community and gain followers slowly over time
2:19
And as they do that, they'll try to pull that community slowly in a particular direction
2:26
Historically, they've done this in English language conversations here in the United States, famously around the 2016 election
2:35
But here at Clemson, we are identifying accounts that we attributed very likely to the Russian Internet Research Agency in as recently as 2020
2:46
communicating in English. But we know that the Russians also do this mostly, in fact, in Russian
2:57
They create accounts that purport to be Russian nationalist pro-Putin accounts and integrate
3:04
themselves into those communities online and then pull those communities in a particular direction
3:09
Basically, pushing particular narratives, particular conversations that the community, they're communicating to might already be inclined to believe. Now, this is what we normally
3:24
think of when we think of a classic Russian troll. But what classic Russian trolls don't do is they
3:32
don't really go out there looking for a fight. Mostly they're communicating to people who agree
3:39
with the things that they're saying, because that's how persuasion works. You don't go out
3:44
there and persuade a Clinton voter to become a Trump voter or vice versa, you persuade someone
3:50
who's already inclined to Trump to actually show up and vote for Trump, or someone who's inclined
3:55
to vote for Clinton to actually show up and vote for Clinton, or you pull that community in a
4:01
particular direction around a particular issue that they might have already been thinking
4:07
but you want to make them a little more extreme in that direction. Tell me, what's the difference between us and them
4:12
because we live here. Now, I want to juxtapose that regarding what the Russians have often done in the past
4:21
and continue to do. So what the Chinese often do. The Chinese disinformation on social media
4:28
oftentimes functions in a very different way. Now, the Russians, they're very interested
4:32
in what you think of your neighbor. They're very interested in pushing particular narratives
4:38
but the Chinese are much more interested in simply what you think of China
4:45
They're not necessarily very often at least engaging in, you know, American politics or EU politics or whatever the case may be
4:56
They really care about things about China. You want to hear my impersonation of American
5:02
Yeah, yeah. Okay. Hey, I really, really want that. That looks good
5:08
conversations around Uyghur atrocities or conversations around uh the recent Olympics
5:17
those are the conversations that the Chinese push and the way they function is very different they're
5:23
not you know creating fake accounts that purport to be part of a particular community but they
5:29
operate in mass thousands of accounts uh that are don't even try very hard to look authentic
5:37
And what they do is they make sure that certain conversations don't happen
5:42
So they'll do things like take over a hashtag. Around the Olympics, for instance, we saw the Chinese accounts using hashtag genocide games on Twitter
5:54
Thousands and thousands of these Chinese accounts using this hashtag over and over again
5:59
And that might seem counterintuitive. Why would the Chinese want to use this hashtag genocide games
6:04
a hashtag that had been used by a lot of people that were critical of China
6:09
specifically critical of China in their treatment of the Uyghur Muslim minority in the Xinjiang region of China
6:20
And they were using that hashtag to connect the Olympics to Uyghur atrocities
6:26
It's a nicely packaged hashtag. It's got alliteration, genocide games. And it certainly wasn't a hashtag that the Chinese wanted anybody using or a hashtag they wanted trending on Twitter or any other platform
6:40
But the reason they used it is they took that hashtag and they attached it to a bunch of unrelated tweets
6:48
And and so that anyone looking to engage in the conversations using that hashtag would come across unrelated content, content that was pro-China, content that was part of another conversation
7:01
and that really affected people's ability to use that hashtag to criticize China
7:10
And you could see it in the data. You could see people start, they're just starting to use that hashtag to criticize China
7:16
And the Chinese swarmed that hashtag. This is a tactic that has been used long before the Chinese by, you know
7:25
genuine activists as well, taking over, brigading a hashtag like that. But the Chinese did that so that that conversation wouldn't happen
7:34
So the Russians, they want specific conversations to happen. And there's certain tactics you would use to make conversations happen
7:42
Whereas the Chinese want to make sure the conversations don't happen. Any conversation critical of China, let's make sure that's not happening
7:49
And there's other tactics you would use to make sure the conversations don't happen
7:54
This consumer is all about consumer reviews. This consumer is about 16 years old and that when reviews were starting Online Companies engage in disinformation by buying reviews
8:13
You probably have heard about such practice. Company that has only positive reviews online
8:20
Is it possible? That's an excellent question. I think it depends on your sample size
8:26
You know, if a company only has three reviews and they're all positive, sure, I would believe that
8:32
But if they have 100,000 reviews and they're all positive, that starts to get really, really questionable
8:39
The question is a matter of numbers. Have you ever done research amount of negative content versus positive content on a corporate level
8:47
I hire here in the near future with Clemson's marketing department. And we're hoping to find somebody that can do just that type of research
8:56
Because we're very interested in that. I think that's an important space for disinformation that isn't talked about enough, especially in terms of spaces that really affect people's everyday lives and spaces where there's a lot of money in that space without question
9:15
You know, fake reviews are big business in the same way that fake social media profiles are big business
9:23
You know, there's huge centers in South Asia, especially, that specialize just in that type of work
9:35
And we've seen those centers spread social media disinformation, but of course, they spread fake reviews as well
9:42
So, no, we haven't done that work yet, but it's a direction that I'm hoping to go in the future because I think it's important
9:47
Can you help our viewers differentiate? How do you fact check information online
9:56
You want to talk about politics? We can do it on the political level. It would be better for our consumers to understand it from a corporate level, company to consumer, B2C
10:05
I think that in looking for factual information, it's important to understand the concept of authenticity
10:17
to understand the process through which information came to you to begin with
10:26
Did the person that's sharing this information have some kind of agenda? Who are they? Do you
10:33
know them? Do you have a personal relationship with this individual? In general, my response
10:40
to people wondering how to be safe in an internet, in the digital age, in an internet environment
10:47
I usually tell them that they need to treat the digital world more like the real world
10:55
In the real world, people understand that when you go outside, most strangers don't want to hurt you
11:02
Most strangers are perfectly nice. In most circumstances, maybe even you've become friends
11:10
Who knows? But they still treat strangers like strangers. You don't walk outside and, you know, trust every single person you meet
11:21
You don't walk outside and hand over all the contact information in your phone to someone you're walking by
11:28
You don't invite somebody into your home simply because they're wearing a T-shirt that you like
11:34
But in the digital world, for some reason, you know, we do that every single day
11:39
We share our followers with others. We invite people onto our platform and allow them to follow us
11:47
We engage in conversations with people that we know absolutely nothing about and don't
11:54
even know if the information that they are telling us is truthful, if it's even a real
11:58
person at all. And so I think it's really just fundamentally important to treat the digital world more
12:05
like the real world. Because you know what? Most people in the digital world are real and they mean you no harm
12:12
but you know sometimes sometimes they do sometimes they're trying to steal your information steal your
12:20
money whatever it may be they're trying to persuade you of something for some agenda that they may have
12:26
they're trying to spread disinformation and so you you have to treat the digital world with a
12:33
just a little bit of skepticism and understand the space that you're in I think a lot of you know
12:41
younger folks, younger adults sort of have started to understand that intuitively
12:47
But for people that didn't grow up with the internet, it's a hard concept to really get
12:53
You receive a piece of information on Twitter, on Facebook. It's kind of hard to start. So on one
12:59
hand, we are saying, okay, don't share a lot of personal information online with people that you
13:05
don't know but then on the other hand so there is a tweet that flew from nowhere um and you need to
13:13
fact check it it's a little bit difficult to go back to that person and start asking them questions
13:19
because they may not be willing to share it right so how do you how do you learn more about the
13:26
person when you yourself are not prepared to give out extra information about yourself so how do you
13:32
it's kind of difficult. Yeah, I think that it's important to look at that information that
13:38
they're sharing with you as well. When I'm engaging in the digital world, there's certain
13:47
sources that I know that I trust, and I try to stick to those sources. If I'm getting information
13:53
from a link that I don't recognize or that I've never heard of, and there's a lot of perfectly
14:00
reputable sources out there that I've never heard of. You know, I'm not perfect, but I'm still going
14:08
to go look for that same information elsewhere. You know, you don't believe something just because
14:13
you heard it once from one source. And you want to go double check that, especially before you
14:20
share it with others, before you use your own credibility to give credibility to somebody else
14:27
I think that's key because that's what a lot of bad actors are really trying to get
14:32
They're trying to use your credibility to spread that information that they're trying to spread or in order to spread that link that they're trying to get other people to click on
14:42
We just came out from two-year lockdown for COVID. And if I remember correctly, there were a lot of people online that were vaccines deniers
14:57
people that were saying that my body is my body i don't want to stick anything into it freedom of
15:05
expression yet we know a lot of social media platforms have shut down covenanting deniers
15:14
what do you think what happened there so a lot of people were saying hey we need vaccines the other
15:22
Not huge, but a lot of people were saying that they don't need vaccines
15:27
They have the right not to take. They had the right to speak. They had an ability to talk that they don't like lockdowns
15:36
They don't like masks. What happened there? Were there political actors behind it
15:43
What's your take on that full story? Anytime you're talking about conversations where literally the whole world is having the same conversation
15:52
And for the past couple of years, literally everyone has been talking about the pandemic
15:58
and COVID-19. I think that you have to assume that there's a lot going on
16:05
And there's a lot of different actors and they all have different motivations. And it true that for instance Russia spread some disinformation about the pandemic especially about some of the other vaccines because they were trying to pump up their own vaccine and make their own vaccine look more reputable
16:30
You know, they weren't necessarily trying to get people not to take any vaccine, but Russia was about their vaccine
16:36
Other actors out there were spreading disinformation in order to make a buck. Take Alex Jones, for instance. He was very famously spreading disinformation about COVID-19 because he was trying to sell products off of his website
16:56
He had some products that he claimed would have an effect on curing the virus, which was all hogwash
17:06
But he wanted to make a buck. And there were a lot of other individuals, just like Alex Jones, who were trying to take advantage of various conversations around the pandemic just to make money
17:17
There's been shysters in the world since well before the Internet. The Internet just helps them reach a much wider audience
17:26
But honestly, you know, the biggest problem we have with misinformation around COVID-19 and frankly continue to have is just that there's a lot of people having these conversations and they're scared and they're nervous for any number of reasons
17:45
This is especially true early in conversations around COVID-19 when there wasn't a lot of, you know, information we could rely on because it hadn't been vetted yet
17:59
We didn't know much about the virus. So people were just making things up or, you know, previous false stories would emerge and then be applied to COVID-19
18:13
I remember early, early in the first month, two months of the pandemic, I wanted to see who was the first person that connected 5G to COVID-19
18:28
I don't know if you remember, but there were these these stories circulating that 5G caused COVID-19
18:34
You know, 5G, previously 4G had been blamed for all sorts of illnesses and disease from, you know, cancer to Ebola at various times
18:45
You know, we always distrust new technologies and it's been true of 5G
18:50
And so these these stories about 5G causing COVID-19 were emerging. And I went and I looked who was the first person on Twitter that connected the idea of 5G and COVID-19
19:01
I found it was an individual in New Zealand and all their tweet said was, and 5G causes COVID-19 in 5, 4, 3, 2
19:16
It was a joke. They were saying this is a story that's about to happen. It's hogwash, but somebody's going to say it
19:22
And sure enough, the very next day on Twitter, somebody was saying 5G causes COVID-19 and these conversations were happening in Italy
19:31
So, you know, the Internet's a global place. All these conversations are interconnected
19:36
So some of the false stories about COVID-19, you could you could predict would happen because it's the same sort of thing that's happened in the past
19:45
And it's the same sort of thing that will happen in the future
19:50
You know, we've we've always liked to accuse the elite and the powerful of being responsible for horrible sins
19:58
So, you know, of course, we're going to accuse the elite and the powerful of being responsible for COVID-19
20:07
We've always wanted to accuse, you know, countries we don't trust of being responsible for things
20:13
So, of course, you know, we're going to accuse China of horrible things related to COVID-19
20:19
And it's especially difficult that these stories emerge when there might be an element of truth to it
20:24
There is an element of truth to stories about China's responsibility for COVID-19
20:31
But then those little nuggets turn into, you know, snowball into something that is disinformation
20:37
Guys, over here, I would like to ask you to subscribe to our channel, like this video
20:44
This is important to share this information with our viewers, with your friends
20:48
So it is important for you to actually take an action and click on that bell
20:55
Darren, let's talk about Russia-Ukraine conflict for a little bit, if you don't mind
21:00
Russia has been running a hybrid war for the past eight years, since 2014
21:05
How effective is Russian propaganda disinformation in Ukraine and in the world overall
21:12
What's your opinion? I think that globally there's an assumption that Ukraine is winning the global information war between Ukraine and Russia
21:24
And to a large degree, that's probably true. If you're looking at conversations in English or conversations just in the West
21:35
But Putin's main audience has always been and continues to be the Russian people
21:43
It's the Russian people that allow him to stay in power and their loyalty and keeping them in place allows him to maintain his authority, maintain his power
21:58
And so it is true that the Russian people have always been his main target of disinformation
22:03
And that disinformation has taken all kinds of forms in the past 20 years, including disinformation that comes through traditional media
22:15
You know, there's various state media outlets in Russia, but also in new media and social media and various websites
22:25
And in the Russian language, Russia does seem to be doing much better, if not winning the information war
22:36
We worked with ProPublica near the start of the invasion to identify accounts across a number of different social media platforms, Twitter, Instagram, TikTok, VK, which is Russian Facebook, and Telegram especially, that we attributed as very likely coming from the Internet Research Agency in St. Petersburg, Russia
23:02
the internet research agency famously responsible for intervening in the 2016 u.s election
23:10
and those some of what they were doing with these accounts was very effective especially on tiktok
23:17
they had hundreds of thousands of followers on tiktok millions of likes from these accounts
23:21
and they seem to have a real effect on on some of these these conversations happening in russia
23:29
Russian. And it's certainly true that, you know, Putin has still maintained high levels of
23:38
positive responses on feedback from the Russian people. You know, there's inklings that maybe
23:48
there's some chinks in his armor, but he still has maintained that support
23:54
Social media is global, right? I would probably agree with you that within the Russian Federation, there is overwhelming support for Putin just because they don't see a lot of other information
24:09
They lost access to Instagram, they lost access to Facebook. Their access to information on social media has been limited by government There are other Russian speaking people around the world The person who is exposed to pure Russian Federation propaganda and to English
24:35
news channels, do you think who is winning? I think that because of the particular types
24:42
of tactics that Putin uses, the ball is still in Putin's court
24:48
And that's because he doesn't necessarily have to win anything. He just has to give just enough doubt
25:00
For decades, Russian disinformation has centered on this idea of doubt, of doubt in believing
25:12
in mainstream sources and doubt that there is any real objective truth
25:21
So one tactic that we found them to be using quite effectively, seemingly, early in the
25:28
war was just undermining Western sources of information by suggesting that these Western
25:36
sources were lying. One video clip we saw, for instance, was a video of a German journalist standing in front of a field of body bags
25:49
And the clip said that this is a German journalist standing in Kiev reporting on the conflict in Ukraine
25:55
And then one of the body bags starts moving around and sits up and has a cigarette
26:00
Now, the video clip wasn't in Kiev at all. It was in Berlin in 2015, and this video was of a global climate change protest
26:11
But that video got a lot of traction, given a different context, in conversations happening around Ukraine
26:19
And, you know, what that video was designed to do was to spread doubt, to say, yeah, maybe the Russians lie, but so do the Ukrainians, so does the West
26:33
We saw a number of fake fact check videos. So these were videos that were purporting to be videos that the Ukrainians were spreading of destroyed Russian vehicles
26:49
And they were and and the Russians had created the whole thing from from whole plot in order to fact check them spreading additional doubt
27:01
I know that's complex, but the real issue there is that it's grounded in just spreading distrust
27:12
Because, you know, even if you don't think Putin is great, it's good enough for Putin if you just think, fine, Putin's not great, but nobody else is any better
27:24
As long as the grass isn't greener on the other side, Putin's still winning
27:27
Because if you don't believe in anything, then you're not going to fight for anything
27:32
And that's all Putin really needs at the end of the day is for the Russian people to be not willing to fight for anything
27:39
It's not Putin. How do you counteract it? It's very difficult in autocratic countries that maintain control of their information systems
27:50
Now, in a lot of these countries, especially Russia, you see slow change happening in the younger generations, especially the younger urban generations that have ways, that have access to Western media, that have access to information that's not Russian state propaganda
28:13
And so and I think that's what's going to change is it's going to be a generational shift
28:21
It's not something that you can necessarily do overnight, sort of, you know, invading Russia and instituting regime change, which is probably not currently in our best interest as long as the Russians maintain a nuclear arsenal
28:40
And I think the same is true in other countries as well
28:46
China maintains incredible control over their information systems in China, but young Chinese still managed to jump the firewall and get information from the outside
28:58
And, you know, comparatively small numbers, but it still happens. The same is true in most autocratic countries, Iran as well, especially
29:05
Fake news you've ever come across? The craziest fake news I've ever come across. Oh, gosh, that's tough
29:14
um there's just so much bizarre so many bizarre things out there i think that
29:22
that the craziest and i'm not kidding the craziest fake news out there is is probably one that is
29:30
more uh commonly believed than many others and that's the the q anon conspiracy theory
29:38
because it's it's not so when you really look under the surface at what you know q anon really
29:47
purports to be about because the q anon conspiracy theory says that a cabal of pedophile cannibal
29:57
socialists is in control of the world governments and that the only individual that can stop
30:08
of these people is Donald Trump. So for somehow, you know, for the past generation or more
30:17
cannibal pedophiles have gained control of, you know, our government. And that's just nuts
30:26
I mean, of course not. And especially if you start looking at all the shades of various things that
30:34
that QAnon claims is true. But again, like with, you know, conspiracy theories related to China and
30:43
COVID-19, these things become easy to believe when there is just a chink of truth to them
30:52
And you know what, there, there are some really dirty people in the government. And
30:57
And there are sex scandals related to both liberal and conservative elites
31:05
And so when you take just a nugget of that and blow it up into a much bigger story, it can become something that millions of people believe to be true
31:13
Final message to our viewers in regards to social media, consumer reviews
31:21
How would you summarize our conversation today? I would say, one, find sources that you trust and go to those sources
31:35
But then go beyond that. Find some more sources and make sure that your sources are always triangulating
31:43
Maybe even look at a few sources that you wouldn't necessarily trust
31:48
something from the other side of the ideological aisle and take those viewpoints into account as
31:55
well. But the main thing I would say is to have healthy skepticism, but that it's still okay to
32:06
trust. You can't disbelieve everything. We're still living an objective reality. Restate your message a little bit differently for our viewers
32:17
Guys, when you look at Facebook reviews, Google reviews, don't forget to take a look at Pissed Consumer reviews
32:22
It's an opposite point of view sometimes
#News
#Politics
#Social Issues & Advocacy
#Human Rights & Liberties
#Social Sciences
#Media Critics & Watchdogs
#Communications & Media Studies


