Showing posts with label Disinformation. Show all posts
Showing posts with label Disinformation. Show all posts

Monday, December 2, 2024

Weekly newsletter 12/02/24: Bill of Rights Day, Disinformation, Autocracy, Inc.

Markham's Slow News newsletter will be published every Monday. Please subscribe and share and comment.

December 15th is Bill of Rights Day. 

There are some great resources here provided by Larry Ferlazzo. As the USA moves further from democracy to autocracy a reminder about the Bill Of Rights is especially relevant and timely.


Disinformation can contribute to avoidable harm and death

Disinformation is prolific in our age of digital media when publishing can occur without any editing. It seems ironic that I am writing this when I engage in this activity myself. However, what distinguishes my publication is that I have taken the Pro Truth pledge and promise my readers to publish only what I have vetted and know to be true to the best of my ability.


Some people hold the value of honesty, accuracy, and validity very highly while others, motivated by any number of incentives, are selling something based on explicit and implicit agendas. It behooves the consumer of media to insist on honesty and accuracy and to discern the motivations and incentives of the communicator. The question “Why is this person communicating this to me can be very illuminating.” In everyday language we should ask “What is this communicator up to and hoping will happen as a result of communicating this message”?


Just as people can spread infectious bacteria and viruses, they can spread false information. Just as infectious diseases can harm and kill so can toxic disinformation.


For more from Hayden Godfrey’s article, “Doctors Who Put Lives At Risk With Covid Misinformation Rarely Punished? click here.


Autocracies, Inc by Anne Applebaum


As the US has slipped further from democracy to autocracy with the re-election of Donald Trump as president, knowledge of the signs and symptoms of autocracy as a form of government and the consequences of this development has become much more important. This shift in governance processes have highly significant influence on the daily events described in the media.


From Autocracy, Inc. by Anne Applebaum


"Nowadays, autocracies are run not by one bad guy but by sophisticated networks relying on kleptocratic financial structures, a complex of security services—military, paramilitary, police—and technological experts who provide surveillance, propaganda, and disinformation.  p. 1


"Unlike military or political alliances from other times and places, this group operates not like a bloc but rather like an agglomeration of companies, bound not by ideology but rather by a ruthless, single-minded determination to preserve their personal wealth and power: Autocracy, Inc.  p. 2


"Instead of ideas, the strongmen who lead Russia, China, Iran, North Korea, Venezuela, Nicaragua, Angola, Myanmar, Cuba, Syria, Zimbabwe, Mali, Belarus, Sudan, Azerbaijan, and perhaps three dozen others share a determination to preserve their personal wealth and power; Autocracy, inc.  p.3



The world is run by corporations which the US supreme court has deemed as persons for constitutional purposes. What's up with that?


Capitalism where the profit motive is the only legitimate value system for corporations to pursue furthers the power of the oligarchs who can buy the political policies they prefer by financing elections.


What do you think Elon Musk is doing working with Trump heading a concocted government agency called the “Department of Efficiency”? The billionaires run the US.


Our US motto, instead of "In God We Trust" should be "Money Talks and Bullshit Walks."


So, I think Applebaum is on to something in the introduction. And now I am very interested in reading the rest of the book to see what she has to say.


Sunday, November 24, 2024

How does disinformation go viral?

 

Across groups, social influences also produce noise. If someone starts a meeting by favoring a major change in the company’s direction, that person might initiate a discussion that leads a group unanimously to support the change. Their agreement might be a product of social pressures, not of conviction. If someone else had started the meeting by indicating a different view, or if the initial speaker had decided to be silent, the discussion might have headed in an altogether different direction—and for the same reason. Very similar groups can end up in divergent places because of social pressures.


Kahneman, Daniel; Sibony, Olivier; Sunstein, Cass R.. Noise . Little, Brown and Company. Kindle Edition. 


Sometimes what Kahneman et al. call “informational cascades” is called “peer pressure.” Solomon Ashe and other social psychologists demonstrated this dynamic decades ago.


We have colloquial sayings like “Better to go along to get along,” and “When in Rome you do as the Romans do,” and “Why go against the grain?” and “Don’t upset the apple cart,” and “You shouldn’t disturb the status quo.”


Keeping with the title of their book, the authors write that informational cascades are “noise.” Indeed they are. A major contributor to informational cascades is power and what are sometimes called “opinion leaders.” The first story told about the incident, event, or topic "frames" the future discussion to which any subsequent offering will be compared. "Disinformation" often goes viral in this way with the first story constantly being spread as subsequent commentors try to rebut it.


When posts on social media go "viral" they demonstrate what Kahneman is calling an "informational cascade."


To what extent are you an opinion leader in the groups you participate in? When have you been the leader and when have you been subject to another leader and group pressure? Have you ever participated in an organizational decision which didn’t seem right to you but you went along because you did not want to challenge the developing majority opinion of the group?


Noise is well worth reading as it provides a deeper understanding of the disinformation so rampant in our society in our digital age.

Sunday, October 4, 2020

The Pernicious Contagion Of Misinformation

 


False statements — about Covid-19 and so much else — spread like a virus online. Scientists should study them like one.

10.2.2020

You’ve heard the claims. Hydroxychloroquine is a miracle cure for the Covid-19 virus. Wearing a mask is bad for you. Vaccines cause autism. Left-wing arsonists are responsible for the fires in California.

The Internet is full of misinformation — that is, inaccurate statements — including the sinister, intentionally misleading subset known as disinformation. Both are spreading, a contagion that imperils society just as surely as the coronavirus itself. Those who spread it run the gamut of society; a new study by Cornell University researchers concludes that President Trump has been the leading source of Covid-19 misinformation reported by news media, who often repeat the information “without question or correction.”

Scientists can approach the social disease of misinformation the same way they address real, biological diseases, say experimental psychologist Briony Swire-Thompson and political scientist David Lazer of Northeastern University. Writing in the 2020 Annual Review of Public Health, they explore questions of exposure, susceptibility and transmission of health-related misinformation.

Swire-Thompson spoke with Knowable Magazine about what we can learn from taking an epidemiological approach to misinformation. This conversation has been edited for length and clarity.

What do you mean by an epidemiological approach to misinformation?

The epidemiological approach is really just looking at how and why information spreads between people in a network. That’s very different from the individual approach that I’m used to in the experimental world, where you manipulate various conditions and see what the outcome is. Ideally, you need both.

Where does most online misinformation spread?

Much of this research is yet to be done. At the moment, a lot of the academic research on misinformation and disinformation is done using Twitter. It’s easier, because there’s more data-sharing between Twitter and the academic community than for other platforms like Facebook or Instagram. It’s also easier because of issues around privacy. When you tweet, you put it out there in public. But social media like Facebook and WhatsApp are private, so access is more complicated, as it should be with privacy concerns. It means that less research has been done on a lot of these platforms.

What do you know so far about who spreads misinformation?

One study by other researchers found that older adults — people over the age of 65 — are seven times more likely to share fake news as people 18 to 29. So older adults could potentially be spreading misinformation more often. We found the same trend in our own studies of fake news during the 2016 US election. We did see, too, that people who are very politically engaged are more likely to view and share political fake news.

But it’s a big problem that a lot of these studies have been done exclusively with political misinformation. We don’t know if that generalizes to other subjects like health. Health misinformation is very understudied.

The really surprising finding with our study, though, was the concentration. We had a sample of 16,000 people, and found that there were 16 individuals sharing 80 percent of the fake news, which is extraordinary. They’re superspreaders, just like with Covid-19. They were tweeting 71 times per day as compared to the median person, who tweeted 0.1 times per day. We assume they had automation tools that helped them retweet or reshare content.

That suggests those people are very important in the misinformation ecosystem. What do we know about them?

Very little. Twelve of them were females, but that is something I don’t believe will replicate in a larger sample. This is the big problem: You can’t really say much from only 16 individuals. We’re hopefully going to look at this with a much larger sample, and then we can make some generalizations about basic demographics like their age and political orientation.

What about bots? Are they doing most of the spreading, or is it coming from real people?

In our study, we linked voter registration data to Twitter handles to be sure they are real people. But even if bots are tweeting some of the misinformation, the question is, what influence are they having on real people? It’s unlikely that they have as much impact on people’s beliefs as some think, because they’re not as embedded in human social networks. Also, the trustworthiness of the source is really important, and we’re more likely to believe information from friends and family than from unknown accounts. However, the true impact of bots is still under debate.

What does this tell us about the virality of misinformation?

We often think of virality as one person shares information to another who shares it to another, and so forth. That’s compared to a broadcast where one person or a media outlet with lots of followers shares the information, and it gets very, very popular due to just one broadcast. It still gets seen by many people, but how that happens is different.

One recent paper found it was more likely that information was being broadcast than spread through true virality. That’s an interesting study, because it shows that people who have large numbers of followers probably have more responsibility to be sure that what they share is accurate. But that study was done with information in general, not misinformation.

A different paper found that false information spread faster, further and deeper than true information — but this was observational, not experimental, so we still don’t know why. The authors proposed that it was due to the information being novel and inspiring fear, disgust, and surprise. It had this emotional component, they proposed, that was behind the higher spread.

But these are still early days. Surprisingly early days, in this field.

On the other side of the contagion, who is most likely to receive online misinformation?

We found there was a very similar pattern as for sharing. Again, it was older adults. I think that on both the sharing and the exposure end, they could be a group we want to check in on to see if they’re believing this information. They might be fine. People might be, like, “I know that’s false. It’s just funny.” So we should check.

People on the political right both share and view more. While conservatives and liberals share the same proportion of misinformation that they view, conservatives are sharing more because they are exposed to more misinformation in their social media ecosystem.

What can we do to reduce the spread of misinformation?

There is a lot that we all can do to reduce the spread of misinformation online. Simple measures like correcting misinformation if we see our friends and family sharing it can go a long way. Also, taking a bit more time to read an article and to consider whether it is true prior to sharing, and being more conscious about who we friend or follow online, can be considered good practice.

Does it help or hurt to correct misinformation?

Scientists used to be concerned about a phenomenon called the backfire effect, where you try and correct the information and it has the opposite effect: People increase their belief instead of reducing it. I think many people became quite afraid of correcting, and they preferred to just let the misinformation sit out there.

People thought the backfire effect was due to two things: One, if the information was connected to people’s sense of identity, when the correction occurred they were more likely to dig in their heels. Two, the effect of information repetition: I have to repeat the misinformation to correct it, and this familiarity makes people think the information is more true.

I’ve done research on both, and we have found there’s very little evidence for either. We haven’t been able to replicate the findings that showed these things, even using the exact same items. And it isn’t just my work. We came to the conclusion that the backfire effect is not widespread. I think it’s now safe to say corrections work.

Even for Covid-19-related misinformation?

It is a bit of a perfect storm these days in the pandemic. In the correction world, the best response we have is to tell people what the true information is. But for Covid-19 we can’t say what is true yet, in many areas, because it takes such a long time for science to establish the facts. It’s like we’re fighting with our hands tied behind our back.

Are there differences in how people respond when misinformation is corrected?

I’ve done only one study on this, looking at whether people in different cultures process information differently. And we found a big difference between people in the USA and Australia. We were looking at when misinformation from people’s favored politicians was corrected. In both the US and Australia, people did a great job of decreasing their belief in the misinformation. That was really positive, an encouraging finding.

But while people in the US reduced their belief in the misinformation itself, it didn’t change the way they were going to vote, on the left or the right. In Australia, we found that the correction greatly impacted people’s feelings and their voting intentions toward these political figures. That could be because Australia is less polarized. But the fact that we found a difference in something like that highlights the importance of doing misinformation studies in different locations and on different topics, because I don’t think we know where findings are going to be replicable and where they’re not.

What do we still have to learn?

A lot of these questions — who’s susceptible to misinformation, who’s spreading it and for what reasons, who’s predisposed — have very unsatisfying answers right now. My partner does vision science. I’m always super-jealous, because how people see has been studied for at least a hundred years. For misinformation, we’re still trying to establish the basics, in terms of who and when. But I think these are the exact kind of things we need to know to be able to build this model of epidemiology.

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews. Sign up for the newsletter.

Knowable Magazine | Annual Reviews