The CSR Newsletters are a freely-available resource generated as a dynamic complement to the textbook, Strategic Corporate Social Responsibility: Sustainable Value Creation.

To sign-up to receive the CSR Newsletters regularly during the fall and spring academic semesters, e-mail author David Chandler at david.chandler@ucdenver.edu.

Tuesday, October 10, 2017

Strategic CSR - Fake news

I am in the process of thinking through a new case for the next edition of Strategic CSR – something around the role of the media in the 21st century and the growing prevalence of fake news. I have always been interested in the role of the media in society, both reflecting the news and (in their selection of which events to report) also creating the news. More recently, I have been interested in how Facebook and Twitter have emerged as new age media companies. Their role in shaping the outcome of the last presidential election here in the U.S., as well as the massive impact they have had on human relations, social etiquette, student attention spans (!!!), and driving the news in traditional media outlets, is fascinating/frightening to watch.
 
With these evolving thoughts in mind, and in the aftermath of the devastation in Las Vegas, I noticed the article in the url below that explains how fringe groups are able to use social media to take advantage of crises to advance their agendas:
 
"When they woke up and glanced at their phones on Monday morning, Americans may have been shocked to learn that the man behind the mass shooting in Las Vegas late on Sunday was an anti-Trump liberal who liked Rachel Maddow and MoveOn.org, that the F.B.I. had already linked him to the Islamic State, and that mainstream news organizations were suppressing that he had recently converted to Islam. They were shocking, gruesome revelations. They were also entirely false — and widely spread by Google and Facebook."
 
The article reports that searches related to the shooting, in particular the name of the shooter, were generating results with false stories near the top of the list:
 
"In Google's case, trolls from 4Chan, a notoriously toxic online message board with a vocal far-right contingent, had spent the night scheming about how to pin the shooting on liberals. One of their discussion threads, in which they wrongly identified the gunman, was picked up by Google's 'top stories' module, and spent hours at the top of the site's search results for that man's name."
 
Facebook was also tricked into promoting false stories about the shooting:
 
"In Facebook's case, an official 'safety check' page for the Las Vegas shooting prominently displayed a post from a site called 'Alt-Right News.' The post incorrectly identified the shooter and described him as a Trump-hating liberal. In addition, some users saw a story on a 'trending topic' page on Facebook for the shooting that was published by Sputnik, a news agency controlled by the Russian government. The story's headline claimed, incorrectly, that the F.B.I. had linked the shooter with the 'Daesh terror group.'"
 
In both cases, the companies blamed "algorithm errors" for these mistakes. This demonstrates, of course, the extent to which algorithms now intrude into our lives, making them more convenient, but also producing unintended consequences. It also emphasizes the speed at which these groups work to take advantage of specific events, which suggests these companies are not in control of their product as much as they would like us to believe. What is more, the article argues that this is not a recent phenomenon, but something that has plagued these services for many years:
 
"But this was no one-off incident. Over the past few years, extremists, conspiracy theorists and government-backed propagandists have made a habit of swarming major news events, using search-optimized 'keyword bombs' and algorithm-friendly headlines. These organizations are skilled at reverse-engineering the ways that tech platforms parse information, and they benefit from a vast real-time amplification network that includes 4Chan and Reddit as well as Facebook, Twitter and Google. Even when these campaigns are thwarted, they often last hours or days — long enough to spread misleading information to millions of people."
 
All of this (together with the on-going Russia-related Congressional investigations) raises two important questions – one for these companies and one for the broader CSR debate. First, what kind of companies are these? If they are merely tools that others use to communicate with each other (a media platform), then they can claim that the content they convey on their sites are not their business. All they are doing is enabling communication that would take place anyway. If, however, they are media companies that are slowly taking on more of the functions of regular media companies (a media publisher), then that suddenly alters their level of responsibility for the content – something they were happy to edit when it helped produce a better product, but now something they may have to do more systematically:
 
"Facebook, for instance, previously had a team of trained news editors who chose which stories appeared in its trending topics section, a huge driver of traffic to news stories. But it disbanded the group and instituted an automated process last year, after reports surfaced that the editors were suppressing conservative news sites. The change seems to have made the problem worse — earlier this year, Facebook redesigned the trending topics section again, after complaints that hoaxes and fake news stories were showing up in users' feeds."
 
Second, an essential element of CSR is transparency and accountability, which is enabled by our ability to communicate clearly. The more free-flowing information is, the faster stories about sweatshops in SE Asia or oil spills in Nigeria can reach the stakeholders of large companies in ways that influence behavior. And it is the threat of this happening that can encourage companies to build long-lasting, trust-based stakeholder relationships. For all of this to hold true, however, it is essential that the information that is circulated is both accurate and reliable. This is where the attention turns back to Facebook, Twitter, and Google (among others). Clearly, they get this and are responding. Of course, more socially responsible companies would have done more to prevent this situation from running out of control (with such dire consequences) in the first place. The broader implications for the firms of not acting earlier are evident in the article in the second url below, which describes a discussion within the UK government as to whether Facebook and Google should be regulated as media publishers:
 
"Britain is looking at the role of Google and Facebook in the provision of news and what their wider responsibilities and legal status should be, a spokesman for Prime Minister Theresa May said on Tuesday. As more people get their news through Google and Facebook, some in the industry say the internet giants are publishers and not just platforms, meaning they should be held responsible for the content and regulated like traditional news providers."
 
Take care
David
 
 
Instructor Teaching and Student Study Site: https://study.sagepub.com/chandler4e
Strategic CSR Simulation: http://www.strategiccsrsim.com/
The library of CSR Newsletters are archived at: https://strategiccsr-sage.blogspot.com/
 
 
Capitalizing on a Mass Killing to Spread Fake News Online
By Kevin Roose
October 3, 2017
The New York Times
Late Edition – Final
A19
 
Britain looking at Google, Facebook role in news: PM May's spokesman
October 10, 2017
Reuters