Promoting Fake News

It’s been established that Facebook, the largest social media platform, now has a perilous role as one of the largest sources of news. With 66% of their 1.94 billion monthly active users choosing to source their news solely from the site.  With users being sold information they are predetermined to agree with the users are left in a vulnerable situation. This is why when a Buzzfeed recently uncovered that 38% of content posted on far-right Facebook pages and 19% of far left Facebook pages promoted false information, the implications were severe. The issue here is that Facebook’s recommender algorithm allows for these fake news stories to gather traction with an editor to fact-check. In the previous news medium era’s there were several gatekeepers, as previously discussed, such as journalists editors and publishers who fact-checked and corrected media before it was published to the public. Now in this world of democratised news information is being published directly to the public via Facebook, avoiding the previous editorial channels and because Facebook promotes peoples engagement with content they agree with false articles run rampant. This is where the platform itself has to step in. Facebook has become the new gatekeeper of news. Where once there were newspaper publications and television programs, now Facebook is dominating as peoples one-stop shop for news. It is for this reason that Facebook needs to monitor and control the spread of Fake News, moderating falsified information so the vulnerable public of their audience is not further misled. #FakeNewsFeed

Vladamir’s Fake News

In today’s era of Fake News, one of the primary examples of the consequences of using social media as a primary source for news can be seen in the leader of the Free World. It is now retrospectively apparent that social media, and people’s dependency to social media, impacted the results of the 2016 presidential election for better or worse, depending on your political views. Facebook has revealed that a Russian propaganda company spent $100,000 on advertisements where they pushed polarising issues like gun control, immigration and racial tension naming both Donald Trump and Hillary Clinton directly. A Facebook spokesperson stated “Our analysis suggests these accounts and Pages were affiliated with one another and likely operated out of Russia. We have shared our findings with U.S. authorities investigating these issues, and we will continue to work with them as necessary.” As previously discussed Facebook promotes content to users that it deems they will agree with. Due to this the content published by nearly 500 coordinated fraudulent accounts were able to gain traction and they targeted individuals opinions surrounding particular issues. The implication of this incident is difficult to determine, but as the content aimed at strengthening the ideals of Trump’s followers, it illustrates a serious problem for Facebook and its users.

Who is the Gatekeeper of Facebook News?

Professor David M. White’s discussing an interesting idea of the ‘gatekeepers’ between the source and audience of news in his writings,? “The ‘Gate Keeper:’ A case Study in the Selection of News,” Traditionally news was sourced from Newspapers, radio and television, here the gatekeepers were reports who “make the initial judgment as to whether a story is ‘important’ or not” and finally the editor who “has charge of the selection of national and international news which will appear on the front and ‘jump’ pages of his newspaper.” [D. M. White (1950)] These gatekeepers have an important role as towards what the audience sees and how they perceive world events. What they decide is important enough for news and the angle they decide to take has a direct impact on what their audience reads and sees. 

In this new media landscape, the gatekeepers of news has drastically shifted. With 66% of Facebook users, that’s 1.94 billion people, using the site as their only source for news serious questions begin to arise as to who are the new gate keepers? The recommender algorithm system employed by Facebook has led to the world whereby users are their own gatekeepers. The algorithm selects information it has determined will be positively engaged with by the user. The information we see on our news feed is shown to us because of what we like, comment and share, what friends we associate with and what they like, comment and share. This personalization of the news combined with the oversaturation of the news industry means we’ve superseded reporters and editors as the gatekeepers of our news. The implications of this are yet to be seen.

Expand your news circle

This will shock nobody, but people are too reliant on Facebook. Yet the reason why people’s dependency on the social media platform should be of concern is not what you’d expect. Mark Zuckerburg’s creation boasts 1.94 billion users, that’s almost a third of the entire population, with two-thirds of Facebook users (66%) utilising the platform as their sole news provider. Seeing as less than 40% utilising multiple sites to source information on world events, any negative implications toward Facebook as a news provider will have widespread effects. The primary negative affect is that users are only shown what they want to see. As the algorithm aims to show users content the platforms deems they will positively engage users only see perspectives they already agree with. This means users never change their view on an issue as it is always supported by their news feed. This flows into the biologist Jakob von Uexküll’s discussion of “umwelts” he highlights the way individuals selective “functional cycles” of “receptors” and “effectors” [J. von Uexküll, pg. 324] limits their “umwelts” or world.This idea of the world of media and communications limited by our selective receptors can be expanded in a broader sense to our interactions with Facebook’s news feed algorithm. In essence, the technological development intensifies the selective nature of our “receptors” in the “functional cycle.” Thus Facebook’s algorithm has a detrimental effect on the perception of users on world events as viewers are not able to perceive news that they do not already agree with.