The business of fake news: How algorithms influence our opinions

Scout Hutchinson, Columnist

The spread of information, news, personal ideas and stories through the internet has created a world where it seems like everything is at your fingertips. However, these online spaces have also created one of the biggest landscapes for business and uncontrolled capitalism. Information and knowledge have never been free, and the internet has not changed that. The Social Dilemma, a new Netflix documentary, has brought this conversation to the forefront of popular media discussion. 

Algorithms were created with the idea to cater to the person interacting with the program. However, they are now doing way more than advertising the clothing brands that you click on the most. They are also molding our opinions and political ideologies. This has created an entirely new business and consumer relationship. Like any algorithm that is supposed to cater directly to your interests, social media and internet search engines collect your data to then replicate and reproduce information back to you by offering you similar “products,” such as similar opinions, groups, etc. For example, if you are someone that frequents flat earth Facebook pages, the algorithm is more likely to suggest other pages with similar conspiracy theories. 

Not only do algorithms allow you to confirm your opinion and bias, but they also dictate what you think. By only showing you things that relate to your opinions, it becomes almost impossible to critically think and form new or different opinions. It becomes increasingly less likely that you will see something you disagree with, because the more times you click on something that confirms your political opinions, the more Facebook or Google will direct you to things that support your thinking. As Tristan Harris, one of the stars of the documentary said in an interview, “I think that this business model of doing whatever is best for engagement will always privilege giving each person their own reality.” 

While social media might not be the ultimate culprit — as the production of fake news and political polarization are not new — it allows corporations to profit off of fake news at a much larger scale. A study by MIT in 2018 specifically looking at Twitter has shown that fake news travels 6x faster than truthful news on that platform. Another study from researchers at Standford and NYU showed that 40 percent of visits to 65 fake news sites come from social media, compared to around 10 percent of visits to 690 top US news sites. This means that fake news creates more clicks, more attention and more money. So not only are we being catered to by algorithms who produce things that are related to our search history, we are more likely to see fake news. 

The algorithms themselves may not be programmed to direct one’s thoughts and political ideologies, but the information and fake news that they distribute to large audiences are. Through the investigation of Russian interference in the 2016 election, we can see how the distribution of fake news can polarize us and influence our decisions in voting. A study done by professors at Ohio University showed that fake news surrounding Hillary Clinton most likely had a substantial impact on the voting decisions of a strategically important set of voters, specifically people who voted for Obama. 

The spread of news and information through algorithms on social media influences how we think, act and engage as political animals. More specifically, it informs our political opinions and how we vote. Overloading people with fake information and justifying it with the right to “free speech” is as bad as burning books, withholding information or censoring. Our society believes that it is one’s right to choose what information they take in, without understanding what information is fake and what is true. On both sides of our two-party political spectrum, we assume that the person on the other side is seeing exactly what we are seeing. In reality, we have all been forced into bubbles that confirm our political biases.