Not everyone knows how to handle political arguments that disrupt the peace at the dinner table. However, while keeping this political hostility in mind, there is also evidence suggesting that YouTube’s algorithm system, which generates more politically similar opinion videos, is partly to blame for the increasing political polarization, particularly between U.S.’s left and right political parties. There’s a study from the U. C. Davis University supporting the argument that YouTube’s algorithm, a system that uses a person’s search history to recommend similar videos to maintain viewer engagement, is a likely culprit for the gap in understanding between both left and right party supporters.
Through what the U.C. Davis patented as the “loop effect,” the study describes how the YouTube algorithm will recommend more political videos that can guide the viewer not only deeper into politics, but deeper into extremism. Despite the counter argument against YouTube algorithm’s connection with increased political polarity, there is stronger evidence that the YouTube algorithm’s functions does contribute to this polarization.
After all, when a person clicks on Youtube, their homepage is already littered with content that supports their perspectives and beliefs.
When a person sees a homepage that only supports their political belief, it often makes them believe their political stance is correct. However, the more they provide political data for Youtube’s algorithm, the more they pose a risk of being exposed to radicalized political content that can come with negative physical and mental consequences on their health.
For instance, a 2021 study from the Pew Research Center stated that 1/5 of Americans that experienced online harassment, including actions that escalated to threats of physical harm or sexual harassment, was a result of expressing their political views regardless of their political stance. Given the evidence in this study, politically motivated online harassment is a specific form of cyberbullying which can become a stressor that contributes to feelings of increased anxiety or depression.
Additionally, another study from National Library of Medicine in 2022 also concludes that those whose political preferences deviated from the average voters also experienced an increased risk for a decrease in physical health, something that is further worsened by Youtube Algorithm’s “loop effects.” If left unchecked, evidence suggests that the Youtube Algorithm political recommendations can pose dangers not only to the spread of radicalized information but poor physical and mental health.
While keeping that in mind, the only chance a person will be exposed to different political perspectives is if the individual seeks that information for themselves.
The best way to be educated in politics or any topic is to consider all possible viewpoints. And the best way to do so is by pausing one’s YouTube watch history and taking the reins of how they gain their information, primarily through fact-checking and embracing politically different but reliable perspectives.