Algorithmic Influence: How Social Media Feeds Turn Us Into Passive Consumers

Algorithmic Influence: How Social Media Feeds Turn Us Into Passive Consumers

“Filterworld” is the title of a new book written by Kyle Chayka that investigates the enormous impact that algorithmic recommendations have on user experiences in the context of the ever-evolving environment of social media platforms.

He investigates the ways in which the algorithms that power Facebook, Twitter, and Instagram have shifted the viewership of posts from chronological to curated. In an interview with NPR, he asserts that these algorithms influence the music, news, nutrition, and other interests of customers.

The author claims that the pervasiveness of algorithmic curation has transformed consumers into passive recipients, thereby “flattening” their preferences. It is not just users who are influenced by this; content providers, particularly in industries like as music, feel obliged to modify their work so that it is in line with algorithms. The creation of attention-grabbing hooks at the beginning of a song is becoming increasingly important for musicians who are performing on platforms such as Spotify or TikTok in order to maximize user engagement.

Chayka thinks that tighter regulation of social media firms could help lessen the influence of algorithms, despite the fact that algorithms are present in an overwhelming amount to begin with. One of the proposals is the possibility of platforms such as Instagram or WhatsApp being spun off from Meta, the parent company of Facebook. This would create a more competitive environment that would provide consumers with a variety of options to improve their experience.

Algorithm is King: Social Media Feeds

As the author contemplates the workings of the internet, he or she makes the observation that although it enables mass distribution, authors are nonetheless susceptible to the effect of algorithmic ecosystems while using the internet. In this day and age, conventional gatekeepers such as magazine editors or record executives are frequently replaced by digital engagement measures as a means of determining success.

Furthermore, Kyle Chayka, a staff writer for The New Yorker, expresses concern with the lack of insight into the algorithms used by others, which might result in a feeling of isolation within a digital experience that appears to be communal. Users are unable to determine what other people are exposed to, which makes it more difficult to build cultural experiences that are shared by all.

The author of the technology article also highlights the fact that algorithms promote passive consumption, which in turn restricts critical involvement with culture. He is concerned that the search of content that is simple to comprehend could result in the neglect of difficult masterpieces that do not immediately grab people due to their difficulty.

The experiment conducted by Chayka highlights concerns regarding the delicate balance that exists between convenience and depth in cultural consumption in a world where algorithmic curation is the dominant form of cultural consumption. He indicated that he was concerned about the “passivity of consumption that we’ve been pushed into, the ways that we’re encouraged not to think about the culture we’re consuming, to not go deeper and not follow our own inclinations.”

On the other hand, an article published in the Harvard Business Review pointed out that algorithms have restrictions that are caused by a significant constraint. This constraint is that algorithms are dependent on the actions of users, primarily clicks, views, and purchases. According to what was stated, the use of “revealed preferences” as the foundation for learning is deemed to be insufficient and oftentimes deceptive, as it does not adequately comprehend the true objectives and values of users.

This disparity between disclosed preferences and “normative preferences” presents obstacles, which prevents algorithms from meeting their commitment to improving lives across a variety of sectors, such as the curating of social media, the allocation of healthcare resources, and the pricing of insurance policies.

Read More: TikTok and YouTube Prove to Be Teens’ Favorite Social Media Platforms

Government Looking Into Harmful Impacts

For quite some time, researchers have expressed their worries over the algorithms that govern popular social media sites such as Facebook and Instagram, as well as the algorithms that shape the display of material on user feeds.

Frances Haugen, a former employee of Facebook who was acting as a whistleblower, brought additional attention to these algorithms in the year 2021. For the purpose of testifying in Congress, Haugen disclosed internal information to lawmakers and the public. He stated that the algorithm used by Facebook was exposing adolescents to an increased amount of content related to anorexia and stoking ethnic strife, particularly in places such as Ethiopia.

As a direct response to these disclosures, lawmakers, such as Senators Amy Klobuchar (Democrat of Minnesota) and Cynthia Lummis (Republican of Wyoming), have sponsored measures with the intention of either researching these algorithms or setting restrictions on them. According to The New York Times, according to the current state of affairs, none of these legislative measures have been able to effectively move into legislation.

Read More: Creating Engaging Social Media Content: Tips and Strategies