Beware online "filter bubbles" | Eli Pariser

Beware online "filter bubbles" | Eli Pariser

Understanding the Relevance of Information in the Digital Age

The Importance of Relevance

  • Mark Zuckerberg highlighted that personal relevance can overshadow global issues, stating, "A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa."
  • This raises questions about how a web based on relevance might function and its implications for society.

Personal Experience with Information Flow

  • The speaker reflects on their upbringing in rural Maine, where the Internet symbolized connection and potential for democracy. However, they note a troubling shift in information flow online.
  • They observed this change firsthand on their Facebook page, where conservative voices had seemingly vanished from their feed despite their interest in diverse political perspectives.

Algorithmic Editing and Its Consequences

  • Facebook's algorithm tailored content based on user engagement, leading to an unintentional exclusion of certain viewpoints without user consent. This phenomenon is not unique to Facebook; Google employs similar practices.
  • Search results can vary significantly between users due to numerous signals considered by algorithms, creating a lack of standardization across searches. For example, two friends searching "Egypt" received vastly different results regarding current events.

The Rise of Personalization Across Platforms

  • Major news platforms like Yahoo News and Huffington Post are also adopting personalized content delivery methods, which could lead to users seeing only what aligns with their preferences rather than what is necessary or important. Eric Schmidt noted that it will be challenging for individuals to consume untailored content moving forward.

The Concept of Filter Bubbles

  • The term "filter bubble" describes a personalized universe of information shaped by algorithms that filter out content based on individual behavior without transparency or user control over what gets excluded. Users remain unaware of the breadth of information omitted from their view.

Implications for Content Consumption

  • Research from Netflix revealed that viewers often face a conflict between aspirational choices (e.g., critically acclaimed films) and impulsive selections (e.g., familiar comedies), suggesting that algorithmic filters can skew our media consumption towards less enriching options if not balanced properly.
  • A well-rounded information diet should include both engaging and challenging content; however, reliance solely on click-based algorithms risks creating an environment filled with superficial or trivial information instead of meaningful insights.

Rethinking Gatekeeping in Information Distribution

The Role of Responsibility in Digital Media

Historical Context of Media Responsibility

  • The speaker highlights that society has faced similar challenges regarding media responsibility before, referencing 1915 when newspapers were not fully aware of their civic duties.
  • It was recognized that a functioning democracy relies on citizens receiving accurate information, positioning newspapers as essential filters for news.
  • The development of journalistic ethics emerged from this realization, which helped navigate the complexities of information dissemination throughout the last century.

Current Challenges and Responsibilities

  • The speaker draws parallels between the current state of the web and the media landscape of 1915, suggesting we are at a critical juncture again.
  • Emphasis is placed on the need for new gatekeepers—like those from major tech companies—to incorporate civic responsibility into their algorithms and coding practices.

Transparency and Control in Algorithms

  • There is a call for transparency in algorithms to allow users to understand how content is filtered and what rules govern these processes.
Channel: TED
Video description

http://www.ted.com As web companies strive to tailor their services (including news and search results) to our personal tastes, there's a dangerous unintended consequence: We get trapped in a "filter bubble" and don't get exposed to information that could challenge or broaden our worldview. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy. Read our community Q&A with Eli (featuring 10 ways to turn off the filter bubble): http://on.ted.com/PariserQA