New Senate Bill Aims To Rupture Facebook’s Algorithm

If you use social media, you’re fully aware of the influence that big tech algorithms have over your life. Algorithms prominently utilized by Facebook, Instagram, and others, display recommended content to users through technological design that leverages our personal data. In these cases, Facebook wins, as does it’s long list of high paying advertisers. But does the consumer win? A new Senate bill called The Filter Bubble Transparency Act seeks to put a stop to the egregious deployment of algorithms.
If passed, The Filter Bubble Transparency Act could dramatically change the course of social media for both consumers and advertisers. The new bill would force social media companies to disclose algorithms, as well as force them to provide non-algorithm options for consumers.
The bill has bipartisan support. The bill’s primary sponsor, Republican Senator John Thune, says the bill would help consumers gain back control and choice. Here’s a specific example, taken from The Filter Bubble Transparency Act‘s language, of how it might work if passed.
[The platform] provides notice to users of the platform that the platform uses an opaque algorithm that makes inferences based on user specific data to select the content the user sees. Such notice shall be presented in a clear, conspicuous manner on the platform whenever the user interacts with an opaque algorithm for the first time, and may be a one-time notice that can be dismissed by the user.
[The platform] makes available a version of the platform that uses an input-transparent algorithm and enables users to easily switch between the version of the platform that uses an opaque algorithm and the version of the platform that uses the input-transparent algorithm by selecting a prominently placed icon, which shall be displayed wherever the user interacts with an opaque algorithm.
In terms of which companies might take the brunt of the bill, Facebook stands high among its peers. Google consistently denies it participates in much personalization, though cases have been made that cite otherwise. Facebook, however, operates almost solely on an algorithmic filter. Even if you “like” a page, you aren’t likely to see that page’s content for the long haul. In 2014, Facebook began to shift away from a more open network into a platform that determines what content, friends or pages, a person should see. In 2018, another Facebook algorithm shift added in more personalization features that further sought to limit people’s exposure to certain content. Facebook Page administrators currently report that less than 2 percent of their followers see their content.
Many Facebook page owners lashed out at the company back in 2014, citing that they’d participated in paid campaigns to promote their pages. But Facebook pulled the rug from underneath of them, disrupting their page’s reach to those followers obtained through paid campaigns. This created a cycle of continued advertising by page owners. If you aren’t paying, people aren’t seeing your content, at least on Facebook.
Facebook claims that personalization is good for the consumer. But if The Filter Bubble Transparency Act is to pass, I have a hunch that most people will choose the non-algorithm option that allows them to view the content they chose to follow for themselves.
The Filter Bubble Transparency Act – Not Just Facebook
Overall, the bill would end up being far-reaching.
It would apply to “any public-facing website, internet application, or mobile application, including a social network site, video sharing service, search engine, or content aggregation service.”
The jury remains out on how much of an effect it might have on Google. However, many Republicans feel that Google’s search results filter out or hide conservative opinion content, or front-load far-left content, in search results. Facebook has faced similar criticism, even going so far as to tweak its algorithm to reduce the issue.
Democrats, of course, believe that nefarious characters utilize Facebook to influence elections.
So while the bill is bipartisan, it likely faces some hangups down the road.
Upworthy Founder Eli Parser Leads Charge Against Big Tech
Parser has long contended that “personalized content” is bad for our society. In a recent Ted Talk, Parser said that filters are an epidemic “sweeping the web.” In fact, Parser is credited with coining the phrase, “filter bubble.”
So I do think this is a problem. And I think, if you take all of these filters together, you take all these algorithms, you get what I call a filter bubble. And your filter bubble is your own personal, unique universe of information that you live in online. And what’s in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don’t decide what gets in. And more importantly, you don’t actually see what gets edited out. So one of the problems with the filter bubble was discovered by some researchers at Netflix. And they were looking at the Netflix queues, and they noticed something kind of funny that a lot of us probably have noticed, which is there are some movies that just sort of zip right up and out to our houses. They enter the queue, they just zip right out. So “Iron Man” zips right out, and “Waiting for Superman” can wait for a really long time.
Big tech has many challenges when it comes to the government. Beyond the potential of this bill passing, potential Presidential candidate Elizabeth Warren and Mark Zuckerberg seem to be embroiled in a dispute over big tech’s position in society. And it isn’t just Zuckerberg. Some believe Bill Gates recently implied that he’d vote for Trump if given the choice between her and him.
One thing is for certain, big tech issues will test all 2020 candidates.
Author: Jim Satney
PrepForThat’s Editor and lead writer for political, survival, and weather categories.
Please visit the CDC website for the most up-to-date COVID-19 information.
*As an Amazon Associate I earn from qualifying purchases