Facebook’s Algorithm Shapes Our Lives. This Hacker Wants to Find Out How.

Claudio Agosti wants to know what Facebook does with him. The programmer has developed a browser extension that collects data donations from users. He wants to decipher why we only get to see very specific political news – and what Facebook is hiding from us in their News Feed.

Claudio Agosti in Berlin CC-BY-SA 4.0 Cellarpaper / Bearbeitung: netzpolitik.org

Claudio Agosti is laughing. The hacker is giving a talk about how Facebook works at the art festival Transmediale in Berlin. He talks about how the social network weaves its algorithms in order to attract people into their web like spiders do with flies. In his melodic Italian accent, Agosti asks the crowd who would want to make a bet on whether Facebook treats their users in a fair manner.

Agosti straightaway gives the answer himself: “The truth is: nobody wins. In an oppressive system you are just subject to the decisions of someone else”. He grins mischievously. A game of algorithms, their power over the minds of their billions of users – it’s all a sinister joke to Agosti.

Claudio Agosti, 39 years old, bald and sturdy, has spent half his life exploring the impact of modern technology on us. He comes from near Milan and lives in Berlin. Whether you call him a hacker, a privacy activist or a critical researcher does not really matter. Agosti probably knows more about the way Facebook’s algorithms work than anyone who hasn’t worked on them personally.

It has been 10 years since Agosti first wondered how algorithms impact our lives. Back then, he noticed that Google’s search results had become more and more personalized. From a standard of results that were the same for everyone, he observed a filter bubble developing the he fears keeps us ever more encapsulated in the algorithms‘ world.

“Algorithms decide for you what is important”, he says. This is what bothers Agosti, the self-taught programmer who is used to mastering technology. “To be free, an individual should have full control over this logic.” A simple, yet radical thought.

Since 2016, the year that Donald Trump became president, Agosti has been working on what might be his most ambitious hack yet: to decipher Facebook’s news feed algorithm.

Facebook controls the selection of news in their News Feed. The algorithm decides whether a post is shown to the users or not. Its priority is to keep us on Facebook for as long as possible – while showing us advertising. How exactly they achieve this is the company’s best-kept business secret. Facebook earns billions with their News Feed.

Agosti wants to understand how the social network feeds its users with content. For instance, why Facebook shows some people more right-wing news and others more left-leaning ones.

The Italian wants to show the extent to which Facebook influences our thinking. An interesting and logical proposal a year after the scandal over Cambridge Analytica. But does it work?

Scrolling to the Solution

The Italian’s project is called facebook.tracking.exposed, abbreviated fbTrex. Thousands of Facebook users have already installed a browser extension developed by Agosti.

When a participant is scrolling through Facebook, the program which is runs unobtrusively in the web browser collects public posts while ignoring private ones. Every time someone views their Facebook timeline, Agosti’s project gains more data.

The software tells its users how often posts by a particular page or group are displayed to them. This helps them measure their own filter bubble, says Agosti.

The browser extension forwards the data of public posts to the project. With this data, Agosti intends to learn the principles behind the personalisation of Facebook’s news feed. The algorithm’s behaviour is used to draw conclusions about its method of operation.

The project, of course, has clear limitations. “It is not the goal to reverse engineer the algorithm”, Agosti said at the congress of the Chaos Computer Club in Leipzig, one of the biggest hacker conventions in Europe. It would be impossible to guess the complexity of its variables, he said. Rather, his aim was to collect enough data to demonstrate how the algorithm works to direct user attention following it’s own goals.

For instance, in the News Feed, posts alternate between videos and pictures with plain text. The rate at which content is mixed is quite stable among the observed users, Agosti points out. „Through signs like this we noticed that Facebook […] mixes components like primary colours to create a nuanced palette.“

From time to time, Facebook makes big changes to the algorithm. In January 2018, CEO Mark Zuckerberg announced that in the future they would prioritize “meaningful interactions” with friends and family over “public content”. Zuckerberg was responding to allegations that social media were propagators of fake news and disinformation.

Agosti states that he could clearly see the change of the algorithm in his data. Facebook had adjusted the rate of pictures to videos and text. He also noticed an increasing number of different sources in the News Feeds. “In the summer of 2018 I noticed that many users were offered a wider range of content in their Facebook timelines.”

A report by analytics firm NewsWhip gives insight into which sources users get to see: in the year after the algorithm change, the right-wing US channel Fox News became the publisher with the most engagement among users world-wide. The most popular topics included celebrity deaths, abortion and Donald Trump.

Does the algorithm reinforce such content? fbTrex aims to provide an answer to that question. Agosti says the project is much the same as others he did twenty years ago, when he first began to dissect networks. “Who does the data belong to, who is under the control of knowledge? It is always the same struggle, only with different faces”

About Dungeons and Dragons

Agosti has been struggling for a long time, albeit not always for the same side. His skills are self-taught. “I abandoned university after a few months, that was in 1999”, the 39-year-old writes in an e-mail. “I was learning much more via IRC with my hacker crew, s0ftpr0ject.” Their syllabus: writing viruses and hacking networks.

In the late 1990s, Agosti becomes part of the Italian nerd scene. People there still know him as Vecna, the name of a villain in the role-playing game Dungeons & Dragons.

Then the college drop-out joins HackingTeam in 2006. Founded in 2003, the company sells surveillance software for law enforcement. Years later, leaked e-mails reveal that HackingTeam had worked for authoritarian regimes like Saudi Arabia and Sudan.

For HackingTeam Agosti breaks into the computer systems of private Italian clients to show their vulnerabilities. He performs penetration testing of networks and analyses systems. “This exposed me to a harsh reality”, he says.

His former bosses valued his competence. “Claudio, I have never underestimated your extraordinary technical abilities and the breadth of your skills!”, co-founder David Vincenzetti writes in a leaked e-mail.

Agosti says he never had any part in the filthy deals with authoritarian regimes. He had only known about the first surveillance software prototype for the Italian market, he explains. Nevertheless, he did have ethical concerns regarding his work, he says today. He maintains he felt unease about making money from the insecurity of others. Agosti leaves HackingTeam not long after joining.

As thousands of internal e-mails from HackingTeam reach journalists and Wikileaks in 2015, there are also numerous e-mails from Agosti. But that does not bother him. „This is a leak in the public interest“, he write in a piece for Gizmodo. „I really feel that the personal and corporate damage is smaller than the improvement our society can gain from it.“

Hacktivism for Digital Rights

Since leaving HackingTeam, Agosti tries to position himself more clearly. The hacker, who claims to have been an activist even before HackingTeam, helps found Hermes Center in 2008, an association dedicated to digital human rights. There he works on GlobaLeaks, an open-source project for secure and anonymous communication with whistleblowers.

Users should be in control of the software that they spend many hours of their lives with, Agosti says. He thinks that kind of control should be a digital human right.

In addition, he is a member of Diem25, the political movement founded by the former Greek Finance Minister Yanis Varoufakis. He is co-author of Diem’s policy paper “Technological Sovereignty: Democratising Technology and Innovation”.

Even after years of work, Agosti can only make a rough guess on how Facebook designs its News Feed. Nonetheless, Agostis results can be astonishing. For example when it comes to how many posts Facebook is hiding from its users.

What Facebook Is Hiding from Us

Even when a user has hundreds of “friends” and is following dozens of pages, they only see a small portion of their contents. Agosti estimates that up to 80 per cent of all things that friends and followed pages produce do not show up in his News Feed.

A report by the World Wide Web Foundation, based on Agosti’s data, calls this “The Invisible Curation of Content”. In a series of experiments conducted by Agosti, certain news articles were never shown to users even though they were shared by sources followed by the subjects. This can have profound political implications, Agosti argues. In their experiment, the content Facebook hid included articles about #NiUnaMenos, a protest movement started in Argentina against femicide. Agosti wonders why Facebook buried them.

In the meanwhile, Facebook announced a new option called “Why am I seeing this post?”. It is supposed to give users more control over what they see and why they are seeing it. This feature is meant provide users with the specific reasons why a post pops up in their timeline.

However, Facebook still remains highly selective about its transparency. Lobbyists in Brussels recently admitted that some of the criteria will not be included in the upcoming feature, according to meeting notes kept by the European Commission.

For Agosti this is not enough. Transparency means showing all of the selection factors whilst also disclosing why a post is not being shown in the News Feed. What Facebook announced, he said, was bad. “The tool is not made to give you more control but to make users spend even move time on the platform.”

Next Stop: Youtube

His work regarding the Facebook news algorithm is only the beginning for Agosti. In his “tracking.exposed manifesto”, Agosti announces plans to expand his project to Youtube’s controversial recommendation engine. There, too, he wants to unveil how users are being tracked, profiled and influenced by algorithms.

Recently, Agosti is affiliated with the University of Amsterdam’s Data Activism research project. tracking.exposed lists Data Activism as Academic partner. His projects also receive funds from the European Union.

Agosti wants to continue his work as long as Facebook does not disclose their algorithms. In his opinion, the public has a right to understand how algorithms shape us and our society. Agosti maintains that as long as we do not understand the mechanisms at work there can be no political solution to the problems they create.

For him it is a matter of survival: the freedom of humanity must be defended in a web dominated by algorithms.

A German version of this story was published here.

Deine Spende für digitale Freiheitsrechte

Wir berichten über aktuelle netzpolitische Entwicklungen, decken Skandale auf und stoßen Debatten an. Dabei sind wir vollkommen unabhängig. Denn unser Kampf für digitale Freiheitsrechte finanziert sich zu fast 100 Prozent aus den Spenden unserer Leser:innen.

0 Ergänzungen

Dieser Artikel ist älter als ein Jahr, daher sind die Ergänzungen geschlossen.