The Digital Betrayal: How Cambridge Analytica Compromised User Privacy 

Platforms encourage extensive data sharing with third parties, while some users advocate for enhanced privacy protection. When our data gets collected without us knowing, it can cause harm. The article presents the Cambridge Analytica case, which showed how data can be used in sneaky ways for politics and business.

“We violated the profiles of 50 million people. We used Facebook to harvest millions of profiles and build models to take advantage of what we knew about the users to pelt their inner demons. This was the ground on which the entire business was built.” (Christopher Wylie, former Cambridge Analytica Employee)

what are we talKING ABOUT?

The answer is: Data-Driven behavior change and the mechanics of User Targeting. This topic concerns the use of data to influence and change the behaviors of online users. The mechanism involves collecting large amounts of data on users, which can include web browsing activities and habits, interactions on social media, purchase data, demographic information, and usage data from other apps or devices. These data are analyzed to identify patterns and create highly precise psychological profiles of users. Once profiles are produced, users are segmented into groups based on common characteristics, facilitating the targeting of specific and personalized messages. Additionally, persuasion techniques are implemented to motivate users to change their behavior. 

This is what happened with Cambridge Analytica and Meta’s Facebook to influence voting decisions in the realm of politics in the U.S.

A crucial aspect of User Targeting is the respect for privacy end ethical practices, which often get stretched thin in numerous ways in the digital world. Theoretically, companies should ensure that they obtain informed consent from users for the collection and use of their data – the so-called consent to Terms & Conditions – and be transparent in practices and protecting user data from unauthorized access and breaches.

How did the data go from Facebook profiles to Cambridge Analytica?

This is one of the most popular scandals about data breach and violation of privacy, but it’s important to remember cases like this because they shed light on a problem that’s still much relevant today.

For Cambridge Analytica, the data was collected via an app called thisisyourdigitallife – programmed by the academic Alekandr Kogan from Cambridge University – which offered the possibility to take quizzes about the user’s own digital life. 

Therefore, it wasn’t Meta who sold it: the data was collected through an apparently harmless app.

Here, we’re talking about a free service, which is actually paid by the user data: once agreed to the Terms & Conditions, the app thisisyourdigitallife gains access to the e-mail, gender, and other information contained in the respective Facebook profile. From the moment when the user logs into the app via Facebook, it starts collecting data, which gets immediately shared. At the same time, Cambridge Analytica can harvest all the data that Meta had ever collected about the user.

270.000 to 300.000 people subscribe to the app via Facebook login. Meta also allowed to gain data on the friends-network without the user’s knowledge, and before the titan of social media could change its rules, the app saved all kinds of information on 50 million profiles.

Theoretically, there was no data breach violation: Kogan asked for, and obtained, access to user’s data to the people who chose to subscribe to his app, and everyone involved gave their consent. Therefore, people gave up their information willingly: no system got sneaked into and no sensible information was hacked or stolen. Basically, this case is about the usage of a fissure of Facebook’s system to collect the data of the users involved, which – consciously or not – provided it.

Where’s the problem?

Kogan shared the collected data and information with Cambridge Analytica, violating Facebook’s Terms & Conditions, which prohibit app owners from sharing user data with third-party corporations.

So, the core issue is the resale of collected user data.

In 2015, Meta demanded that Kogan and Cambridge Analytica erase all the user data they had collected, trusting that they would comply. However, what Meta should have done was suspend their profiles and ensure the data was truly deleted from their databases.

Cambridge Analytica had close ties with some of Donald Trump’s key partners during the 2016 U.S. elections, and it’s believed that the harvested data was used to create voter profiles and influence voters. Additionally, there are suspicions that Cambridge Analytica assisted Russian propaganda efforts against Hillary Clinton and supported the Brexit campaign.

“Cambridge Analytica, Zuckerberg signs a letter of apologies on British journals”

Here we have a rhetorical passage: Zuckerberg does not admit that he was at fault for selling Facebook’s data; instead, he claims that he was misled.

The major problem that Meta continues to face is the failure to ensure, in a concrete way, that user data cannot be used without authorization. There is a clear disregard for privacy.

Also, Meta’s position on this issue is complicated by the fact that it employs a similar system for collecting and analyzing user data as Cambridge Analytica did, for its internal marketing purposes. This allows Meta to organize advertisements, and it also constitutes its primary source of income.

In fact, the product that Meta sells is user data, just like Cambridge Analytica. When we use Facebook, we create a digital version of ourselves that becomes attractive to third-party corporations, which are in a constant race to capture our attention through advertisements.

This is a systemic problem: it doesn’t concern just Facebook and Meta alone; it’s the entire ecosystem and business model that’s based on user data.

Katarina Kojic

sources

Bareebe, R. (2022). The Cambridge Analytica Scandal and Its Impact on Meta. DOI: 10.13140/RG.2.2.19583.69285.

Benecchi, E. (2022).  [PowerPoint Slides from the fifth lecture of the course “Culture digitali”]. ICorsi3. https://www.icorsi.ch/.

Brown, A. (2020). “Should I Stay or Should I Leave”?: Exploring (Dis)continued Facebook Use After the Cambridge Analytica Scandal. Social Media + Society 6(1). DOI: 10.1177/2056305120913884.

Cadwalladr, C. (2022, August 16). „I made Steve Bannon’s psychological warfare tool”: meet the data war whistleblower. The Guardian. https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump?CMP=twt_gu.

De Palma, G. (2018, March 23). Facebook e lo scandalo Cambridge Analytica, cosa sappiamo finora. Sky TG24. https://tg24.sky.it/mondo/2018/03/22/scandalo-facebook-ricostruzione.

Fink, D., & Jakee, K. (2024). Microtargeting Voters in the 2016 US Election: Was Cambridge Analytica Really Different?. The Independent Review: A Journal of Political Economy.

Kranz, J. (2018, June 24). The Facebook/Cambridge Analytica data scandal, visually explained. Overthink Group. https://overthinkgroup.com/facebook-cambridge-analytica/.

LaRepubblica. (2018, March 25). Cambridge Analytica, Zuckerberg firma una lettera di scuse sui giornali britannici. https://www.repubblica.it/esteri/2018/03/25/foto/cambridge_analytica_zuckerberg_firma_un_annuncio_di_scuse_sui_giornali_britannici-192205512/1/.

Levi, O., Hamidian, S., & Hosseini, P. (2020). Automatically Identifying Political Ads on Facebook: Towards Understanding of Manipulation via User Targeting. Disinformation in Open Online Media (95-106). DOI: 10.1007/978-3-030-61841-4_7.


Publié

dans

par

Étiquettes :