Facebook is the largest social networking platform in the world, with 2.41 billion people actively using the platform. It also offers the largest advertising audience of all the platforms including Google. More of us should be taking more ownership in understanding how platforms such as Facebook and Google use consumer data for advertising and how it contributes to the shifting dynamic in advertising because the stakes are high for our privacy.
Mark Zuckerberg recently met with Congress for the first time since the Cambridge Analytica data scandal last year. Among topics of discussion including privacy, House of Representatives members unleashed questions regarding Libra, a blockchain digital currency that is expected to be released next year.
Many of the questions regarding Libra were about it as a monopoly. This concern is valid, and the development of Libra is worrisome.
Consider the possibility of Libra’s successful launch. Facebook will gain more personal user data around not just buying behavior but also financial behavior. We already know that Facebook has mismanaged consumer data in the past with the Cambridge Analytica scandal.
Can we trust Facebook to get it right this time when they develop a platform like Libra, or will they once again mismanage the data and have an even larger scandal due to the sensitivity of the financial information?
Aside from Libra, the upcoming presidential elections highlight another daunting danger for Facebook users. Thus far in 2019, candidates running for president have invested more than $48 million on Facebook advertising.
While President Donald Trump seemingly had the upper hand in the 2016 elections when it came to Facebook advertising platform knowledge, the Democrats are heavily invested in 2020 — with about $35 million in Facebook ads, more than 2½ times what’s been spent to date from the Republican Party.
The Facebook Ads Manager, an ad platform developed by Facebook, is a self-serve advertising platform for anyone to use. The idea of Facebook’s audience, the consumer, also being the product became more apparent following last year’s Cambridge Analytica scandal.
Essentially, Facebook’s business model relies on users sharing personal information so that they can then target them with advertising, something consumers are relatively unaware of, according to a recent Pew study. Somewhere between our pictures of newborn babies or waterfall-filled vacation spots lies a low-cost, highly targeted advertising opportunity for anyone willing to buy the spot. Often subconsciously, messages leave users experiencing psychological priming effects, potentially changing their mood and behavior.
Artificial intelligence and machine learning also play a role and are used to enable optimized targeting, language and imagery. The very same content that one person sees may have 20 other variations designed to appeal to each individual based on the machine’s collection of personal information.
Earlier this year, after paying a $5 billion penalty, Facebook finalized an agreement with the Federal Trade Commission to guide accountability throughout the company and how they develop the Ads Manager software. Facebook now has a published list of actions they’ve taken to “give people a voice on our platform while still keeping our community safe.” It’s about time Facebook said that, considering it has been on a rapid growth trajectory since its launch almost 15 years ago.
Ultimately, change is a burden upon all of us — not just lawmakers and businesses. Understanding a platform’s privacy terms is one challenge. Regulation such as the General Data Privacy Regulation (GDPR) has helped to encourage businesses to use easy-to-understand language in their privacy terms. However, there’s still a long way to go until society grasps the implications of technology use, especially on social platforms.
So, what can we do? People can stop using the platform, but that’s unlikely. Even in light of the #DeleteFacebook movement, Facebook has seen little threat in terms of a drop in company value or an overall drop in users. Instead, users can be more conscious about what they share and how they use platforms like Facebook. By understanding the power and value of user data, people can change behavior and better protect their personal data online.
Laura Bright is an associate professor of media analytics in the Moody College of Communication at The University of Texas at Austin.
Kristen Sussman is a Ph.D. candidate in the Moody College of Communication at The University of Texas at Austin.
A version of this op-ed appeared in the Houston Chronicle.