Winner of the Writing Competition: "Privacy in the Eyes of the Data Broker" by Jasmine Bae
Jasmine Bae is a sophomore from UC Berkeley and is the winner of the 2020 Writing Competition for the 46th International Conference, which focused on themes related to The Decisive Decade. Below is her article on how consumers are realizing the issues with businesses being in charge of their personal and private data.
Privacy is in the Eyes of the Beholder— or rather, the Data Broker
As our lives become increasingly digitized in the 21st century, our personal choices, habits, and mannerisms have become intertwined with our online identity. Companies of all industries have been increasingly encouraged to manage and collect data of their clients (or potential clients) in order to compete and strategize effectively in the new global economic system. At the same time, consumers are not only more informed about this phenomenon but are also realizing issues with businesses being in charge of their personal and private data.
In fact, in 2019 Pew Research found that 79% of Americans are “very or somewhat concerned” about how companies utilize their data. They also found that 70% of Americans believe their data is much less secure than it was 5 years ago. Businesses have only very recently realized their social responsibilities to uphold privacy matters as an ethical issue with new regulations made by large governing bodies. With the ongoing outbreak of the coronavirus pandemic, introduction of contact-tracing applications and spreading of misinformation have only continued to heighten fears of data collection and the continued divergence of a public “digital personality” from the private “real-world identity.”
Just recently in early July 2020, Apple’s iOS 14 beta program released new privacy features that would alert iPhone users of downloaded applications accessing phone data, most infamously accessing clipboard data. As a result of the new update, beta users found at least 54 popular apps that would read clipboard data with every keystroke, which included LinkedIn,Reddit, and TikTok (which have now been patched by all three apps). Other infamous examples of violation of privacy include 2018’s Facebook-Cambridge Analytica scandal, which revealed that “psychographic” data of citizens from around the world could sway various election campaigns in many countries. The U.S. Securities and Exchange Commission (SEC) subsequently handled investigations with Facebook and other tech giants for making “misleading disclosures regarding its handling of user data.” In addition, with stay-at-home procedures forcing many organizations to handle operations remotely through video-web conferencing, privacy concerns have flourished once more. Popular web conferencing software Zoom had to rewrite its privacy policy to reflect changes made regarding releasing customer data and video transcripts to third-party advertisers for targeted marketing purposes.
With mass data collection programs run by big businesses throughout the world, it can be increasingly unclear why exactly industries have decided to buy into the data analytics realm. Data collected under applications, services, terms-of-agreements, and other business operations are sold to data brokers, companies that work to categorize users and other clients for purposes such as targeted marketing and assigning individual consumer credit scores. The revenue from the global big data industry market —which includes collection, brokerage, and analytics— is currently over $50 billion as of 2020 and is projected to grow to $100 billion by 2027. For companies across industries, big data has provided consumers with more personalized services, marketing strategies, and targeted offers, but incidents of unethical and illegal misuse of personal data have made the public wary of business self-regulation and promises of keeping identities and online behaviors private.
Pressure from the public and government officials over privacy concerns have made way for monumental legal change. Recent enforcement of the U.S. 1998 Children’s Online Privacy Protection Act, or COPPA, has forced YouTube to change its kids’ content system so that data cannot be collected from children under the age of 13 through a $170 million settlement. In 2018, the European Union’s General Data Protection Regulation (GDPR) paved the way in regulating personal data online in a historical first for data privacy regulation. The legal framework required businesses to acknowledge the rights of E.U. citizens to erase their personal data online and to be able to opt out of certain data collection programs. Similarly in January 2020, California enforced a new privacy law, the California Consumer Privacy Act (CCPA), to allow residents of the state to request any data that businesses within the state have collected on their online identity and behaviors, as well as predictions that have been made about them.
Responses to the GDPR have pushed most websites to restructure how browser cookies are accepted among users, most noticeably by allowing opt-in and opt-out cookies. However, some websites have restricted user access unless they accept some, or all, cookies— a loophole that would allow companies to still utilize their client’s data as before. With the implementation of the CCPA, many tech companies —ranging widely from Amazon and Apple to General Motors and Target— allow users to request the data collected under their name, account, or identity. While this is a good first step in bringing transparency to personal data collection, most companies have restricted requests only to California residents, revealing that regulators and privacy-concerned people have a long road ahead until secret data analytics methods become widely and publicly available.
It is clear that most businesses have realized that the public and legal eyes have turned in favor of transparency rather than potentially unsafe and unethical privacy practices in the upcoming decade. As consumers are increasingly more knowledgeable about how online services and data collection/analysis systems operate, various industries have to shift their norms to respect new values of digital privacy. Whether businesses try to distinctly explain their data privacy practices to all consumers, provide more methods of data control to consumers, find new avenues for targeted advertising without violating privacy, or develop a new protocol supported by the masses, public opinion has favored openness over secrecy. Only time will tell if these changes are sparked by more scandals that will change modern perceptions of big data, public lobbying and pressure, government regulations, or by companies’ own moral choices. One thing is for certain, however: until a certain acceptable level of transparency is reached between businesses and their clients, consumers will readily continue to uncover more information about use of personal data, for better or for worse.