Skip to main content

Introducing Fairness to the Data Marketplace: Privacy Regulation & Consumer Empowerment

July 2, 2019

Author:

Charlie White

As the use of personal electronics has grown, tech companies have monetized the data that they collect from individuals using their platforms, selling it into data markets that provide consumers ‘personal information to help businesses make more informed decisions to great profit. With access to information like Google Search histories, online purchases, and Fitbit Steps, marketers are able to more finely target their online advertisements, banks are more informed about who they are loaning money to, and insurers can more accurately base their rates off of the behaviors of their customers.

Data driving these informed decisions have revolutionized the economy in the last decade, but, they have also created a deeply troubling market structure in which individuals have essentially no knowledge or control of their personal data. Data that companies are using and sharing.

The data marketplace has incentivized companies to deceptively intrude on personal privacy to collect and process personal data so that businesses can know more about the lives of individuals. A resulting conflict has emerged between unchecked data collection and sharing for companies on one hand and consumer autonomy and privacy for individuals on the other. As a result, the boundaries and standards that governments determine appropriate in this conflict are going to have an enormous influence on how society looks in the future.

The first major government policy response to this dilemma came on May 25, 2016, when the European Union (EU) Parliament approved and adopted the General Data Protection Regulation (GDPR) to regulate the data handling practices of any company operating in the EU. The GDPR enumerates the rights that individuals have over their data, establishes stricter regulatory oversight on data handling companies, and sets substantial fines for failing to comply. Companies have spent millions of dollars working towards GDPR compliance and will continue spending to fulfill the regulation’s administrative requirements. The costs that the GDPR imposes will likely disadvantage European companies, with potentially severe consequences for the EU economy.

The economic implications of regulation need to be considered and weighed relative to the overarching policy goals of protecting the data privacy and security of individuals.  There are other possible models for creating strict privacy regulation than the GDPR’s model. Policies that establish consumer autonomy in the data market through increased transparency and control of data collection and sharing are the most effective solution to privacy and security concerns. This strategy is a more direct remedy to the concerns of consumers’, while better limiting compliance costs for companies. User autonomy in the data market would allow consumers to pursue their own privacy and security interests, which would impel companies to treat these features seriously in order to gain the trust and use of consumers, with less costly regulatory burdens.

What does the GDPR require?

The GDPR pursues its principles of privacy and protection in essentially two ways; the first enumerates the rights that individuals have over their data, the second establishes more protective oversight from authorities over data handling practices. Between these two, the rights section does more to empower individuals with increased leverage against companies, while increased regulatory oversight seems to create more administrative requirements and bureaucratic cooperation for companies.

The GDPR applies not only to European companies but to any company operating within the EU and handling EU citizen data. This extraterritoriality is significant because many of the largest companies that the GDPR impacts the most are based outside of Europe. Of the 20 largest internet companies worldwide, 12 are American and zero are European, while Google controls 86% of the EU’s search engine market share and 72% of EU cloud storage providers were storing data in the US (Ciriani, 2015).

Failure to comply results in heavy handed enforcement of up to 4% of a company’s global annual revenue: for Facebook, $1.6 billion, and Google, $4.4 billion (Tiku N., 2018). It is unusual for a regional law to have such large international implications, but in the relatively borderless world where the data market exists, any effective data protection regulation must apply evenly as citizens’ data leaves that region. Between the extensive new responsibilities for companies, the extraterritoriality of the law, and the heavy-handed enforcement it enables, the GDPR has substantial weight behind it to interrupt how the data economy currently operates.

A data subject’s rights can be categorized into four categories; transparency, access, correction, and objection. Put together, these enable individuals to better know and control what is happening with their data. Digital rights are not absolute, meaning they have to be weighed against the claims and interests of companies. Nevertheless, empowering individuals with these means of autonomy is a significant first step to humanizing technology mechanisms and leveling the playing field between individuals and data handling companies.

Transparency

A data subject’s right to transparency is crucial in the interactions between an individual and a data holding company. For someone to have any meaningful control over their data, a prerequisite is that they understand what information has been collected on them and the rights they have in controlling that information.

To facilitate consumer understanding, the GDPR demands that communication from the company to the individual must be in “concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed to a child” (Intersoft, Transparency, n.d.). With the countless number data collectors and processors who interact in data markets, transparency with a data subject is likely much harder than the GDPR language makes it seem. Transparency is a critical principle to pursue, but also incredibly far from being realized.  Without transparency and the awareness that comes from it, it would be much more difficult to exercise any of the following rights.

Access

Where personal data is being collected from a data subject, that individual has guaranteed access to know the intentions and whereabouts of that data. Personal data describes data which can be used to identify a specific person, “such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person” (Intersoft, Definitions, n.d.).

When such data is collected, the individual has access to learn the categories of personal data collected, the purposes for which that data will be used, the recipients of it, the intended time period it will be held, and their right to correct, erase, or restrict that data (Intersoft, Rights, n.d.). Access to personal data is one step beyond transparency to reaching individual control. While the aims of transparency are to let subjects know of their rights, access is the first step of action from a data subject to exercise their own autonomy within the data market. Once learned on how they stand, individuals can pursue further control.

Correction

A data subject has the right to request personal data on them be corrected, made complete, or erased. If data are inaccurate or incomplete, then the company is unconditionally required to rectify its mistake (Intersoft, Right to Rectification, n.d.). Erasure, however, is more contested. A company must erase personal data upon a subject’s request for a handful of reasons including if it is no longer necessary for the purposes which it was collected and processed, if consent is withdrawn and there is no other legal basis for holding it, or if the data were collected unlawfully (Intersoft, Right to Erasure, n.d.).

Upon a subject’s erasure request, the subject’s interests are balanced against counter-interests such as the information’s necessity for freedom of expression and speech, compliance with a contractual obligation, benefits to public health, or legitimate research purposes.  After balancing, if the subject’s request wins out, a burden then falls on the relevant data collector to take reasonable steps to inform other data handlers which it has shared the relevant personal data with of the subject’s erasure request. Data erasure across the whole data marketplace is the much discussed “right to be forgotten.” It derives from the logic that erasure is essentially meaningless if the information is already shared. If a subject has legitimate reasons for that information to be gone, it must be gone from the entire marketplace, not just the place it came from. Some experts doubt this is even possible, while more optimistic experts admit it will be very difficult for companies to accomplish. At intersections like this, between valid entitlements for the data subjects and ambitious asks of data holding companies, there is still lots of ground left to cover to shorten the distance between divergent interests of industry and individuals.

Objection

Finally, data subjects have the right to object to the processing of their data on an individual basis. This applies if the individual feels that their individual circumstances should exclude them from certain data processing, at which point the company holding the data must stop until they prove their interests override that of the individual (Intersoft, Right to Object, n.d.). A caveat for companies is that if an individual opposes their personal data being processed for targeted marketing purposes, including subject profiles being built, the company must stop without the typical balancing of interests that usually takes place.

In addition to adhering to subjects’ rights, companies have many administrative obligations so that government authorities can maintain oversight of company data handling practices. Among these are the assignment of a qualified and independent Data Protection Officer, required data breach notifications within 72 hours, Data Protection Impact Assessments, and the recording of data processing activities (Intersoft Consulting, Chapter 4, n.d.). The essence of these provisions is increased communication and cooperation between industry and enforcement authorities. For instance, the Data Protection Impact Assessments are reports sent to the supervisory authority on the protection and privacy implications of new data processing operations. Likewise, collectors send their data breach notification to their supervisory authority with details on how many people are likely to be affected, the categories of personal data breached, the likely consequences of the breach, and steps taken from the company to mitigate future mistakes.

These administrative obligations help ensure responsibility from data handlers, although they will also increase the costs of operating within the EU considerably. These increased costs have quantifiable, although uncertain, economic consequences for the EU. Predictions from the UK Ministry of Justice puts the increase in annual total administrative costs for businesses at around £190 million a year for all affected businesses in the UK (Proposal for EU DPR, 2012, p. 12). Considering their best estimates that submitting a Data Protection Impact Assessment will cost around £27,000 for large companies and £11,200 for smaller companies, and that investigating and reporting a data breach costs between £1,000 and £2,000, across every company affected by the GDPR (the listed predictions only include British affected companies), these tasks are going to add up (Proposal for EU DPR, 2012, p. 15-23). Because of the drastic fines for non-compliance, companies are likely to err on the side of caution and overspend to meet GDPR compliance. A survey from PricewaterhouseCoopers leading up to the GDPR revealed that, of the companies which had finished preparations, 88% spent at least $1 million and 40% spent at least $10 million (PricewaterhouseCoopers, 2017). When operating a business becomes more expensive in one region, that region’s economy suffers.

What are the economic implications of the GDPR?

Imposing costs on data handlers operating within the EU ripples through interconnected industries to harm the competitiveness of EU based companies and the European economy. The consequences of these policies are severe enough to demand real consideration about what ends GDPR compliance costs are really serving.

The EU service industry is expected to be particularly affected by the GDPR because of the degree that these businesses rely on data, which will cost more because of the compliance costs for data handling companies (European Centre for International Political Economy, 2013). Service companies are those which exist by providing an intangible service to their customers. The range of business services is enormous, and the GDPR’s effect is going to vary depending on the importance of data in providing that service and how far spread a business is. To see this effect, consider a company like Allianz SE, a German based multi-national insurance company. They provides insurance as a service, which relies on making risk assessments about their customers. These predictive assessments are much more well-informed when the insurer has access to personal data to know more about their customer.

Insurers buy their data to guide their decisions from controllers and processors like Google, but because gathering data in the EU and transferring it across borders is now more complicated and expensive Google is likely going to charge more for it. Allianz pays the higher price but, as a result, is less competitive on an international scale than a similar insurer which didn’t have to. Service trading, especially between the US and EU countries, stagnates and, with it, the European economies which rely on it. In the EU, services are a vital and highly integrated sector of their economy.  The EU accounts for 24% of the world trade in services, making it the largest service trader in the world. The UN Conference on Trade and Development estimates that about half of service trade is enabled by the cross-border flow of data in the ICT sector. Half of the EU service trade adds up to about € 465 billion euros (600 billion USD), equal to nearly six times the total EU export of cars (European Centre for Political Economy, 2013, p. 5-6). A regulation which increases data holding costs strikes at the competitiveness of EU service industries, which will have widespread and crippling economic consequences.

The US Chamber of Commerce published a study attempting to predict the economic costs that the GDPR will have on transatlantic trade. While the authors admit their predictions are uncertain because there are still a lot of unknowns surrounding the GDPR, they are nevertheless severe. By calculating how restrictive estimated internal costs of GDPR compliance and fines are to production and cross-border trade, economists can predict the trade impact across industry sectors. Their predicted consequences are a decrease in service exports from the EU between 4.4%-18.6%. This amounts to a 1.5%-3.9% drop in the EU GDP or up to $1,142 dollars per European (European Centre for Political Economy, 2013, p. 20). The only other times the EU GDP experienced equivalent drops were in 1975 and 2009, so if US Chamber of Commerce was right in their predictions, then the benefits of data protection and privacy come at cost that should be weighed seriously. Again, however, nobody knows exactly what GDPR regulation will demand. Many detrimental effects that the Chamber of Commerce calculated came from applying “the right to be forgotten” into the equation. Although this right is included in the GDPR, there is not enough precedent for us to know how authorities are going to balance this right against the interests of companies. As one can tell from the severity of the estimated impacts, the authorities enforcement decisions of the GDPR are going to matter for the EU economy, and hopefully they approach their decision aware of the consequences that it has.

Recently decreasing investment into EU tech companies provides some insight of the fear and uncertainty that individuals are approaching the GDPR with.  Researchers tracked venture investment statistics on US and EU tech companies before and after the rollout of the GDPR. Overall, they found that the number of venture investment deals in European based tech companies decreased 17.6%, while the average amount per deal decreased 39.6% (Jia, etc. p. 4). When the researchers categorized their findings based off the age of companies, they found that the effects were sharpest in young firms (0-3 years) because a business is more reliant on investment during these early and transitional years. By then calculating the investment funding that is now absent relative to the number of employee’s young firms typically have, the researchers roughly estimate between 3,604 and 29,189 jobs lost, or 4.09%-11.2% of the number of individuals employed by young firms (Jia, et al., 2018p. 19). Again, economic predictions should be taken with a grain of salt because they cannot predict the adaptive strategies businesses are likely going to take. Also, measuring investing in such a short time frame captures only the initial and likely panicked response. As time passes there will be more certainty regarding the GDPR and the businesses which have learned to adapt under it, likely steadying the investments. Still, the research demonstrates the worry and uncertainty that people in the EU have towards the GDPR.

The economic consequences of the GDPR, no matter how uncertain they are, stem back to the threat of European companies being less competitive in a global market as a result of heightened compliance costs. The tradeoff of financial wellbeing for privacy is often overlooked but deserves more consideration from the public.

Maximizing subject control while minimizing cost

Innovative policy solutions oriented more towards consumer autonomy rather than costly bureaucratic oversight would reduce the administrative costs of companies while remedying the power imbalance between subjects and data markets. That 86% of survey respondents wanted more online control and transparency reveals an internet landscape in which users are unable to interact with the internet while protecting their own interests (Data & Marketing Association, 2019). With meaningful subject autonomy in the market, industry competitors would improve their privacy and security standards if only to gain users’ trust and use. There is admittedly a long road ahead to realizing subject autonomy in the data market, but if privacy proponents and policymakers work towards innovative technical solutions, these problems are not insurmountable. Technology advances like machine learning have fueled corporate data collection and processing by helping make sense of endless data – their similar capabilities to better the position and understanding of the consumer in the data marketplace need to be utilized.

With how complex data collection and sharing is, transparency with subjects is essentially impossible because there is so much information to grasp. Automating aspects of communication between companies and individuals would reduce the demand on individuals to understand what is happening with their data. Relying on individuals alone to understand privacy policies is simply not reasonable, even with the GDPR requirements for transparency (Litman-Navarro, 2019). Researchers approximate that reading every privacy policy we come across in a year would take a human 30 eight-hour workdays to finish. (McDonald, 2008). Relying on automation to simplify privacy policies for consumers could go a long way in providing transparency. Technical solutions in this mold exist:

P3P 1.0 creates the framework for standardized, machine-readable privacy policies, and consumer products that read these policies. Web sites express their privacy policies in a simple standardized format that can be downloaded automatically and read by web browsers and other end-user software tools. These tools can display information about a site’s privacy policy to end users, and take actions based on a user’s preferences. Such tools might provide positive feedback to users when the sites they visit have privacy policies matching their preferences and provide warnings when a mismatch occurs. They may also notify users when a site’s privacy policy changes (W3C, n.d.).

Realizing user transparency is more achievable than we think with the help of automated solutions. Policymakers need to consider utilizing tools like this in policy to provide transparency to users.

Other automated solutions such as Sticky Policies can similarly serve the ends of user control over their data. Sticky policies attach a data subject’s privacy preferences as machine-readable code to the data they produce, so as their data travels across multiple parties the control preferences they set have stuck (Pearson & Casassa-Mont, 2011). If someone doesn’t want their data shared with political profiling companies, for example, they could say so from the beginning and these firms would be blocked from seeing that individual’s shared online data. 93% of Americans believe it is important to control who can get information about you, yet the realities of the data market don’t reflect these popular beliefs at all (Madden et. al., 2016). Axciom, one of the world’s largest data brokers with claimed access on 700 million individuals, offers their clients the ability to “upload their consumer data…, combine it with data from more than 100 third-party data providers… and then utilize it on more than 500 marketing technology platforms.” (Christl, 2017, pg. 56). Users have their own data sharing preferences, and third-party access is seen negatively relative to most other forms of sharing. (Spiekermann, 2015, p. 12). This data sharing happens almost completely removed from any consumer autonomy. Sticky Policies are a step towards guaranteeing that who do get access respect the preferences of the data subject.

Automated solutions like P3P and Sticky Policies would be an extraordinarily cost-efficient solution to providing subject autonomy because they relieve an enormous amount of the administrative burden of compliance. Regulatory oversight could be limited to ensuring that company data collection and sharing practices are compatible with required consumer-protection software. The costly human elements of constant correspondence between industry and authorities wouldn’t be as burdensome because regulators do not dictate most industry standards. Instead, companies have more discretion on their own privacy and security standards, but consumers also have more transparency and control over the policies they come across. Changing the structure of the data marketplace by granting autonomy to data subjects will have its own costs; for example, backlash after irresponsible behavior or limits to data sharing. These costs however are more organic because they come from companies improving their own practices to gain consumer trust and use rather than corresponding with regulators.

With increased autonomy for subjects in the data marketplace, data handling companies will have to compete for consumer preferences by fostering trust and acting responsibly. In many regards data privacy and protection are not too different from consumer protection standards in other industries. For example, before buying a car a consumer likely does some research on the safety ratings of cars they are considering. Because the US National Highway Traffic Safety Administration assigns and publishes vehicle safety ratings, American consumers can easily learn and consider safety in their car purchase decision. The result is that customers buy safer cars and car manufacturers develop safer cars to gain an edge in market competition. Applying this phenomenon to the data marketplace, similar results would take place if online users had comparable easy-to-understand access to the privacy and protection practices of the platforms they interact with.

Survey results reveal that nearly all users are not entirely averse to data sharing: 99.9% of respondents are willing to share their data with organizations if the benefits and terms serve their needs (Roeber, 2015, pg. 105). They indicated that their willingness to share depends on the industry sector and the data handling practices of the company they share with. If consumers are empowered to act on their preferences of privacy and data protection, this would likely drive the industry to develop more advanced standards on their own terms in order to earn subject trust.

A recent marketing campaign by Apple emphasizing their commitment to privacy indicates that responsible data handling practices can be leveraged as a competitive advantage, much like safe car makers do to market their cars (Apple, 2019). Again, the difference is that car buyers know about the products they are buying, whereas internet users have no idea of what is happening with their data and therefore no meaningful choice. If lawmakers could successfully design regulations that lifts individuals to equal standing in the data marketplace with the companies, they provide data to, it’s likely that companies will have to develop more responsible practices for the sake of staying competitive.

Conclusion

It is critical that fundamental changes happen to the structure of the data market so that individuals have more autonomy in deciding who has access and to what information. Government regulation have only begun to address this flaw. Enumerating rights to individuals over the control of their data, as the GDPR does, is an encouraging first step. However, creating effective mechanisms for individuals to exercise their rights needs continued development. With meaningful consumer autonomy, there would be less need for burdensome regulatory oversight. There is promise in automated consumer-protection solutions that policymakers should be attentive to. Consumers and policymakers cannot resign themselves to the feelings of powerlessness that often come with these technologies.

Bibliography

Apple. (2019, March 14). Privacy on iPhone – Private Side. Retrieved from https://www.youtube.com/watch?v=A_6uV9A12ok

Christl, W. (2017). Corporate Surveillance in Everyday Life: How Companies Collect, Combine, Trade, and Use Personal Data on Billions (Rep.). Vienna: Cracked Labs.

Christl, W., & Spiekermann, S. (2016). Networks of Control: A Report on Corporate Surveillance, Digital Tracking, Big Data, & Privacy (Rep.). Facultas Verlags.

Ciriani, Stephane, The Economic Impact of the European Reform of Data Protection (March 31, 2015). COMMUNICATIONS & STRATEGIES, no.97, 1st quarter 2015, p. 41-58. Available at SSRN: https://ssrn.com/abstract=2674010

Data & Marketing Association. (2019, May 21). Webinar: Privacy, Regulation, and You [Web log post]. Retrieved from https://dma.org.uk/webinar/privacy-regulation-and-you-3

European Centre for International Political Economy. (2013). The Economic Importance of Getting Big Data Right: Protecting Privacy, Transmitting Data, Moving Commerce (Rep.). U.S. Chamber of Commerce.

Intersoft Consulting. (n.d.). General Data Protection Regulation (GDPR) – Final text neatly arranged. Retrieved from https://gdpr-info.eu/

I/S: A Journal of Law and Policy for the Information Society, vol. 4, no. 3 (2008), 543-568

Jia, J., Zhe Jin, G., & Wagman, L. (2018, November). The Short-Run Effects of GDPR on Technology Venture Investment (Rep.). Retrieved https://www.nber.org/papers/w25248

Litman-Navarro, K. (2019, June 12). We Read 150 Privacy Policies. They Were an Incomprehensible Disaster. The New York Times. Retrieved from https://www.nytimes.com/interactive/2019/06/12/opinion/facebook-google-privacy-policies.html?rref=collection/seriescollection/new-york-times-privacy-project&action=click&contentCollection=opinion®ion=stream&module=stream_unit&version=latest&contentPlacement=1&pgtype=collection

Madden, M., Rainie, L., Madden, M., & Rainie, L. (2016, March 24). Americans Attitudes About Privacy, Security and Surveillance. Retrieved from https://www.pewinternet.org/2015/05/20/americans-attitudes-about-privacy-security-and-surveillance/

McDonald, A., & Cranor, L. (2008). The Cost of Reading Policies (3rd ed., Vol. 4, A Journal of Law and Policy for the Information Society, pp. 543-568, Rep. No. 2372-2959). Ohio State University.

Pearson S. and Casassa-Mont M., “Sticky Policies: An Approach for Managing Privacy across Multiple Parties,” in Computer, vol. 44, no. 9, pp. 60-68, Sept. 2011.

PricewaterhouseCoopers. (2017, May 28). Pulse Survey: GDPR budgets top $10 million for 40% of surveyed companies. Retrieved from https://www.pwc.com/us/en/services/consulting/library/general-data-protection-regulation-gdpr-budgets.html

Proposal for an EU Data Protection Regulation – Impact Assessment (Rep.). (2012). UK: Ministry of Justice.

Roeber, B., Rehse, O., Knorrek, R. et al. Electron Markets (2015) 25: 95. https://doi.org/10.1007/s12525-015-0183-0

Ryan, J. (2018, January 16). Research result: What percentage will consent to tracking for… Retrieved from https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/

Spiekermann-Hoff, Sarah and Böhme, Rainer and Acquisti, Alessandro and Hui, Kai-Lung (2015) The Challenges of Personal Data Markets and Privacy. Electronic Markets (em), 25 (2). pp. 161167. ISSN 1422-8890

Tiku, N. (2018, March 19). How Europe’s New Privacy Law Will Change the Web, and More. The Wired. Retrieved from https://www.wired.com/story/europes-new-privacy-law-will-change-the-web-and-more/

W3C. (n.d.). An Introduction to P3P. Retrieved from https://www.w3.org/P3P/introduction.html

This publication was made possible in part by a grant from Carnegie Corporation of New York. The statements made and views expressed are solely the responsibility of the author.