Skip to main content

Internet of Things and Privacy in Public

April 12, 2019


Victoria O'Laughlin

The increasing ubiquity of Internet of Things (IoT) devices has created new risks. IoT refers to everyday internet-connected objects, devices, and sensors which can collect, store, and transmit data without much human assistance (Rose et al., 2015). Public spaces are increasingly filled with IoT devices, the purpose of which is to enhance the quality and fruitfulness of life (Tietz et al., 2018).

While the embedding of IoT devices in public infrastructure is meant to improve efficiency and the general ease of life, IoT comes with security and privacy concerns (Voas et al., 2018). In large part this is because these information-collecting technologies are being built into public spaces without comprehensive and standardized laws around implementation, maintenance, monitoring, and user rights – or, more specifically, privacy rights (“Quarterly Compliance Report – Information Compliance”, 2016).

Individuals cannot expect total privacy in public spaces for obvious reasons, but the information collection capabilities of IoT in public spaces raises serious, personal privacy concerns. For example, IoT can surveil and track individuals, it is difficult for individuals to opt out of having their personal private information collected by IoT, and it’s often unclear how their collected information is being used or even sold (Rose et al., 2015). Frighteningly, while the growth and implementation of IoT in public spaces rapidly continues, privacy regulation in most places remains unaddressed. By being in public, individuals choose to be seen and, therefore, give up some privacy, but most people do not expect systemic collection of data about them to be recorded in these areas.

The existing examples of IoT in public spaces highlights the fact that privacy regulations have not been updated sufficiently to address the complexities and implications of IoT data collection. While people are generally accepting of having data collected as long as it’s unidentifiable and used to enhance the quality of a space (Emami-Naeini et al., 2017), seemingly-unidentifiable information can become re-identifiable (“Halfway through Sidewalk Labs year of consultation, what do we really know about Quayside?,” 2018), and the combination of corporate urban planning and mass data collection, can become much more personal than many presume.

There’s also the concerning fact that because IoT technologies such as sensors and cameras are embedded in infrastructure and sometimes discretely placed, obtaining consent of individuals is difficult (Brar, 2018). Finally, the collection and transmission of data involves numerous parties and cloud providers — and often any promises of data anonymity by a particular IoT vendor do not extend to these third parties. As IoT devices continue to be adopted within public spaces such as streets, parks, and plazas (Benfield, 2013), there are still no privacy rights around the collection of personal private information.

 Smart Cities and Privacy Issues

A “smart city” is a city where IoT devices are embedded in urban spaces to maximize productivity. Sidewalk Labs of Alphabet, Google’s parent company, is Toronto’s proposed Quayside smart city. The futuristic, innovative project intends to test how IoT technologies, digitized infrastructure, and data collection can be used to transform a city into the most energy and cost-efficient environment as possible (Bliss, 2018a). Toronto’s smart neighborhood joins a list of cities in the world where there are smart city projects in process. The cities include, but are not limited to, London, Singapore, Seoul, New York, Helsinki, Montreal, Boston, Melbourne, Barcelona, Shanghai, San Francisco, Vienna, Amsterdam, Shenzhen, Stockholm, Taipei, Chicago, Seattle, Hong Kong, Charlotte, Vancouver, Washington DC, New Delhi, Copenhagen, Columbus, Los Angeles, Surat, Tokyo, Berlin, Beijing, Sydney, Ahmedabad, Bhubaneswar, Jaipur, Atlanta, Pune, Wellington, Kansas City, Toronto, Dubai, Dublin, Tel Aviv, Philadelphia, Reykjavik, Lyon, Paris, Jakarta, Rio de Janeiro, Phuket, and Kigali (“Top 50 Smart City Governments”, n.d.).

A Usenix study, which surveyed over 1,000 individuals, concluded that most people found it acceptable and would consent to having their data collected in public spaces, as long as it is used for enhancing the quality of a space (Emami-Naeini et al., 2017). Sidewalk Labs’ data-collecting sensors aim to do exactly that by alleviating road congestion, creating inexpensive housing, and reducing unnecessary emissions (Bliss, 2018a). However, the effort has invited criticism from privacy advocates and experts such as Bianca Wylie, who stress the point that because Alphabet is a company, and not a government, decisions are made behind closed doors without consideration for citizens’ rights or voices, undermining democracy, and sidelining the creation of privacy rights (Bliss, 2018b). Maintaining and updating privacy rights with the growth of IoT in public spaces is crucial in order to protect sensitive information of citizens.

Just like any corporation, Google must comply with existing law, but regulations are not yet specific enough to address the quickly-developing, complex capabilities of IoT. Urban planning by a corporation with the integration of IoT has created worrisome privacy issues for those having their information collected by Sidewalk Labs’ digital infrastructure.

When data and information is not clearly identifiable to an individual, big data collection seems harmless. However, corporate data collection can become much deeper and more personal than many may imagine. For example, Canadian Privacy Statutes necessitate corporations to seek the consent of individuals and to be clear about how their personal information is collected and used (“International Comparative Legal Guide,” 2018). While Sidewalk Labs claims they will comply (“Halfway through Sidewalk Labs year of consultation, what do we really know about Quayside?,” 2018), the digital city doesn’t provide an opt-out option for data collection – not to mention, many individuals do not understand the privacy implications and vulnerabilities of having their personal information collected and stored. This leaves citizens with no choice but to trust Google when it says that their collected information will remain secure. Additionally, laws on data collection in public spaces do not mention ownership (Dawson, 2018), which leaves the decision of how to store collected citizen data up to Google instead of the government.

The company’s unique solution is to store any collected, unidentifiable information in a publicly-available, non-proprietary data trust, allowing anyone to access and review the smart city’s data collection (Bliss, 2018). In a data policy draft, Sidewalk Labs assured that the data and information collected would not be sold to third parties or used for advertising (“Halfway through Sidewalk Labs year of consultation, what do we really know about Quayside?,” 2018). However, the data trust could be seen as a public asset, vulnerable to hackers or other companies who have capabilities, tools, and incentives to access, re-identify, or sell the data stored in the trust (Dawson, 2018).

Much of the data to be collected in Sidewalk Labs wouldn’t be directly attributable to individuals and Sidewalk Labs has demonstrated commitment to remove any and all personally identifiable information from the data it collects (“’Not good enough’: Toronto privacy expert resigns from Sidewalk Labs over data concerns,” 2018). However, University of Toronto professor and founder of the University’s Identity, Privacy and Security Institute highlighted a serious risk of collecting seemingly-unidentifiable data. Clement says that the collected information “will at least in its origin be linkable to individuals” (“Halfway through Sidewalk Labs year of consultation, what do we really know about Quayside?,” 2018). He’s referring to the self-driving cars, bicycles, and digitized traffic monitors of Sidewalk Labs which grant Alphabet access to individuals’ unique movement paths. After analyzing repeated patterns of transportation, these technologies can help pinpoint where someone resides or works (Kofman, 2019). Additionally, Clement predicts that personal user devices would connect to Sidewalk Labs’ wireless internet networks, video cameras will be abundant, and the numerous sensors could both individually and collectively help track individuals’ behavioral or transportation patterns (“Halfway through Sidewalk Labs year of consultation, what do we really know about Quayside?,” 2018).

Regardless of Sidewalk Labs’ intentions and efforts to erase all personally identifiable information, the ability to track individuals makes re-identification possible. Ann Cavoukian, former privacy commissioner of Ontario, resigned from her role as a privacy advisor for Sidewalk Labs because she did not have faith in the company’s proposed data use guidelines (Bliss, 2018b). The highly-regarded privacy expert feared that the potential for re-identification of user data would welcome other companies or hackers to access and exploit the data gathered from Sidewalk Labs (Bliss, 2018b). People can be put in danger if actors with malicious intentions choose to re-identify victims and track their movements. This problem introduces vulnerabilities in the collection of individuals’ data, regardless of whether the data is clearly identifiable to begin with. Cavoukian also expressed concern about the privacy implications of third parties involved with Sidewalk Labs. While Sidewalk Labs’ privacy framework guarantees privacy, third parties would not have to adhere to the framework (O’Kane et al., 2019). This issue, in addition to the possibility of re-identification of collected information, ultimately deteriorates the legitimacy of any privacy initiative by Sidewalk Labs.

Another example of an IoT privacy issue can be seen in the developing smart city of Kansas City. IoT devices which lack visibility, such as discreet cameras, mean individuals don’t know that their data is being collected (Voas et al., 2018). Video cameras are embedded in Kansas City’s street lights to help identify parking availability in designated areas (Grote, 2016). The ultimate goal is to create a parking app for citizens and although the systems engineer assured that the footage would not be stored but instead overwritten every 36 hours (Grote, 2016), the interconnectedness of IoT complicates security. For instance, the movement of data from the video cameras to an app requires the data – even if temporarily collected – to be transmitted through a cloud provider and apps such as Waze (Brar, 2017).

In a similar example, the now commonly used video camera doorbells also entail unexpected privacy problems. While the homeowner may consent to being recorded in front of their door, a visitor whose data has the potential to be stored in the cloud, does not have an opportunity to consent (Brar, 2018). This is a problem because the footage collected on citizens could remain permanently stored somewhere or at least vulnerable to hacking without individuals’ consent. There are missing standards and procedures for connecting sensors to cloud providers (Brar, 2017), leaving citizens without privacy rights around their data collected in public spaces daily.

The digital infrastructure in Sidewalk Labs and Kansas City does not give individuals the option to avoid tracking, which ultimately leaves anyone vulnerable through their collected information without their clear consent. By simply existing within the boundaries of these smart cities, individuals are compromising their privacy without necessarily realizing it (Bliss, 2018a). If the blueprint becomes a reality and is successfully implemented, Sidewalk Labs is likely to be emulated in other urban settings internationally. There has been minimal creation of regulations and standards around privacy rights in the growth of Kansas City’s smart city (Brar, 2017). If the consideration of privacy rights in these data-gathering public spaces is overlooked, serious privacy, safety, and legal issues will manifest.

Existing Privacy Regulation

There have been attempts to tackle the growing issue of minimal privacy regulations in the digital world – but, they are distributed across different national jurisdictions. Differing national contexts and laws challenge the attainability of privacy regulation across borders since many IoT vendors are transnational. The Organization for Economic Co-operation and Development (OECD) recognizes this issue, and asserted in their guidelines that “member countries have a common interest in protecting privacy and individual liberties” (OECD, 2013) regardless of countries having different policies. The OECD has been a forum for discussing international regulations, where data privacy has been prioritized (OECD, 2013).

Existing privacy regulation can be seen in the recent European Union (EU) General Data Protection Regulation (GDPR). Aside from EU GDPR-compliant countries, other nations such as South Korea and the United States, have demonstrated strong initiatives to develop and strengthen privacy regulations.

The EU’s General Data Protection Regulation

One major step is the European Union (EU) General Data Protection Regulation (GDPR) which provides regulations for data privacy and protection (GAO, 2017). In addition to implementing high financial penalties for violating the GDPR’s standards of obtaining users’ data consent and rights, initiatives are in place to fill the legal gaps between the EU and external systems, such as the H2020 Privacy Flag (GAO, 2017).

However, enforcement of the GDPR becomes complicated when dealing with IoT data collection in public spaces (Brar, 2018). Since IoT is everywhere and, as mentioned above, often times discretely placed, it’s difficult for corporations to obtain consent from individuals. For example, unlike cell phones which corporations can use to directly ask for data collection consent, IoT doesn’t provide an interface to issue compliance for data collection (Brar, 2018). Moreover, data processing through a cloud provider, similar to Kansas City’s efforts to connect video cameras to a traffic app, further complicates keeping track of user data as information travels through a supply chain and multiple parties (Brar, 2018).

To address the complicated compliance issue with IoT data collection, the GDPR created a more specific process called the Data Protection Impact Assessment (DPIA) (“DPIA: Data Protection Impact Assessments under the GDPR – a guide,” 2017). The DPIA is a mandatory assessment intended for large-scale data collection, such as that of IoT in public spaces, to prevent potential privacy issues that could emerge from data processing (Vegh, 2017). The assessment must be completed proactively before data collection and processing occurs (Vegh, 2017), and one of the four required DPIA aspects under Article 35 of the GDPR states that the DPIA must contain “an assessment of the risks to the rights and freedoms of data subjects” (DPIA: Data Protection Impact Assessments under the GDPR – a guide,” 2017). While Article 35 specifies that the DPIA would provide measures for risk mitigation, the acknowledgement of how user rights are at risk and made vulnerable by IoT in public spaces shows that privacy cannot be guaranteed in the growth of new IoT technologies. This brings up again the issue of corporations building smart cities instead of governments: the ambitious business-oriented blueprints become much more complex when facing application of the GDPR and DPIA in IoT infrastructure.

South Korea

Another approach taken to regulate IoT can be seen in South Korea. The country’s Telecommunications Strategy Council provides oversight for IoT regulations, while collaborating with other organizations to ensure quality and relevance of the regulations (GAO, 2017). South Korea has clearly identified the necessity of regularly updating and refining laws and regulations around IoT privacy because along with the rapid advancement of technology, rules and regulations are necessary to maintain IoT security and therefore individuals’ privacy.

Emerging US Regulation

In the US, California’s Consumer Privacy Act (CCPA) is an initial effort to create best privacy practices around IoT, which will be implemented in January 2020. The CCPA aims to increase users’ awareness of how their personal information is used by businesses, to at least keep people knowledgeable about their data being used in public spaces (Smith, 2019).

While the CCPA doesn’t address IoT specifically, California’s SB-327, also known as the Security of Connected Devices Act, does. This law requires IoT devices to be equipped with security features in manufacturing, demonstrating vendor and manufacturer mindfulness in maintaining user privacy from the beginning. Specifically, SB-327 will “increase oversight on IoT security” (Smith, 2019), taking elements from the CCPA and turning that law more in the direction of addressing IoT. While these laws are a good place to start, neither the CPAA or California’s SB-327 are specific enough about users’ rights in relation to IoT, but do encourage the public to begin understanding the implications of their rights with the IoT, which is beginning to surround us everywhere.

In Washington State, S.B. 5376, or the Washington Privacy Act, was introduced at the beginning of 2019 and shares similar fundamentals as the EU GDPR. Only applying to companies who handle the data of tens of thousands of customers, the bill grants those consumers the right to know what personal information companies are collecting and whether or not it’s being sold (Nickelsburg, 2019). The bill requires these companies to be absolutely transparent with users about their data collection (Nickelsburg, 2019), one example being disclosing the logic and significance of personal data collected in public spaces from cameras or transportation tracking.

The detailed guidelines of the GDPR can serve as a basis for creating more global privacy rights. American companies which have global offices comply with the GDPR and will likely comply with US state laws such as the proposed Washington Privacy Act (Nickelsburg, 2019) since they have components of GDPR concepts and definitions (The Washington Privacy Act FAQs, 2019). These companies plan to extend GDPR regulations and protections to their American clients (The Washington Privacy Act FAQs, 2019).

In Washington State specifically, tech giants Amazon and Microsoft will receive stricter regulations concerning gadgets like facial recognition, a major biometric data analyzer (Nickelsburg, 2019). Frameworks geared towards corporations can help alleviate privacy concerns. For example, the Cloud Security Alliance (CSA) has recently created a CSA IoT Controls Framework, which is the first set of guidelines and context which companies can refer to when building IoT systems to help with digital risk management (Cloud Security Alliance, 2019). Corporations may encourage the development and implementation of these frameworks to ensure security of user data, while protecting their goals of making profit from implementing IoT networks in public spaces.

Next Steps

Increasing transparency and accountability between corporations who design and implement IoT in public spaces and individuals who reside in or visit these environments, is key. The unknown is what frightens most people, but transparency about how an individual’s data and information is used will build trust, while also safeguarding privacy to at least some extent (Emami-Naeini et al., 2017).

However, proactive articulation of what individuals’ privacy rights should be in the growth of IoT in public spaces across national contexts, will help mitigate concerns. Regulation could require organizations to designate responsibility to teams who manage the privacy aspects of IoT (Smith, 2019). Every IoT organization should have an enforced set of guidelines that include clarity on who’s managing the data, what the data will be used for, and whether it will be sold to third parties or not.

Better yet, governments should be wary of allowing corporations to build smart cities, and stress detailed regulation around IoT more than ever. New IoT technologies are constantly emerging and both governments and corporations lack regulation. Since data collection in public spaces directly affects the individual, perhaps it’s time for more public involvement – to preserve privacy, democracy, and safety. If individuals learn the severe implications of IoT data collection without having privacy rights and express their concerns, more specific privacy rights could emerge.


Article 35 | GDPR. (2017). Retrieved April 5, 2019, from General Data Protection Regulation (GDPR) website:

Benfield, K. (2013). The Important Difference Between a Public Space and a “Common.” Retrieved April 2, 2019, from CityLab website:

Bliss, L. (2018a, September 7). Behind the Backlash Over Sidewalk Labs’ Smart City. Retrieved March 22, 2019, from CityLab website:

Bliss, L. (2018b, December 21). Meet the Jane Jacobs of the 21st Century. Retrieved April 4, 2019, from CityLab website:

Brar, A. (2017, October 31). How mobile operators can bring the smart city to life – IoT Agenda. Retrieved April 5, 2019, from

Brar, A. (2018, May 21). What does the GDPR mean for IoT? – IoT Agenda. Retrieved April 5, 2019, from

Cloud Security Alliance Debuts Internet of. (2019, March 4). Retrieved March 21, 2019, from Cloud Security Alliance website:

Dawson, A. H. (2018, October 15). An Update on Data Governance for Sidewalk Toronto. Retrieved April 4, 2019, from Sidewalk Talk website:

DPIA: Data Protection Impact Assessments under the GDPR – a guide. (2017). Retrieved April 5, 2019, from i-SCOOP website:

Emami-Naeini, P., Bhagavatula, S., Habib, H., Degeling, M., Bauer, L., Cranor, L. F., & Sadeh, N. (2017). Privacy Expectations and Preferences in an IoT World. 15.

Grote, D. (2016, May 4). KC gets “smart” lights along streetcar line. Retrieved April 5, 2019, from Kansas City Business Journal website:

Halfway through Sidewalk Labs year of consultation, what do we really know about Quayside? | CBC News. (2018, May 2). Retrieved April 4, 2019, from CBC website:

International Comparative Legal Guide: data protection. (2018). Place of publication not identified: GLOBAL LEGAL Group LTD.

Kofman, A. (2019, January 28). Google’s Sidewalk Labs Plans to Package and Sell Location Data on Millions of Cellphones. Retrieved April 4, 2019, from The Intercept website:

Nickelsburg, M. (2019, January 22). Washington state considers new privacy law to regulate data collection and facial recognition tech. Retrieved March 23, 2019, from GeekWire website:

“Not good enough”: Toronto privacy expert resigns from Sidewalk Labs over data concerns | CBC News. Retrieved April 4, 2019, from CBC website:

OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data – OECD. (2013). Retrieved March 23, 2019, from

Office, U. S. G. A. (2017). Internet of Things: Status and implications of an increasingly connected world. (GAO-17-75). Retrieved from

O’Kane, J., McMahon, T., Bozikovic, A., & Hauen, J. (2019, February 21). Sidewalk Labs’s vision and your data privacy: A guide to the saga on Toronto’s waterfront. Retrieved from

Quarterly Compliance Report – Information Compliance. (n.d.). Retrieved from

Smith, E. (2019, March 14). If Left to Our Own Devices… What the New CCPA Regulations Mean to Risk Management | Shared Assessments. Retrieved March 21, 2019, from

Thakkar, D. (2018, May 7). U.S. States Enact Biometric Information Privacy Act. Retrieved March 23, 2019, from Bayometric website:

The Washington Privacy Act. (n.d.). Retrieved from

Tietz, C., Steinmetz, C., Rahmat, H., Bishop, K., Corkery, L., Park, M., … Thompson, S. (2018, April 9). Sensors in public spaces can help create cities that are both smart and sociable. Retrieved March 21, 2019, from The Conversation website:

Top 50 Smart City Governments. (n.d.). Retrieved April 12, 2019, from Top 50 Smart City Governments website:

Vegh, L. (2017, September 1). DPIA – What it is, When is it Needed and Why. Retrieved April 5, 2019, from EU GDPR Compliant website:

Voas, J., Kuhn, R., Laplante, P., & Applebaum, S. (2018). Internet of Things (IoT) Trust Concerns (Draft) (pp. 50–50). Retrieved from National Institute of Standards and Technology website:

This publication was made possible in part by a grant from Carnegie Corporation of New York. The statements made and views expressed are solely the responsibility of the author.