Perspective | |
Privacy and ethics under the gaze of eyes in the sky
Data protection and by extension privacy is a universal value may be attainable, not merely aspirational |
|
Conceptions of privacy under the gaze of earth observations (EO) and satellite technology is now exposed to the full array of other means of capturing data from unmanned aerial vehicles through to in situ fixed CCTV cameras and sensors in indoor lighting systems. This statement is not to spread fear and foreboding that our privacy is being incessantly invaded by all and sundry. Whilst, privacy is a fundamental human right recognised internationally this ‘right’ has been poorly understood and the very term privacy itself cannot be precisely defined.1
Typologies of privacy
Any analysis of privacy could begin by analysing different conceptual frameworks and typologies. Privacy can also be looked at from a legal point of view, from aspects of the common law in addition to civil law rights and protections. From a jurisprudential point of view, legal frameworks and theories add a further complexity to our understandings of privacy obligations. These obligations include trespass and property law; confidentiality and data protection; environmental law; tort liability and nuisance.
The idea of privacy is that it has a series of interlocking themes including bodily privacy, such as invasive procedures on one’s body or other covert forms; informational privacy, in regards to data about the person, the privacy of secure and confidential communications; and locational privacy, in terms of intrusions to one’s domestic abodes, work places and shopping journeys.
The typology is summarised as to who we are, what we know, what we say and where we are that encompasses all thematic, temporal and spatial aspects of our existence. These also suggest the continuum of zones of privacy of personal and public spaces that are both inherent surrounding us and above us, that change over time and are relative to wealth, culture and ideology.
Privacy as a prism
The multidimensional prismatic nature of EO and other invasive technologies and the complexity of the privacy question has been a huge impediment to the development of an appropriate and holistic privacy protection program. For instance, there is no active “personal” jurisdiction in this regard nor is there a truly international geospatial jurisprudence for privacy rights and obligations in so far as EO is concerned.
Earth observation satellite cameras and sensors capture everything in its wake as the extra-terrestrial vehicle circles the globe. Advertent or otherwise all activities on the ground are captured on a regular and cyclical basis. When analysed such images may show single scenes or scenes over time. When mixed in with with other spatial and thematic information and increasingly ‘big data’, the images are truly a rich trove of data.
Technology has now been developed where in situ CCTVs are able to capture and recognise people’s faces. In these instances, the worry at the back of many people’s minds is the loss of “privacy”, whatever one conceives of it. Whereas in the past to invade one’s privacy is to physically cross social and property boundaries.
Lemma 1: At what point does the aggregation of spatially enabled data encroach on our privacy interests and who decides this?
The inference is that the conflicts and uncertainty is made more complex when one attempts to separate the identification of the individual and that of the identification of private property. Identification, per se, may not be an issue but identification coupled with extra information may raise serious privacy concerns.
An image of a person is of not much value but when it is linked to places, everyday patterns and other contextual information the picture may tell more than is apparent. This linked information network presents privacy issues in an age of automated data processing. The “new” problem for privacy in its wider information technology context reduces the ability of an individual to control one’s own information and it is this aspect of personal privacy that is placed at risk by high powered computer analytics.
But, the privacy question is also relative in temporal terms. For instance, there may be inter-generational differences in what and how people reveal information about themselves. The gen-Y children may be quite happy to share their lives through new social media such as Twitter and Facebook as contrasted to the “babyboomers” of the sixties who might be more reticent and “conservative”. Social networks now challenge the very fabric of what is “private” and what is secret.
Business model
The business model with social media providers such as Google, Instagram and Facebook is that users are asked to give up some personal information in exchange for free use of software and services. This business model has worked very well as witnessed by the success of these platforms, the numbers of users and the monetization of the data. The data harvested from users of these services have spawned a whole new industry in which third party buyers and sellers of data use these for their marketing and advertising campaigns. It seems that the business model is a win-win for everyone involved.
Lemma 2: The one question that arises from such activity is how much information can anyone afford to give away, which, once released, is irreversible.2
This business model has seemingly been successful in a liberal, democratic market economy of the developed world in the West. However, other newly developed economies in the East have used such a model in their governance with equally greater impact. The Aadhaar biometric identification system in India and the social credit system in China have taken the business model a few innovative steps forward.3
The Aadhaar system is a data-driven model of governance where citizens are given digital identities to enable them to access government services such as welfare benefits.4 The Aadhaar or unique identity number project (UID) has made it mandatory for all citizens to obtain an UID since 2012. The system has been lauded by major news outlets such as the BBC for giving the poor a presence and the ability to access welfare benefits.5 Access to social services, banking and food subsidies have now become possible for the masses as the national online system can verify the UIDs anywhere, anytime and in seconds.
Whilst there are challenges to the UID process from possible privacy and human rights violations such as, the risk of fake online identities, weak institutional measures and data breaches, the undocumented sections of society now have the right to an identity where previously they had none.
Perversely there are some sectors of the Indian society, who are privileged group of land-owning peasants, now wish to be classified in the ‘other backward classes’ in order to guarantee them government jobs and university places.
The UID has guaranteed citizens the right to an identity, to become visible and to be active in the public and market sphere. In some ways this system has provided the means to overcome the parallel illicit economies controlled by the rice mafia, the fuel mafia and the border mafia – the latter intercepting welfare goods before these reach the shops.
The first privacy law in India was installed only in 2017 as a solution to data protection issues. Indeed, in India the Hindi word for privacy is nijta, a term not usually used in local languages. This suggests that privacy protection may not be important and that privacy is culturally and economically relative.
By contrast the social credit system in China is where a citizen’s financial record, online shopping data, social media behaviour and employment history are used to produce a score. Such scores provide a measure of a citizen’s trustworthiness. China’s leading technological innovation with a datafied system is deployed by the government to promote social safety, financial inclusion and citizen participation. The “give and take” business model is evident here.
Ironically, the social credit system is not novel nor new. The “sesame credit” system used by the AliBaba group, launched in 2015, has close to 520 million users.6 While the points credit system is differently configured, better scores gain special privileges and has even been used as a status symbol on dating and match making sites.
Under the social credit system, the sharing of social, political and economic information between private enterprise and public authorities appear to work well. AliBaba data are used for designing smart cities, BeiDou GNSS data for autonomous vehicle navigation systems, and Tencent for medical imaging. Similarly, AliPay and WeChat Wallet provide an alternate to traditional banking practices as these provide access to nearly everyone where trust is paramount.7
The social credit system also assists in providing data-driven social order and citizen tracking. While citizen tracking might be an anathema in the developed world, in China personal security and safety are provided in return to greater dataveillance.8 Disparities in social inclusion, representation and minority rights may fracture Chinese communities and interpreted as the oppression of individual freedom of expression, but the prospect of payoffs in economic development, social betterment and good governance has proven to be more attractive.
Lemma 3: The business models above suggest that privacy protection should be understood and accepted to be the “process of finding appropriate balance between privacy and multiple competing interests” in light of different interpretations by the free market economy and the totalitarian state.9
Data protection
Privacy protection may be a misnomer. More correctly, we should be speaking of data protection as this terminology lends itself to better definition and clarity and is a much broader term. The Latin root privare which means “to separate”. To want privacy is to want to be separate, to be an individual and privare also means to deprive, to take, to rob or to leave something behind.10 This latter usage might be more apposite in the age of electronic tracking and surveillance with the ‘taking’ of information advertent or otherwise, with or without consent, and at any time and any place.
In the U.S. the origins of the privacy right may be traced to a law review article by Warren and Brandeis (1890) where the right to be left alone has held sway for over a century.11 Added to this are nuisance laws linked to property rights and the freedom of expression, all ensconced within the U.S. Constitution.
Thus, with reference to EO and UAVs, U.S. property law encompasses a zone theory where there is an upper strata and the lower zone in which land owners enjoy effective possession and use, the unused airspace that denies ownership to land owners, the use of and ownership of upward space (for navigation) and nuisance theory where recourse is only afforded to those who have suffered actual interference. Thus, to offer protection is to guard the use of the underlying data.
In a news report that Google pledges to keep secrets, it was suggested that privacy is the company’s achilles heel.12 Such a statement comes following the penalty imposed by the U.S. Federal Trade Commission of USD $22 million in 2012 for misrepresentation on the use of tracking information. In 2019 Facebook is faced with the prospect of a USD $5 billion fine for not protecting users’ data in the light of the Cambridge Analytica episode.13
Facebook has more than 2.7 billion users worldwide of its family of ‘apps’ – Facebook, Messenger, Instagram and WhatsApp among others. Facebook’s founder and CEO Mark Zuckerberg has welcomed more regulation. The rationale being that there are unintended consequences when technology creates applications and in most instances without ethical considerations. In redesigning the platforms Zuckerberg has declared that ‘the future is private’.14 By this is meant that the private social platforms become more like digital town squares where users are in private conversations that are un-shareable and encrypted. This is radically different from the current sharing economy.
Unintended consequences beg ethical considerations. As with the trolley problem that raises ethical questions of whom to preserve in a potential collision, the issue with dataveillance is the question of how to design technological ecosystems that behave responsibly.
An algorithmic solution promoted by Zuckerberg is that of an integrated identity. Whilst we may have multiple identities in our daily lives – that of self, parent, lecturer, sports person, advocate and the rest – the one identity proposed aligns with authenticity and a universal profile. But the intractable issue is how third parties may churn data from this universal profile for profit.15
The next issue is that the loss of identity may be the price one pays for living in a digital age where a digital trail of data is left behind everywhere we venture. The problem with geospatial technology generally, and EO technology in particular, is to balance the rights of individuals against the rights of the general public to the advantage of all.
On the one hand, a more lasting impact on consumer privacy is to make it mandatory through legislation for large corporations like Facebook to curb its ability to share data with its business partners. On the other, legislative and ethical tools should promote measures that inform consumers when and how it collects and shares data. Slonecker et al. (1998) have suggested that there is a dire need for ethical guidelines to provide the moral philosophy to data and privacy protection in the absence of a global comprehensive legal and policy framework.16 The invasion of privacy and data breaches both advertent and inadvertent can be difficult to identify, protect and police.
Lemma 4: Indeed, how does a person maintain “privacy” in a public place?
A cynical view is that perhaps the right to be left alone has been steamrolled by the rush of the digital revolution. In so doing privacy may only be available to those who can afford to pay for it.17 Similarly it has been said that privacy is not a right but a privilege, a luxury afforded by wealth and enforced by custom. However, some people are prepared to trade a bit of privacy for convenience – to save time and money – to divulge personal information in return for discounts, ease of future access and to save time. This is the operation of the business model par excellence. That privacy is a luxury should no longer be in contemplation. Whilst we do have a right and an entitlement to privacy in the Western liberal world, digital technology has eased the access of marginalised peoples around the world to have an identity, and to participate in the political and socio-economic life. Privacy is no longer culturally relative but is normative.
Further research questions
It may be that the advancement of technology and demands of modern business and government have made the traditional ideas of privacy anachronistic.18 In addition, netizens in the Web 2.0 generation in which things happen by mass action of the crowds, the solutions may be embedded in ethical, technical and social standards and not solely through those in legal avenues.
That data protection and by extension privacy is a universal value may be attainable, not merely aspirational. Fundamental to individual dignity, the research question is whether dataveillance balanced with benign paternalism may overcome the resistance to datafied governance models of the totalitarian kind or to adopt the business model of large corporations as proxies of the government. The aims of both governance models are the same but the means of getting there may be different.
Footnotes
1 UDHR 1948 Universal Declaration of Human Rights, December 10, 1948, Article 12 at http://www.un.org/ Overview/rights.html and Article 17 of the ICCPR 1976 International Covenant on Civil and Political Rights (New York, United Nations) at http://www. privacy.org/pi/intl_orgs/un/internatioinal_ covenant_civil_political_rights.txt.
2 The implementation of the EU GDPR legislation has a provision with the ‘right to be digitally forgotten’ that prescribes how this irreversibility may be countered. See Commission of the European Communities 2016 Regulation (EU) 2016/679 of the European Parliament and of the Council 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/ EC (General Data Protection Regulation).
3 Arora, P & H Stevens 2019 Data driven models of governance across borders: Datafication from the local to the global, First Monday V.24(1) 1 April 2019; Bang Xiao 2019 Chinese surveillance app reverse-engineered by the HRW highlights monitoring of Xianjiang residents, ABC News 2 May 2019 at https://www.abc.net.au/news/2019-05- 02/chinese-surveillance-app-reverseengineered- by-rights-group/11062670.
4 See Arora, P 2019 Benign dataveillance: Examining novel data-driven governance systems in India and China, First Monday, v.24(4), 1 April 2019.
5 Arora, P 2016 The bottom of the data pyramid: Bid data and the Global South, International Journal of Communication, V.10 pp. 1681-1699 and at https://ijoc. org/index.php/ijoc/article,view/4297.
6 Kostka, G 2018 China’s social credit systems are highly popular – for now, Mercator Institute for China Studies at https://www.merics.org/en/blog/chinasocial- systems-are-highly-popular-now.
7 Alipay is a third-party mobile and online payment platform, established in Hangzhou, China in February 2004 by AliBaba Group and its founder Jack Ma. See https://intl.alipay.com/. WeChat Wallet is a way to manage payments with a smart phone by adding debit or credit cards, to send money and pay for goods and services. See https://www.citcon.com.
8 Dataveillance, a portmanteau of data and surveillance, refers to the practice of monitoring digital data relating to personal details or online activities on social media and mobile applications.
9 Roger Clarke, “Privacy: More wobbleboard than balance-beam”, at http://www. rogerclarke.com/DV/Wobble.html.
10 See Beckman Center for Internet and Society – “Privacy in Cyberspace” at http://eon.law.harvard.edu/ privacy99/syllabus.html.
11 Warren, S & Brandeis, L (1890) “The Right to Privacy” in 4 Harvard Law Review 193.
12 Griffiths, C 2019 Google pledges to keep secrets, The Australian 9 May 2019, p. 14.
13 Issac, M & C Kang 2019 Facebook expects to be fined up to $5 billion by FTC over privacy issues at https://www. nytimes.com/2019/04/24/technologyfacebook- ftc-fine-privacy.html. See also Cambridge Analytica was a British political consulting firm which combined data mining, data brokerage and data analysis with strategic communication during the U.S. Presidential elections in 2013. See https://www.theguardian.com/ news/series/cambridge-analytica-files.
14 The Times 2019 Facebook finally discovers privacy, The Australian 2 May 2019, p. 3.
15 Brusseau, J 2019 Ethics of identity in the time of big data, First Monday, V.24(5) 6 May 2019.
16 Slonecker, ET, Shaw, DM & Lillesand, TM (1998) Emerging legal and ethical issues in advanced remote sensing technology, Photogrammetry Engineering and Remote Sensing, V.64(6), pp, 589-595.
17 Given, J (2009) “Privacy is over, get used to it”, in The Australian Literary Review, March 4, 2009, p. 10-11.
18 Philipson, G (208) “Privacy the price of super communication. If you want instant movies, music and phone calls you have no privacy. Get over it”, Sydney Morning Herald Next, July 15 2008, p. 29.
Leave your response!