State and Surveillance

T

he state has always been engaged in security-related activities, but these have changed over time and especially, in an accelerating way, in the twenty-first century. Such activities include surveillance, understood here as any personal data acquisition and analysis for management, influence or entitlement. Today, state activities cannot be considered without noting the role of data flows between private corporations and government agencies, and of the part played by new technologies themselves that are often permitted a leading role, especially as artificial intelligence (AI) is promoted.

The upshot is that public trust is threatened as governments become preoccupied with issues that do not strike citizens as being central to their own security, and as data breaches and undemocratic practices proliferate. New methods of data analytics demand new approaches to how data is framed, analyzed and used. A duty of care regarding these matters is vital and includes attention to the sources of data and their curation, the algorithms used for analysis and the uses that are permitted for those data. Both internal and external assessment and review should happen periodically and be overhauled as needed, for appropriate data governance to be achieved, under the larger goals of data justice, the common good and human flourishing.


 

Back Story

In the long history of surveillance, the state has always been the key player. Some notion of security has been a central rationale. Externally, surveillance relates to geopolitical and military purposes or commercial advantage. Internally, surveillance might be pursued for the pacification and administration of the population. This includes the collection and use of data for everything from electoral rolls to health care and welfare provision.

Since the mid-twentieth century, surveillance carried out by state agencies has expanded enormously, both for geopolitical reasons — such as the Cold War and, later, anti-terrorism activities — and because new technologies were developed to enable such expansion. The very technologies, invented and refined for military use, have become the backbone, not only of state surveillance but also of industrial enterprise and everyday commercial and personal activities.

Externally, surveillance relates to geopolitical and military purposes or commercial advantage. Internally, surveillance might be pursued for the pacification and administration of the population.

The internet, invented as a Cold War communication network, became public and commercialized in the 1990s, when it became a source of information. Web 2.0 followed, in which not only interactions were facilitated, but where users could provide their own content. Soon it began to morph into the Internet of Things (IoT), which means that surveillance is embedded in all kinds of objects, from buildings and cars to vacuum cleaners and fridges. Thus, data is “skimmed off” from mundane practices including driving, shopping and sending messages (Jeffreys-Jones 2017).

Since the late twentieth century, with the rise of neoliberal policies, the relationship between state agencies and commercial corporations has become deeper and more complex. This is vital for understanding surveillance, not only because corporations supply the know-how and equipment for monitoring and tracking, but also because, today, the data desired for use in policing and intelligence, and in many other tasks, originates in ordinary online exchanges, searches and interaction, as well as in phone calls. This means routine forms of data exchange, allowing for information to flow between public and private realms, along with many public-private partnerships that have been developing since the 1980s and 1990s, are now normalized and commonplace (Ball and Snider 2013).

Rapid Developments

In the early twenty-first century, the events of 9/11 (that is, September 11, 2001) represented a crucial shift. The rapid securitization of many aspects of government and everyday life in the name of anti-terrorism is now seen as normal. Much of this development depended on the intensified deployment of information technologies from companies that at the end of the twentieth century had feverishly been seeking new markets. Biometrics, for instance, which had been languishing as an idea without an application, suddenly appeared to offer vital and reliable support for identifying suspects (Lyon 2008).

The extent of this began to be clear early on, but the disclosures by Edward Snowden in June 2013 demonstrated beyond doubt that the global security-surveillance network was in high gear. Government agencies were making extensive use of personal telephone and internet data, and it was increasingly hard to distinguish between internal and external surveillance. Consumers and citizens were outraged to know that, somehow, government agencies had access to their personal data. The crucial category was metadata, the details of where and when communications or transactions occurred, between whom and so on. Trust was further eroded by official denials that the metadata involved was consequential, even though it comprised the same data that private detectives would seek; that is, of a very personal nature, just not necessarily of the older date-of-birth or street address type (Lyon 2015; Szoldra 2016).

shutterstock_1015052329.jpg
Since September 11, 2001, the rapid securitization of many aspects of government and everyday life have come to be seen as normal. The intensified deployment of information technologies came primarily from companies that, at the end of the twentieth century, were seeking new markets and applications for ideas such as biometrics. (Photo: Hayk_Shalunts / Shutterstock.com)

As a site of user-generated content, the internet was hugely augmented by the growth of social media and then of platforms in general after the success of Facebook, beginning in 2004, and the so-called sharing economy of Airbnb, Uber and the like, a few years later. During these years, first in the corporate sector and then in government, ways to harvest, analyze and monetize this so-called data exhaust on a massive scale were found. The take-up rate accelerated as new data analytics — big data — was developed to utilize this trove.

The apparent possibilities for reviving older dreams of the online world as a harbinger of a new phase of democratic participation served to mask the regulation-resistant, competitive character of these mushrooming corporations. In fact, while ordinary, everyday lives became increasingly transparent to these data-greedy behemoths, their own activities became less and less transparent, something that our research team examined recently, directly relating to Canada (Bennett et al. 2014). Thus, the very bedrock of democratic involvement — trust, based on an informed citizenry — was being eroded from within.

By 2013, the Snowden disclosures indicated how the shift toward big data practices was happening with the National Security Agency (NSA) in the United States, but the shift was occurring simultaneously in Canada and elsewhere in the so-called Five Eyes countries. In Canada, the Communications Security Establishment (CSE) adopted a new analytic method from about 2012, which was described in “scientific revolution” terms. The switch was made from suspicion-led to data-driven approaches, heavily dependent on computing power and algorithmic analytics. Communications were to be monitored and analyzed to discover patterns producing actionable intelligence (Thompson and Lyon, forthcoming 2019).

Although agencies such as the NSA and CSE develop their own methods, they frequently work in tandem with commercial providers and university research hubs to create new surveillance tools within a network of agencies that is far more than the sum of its parts. In Canada, the Tutte Institute for Mathematics and Computing, working with the CSE, is a case in point. Reciprocal relationships are deepened within networks, both for providing expertise and software and for executing surveillance tasks. At the CSE, an education phase gave way to an exploration phase, and from there to engagement, which leans more heavily on machine learning. They work together with private sector commercial organizations as well as universities, developing algorithms for knowledge discovery and data mining.

Big Data Surveillance

Clearly, the phrase “state and surveillance” does not do justice to recent developments in security-surveillance networks following 9/11 and the rise of platforms that generate burgeoning data resources. New relationships mean that once-distinct public and private entities now shade into each other. Government works closely with businesses and research groups, and there is also a sense in which the technological systems themselves participate, especially as AI and machine learning become more significant. Such developments challenge conventional modes of scientific and technological practice, and, of course, the time-honoured approaches to policing and security.

This is a worldwide trend, seen in global IT companies — now often referred to as surveillance capitalism.

Today, huge amounts of data are sucked into systems that store, combine and analyze them, to create patterns and reveal trends that can be used for security, alongside other uses such as health, marketing, governance and many other areas.1 This is a worldwide trend, seen in global IT companies — now often referred to as surveillance capitalism (Zuboff 2019) — and also in the programs, activities and public documents of the CSE, the Canadian Security Intelligence Service and the Royal Canadian Mounted Police (RCMP). In terms of method, this major shift from causation to correlation raises many questions, for instance about privacy and data protection regimes that sometimes seem to be sidelined.

Some key features include the use of open source intelligence and social media, the geographies of security-surveillance (cables, clouds and data centres), and the implications for international relations of physical communication conduits that are accessible to intelligence agencies (Clement 2018). It also raises questions about how international professional security groups, in public-private partnerships, influence policy and profits. Each of the key features mentioned above relates to further issues, such as how big data practices exploit loopholes in current privacy laws, how security is mobilized as a permanent rationale for increased surveillance and how new channels of power and influence disproportionately disadvantage certain population groups (Dwork and Mulligan 2013; Raley 2013). The latter is clear in “predictive policing,” a parallel field to national security in which scrutiny of those already under suspicion is intensified, and the influence of race, class and neighbourhood are magnified through big data practices (Brayne 2017). This, too, has deleterious effects on public trust.

Underlying and infusing all these, however, is the question of what sorts of knowledge are sought. Time-honoured practices and patterns of research and investigation, in which causes and explanations are sought, give way to inductive analytical methods. Data, collected from disparate sources, are put together in new configurations in order to infer patterns. The result often leans toward correlations that have a much more uneven history in the quest for reliable knowledge. How far can such new methods be trusted, especially when they carry such heavy freight of responsibility for people’s choices, life chances and even human life itself?

 

Confronting New Questions

The large question to be addressed has to do with data governance. This is closely connected with questions of trust and, thus, also ethics, in both relations with the state and with corporations, in all their early twenty-first century complexity. Trust has been deeply damaged in both corporate and governmental domains, due to data breaches, surveillance overreach, unfair outcomes in policing and security, and disturbingly protective secrecy. Data governance should not be seen in only a technical or legal sense; data justice in data governance would align this with human flourishing and the common good.

Canada and similarly aligned countries cannot expect to advance their strategic and economic interests, let alone foster human flourishing, without rebuilding trust. This, in turn, relates to the focus of security concerns. If Canadian citizens suspect that the actual focus of security seems to refer to governmental, economic or technological activities and systems alone, then trust is once again threatened. However, if those interests are seen to be under an umbrella of human security (Zedner 2009), where personal, communal and environmental protection are the focus rather than states or national security, this will help to recover trust. These considerations underpin the specific comments that follow.

Given the major challenges of new analytic methods in state security endeavours, trust can only be developed by paying attention to protecting the kinds of basic rights and freedoms enshrined in the Canadian Charter of Rights and Freedoms. This also requires robust safeguards against erroneous and malicious use of data, not to mention transparency about government-related (such as the RCMP) use of private surveillance companies for monitoring dissent — at pipeline sites, for example, or at major inter-governmental summit meetings. Such safeguards would nurture human security and, with it, heightened trust.

shutterstock_566434033.jpg
Data breaches, surveillance overreach, unfair outcomes in policing and security, and disturbingly protective secrecy have damaged citizens’ trust in both corporate and governmental domains. Safeguards against erroneous and malicious use of data, and, importantly, transparency about government use of private surveillance companies to, for example, monitor dissent at pipeline sites, are required. (Photo: arindambanerjee / Shutterstock.com)

Turning to specific questions of the digital, and to data in particular, how these are handled is of utmost significance. As the methods of addressing security challenges are shifting fundamentally, so the questions for regulating and overseeing security-surveillance must also change. What was once thought of primarily as a question of data collection is now primarily one of analysis and use of data (Broeders et al. 2017). Along with this is a discernible shift toward data governance in terms of broad ethical frameworks, rather than of privacy alone (Bennett and Raab 2018).

As far as analysis is concerned, duties of care are required both in data collection and curation, and in the use of algorithms that are central to any analysis. Both internal audits and external reviews should be guided by the duty of care. If analysis involves profiling and/or automated decision making, or even decision support, then tight regulation is called for. Democratically organized oversight functions are vitally needed at each level.

In Canada, these matters resurface periodically in relation to the regulation of our own security and policing services. Bill C-51 (the Anti-terrorism Act, 2015), for instance, was very controversial for several years due to its permitting certain kinds of access to data without adequate accountability or oversight and its scant regard for civil liberties. Bill C-59 (An Act respecting national security matters) addresses some of these concerns in a manner that is at least somewhat more satisfactory (Forcese 2018), but constant vigilance is required if trust is to be rebuilt to serve the common good and human flourishing. There is an unfortunate history of overreach and obsessive secrecy within the departments charged with security matters, and these do nothing to enhance trust. New modes of transparency and public responsibility are needed throughout.


 

Conclusion

The “state and surveillance” is a far more complex equation than it may at first appear. Developments in political economy — neoliberal public-private partnerships, for example — and in new data-enabled practices of analytics, machine learning and AI all complicate relationships (Pasquale 2016). This makes it hard to know what exactly transpires within the agencies — security and policing — that are early adopters of new technological and analytic styles of operation. While genuine benefits may well emerge from CSE’s new analytic method or from predictive policing, current trends indicate there is a significant trust deficit and a sense of unfairness, in both procedures and outcomes.

The kinds of operation inspired by corporate practices, such as rating and ranking in credit schemes, by technological activities treated as if data were raw or algorithms were neutral, and that rely on inductive methods that produce correlations rather than explanations, demand radical rethinking. Practices that intensify categorical suspicion, for example, are patently unfair. Thus, requirements for data justice (Hintz, Dencik and Wahl-Jorgensen 2019), as well as for greater transparency, accountability and oversight, need to be part of programs to ensure appropriate data governance. This, too, is only a means to other, societally more significant aims — those of seeking to deepen trust and thus human flourishing and the common good.

Author’s Note
Thanks to Colin Bennett, University of Victoria, who kindly commented on a draft of this article.

  1. “Big Data Surveillance” is the theme of the current multi-disciplinary and international project at the Surveillance Studies Centre at Queen’s University: www.sscqueens.org/projects/big-data-surveillance.

Works Cited

Ball, Kirstie and Laureen Snider, eds. 2013. The Surveillance-Industrial Complex: A Political Economy of Surveillance. Abingdon, UK: Routledge.

Bennett, Colin J., Kevin D. Haggerty, David Lyon and Valerie Steeves, eds. 2014. Transparent Lives: Surveillance in Canada. Edmonton, AB: Athabasca University Press.

Bennett, Colin and Charles Raab. 2018. “Revisiting the governance of privacy: Contemporary policy instruments in global perspective.” Regulation and Governance 12 (3).

Brayne, Sarah. 2017. “Big Data Surveillance: The Case of Policing.” American Sociological Review 82 (5): 977–1008. doi:10.1177/0003122417725865.

Broeders, Dennis, Erik Schrijvers, Bart van der Sloot, Rosamunde van Brakel, Josta de Hoog and  Ernst Hirsch Ballin. 2017. “Big Data and Security Policies: Towards a Framework for Regulating the Phases of Analytics and the Use of Big Data.” Computer Law and Security Review 33 (3): 309–23.

Clement, Andrew. 2018. "Canadian Network Sovereignty: A Strategy for Twenty-First Century National Infrastructure Building." www.cigionline.org/articles/canadian-network-sovereignty.

Dwork, Cynthia and Deirdre Mulligan. 2013. “It’s not privacy and it’s not fair!” Stanford Law Review 66.

Forcese, Craig. 2018. “C-59 and collection of all that is in the eye of the beholder?” National Security Law: Canadian Practice in Comparative Perspective. http://craigforcese.squarespace.com/national-security-law-blog/2018/1/31/c-59-and-collection-of-all-that-is-in-the-eye-of-the-beholde.html.

Hintz, Arne, Lina Dencik and Karin Wahl-Jorgensen. 2019. Digital Citizenship in a Datafied Society. Cambridge, UK: Polity Books.

Jeffreys-Jones, Rhodri. 2017. We Know All About You: The Story of Surveillance in Britain and America. Oxford, UK: Oxford University Press.

Lyon, David. 2008. Identifying Citizens: ID Cards as Surveillance. Cambridge, UK: Polity Books.

———. 2015. Surveillance after Snowden. Cambridge, UK: Polity Books.

Pasquale, Frank. 2016. The Black Box Society: The Secret Algorithms that Control Money and Information. Cambridge, MA: Harvard University Press.

Raley, Rita. 2013. “Dataveillance and Countervailance.” In “Raw Data” Is an Oxymoron, edited by Lisa Gitelman. Cambridge, MA: MIT Press.

Szoldra, Paul. 2016. “Leaked NSA document says metadata collection is one of agency’s ‘most useful tools.’” Business Insider, December. www.businessinsider.com/nsa-document-metadata-2016-12.

Thompson, Scott and David Lyon. Forthcoming 2019. “Pixies, pop-out intelligence and sand-box play: The New Analytic Model and National Security Surveillance in Canada.” In Security Intelligence and Surveillance in the Big Data Age: The Canadian Case, edited by David Lyon and David Murakami Wood. Vancouver, BC: UBC Press.

Zedner, Lucia. 2009. Security. Abingdon, UK: Routledge.

Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism. New York, NY: Public Affairs.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.