Home :: DerniersChangements :: DerniersCommentaires :: ParametresUtilisateur :: Vous êtes 18.117.184.145

< - - Bentham

3.3 Surveillance Capitalism

A third and relatively recent strand in theorising surveillance from a post-panoptic perspective is (neo-)Marxist surveillance theory. Connecting the thought of Marx to surveillance is not entirely new (see Fuchs 2012a for an overview). In fact, Marx himself saw surveillance as a fundamental aspect of the capitalist economy and the modern nation state, understanding surveillance as both an economic and a political concept (Fuchs 2012b). Surveillance is thus a coercive and technological method for controlling and disciplining workers, but it is also a political process of domination, which includes a potential for counter-surveillance through the press. Some scholars have started to apply the stages of Marx’s cycle of capital accumulation to the concept and practices of surveillance: applicant surveillance in the stage of capital circulation; workplace, workforce and property surveillance in the stage of capital production; and consumer surveillance and surveillance of competitors in the cycle of circulation (Allmer 2012; Fuchs 2012a). Other scholars, such as Mathiesen, Andrejevic and Ogura, are implicitly or explicitly deploying Marxist concepts such as exploitation, class, fetishism, ideology critique or culture industry in their analysis of surveillance.

Within the strand of (neo-)Marxist approaches to surveillance, the contours of a theoretical framework are emerging that is somewhat similar to Haggerty and Ericson’s surveillant assemblage, but which goes further in conceptualising surveillance as a dominant and overarching feature of capitalist society. This framework can be called
‘surveillance capitalism’, a term possibly first used by Bellamy and McChesney? (2014), but more thoroughly developed and disseminated by Zuboff (2015, 2016). Although surveillance capitalism is not yet fully developed as a theory, Zuboff (2015) is laying a foundation for a new ‘all-encompassing theory’9 at a ‘civilizational scale’, attempting to explain a new type of social relations and economic-political system that produce their own conceptions and uses of authority and power. Such an overarching theory is needed, because ‘[t]oday’s surveillance complex aligned with an economic base en- thralled with the prospects of metadata appears too strong for meaningful reforms without significant shifts in larger economic foundations’ (Price 2014).

Surveillance capitalism is a new subspecies of (information) capitalism that has gradually constituted itself during the last decade, in which ‘profits derive from the unilateral surveillance and modification of human behavior’ (Zuboff 2016). This new type of capitalism is said to have been hijacked by surveillance, which now has a wholly new logic of accumulation fitting a networked world—aiming to predict and modify human behaviour as a means to produce revenue and market control (Zuboff 2015). Zuboff’s surveillance capitalism thus signals a new era of capitalism with a new dominant logic of accumulation, claiming that the contemporary, very lucrative surveillance project subverts and corrupts the normal evolutionary mechanisms of capitalism—the unity of supply and demand that, however imperfectly, catered to the genuine needs of populations and societies and enabled the fruitful expansion of market democracy (Zuboff 2016). In contrast, surveillance-based capitalism no longer has any connection or interest in the needs of populations, societies or states.

What is called big data is the foundational component of this new economic logic, which is based on prediction and its monetization—selling access to the real-time flow of people’s daily life in order to directly influence and modify their behaviour for profit. The pervasive and ubiquitous recording of all daily transactions also means that the market is no longer unknowable as it still is in classical liberalism; rather, it is becoming transparent and knowable in new ways (Zuboff 2015; Varian 2014). It comes as no surprise that Google is seen as the ideal type of the new economic logic and commercial model.

Zuboff identifies four key features of the emerging logic of capital accumulation, in which she explicitly follows the four new ‘uses’ emanating from computer-mediated transactions identified by Google’s Chief Economist, Hal Varian (Varian 2014). First, the insatiable appetite for data extraction and analysis (data mining), commonly known as big data, illustrates the two possibly most crucial features of surveillance capitalism: formal indifference and functional (or structural) independence. Formal indifference represents the asymmetric nature of data extraction, which occurs in the absence of dialogue or (freely given and informed) consent. Functional independence breaks with the twentieth-century corporation model, since there are no structural reciprocities between the firm and its primary populations (durable careers for employees, long- term relationships with customers, Zuboff 2015). Instead, firms like Google rely on algorithms to manage their core services and on relationships with third parties (adver- tisers and intermediaries) for their revenue. The infrastructure of surveillance capitalism, built on big data with its goal of prediction and behaviour modification, eliminates the need for—or possibility of—feedback loops between the firm and its users.

Second, surveillance capitalism involves real-time monitoring of contractual perfor- mance along with real-time, technology-enabled enforcement of the contract (Zuboff
2015). In a world where such a system of contractual monitoring and enforcement is the norm, ‘habitats inside and outside the human body are saturated with data and produce radically distributed opportunities for observation, interpretation, communication, in- fluence, prediction and ultimately modification of the totality of action’, establishing a new architecture from which there is no escape, making the Panopticon seem prosaic. Where power was previously identified with ownership of means of production, it is now constituted by ownership of means of behavioural modification (Zuboff 2015, 82). Third, services are personalised and customised, telling you what you want or need to know (‘even before you know it yourself’). This feature produces substantial new asymmetries of knowledge and power (Zuboff 2015, 83). Fourth, the technological infrastructure allows for and requires continual experimentation and intervention into users’ lives. Since big data analysis yields only correlations, continual experimentation should attempt to tease out causality. A textbook example is Facebook’s experiment to secretly influence its users’ mood. Thus, ‘behaviour is subjugated to commodification and monetisation’ (Zuboff 2015, 85), which is the dominating logic of twenty-first- century market dynamics.

Zuboff would be first to admit that these contours of surveillance capitalism require further theorisation and development. Nevertheless, what she tries to show is that surveillance as a key part of the new logic of accumulation goes well-beyond privacy considerations. She tries to show that it threatens democracy itself, because it does away with the political canon of the modern liberal order, which was defined by the foundational principles of self-determination—in individuals’ private life and social relations, politics and governance.

In contrast, surveillance-based capitalism no longer has any connection or interest in the needs of populations, societies or states.
What is called big data is the foundational component of this new economic logic, which is based on prediction and its monetization—selling access to the real-time flow of people’s daily life in order to directly influence and modify their behaviour for profit. The pervasive and ubiquitous recording of all daily transactions also means that the market is no longer unknowable as it still is in classical liberalism; rather, it is becoming transparent and knowable in new ways (Zuboff 2015; Varian 2014). It comes as no surprise that Google is seen as the ideal type of the new economic logic and commercial model.

Zuboff identifies four key features of the emerging logic of capital accumulation, in which she explicitly follows the four new ‘uses’ emanating from computer-mediated transactions identified by Google’s Chief Economist, Hal Varian (Varian 2014). First, the insatiable appetite for data extraction and analysis (data mining), commonly known as big data, illustrates the two possibly most crucial features of surveillance capitalism: formal indifference and functional (or structural) independence. Formal indifference represents the asymmetric nature of data extraction, which occurs in the absence of dialogue or (freely given and informed) consent. Functional independence breaks with the twentieth-century corporation model, since there are no structural reciprocities between the firm and its primary populations (durable careers for employees, long- term relationships with customers, Zuboff 2015). Instead, firms like Google rely on algorithms to manage their core services and on relationships with third parties (adver- tisers and intermediaries) for their revenue. The infrastructure of surveillance capitalism, built on big data with its goal of prediction and behaviour modification, eliminates the need for—or possibility of—feedback loops between the firm and its users.

Second, surveillance capitalism involves real-time monitoring of contractual perfor- mance along with real-time, technology-enabled enforcement of the contract (Zuboff
2015). In a world where such a system of contractual monitoring and enforcement is the norm, ‘habitats inside and outside the human body are saturated with data and produce radically distributed opportunities for observation, interpretation, communication, in- fluence, prediction and ultimately modification of the totality of action’, establishing a new architecture from which there is no escape, making the Panopticon seem prosaic. Where power was previously identified with ownership of means of production, it is now constituted by ownership of means of behavioural modification (Zuboff 2015, 82). Third, services are personalised and customised, telling you what you want or need to know (‘even before you know it yourself’). This feature produces substantial new asymmetries of knowledge and power (Zuboff 2015, 83). Fourth, the technological infrastructure allows for and requires continual experimentation and intervention into users’ lives. Since big data analysis yields only correlations, continual experimentation should attempt to tease out causality. A textbook example is Facebook’s experiment to secretly influence its users’ mood. Thus, ‘behaviour is subjugated to commodification and monetisation’ (Zuboff 2015, 85), which is the dominating logic of twenty-first- century market dynamics.

Zuboff would be first to admit that these contours of surveillance capitalism require further theorisation and development. Nevertheless, what she tries to show is that surveillance as a key part of the new logic of accumulation goes well-beyond privacy considerations. She tries to show that it threatens democracy itself, because it does away with the political canon of the modern liberal order, which was defined by the foundational principles of self-determination—in individuals’ private life and social relations, politics and governance.

9 Stated by David Lyon at CPDP in January 2016, 2016: Surveillance capitalism: a new societal condition rising (video available at https://www.youtube.com/watch?v = 0Fr7zl9NA7s, accessed 02 April 2016); also see Zuboff’s 2016 forthcoming book, Master or Slave: The Fight for the Soul of Our Information Civilization, to be published by Eichborn in Germany and Public Affairs in the U.S.

4 Phase 3: Contemporary Conceptualisations—the Branching Out of Surveillance Theory

The previous section featured authors attempting to get away from the Panopticon model. The idea of internalisation of control via one-directional top-down architectures of surveillance no longer seemed to fit contemporary societies mainly because Foucault did not, and could not, include electronic layers of surveillance, which was the main focus of authors in phase 2. In this light, surveillance seems a technology-dependent concept; ideas about surveillance shift along with new technologies that challenge existing forms of societal organisation and governance. But does that imply there are no anchor points that go beyond theories running after each technological trend? Describ- ing surveillance in relation only to the latest technological trend to then analyse how this differs from the former situation, might not be enough to make us understand and potentially change the way we deal with different forms of surveillance in society.10

The problem of surveillance theory here is that the increase in size and complexity of surveillance practices seems to make it almost impossible to develop an over-arching theory of surveillance that captures surveillance as a largely unitary concept or phe- nomenon, as in Foucault’s or Deleuze’s theories. Rather, surveillance theory in what we can call phase 3, which builds on but also moves beyond the Panopticon and digital surveillant networks and assemblages, branches out in different directions. Thus, the third phase is characterised not so much by comprehensive theories, but rather by particular surveillance concepts or diagrams (Elmer 2003) studied in specific contexts or case studies. In phase 3, surveillance scholars re-visit and build on concepts and theories from phases 1 and 2. The shift in analysis from disciplinary to control societies, for instance, is relatively young, and it would be cutting corners to dismiss Foucault’s analysis of centuries of development in European history merely because our socio- technical lifeworld has changed in recent decades. On the other hand, surveillance does remain closely linked to technologies, and the on-going development of technological tools implies that concepts of surveillance do need to be regularly re-visited and re- thought, to offer relevant perspectives on how our world takes shape. In this section, we give examples of the different ways in which surveillance theories and concepts of phases 1 and 2 are being used and re-visited. Whilst a comprehensive overview is impossible to give here, we think the examples chosen are illustrative of the different directions in which surveillance theory is branching out from the previous phases.


4.1 New Forms of Panopticons

After 9/11, the surveillance industry has vastly increased in both form and content. The Snowden revelations have shown that nation-states conduct mass surveillance of communications, of both foreign and domestic citizens. They have done so all too often in conjunction with commercial parties and service providers. Furthermore, the emergence of social media has made the roles of watcher and watched and power relations in society more diffuse—we are letting ourselves be watched collectively and (seemingly) voluntarily, and we eagerly watch each other and the watchers. The Panopticon as a metaphor could still be productive to explain how surveillance works and what it does, albeit in adjusted forms. As David Lyon, a leading author in the field, states: ‘we cannot evade some interaction with the Panopticon, either historically, or in today’s analyses of surveillance’ (Lyon 2006, 4). This, he claims, is due to the ever- growing presence of ‘watching and being watched’ through all kinds of new technologies.
Where the Panopticon idea and the goal of creating docile subjects have spread from the prison to, for instance, the workplace and governments for reasons of productivity and efficiency, they have also travelled to ‘softer’ forms, such as entertainment and marketing. Through reality shows and YouTube?, to be watched is even becoming an asset and a social norm (the YouTube?-logic is: the more views the better). Lyon (2007) calls this ‘panopticommodity’ and Whitaker (1999) frames it ‘participatory Panopticon’, by which they conceptually try to project ideas of watching and being watched as a form of discipline onto current, contemporary manifestations of what is basically Foucault’s panoptic principle.

Bruno Latour also delved into surveillance concepts by coining the Oligopticon; ‘governance has thus consisted of a set of partial vantage points from fixed positions with limited view sheds’ (Latour as cited in Dodge and Kitchin 2011, 85). Each of these sheds has its own particular gaze and its own methods and related technologies—
‘things’—through which it operates. The partial vantage points of the Oligopticon, however, are increasingly linked as databases are connected. Moreover, the number of vantage points has increased post-9/11 (think of Deleuzian points of surveillance— access or checkpoints that citizens encounter in daily life). Bigo (2006), in an attempt to conceptualise 9/11 and what it did to notions of control, freedom and security, has coined the term BANopticon. Instead of monitoring and tracking individuals or groups to capture misbehaviour, the BANopticon aims at keeping all the bad ones out; it bans all those who do not conform to the rules of entry or access in a particular society. He points out that a series of events, most prominently the 9/11 attacks, have triggered a (constant) ‘state of unease’ and an American-imposed idea of global ‘in-security’ (Bigo 2006, 49). This leads to a rhetoric of ‘better safe than sorry’ under which an increase of surveillance measures could take place. This rhetoric also paved the way for experi- mentation with new surveillance technologies, such as body scanners at airports, and the accelerated introduction of biometric passports and experiments with motion tracking at Schiphol Airport (van der Ploeg 2003), for instance. Most of these measures are situated in Deleuzian access points, such as airports and border controls. Some scholars point out that societies emerging after 9/11 can be termed true surveillance societies, in which every citizen is a potential threat needing to be monitored (i.e. Lyon 2001). In that sense, the Panopticon as a diagram re-emerges; the access points create again a confined and bordered space where both visitors and inmates suffer a constant gaze. Lyon (2006) states that we do not have to dismiss the idea of the Panopticon, but that other sources of theory can also be found. This can help in creating more balanced and more informed analyses of current surveillance practices and/or to re-frame phenomena into refined theories or concepts of surveillance.

4.2 Building on Deleuze: Dataveillance and Social Sorting

Another contemporary branch in surveillance studies, featuring in both surveillance studies and media studies, elaborates the notion of ‘dataveillance’. As mentioned earlier, Clarke (1988) coined the term dataveillance to indicate that through computational means and digital information, it has become easier for governing actors to trace individuals or groups than was possible with the previous, often expensive and ‘heavy’ forms of architectural or institutional surveillance. Although databases existed before the computer, they involved equally ‘heavy’, analogue methods of gathering and storing information. Moreover, present forms of surveillance potentially have more impact and shaping power on citizens’ daily life than pre-Internet paper-based data entries because of the accessi- bility of digital databases and the relative ease of combining and sharing different types of data (Marx 2002). This has many implications for surveillance, because both in type and in amount information in these databases about different aspects of citizens’ lives has vastly grown in recent decades. Recalling Deleuze’s notion of the dividual, these different aspects represent parts of an individual, interesting for different actors for different reasons. A temporal difference with Deleuze’s analysis in the 1980s is that it is becoming increasingly unclear for individuals where their data resides, what kinds of correlations and profiles are being made and who is using this data for which purposes. Besides, many current data entries are not obligatory or forced—often people provide them happily and voluntarily—which makes it difficult to analyse dataveillance in terms of a digital Panopticon or as disciplining power. Data use has become opaque and the clear connec- tion between guard and inmate, watcher and watched, is lost. The role of the Internet and new media in society has been acknowledged and researched by (amongst others) media and surveillance scholars, who argue that the ways in which we lead our life has rigorously changed. However, despite the prevalence of dataveillance and mass- surveillance of communication by governments and companies alike, the question wheth- er and to what extent dataveillance actually alters society and public life in terms of discipline or control has yet to be answered. Productive surveillance concepts to be applied in this context are social sorting and adjacent concepts such as profiling, data doubles and predictive policing (to name a few).

Social sorting stems from a fear of others (Bigo 2006; Lyon 2003). To exemplify, several studies have been performed to explore social sorting empirically in the context of CCTV (Lyon 2003; Dubbeld 2005), showing that, despite increased reliance on software and protocol, social sorting still occurs often as a result of a white, male gaze in the CCTV control room, having a particular bias that leads to certain profiles of deviance. With the coming of more forms of automated surveillance, social-sorting processes might be more hidden or transposed into algorithms. The proponents of computerised and automated forms of surveillance foresee more objectivity: they argue that social sorting and arbitrary judgement, being a result of human (mis)judgments and prejudice, could be replaced by ‘objective’ software. In contrast, Galloway (2004) and others, in line with critical, Marxist11? perspectives, do not believe that technological sorting will be ‘fairer’ than human-based sorting in surveillance practices. This may be due to the fact that the so-called enforcer-class (police officers, CCTV operators, bouncers and other surveillance workers) consists mainly of a specific demographic, but also because of how the monitoring software is designed and which action possibilities or affordances (Gibson 1977) and responsibilities are inscribed in the software and hardware that the enforcer-class uses (Timan and Oudshoorn 2012). The concept of surveillant assemblages (see Section 3.2) could be used to study in concrete practices to what extent discipline and control are being transferred to automated, computerised processes and how this changes the normative structure of surveillance (Andrejevic 2007). Current surveillance practices show that, along with the rise of a surveillance class, this class serves political goals and powers that go beyond an objectivation of surveillance through smart and more efficient algorithms. Rather, against the grain of post-panoptical theories that warn against digital networks and automated surveillance as the upcoming loci of control, there are still physical and local surveillance realities that work next to, or in conjunction with, digitised and computerised networks. These surveillance realities still await proper conceptualisation.

11 See for example, Fuchs and Mosco, Introduction: Marx is Back – The Importance of Marxist Theory and Research for Critical Communication Studies Today (2012); Fuchs, New Marxian Times! Reflections on the 4th ICTs and Society Conference BCritique, Democracy and Philosophy in 21st Century Information Society. Towards Critical Theories of Social Media^ (2012a).

4.3 Participation and Empowerment in Surveillance

Moving away from the often dystopian or at least dark types of surveillance analysis that are particularly present in post-structuralist and post-Marxist scholars from Deleuze to Galloway, another branch has emerged in contemporary surveillance studies. Intro- duced by Haggerty, more neutral and sometimes even empowering accounts are discerned in systems of watching and being watched. If we accept that we live in a networked and technology-saturated society, it follows that apparatuses of surveillance, the methods, tools and technologies already mentioned by Foucault are not solely in the hands of power-hungry institutions, companies or governments. Even if we follow Deleuze in reasoning that corporations are now the main surveilling actor, with a surge for power and control that is enhanced by their opaqueness, still individuals can, at least to some extent, resist and refuse, mainly by finding alternative ways of using technol- ogy that is increasingly accessible to him or her. The extent to which this is possible, however, is subject of current debates surrounding mass surveillance. Instead of being a place where one looks at many, most social media technologies follow the logic of ‘many look at many’, where visibility is often deliberately chosen. In that vein, Albrechtslund (2008) in particular diverts from solely negative concepts of surveil- lance. Rather, he argues that since the emergence of ubiquitous computing, surveillance as a concept should be re-considered; ‘The entertaining side of surveillance is a phenomenon worth studying in itself, and we expect that this type of study will contribute to an understanding of the multi-faceted nature of surveillance’ (Albrechtslund and Dubbeld 2005, 3).

Albrechtslund looks at how surveillance is often used as a design principle in, for instance, online games and sports-tracking services. Besides a fun aspect, such games and services can also inform us about how a (part of) society reflects on notions of surveillance. Albrechtslund coins the term ‘participatory surveillance’; citizens/users are actively engaged in surveillance themselves as watchers, but they also participate voluntarily and consciously in the role of watched. Many online environments, espe- cially social networking sites (boyd and Ellison 2007), serve as interesting places of study, since many beliefs, ideas and opinions are shared there. Boyd and Ellison (2007) even state that social networking sites are dominating online activities today and as such, they constitute new arenas for surveillance. From the perspective of users and visitors of these online places, the high level of surveillance, in the form of tracking and being tracked, watching and being watched, or sharing and being shared, is not necessarily negative:

Characteristic of online social networking is the sharing of activities, preferences, beliefs, etc. to socialize. I argue that this practice of self-surveillance cannot be adequately described within the framework of a hierarchical understanding of surveillance. Rather, online social networking seems to introduce a participatory approach to surveillance, which can empower – and not necessarily violate – the user. (Albrechtslund 2008)

Participating via, for instance, sharing, responding or ‘liking’ engages users into these platforms, where the idea of being seen and ‘followed’ is a precondition rather than a setback. This is also called self-surveillance, a term that, with recent technolog- ical trends such as mobile healthcare applications (mHealth) and wearable computing, resonates more and more. The added value of this concept is that it allows for a user- centred perspective on surveillance, rather than a top-down or institutional analysis. Following boyd (2011) and Marwick (2012), this approach enables another type of analysis of surveillance, where tracing behaviour can reveal users’ experiences of surveillance and visibility. On the question why visibility is so important to these users, Koskela (2011) for instance explains that exhibitionism such as shown on social networking sites or in TV shows can work in an empowering way. By throwing everything into public arenas, ‘visibility becomes a tool of power that can be used to rebel against the shame associated with not being private about certain things. Thus, exhibitionism is liberating, because it represents a refusal to be humble’ (Koskela 2004). Similarly, in the marketing context, Dholakia and Zwick argue that ultra- exhibitionism ‘is not a negation of privacy but an attempt to reclaim some control over the externalisation of information. As such, ultra-exhibitionism is to be understood as an act of resistance against the surreptitious modes of profiling, categorization and identity definition that are being performed by others on the consumer whenever he or she enters the electronic ‘consumptionscape^ (Dholakia and Zwick 2001, 13).

A counter-argument to the empowering view of (self-)surveillance, however, is that emerging forms of self-tracking in for example mHealth or other measurement apps in combination with participation as a design principle could be seen as a facade or illusion of self-control, where actually users are being tracked and traced in the background. From a neoliberal point of view, one can also interpret self-tracking and self-surveillance apps as the ultimate model of ‘nudging’ (Sunstein and Thaler 2009), in which governments, institutions and companies are pushing back responsibilities (for health, for instance) onto individuals (Cohen 2016). In a way, this view re-introduces the Panopticon as a fitting metaphor—we are not only internalising doing good via external influences of (partly digitised) institutions in surveillant assemblages, but we are also, through self-monitoring apps and other forms of participatory surveillance, internalising these rational models and methods in a self-induced process of self- disciplining.

4.4 Sousveillance and Other Forms of Resistance

Another conceptual axis along which surveillance in phase 3 is analysed, concerns the level and type of resistance that is possible and present in certain contexts. One concept that emerged in the early 2000s is Steve Mann’s ‘sousveillance’ (Mann 2004), a mode of monitoring in which citizens watch governing bodies from below, as an opposing concept to surveillance—watching over or from above. Due to an increased availability of camera equipment and other recording devices, Mann argues one form of resistance is (and should be) to ‘watch back’ at those who watch us. Where Mann uses the example of wearable cameras to watch back at CCTV cameras, the idea of sousveillance could also be transposed to the digital or the virtual context, where citizen journalism and publishing (leaked) information about surveillance actors can be seen as forms of watching back. Also, in analysing processes of social sorting and exclusion, it can prove insightful to look into forms of resistance and sousveillance to investigate whether and how individuals and groups resist current forms of surveil- lance, dataveillance and surveillance capitalism. As end-users of social media, as feeders of ‘big data’ or as ‘implicated actors’ (Clarke and Montini 1993) of CCTV or WiFi? tracking, people still have some room for negotiation and resistance—for ‘anti- programs’ (Latour 1999) in use. Examples are citizens who choose to avoid CCTV cameras in cities (Brands and Schwanen 2014), wear anti-drone hoodies,12 instal free software (like ‘Detekt’)13 that detects spyware on your computer or feed faulty data into algorithms.14

Although surveillance studies sometimes offers such practical cases and techniques to dodge surveillance (for example Roessler 2002), conceptualisations of resistance or
‘pushing back’ are as yet quite scarce. This might be connected to two asymmetry problems: first, an asymmetry of power, since we rarely get to choose whether or how we are monitored, what happens to information about us and what happens to us because of this information; and second, an asymmetry of knowledge, since we are often not (fully) aware of the monitoring and how it works at all (Brunton and Nissenbaum 2013). These two asymmetries can reinforce our lack of resistance, since how can we resist something that we do not understand, know about and often simply can hardly influence?

A notable exception is the concept of ‘obfuscation’, made particularly prominent by Brunton and Nissenbaum (2013, 2015). They provide tools and a rationale for evasion, noncompliance, refusal and even sabotage, particularly aimed at average users who are not not in a position to ‘opt out’ or exert control over their data, but also offering insights to software developers (to keep their user data safe) and policy makers (to gather data without misusing it). They invoke the notion of ‘informational self-defence’ in order to define obfuscation, which they see as a method of informational resistance, disobedience, protest or covert sabotage to compensate for the absence of other protection mechanisms and which aids the weak against the strong (Brunton and Nissenbaum 2013). Obfuscation in its broadest form thus ‘offers a strategy for miti- gating the impact of the cycle of monitoring, aggregation, analysis and profiling, adding noise to an existing collection of data in order to make the collection more ambiguous, confusing, harder to use and, therefore, less valuable’ (Brunton and Nissenbaum 2013, 169). They offer wide-ranging examples, from ‘quote stuffing’ in high frequency trading to the swapping of supermarket loyalty cards. Many obfuscation tools, such as Tor/and proxy servers, are however still not widely known or deployed outside the relatively small circles of the privacy-aware and the technologically savvy; moreover, they come with transaction costs (e.g. Tor can be slow and is blocked by many large websites, Brunton and Nissenbaum 2013, 168). They also discuss certain ethical and political scruples (Mercer 2010; Pham et al. 2010; Brunton and Nissenbaum 2013) to indicate that obfuscation is not a panacea to address the downsides of surveillance. Nevertheless, Brunton and Nissenbaum see obfuscation as both a personal and a political tactic and therewith offer a theoretical account of resistance that can serve as a platform for further studying legitimate and problematic aspects of surveil- lance and its opposition in an age of ubiquitous data capture.

12 See https://ahprojects.com/projects/stealth-wear (accessed 04 April 2016), for instance.
13 See https://resistsurveillance.org/ (accessed 04 April 2016).
14 See https://cs.nyu.edu/trackmenot/ (accessed 04 April 2016), for instance.

5 Conclusion

In this paper, we have given an overview of key theoretical frameworks and conceptualisations in surveillance theory. This mapping of the field may assist surveil- lance studies as it is increasingly feeding into and fed by a wide range of disciplines, each of which brings their own perspective, concepts and assumptions to understanding surveillance. We think it is important that surveillance scholars—not only those positioning themselves within surveillance studies but also those contributing to other disciplines—have a common ground for discussion and further development of the field. We hope that the overview in this paper can serve as such. It is particularly relevant for newcomers to the field that there is much more to understanding surveil- lance than Foucault’s panopticism.

We have structured surveillance theory in three roughly chronological-thematic phases. The first two attempt to theorise surveillance by offering comprehensive theoretical frameworks, whilst the third further conceptualises surveillance, without developing alternative holistic frameworks but rather building on the insights of the first two.
The first phase, featuring Bentham and Foucault, revolves around the Panopticon and panopticism. It can be characterised as offering architectural theories of surveil- lance, where surveillance is largely physical and spatial in character (either in concrete, closed places such as institutional buildings or more widespread in territorially based social structures) and largely involves centralised mechanisms of watching over sub- jects. Panoptic structures are theorised as architectures of power; through panoptic technologies, surveillance enables power exercise, not only directly but also, and more importantly, through (self-)disciplining of the watched subjects.

The second phase moves away from panoptic metaphors and shifts the focus from institutions to networks, from relatively ostensible forms of discipline to relatively opaque forms of control. This phase can be characterised as offering infrastructural theories of surveillance, where surveillance is networked in character and relies pri- marily on digital rather than physical technologies. It involves distributed forms of watching over people, with increasing distance to the watched and often dealing with data doubles rather than physical persons. A common element in the different theoret- ical accounts of Deleuze, Haggerty and Ericson and Zuboff is to critically question not only the power structures in contemporary network societies and how surveillance reinforces, or sometimes undermines, these, but also how we can conceptualise this power play beyond panoptic effects of self-disciplining.

In the third phase, we see surveillance theory building on, and sometimes combining insights from, both theoretical frameworks of the first two phases, to conceptualise surveillance through concepts or lenses such as dataveillance, access control, social sorting, peer-to-peer surveillance and resistance. With the datafication of society, surveillance combines the monitoring of physical spaces with the monitoring of digital spaces. In these hybrid surveillance spaces, not only government or corporate surveil- lance is found, but also self-surveillance and complex forms of watching-and-being- watched through social media and their paradigm of voluntary data sharing.

The roughly chronological structure should not be interpreted as consecutive stages. Each phase continues to the present (see Fig. 1). New comprehensive theories may emerge within the architectural or, more likely given the predominance of digital infrastructures in today’s society, infrastructural strands of surveillance theory. Moreover, the conceptualisations of the third phase add to and refine the broad theoretical frameworks of the first two phases.

So, where does surveillance theory stand now? In the past two decades, many new layers have been added to real-space surveillant assemblages, with systems such as dataveillance supplementing rather than replacing classic systems of surveillance such as CCTV. In that sense, the Panopticon remains a powerful metaphor. However, the institutions that Foucault recognised as disciplining forces have altered in shape, place, visibility and dynamics. In addition, notions of self-surveillance point to new dynamics, where watching oneself via a mediated, mobile and networked gaze still raises ques- tions of power, discipline and control, but in potentially new ways that cannot be easily captured in classic surveillance frameworks. Thus, many contemporary theoretical approaches to surveillance revolve around de-centralised forms of surveillance, with many watching many and with various permutations of machines and humans watching machines and humans. What binds many strands together are core questions of power and control, of who watches whom in which settings for what reasons; and these questions are asked in settings of technological infrastructures and tools, where tech- nology functions as an intermediary of power or control dynamics.

These questions of power and control are approached differently, however. Gary T. Marx distinguishes three attitudes in surveillance thinking. One view emphasises historical conti- nuity, arguing that changes in surveillance are a matter of degree; others argue that changes in surveillance are revolutionary, making surveillance a much more predominant feature of current society. The latter outlook has two variants: a completely negative view (‘you never had it so bad’) and a more relativistic view, arguing that whilst the technologies may be revolutionary, changes in surveillance largely reflect social and cultural changes (Marx 2002).

Related to this are differences in understanding the role that surveillance and surveillance technologies play in society: some theorists use surveillance as a lens to observe, understand or criticise certain phenomena or developments, whilst others approach surveillance as an intrinsic and fundamental feature of society as a whole. Whilst the former is usually more situated in concrete practices, the latter approach is more holistic and generic, attempting to explain broad developments in society as being related to the fundamental affordances of surveillance in structuring social processes.

One pitfall of seeing surveillance as an all-encompassing feature of society, but also in approaches where surveillance is used as a lens to analyse certain developments, is that theoretical accounts often talk in abstract entities (‘institutions’, ‘the government’, ‘networks’, ‘the market’). These entities are described as invisible forces exercising power over subjects. This perspective often ignores any form of situatedness, context or the specificities of surveillance technologies and practices. In that respect, the surveillant assemblage can be seen as the first recognition that surveillance needs to be analysed in context. It is important to apply insights and methodologies developed by Science and Technology Studies, including for example Actor-Network Theory, to look beyond abstract theory and the frame of inevitable power exercise over passive, docile subjects. Although the technologies used in surveillance have been discussed in earlier surveil- lance literature, the mediation and remediation (Bolter and Grusin 1996) that occurs between technology and users are still frequently overlooked. Useful exceptions are scholars such as Dubbeld (2005); Ball and Webster (2003) and Taekke (2011), who provide useful examples of how Science and Technology Studies and media studies can help find new directions for thinking about the co-evolution of technologies, practices and values associated with surveillance in the twenty-first century.

Fig. 1 Three phases of surveillance theory

Acknowledgments Research for this paper was made possible by a grant from the Netherlands Organisation for Scientific Research (NWO), project number 453-14-004. We thank the anonymous reviewers for their helpful suggestions.

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and repro- duction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.



References

Albrechtslund, A. (2008). Online social networking as participatory surveillance. First Monday, 13(3), 3.
Available at: http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2142/1949 (accessed 01
April 2016).
Albrechtslund, A., & Dubbeld, L. (2005). The plays and arts of surveillance: studying surveillance as entertainment. Policy Studies, 3, 216–221.
Allmer, T. (2012). Towards a critical theory of surveillance in informational capitalism. Frankfurt am Main: Peter Lang.
Andrejevic, M. (2007). iSpy: Surveillance and power in the interactive era. Lawrence: University Press of
Kansas.
Ball, K., & Webster, F. (2003). The intensification of surveillance: Crime, terrorism and warfare in the information era. Chicago: Pluto Press.
Bellamy Foster, J. & McChesney?, R.W. (2014). Surveillance Capitalism: Monopoly-Finance Capital, the Military-Industrial Complex, and the Digital Age. Monthly Review, 66(3). Available at: http: monthlyreview.org/2014/07/01/surveillance-capitalism/(accessed 01 April 2016).
Bentham, J. (2010). The panopticon writings (Ed. M. Božovič). London: Verso Books.
Bigo, D. (2006). Security, exception, ban and surveillance. In D. Lyon (Ed.), Theorising surveillance: The panopticon and beyond (pp. 46–68). Portland: Willan Publishing.
Bogard, W. (2006). Surveillance assemblages and lines of flight. In D. Lyon (Ed.), Theorising surveillance: The panopticon and beyond (pp. 97–122). Portland: Willan Publishing.
Bolter, J. D., & Grusin, R. (1996). Remediation. Configurations, 4(3), 311–358.
Boyd, D. (2011). Dear Voyeur, meet Flâneur… Sincerely, Social Media. Surveillance & Society, 8(4), 505–
507.
Boyd, D., & Ellison, N. B. (2007). Social network sites: definition, history, and scholarship. Journal of
Computer-Mediated Communication, 13(1), 210–230.
Božovič, M. (2010). Introduction: ‘An utterly dark spot. In M. Božovič (Ed.), The panopticon writings (pp. 1–
28). London: Verso Books.
Brands, J., & Schwanen, T. (2014). Experiencing and governing safety in the night-time economy: nurturing the state of being carefree. Emotion, Space and Society, 11, 67–78.
Brunon-Ernst, A. (2013a). Introduction. In A. Brunon-Ernst (Ed.), Beyond Foucault: New perspectives on
Bentham’s panopticon (pp. 1–16). Surrey: Ashgate Publishing.
Brunon-Ernst, A. (2013b). Deconstructing panopticism into the plural panopticons. In A. Brunon-Ernst (Ed.), Beyond Foucault: New perspectives on Bentham’s panopticon (pp. 17–42). Surrey: Ashgate Publishing. Brunon-Ernst, A., & Tusseau, G. (2013). Epilogue: the panopticon as a contemporary icon? In A. Brunon- Ernst (Ed.), Beyond Foucault: New perspectives on Bentham’s panopticon (pp. 185–200). Surrey:
Ashgate Publishing.
Brunton, F., & Nissenbaum, H. (2013). Political and ethical perspectives on data obfuscation. In M.
Hildebrandt & K. De Vries (Eds.), Privacy, due process and the computational turn (pp. 164–188). New York: Routledge.
Brunton, F., & Nissenbaum, H. (2015). Obfuscation: a user’s guide for privacy and protest. Massachusetts: MIT Press.
Clarke, R. (1988). Information technology and dataveillance. Communications of the ACM, 31(5), 498–512. Clarke, A., & Montini, T. (1993). The many faces of RU486: tales of situated knowledges and technological
contestations. Science, Technology & Human Values, 18(1), 42–78.
Cohen, J.E. (2016). The Surveillance-Innovation Complex: The Irony of the Participatory Turn. In D. Barney et al. (Eds.), The Participatory Condition. Minneapolis: University of Minnesota Press. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2466708 (accessed 01 April 2016).
CPDP (2016). Panel on Surveillance capitalism: a new societal condition rising (video available at https:

www.youtube.com/watch?v=0Fr7zl9NA7s; accessed 02 April 2016).
Dalibert, L. (2013). Posthumanism and somatechnologies—exploring the intimate relations between humans and technologies (PhD? thesis). University of Twente.
Deleuze, G. (1992). Postscript on the societies of control. October, 59, 3–7.
Deleuze, G. (1994). Désir et plaisir. Magazine littéraire 325, 59–65. English translation available at: http:
www.artdes.monash.edu.au/globe/delfou.html (accessed 02 April 2016). Deleuze, G. (2006). Foucault. London: Continuum.
Deleuze, G., & Foucault, M. (1972). Les Intellectuels et le Pouvoir. L’Arc, 49, 3–10.
Deleuze, G., & Guattari, F. (1987). A thousand plateaus: capitalism and schizophrenia. Minneapolis: University of Minnesota Press.
Dholakia, N. & Zwick, D. (2001). Privacy and consumer agency in the information age: between prying profilers and preening webcams. Journal of Research for Consumers, 1, available at http://www. jrconsumers.com/academic_articles/issue_1/DholakiaZwick?.pdf (accessed 01 April 2016).
Dilts, A. & Harcourt, B. (2008). Discipline, Security, and Beyond: A Brief Introduction. Carceral Notebooks,
4, 1–6. Available at: http://www.thecarceral.org/cn4_dilts-harcourt.pdf (accessed 01 April 2016).
Dodge, M., & Kitchin, R. (2011). Code/space: software and everyday life. Massachusetts: MIT Press.
Dubbeld, L. (2005). The role of technology in shaping CCTV surveillance practices. Information, Communication & Society, 8(1), 84–100.
Elmer, G. (2003). A diagram of panoptic surveillance. New Media & Society, 5(2), 231–247.
Foucault, M. (1980). Power/knowledge: selected interviews and other writings 1972–1977 (Ed. C. Gordon).
New York: Pantheon Books.
Foucault, M. (1991a). Discipline and punish: the birth of the prison. London: Penguin.
Foucault, M. (1991b). The Foucault effect: studies in governmentality (Eds. G. Burchell, C. Gordon & P.
Miller). Chicago: University of Chicago Press.
Foucault, M. (1998). The will to knowledge: the history of sexuality (Vol. 1). London: Penguin.
Foucault, M. (2002). Power: essential works of Foucault 1954–1984, Vol. 3 (Ed. J.D. Faubion). London: Penguin Books.
Foucault, M. (2006). Psychiatric power: lectures at the collège de France 1973–1974 (Ed. J. Lagrange). New
York: Palgrave Macmillan.
Foucault, M. (2007). Security, territory, population: lectures at the collège de France 1977–1978 (Ed. M.
Senellart). New York: Picador.
Fuchs, C. (2012a). Political economy and surveillance theory. Critical Sociology, 39(5), 1–17.
Fuchs, C. (2012). New Marxian Times! Reflections on the 4th ICTs and Society Conference BCritique, Democracy and Philosophy in 21st Century Information Society. Towards Critical Theories of Social Media^. tripleC, 10(1), 114–121.
Fuchs, C., & Mosco, V. (2012). Introduction: Marx is back—The importance of marxist theory and research for critical communication studies today. tripleC, 10(2), 127–140.
Galloway, A. R. (2004). Protocol: how control exists after decentralization. Massachusetts: MIT Press. Gibson, J.J. (1977). The Theory of Affordances. In R. Shaw and J. Bransford (Eds.), Perceiving, Acting, and
Knowing: Toward an Ecological Psychology (pp. 67–82). Lawrence Erlbaum.
Haggerty, K. (2006). Tear down the walls: on demolishing the panopticon. In D. Lyon (Ed.), Theorising surveillance: The panopticon and beyond (pp. 23–45). Portland: Willan Publishing.
Haggerty, K. D., & Ericson, R. V. (2000). The surveillant assemblage. British Journal of Sociology, 51(4),
605–22.
Hier, S. P. (2002). Probing the surveillant assemblage: on the dialectics of surveillance practices as processes of social control. Surveillance & Society, 1(3), 399–411.
Kaino, M. (2008). Bentham’s concept of security in a global context: the pannomion and the public opinion tribunal as a universal plan. Journal of Bentham Studies, 10, 1–29.
Koops, B.-J. (2010). Law, technology, and shifting power relations. Berkley Technology Law Journal, 25(2),
973–1036.
Koskela, H. (2004). Webcams, TV shows and mobile phones: empowering exhibitionism. Surveillance and
Society, 2(2/3), 199–215.
Koskela, H. (2011). ‘Don’t mess with Texas!’ Texas virtual border watch program and the (botched) politics of responsibilization. Crime Media Culture, 7(1), 49–65.
Latour, B. (1999). On recalling ANT. The Sociological Review, 47(1), 15–25.
Law, J. (1992). Notes on the theory of the actor-network: ordering, strategy and heterogeneity. Systems
Practice, 5, 379–93.
Leroy, M.-L. (2002). Le panoptique inversé: Théorie du contrôle dans la pensée de Jeremy Bentham. In C.
Lazzeri (Ed.), La production des institution (pp. 155–177). Besançon: Presses Universitaires Franc- Comtoises.
Lyon, D. (2001). Surveillance society: monitoring everyday life. Buckingham: Open University Press.
Lyon, D. (Ed.). (2003). Surveillance as social sorting: privacy, risk, and digital discrimination. London: Routledge.
Lyon, D. (2006). The search for surveillance theories. In D. Lyon (Ed.), Theorising surveillance: The panopticon and beyond (pp. 3–20). Portland: Willan Publishing.
Lyon, D. (2007). Surveillance studies: an overview. Cambridge: Polity.
Lyon, D. (2008). An electronic panopticon? A sociological critique of surveillance theory. The Sociological
Review, 41(4), 653–678.
Mann, S. (2004). BSousveillance^: Inverse Surveillance in Multimedia Imaging. Computer Engineering, 620–
627. Available at: http://delivery.acm.org/10.1145/1030000/1027673/p620-mann.pdf?ip=137.56.133.
86&id=1027673&acc=ACTIVE%20SERVICE&key=0C390721DC3021FF.8E8A7FC83EB1C6A0.
4D4702B0C3E38B35.4D4702B0C3E38B35&CFID=596609361&CFTOKEN=89220847&acm=
1459527091_55c715c379b235850cfc211fe4d354da (accessed 01 April 2016).
Marwick, A. (2012). The public domain: surveillance in everyday life. Surveillance & Society, 9(4), 378–393.
Marx, G. T. (2002). What’s new about the BNew surveillance^? Classifying for change and continuity. Society,
1(1), 9–29.
Mercer, D. (2010). CDC uses shopper-card data to trace salmonella. Bloomberg Business Week, 10 March. Murakami Wood, D. (2007). Beyond the panopticon? Foucault and surveillance studies. In J. Crampton & S.
Elden (Eds.), Space, knowledge and power: Foucault and geography (pp. 245–263). Aldershot: Ashgate. Murakami Wood, D. (2013). What is global surveillance? Towards a relational political economy of the global
surveillant assemblage. Geoforum, 49, 317–326.
Pham, N., Ganti, R. K., Uddin, Y. S., Nath, S. & Abdelzaher, T. (2010). Privacy-preserving reconstruction of multidimensional data maps in vehicular participatory sensing. WSN 2010: 7th European Conference on Wireless Sensor Networks.
Price, D.H. (2014). The New Surveillance Normal: NSA and Corporate Surveillance in the Age of Global Capitalism. Monthly Review, 66(3). Available at: http://monthlyreview.org/2014/07/01/the-new- surveillance-normal/. (accessed 04 April 2016).
Roessler, M. (2002). How to find hidden cameras. Available at: http://www.tentacle.franken.de/papers/
hiddencams.pdf (accessed 02 April 2016).
Romein, E., & Schuilenburg, M. (2008). Are you on the fast track? The rise of surveillant assemblages in a post-industrial age. Architectural Theory Review, 13(3), 337–348.
Schofield, P. (2009). Bentham: a guide for the perplexed. London: Continuum. Semple, J. (1987). Bentham’s haunted house. The Bentham Newsletter, 11, 35–44.
Sunstein, C. R., & Thaler, R. H. (2009). Nudge: improving decisions about health, wealth, and happiness.
London: Penguin.
Taekke, J. (2011). Digital panopticism and organizational power. Surveillance & Society, 8(4), 441–454. Timan, T., & Oudshoorn, N. (2012). Mobile cameras as new technologies of surveillance? How citizens
experience the use of mobile cameras in public nightscapes. Surveillance & Society, 10(2), 167–181. Valverde, M. (2008). Police, sovereignty, and law: Foucaultian reflections. In M. Dubber & M. Valverde
(Eds.), Police and the liberal state (pp. 15–32). Stanford: Stanford Law Books.
van der Ploeg, I. (2003). Biometrics and the body as information: normative issues of the socio-technical coding of the body. In D. Lyon (Ed.), Surveillance as social sorting: Privacy, risk and digital discrim- ination (pp. 57–73). London: Routledge.
Varian, H. R. (2014). Beyond big data. Business Economics, 49(1), 27–31.
Whitaker, R. (1999). The end of privacy: how total surveillance is becoming a reality. New York: The New
Press.
Zuboff, S. (2015). Big other: surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30, 75–89.
Zuboff, S. (2016). The Secrets of Surveillance Capitalism. Frankfurter Allgemeine, Feuilleton. Available at: http://www.faz.net/aktuell/feuilleton/debatten/the-digital-debate/shoshana-zuboff-secrets-of-surveillance- capitalism-14103616.html (accessed 01 April 2016).
Il n'y a pas de commentaire sur cette page. [Afficher commentaires/formulaire]