Wednesday, 19 July 2017

Work, Technology, The City, Violence


Dr John Crossan, University of Strathclyde, International Public Policy Institute (IPPI)


The bogeyman of automation thesis is once again the topic of conversation in dinner parties, cafes and bars across the developed world. In part, this latest wave of concern about the power of machines stems from numerous newspaper articles, blogs and discussion papers on the emerging 4th Industrial Revolution. This builds on the previous Digital Revolution. It represents new ways in which technology becomes more embedded within societies. The 4th Industrial Revolution is marked by breakthroughs in a number of fields, including roboticsartificial intelligence and autonomous vehicles. It is essentially a coming together of operation technologies (i.e. hardware and software that detects or causes a change through the direct monitoring and/or control of physical devices) and information technologies. The scale of this integration is captured in the term ‘internet of things’.
 
A key concern about the application of these technologies centres upon their impact on the labour force. The argument here is that technological advances of this kind will result in higher unemployment, workforce deskilling and subsequent social upheaval. Adding weight to the machine-human substitution position The Boston Consulting Group in a recent report estimates (p8) that, for Germany, by 2025, “A greater use of robotics and computerization will reduce the number of jobs in assembly and production by approximately 610,000” -. In addition, the report predicts that routine cognitive work will also be affected, with more than 20,000 jobs in production planning to be lost (ibid).
 
Another concern about the application of these technologies relates to the exponential growth of the security industry post 9/11. Using the term ‘securitization’ Minas Samatas (2011: 3348) in this journal refers to a “bourgeoning industrial complex, encompassing security, surveillance [and] military technology [that] develops and promotes a global security market of new militarized monitoring technologies for civilian applications”. Again, in this journal Volker Eick (2011: 3330) in his study of policing tactics used in the 2006 FIFA World Cup, writes about the deployment of surveillance technologies that includes “airborne warning and control system planes (AWACS), security robots, [and] closed circuit television surveillance (CCTV)”. Such technologies are in the main developed by the military and their ever-increasing use in cities points towards a worrying blurring of the boundaries between war and police (Wall 2013), civilian and enemy, city and battleground.
 
A recent paper by Ian Shaw (2017) makes explicit a connection between the dual concerns of machine-human substitution and a technologically advanced security industry. With an emphasis on the use of drones to police disenfranchised urbanites Shaw’s argument is a compelling and frightening one: “As more and more jobs are replaced by nonhuman capital, the expelled workers find themselves policed, occupied, and watched by an equally robotic security armada” (Shaw 2017: 22). In support of the first stage of this dystopian narrative (i.e. machine-human substitution) Shaw quotes David Harvey[1]: “Robots do not . . . complain, answer back, sue, get sick, go slow, lose concentration, go on strike, demand more wages, worry about conditions, want tea breaks or simply refuse to show up” (Harvey 2014:103). Harvey’s argument while also compelling is flawed, although this flaw does not detract from a core message in Shaw’s work that warns of a type of terror only the state-capitalist nexus could produce and sustain.
 
Why human labour prevails despite technological disruption
 
Harvey is mistaken. Robots metaphorically do get sick and go slow. This is because with increased complexity comes increased system vulnerability (Pfieffer 2016). In other words, “the more we depend on technology and push it to its limits, the more we need highly-skilled, well-trained, well-practiced people […] acting as the last line of defense against the failures that will inevitably occur” (Baxter et al 2012: 65). Furthermore, as David H Autor (2015) argues, humans will always have a comparative advantage over computers when it comes to cognitive tasks, which require creative thinking, good judgement and, increasingly, social intelligence. Such tasks, according to Sara de la Rica (in Dolphin (ed), 2017: 91) can be complementary to computers “and hence [she argues] computerization is likely to increase the demand for people with these skills”. In the short term, there is evidence that computerization and automation will see routine jobs lost (Coyle in Dolphin (ed) 2017). In the medium to long term, predictions about those ‘damn robots coming here and stealing our jobs’ will continue to prove inaccurate. 
 
Human labour prevails, despite technological disruption, because of our ability to adopt and develop new skills via education (Goldin and Katz, 2008[2]) – although during periods of technological disruption, without the political will to push for equality of opportunity in education, there will always be winners and losers. I witnessed a quality example of skills education recently during a visit to a factory that produced high-tech goods. There I met young people at various stages of a modern apprenticeship programme. De la Rica’s comments about humans and machines complimenting one another was evidenced when one apprentice showed me a series of electronic chips he had handmade. When expensive high-tech products breakdown, customers send them back to the manufacturer. Predicting faults in complex machinery is not easy. As such, a bespoke hands-on approach is needed to fix the problem when it occurs, hence the apprentice learning to make computer chips by hand.
 
I was struck by the level of craft that went into the production of these small high-tech products. I was also struck by the professionalism of the young apprentice whose knowledge of the technology was mirrored by his excellent interactional skills. This apprenticeship programme was obviously of a very high standard. Adding further accolade to the programme, the young apprentices I met came to their jobs through a relationship between the company and the local high school. This model of a demand-driven apprenticeship that emerges out of collaboration between local businesses and high schools is regarded by many skills policy commentators as critical to ensuring we have a future digital-ready workforce. The company has operations in cities across the UK including Edinburgh, London, Southampton and Luton. Its educational outreach programme engages several thousand school children and college students from communities within these cities and others, “to ensure the skills and knowledge which are vital to the UK are retained” (Company Website). The company’s commitment to the local urban communities within which it operates is commendable – a good example of corporate citizenship at work.
 
The company’s name is Leonardo in the UK, part of Leonardo-Finmeccancia which, according to Campaign Against Arms Trade (CAAT) is the worlds 9th largest arms company. In addition to military helicopters, fighter aircrafts, missiles, artillery and armed combat vehicles the company makes the FALCO EVO UAV – a drone. Leonardo-Finmeccancia has sold its products to, amongst others, Algeria, Libya and Turkey. The international press reported in December 2011 that 35 villagers had been misidentified by Turkish drones and bombed, killing at least 30, 17 of whom were children. Of the same incident, the Guardian (29/12/2011) reported that those killed were not Kurdish separatist fighters but smugglers of diesel and cigarettes across the Iraqi border. This is one of an increasing number of examples of what has been termed ‘collateral damage’.
 
Shaw, describing the ascendance of dronified policing, quotes Neocleous[3] (2014: 162) who writes “This is nothing less than a permanent police presence of the reproduction of order – air power as the everywhere police – in which the exercise of violence is an ever-present possibility”. I would like to forward a less spectacular narrative, which grounds the ever-present exercise of violence in a softer but no less effective form of everywhere police that hides terror behind the common aspirations of people looking for a good job so that they might pay the bills, go on holiday somewhere warm each year and live a decent life.
 
The factory I visited was in Edinburgh. The high school in this case is within twenty minutes walking distance. The locations and the relationship between the school and factory present a banal everyday urban form to the exercise of violence.  The young apprentices wake up each working morning.  I imagine they may stop by the local cafĂ© for a coffee as they make their way down Pennywell Road to their place of work where they will apply their talents and learn new skills. These young people are the lucky ones. They do not live in Libya, Algeria or the Turkish-Iraqi border lands. Unlike others who went to the same school in this working class neighbourhood, their training at the factory will increase the probability of them never being part of the surplus population “policed, occupied, and watched” [by a] “robotic security armada” (Shaw 2017: 22). Notwithstanding a few minor issues with forgotten passwords or low mobile phone batteries, technology will enhance their lives.    
 
I am not a pacifist. Violent engagement can be a legitimate response to oppression and, while I have issues on both counts, there is nothing new about the military complex operating in the public sphere as a major employer or a symbol of national pride. But the military complex nowadays seems more pervasive coupled as it is with a new security economy that has its eyes firmly fixed on the urban world. War, security and violence now proliferate our everyday lives. Giorgio Agamben argued the imperative of ‘security’ now ‘imposes itself on the basic principle of state activity’ including skills development. This discourse of military urbanism tells us that parts of the city need to be protected – e.g. strategic financial districts, respectable neighbourhoods, tourist spaces. Equally, parts of the city must be subject to pre-emptive action – e.g. BME neighbourhoods and political protests. The bogeyman of automation thesis takes on a different more sinister hue here. We are developing the technologies and the skilled people necessary to ensure that those of us in the right parts of the city need not fear the disenfranchised others. 


 



[1] Harvey, D (2014) Seventeen Contradictions and the End of Capitalism, Profile Books: London.
 
[2] Goldin C. and Katz L.F (2008). The Race between education and technology. Harvard University Press: Cambridge.
 
 
[3] Neocleous, M.(2014) War Power, Police Power, Edinburgh University Press, Edinburgh.
 

No comments:

Post a Comment