By Jack Whitacre, a McCormack Graduate School student
Tomorrow’s disaster responses may well incorporate drones and other unmanned vehicles that deliver needed food, water and medicines. They may send back images for first responders to process in real time and even communicate in whatever languages are needed. One can argue that militaries and other organizations cannot respond to these disasters without proving that their practices are responsible and humane. Debates on morality don’t always keep pace with human-machine interactions, drone deliveries, and mixed reality portals. Discussing value systems and what’s right helps aid recipients, providers, private companies, and governments. Inventing a new application for technology showed me that humanitarian “solutions” can bring their own problems and dialogue is essential to trust. This autumn has brought a giant convergence on virtual reality (VR) with companies like Microsoft attempting to insert VR goggles into industrial applications and turn every personal computer into a VR portal. Often times the best ideas come to those who lean in. I attended a MIT and U.S. Navy’s Hack the Machine event, where a tech speaker invited the audience who wanted to try out a HoloLens, a mixed reality wearable headset, to the front of the room. While the speaker’s assistant prepared the equipment for demonstration, an image of the room (including my restless legs) was projected onto a widescreen. This file was saved and stored. Without realizing it I had participated in a data collecting event that would change my perception of what’s normal and acceptable in a digital society.
In the past I had studied how disaster response can involve large crowds, long hours, and thoughts of people “double dipping” in aid supplies. Humans also experience bias which limits their ability to recognize people of other ages and races[1]. With the experience of the HoloLens in the back of my mind, I had a sudden idea, what if aid distributors could wear the HoloLens that could distinguish and identify repeat visitors faces? It turns out that the HoloLens already has a “detection capture” which makes this process of facial recognition, tagging, and retrieving faces easy. My intention in building upon this new technology was to empower groups like the U.S. Navy to quickly and equally deliver life-saving supplies to coastal communities after superstorms and other disasters. However, I subsequently learned was that the situation on the ground is often more complex.
I soon discovered how innovations are often operationalized without empirical evidence of their effectiveness or conversation with the recipients of security, namely the people. For example, in a piece called “The Banality of Security” (2013), the authors described how London’s government rolled out city-wide closed-circuit television (CCTV) without empirical evidence that it reduced crime. The absence of conclusive studies about the effectiveness of CCTV has made it difficult for civil rights activists to question such a widespread phenomenon which took away people’s ability to participate or give consent. Another scholar called Mark Duffield showed in “The Digital Development-Security Nexus” (2016) how every technology has dual uses and can make people more or less powerful in relation to each other. For example, an app that helps refugees find sources of water can also be used by governments to track the movements of people. This demonstrates the dangers when a technology originally designed for purposes of protecting people is used by groups who position the recipient of security as the state. Without evidence or consent, states may forget about the people technologies are designed to protect.
Of course some people may object to facial recognition software or its deployment in a humanitarian context. It is possible, for example, that some aid recipients might stay away from pick-up points if they thought states or militaries were gathering data. Second, while theft may happen already, people may stay away from the pick-up points and steal from others. Law abiding people would not be able to go back a second time. However well-intentioned, every new technology brings its own ethical problems, including the right to our body and the digital information that represents it.
If communities, government institutions, and companies collaborate in deploying and refining technology in trust building ways, we might be able to improve deliveries, product development, data management, and mutual understanding. We could also ensure that dual use technologies are aimed at unified goals of reducing people’s freedom from fear and want. Having my body scanned by a HoloLens helped me think of new humanitarian applications for technology while showing me second-order problems. A system of ethics which keeps human connections human will give militaries and other disaster response groups the legitimacy to save lives now, work in tandem with machines, and be welcomed back in the future.
[1] One study found we identify people 65% of the time when they are the same race as compared to 40% of the time when they are of a different race. Behrman, Bruce W.; Davey, Sherrie L. “Eyewitness identification in actual criminal cases: An archival analysis.”. Law and Human Behavior. 25 (5): 475–491.
Jack Whitacre is a National Science Foundation fellow at the University of Massachusetts Boston where he is earning a PhD in global governance and human security at the McCormack School of Policy and Global Studies. He studies coasts and coastal communities and hopes to improve the economy and environment through his research and writing.