You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

Pigeon Drones vs Falcon Drones

The Paul Allen backed Zipline Corporation has since 2016 operated a commercial drone service in Rwanda that reportedly transports 20% of blood needed for emergency transfusions in that country.

Zipline is now working to expand its services to as many African countries as possible. Tanzania and Ghana are next in line to patronise the company’s services.

At first glance, a transport drone service shouldn’t raise any major privacy issues. Zipline’s drone-enabled life-saving missions, their much uplifting character notwithstanding, are not all that removed, conceptually and operationally, from the several, widely discussed, models of drone-enabled commerce around the world.

Amazon, Google, 7-11, Walmart and many major corporations around the world, particularly in the US, have all launched major R&D efforts to make package delivery by drones a ubiquitous and mundane reality by the close of this decade.

The implications of this trend for supply chain management, precision transportation, routing optimisation and various other major operational functions of modern business are said to be legion. Zipline for instance claims to have reduced delivery cycle time from 4 hours to 15 minutes on blood transport trips. In fact, one of the earliest backers of the Zipline effort in Rwanda was the UPS Foundation, the charitable arm of the American postal giant. Government agencies, humanitarian relief organisations and civil networks are all expected to get into the game.

Unlike surveillance and anti-personnel drones in the service of state security and military entities, civilian applications in the transport, commerce and logistical domains are rarely scrutinised for their privacy impacts.

It is perhaps not too surprising then that despite a thorough search for evidence showing that the design of Zipline operations in Rwanda and elsewhere operate with a definite privacy policy, I have been unable to find any. But there is no point singling out Zipline. No drone-commerce initiative that has received attention of late has a publicly accessible privacy policy. And there is virtually no serious scholarly work on the subject.

The reason is obviously that unless “surveillance” or other privacy-touching processes can be observed in the form to which our analytical tools have grown accustomed, concerns about privacy rarely surface. This is the “form-factor fixation” dimension of privacy appreciation and analysis.

And, yet, Zipline’s service is dripping with all manner of privacy ramifications, which will only grow as its operations grow in scale and sophistication, and particularly if it is to be truly successful in contributing substantially to health outcomes.

To be able to deliver the right types of blood to the right patients in the right locations, whether on a routine or emergency basis, information about the patient must be input in the electronic ordering, routing, and coordination platforms of Zipline. To optimise results tracking, minimise errors, and enhance accountability for quality of service, delivery performance data needs to be linked to individual patient outcomes. At any rate, certain portions of a patient’s medical records would be critical in any emergency intervention.

How is this different for a normal ambulance or other legacy transport system though? This: the fact of remote operation. Remote operation introduces important elements regarding the personnel dimension that can alter standard privacy arrangements. The persons operating the drone need not be actual health responders. They do not have to be operating from a controlled health facility, and they are not subject to various measures which today are designed to be tied to specific physical coordinates of health infrastructure units. To the extent that Zipline is an private company whose technicians and contractors operate outside the legacy health system, and, more importantly, because of the remote capabilities wielded by these contractors, the design of information-intensive processes to enhance quality control and accountability also risks the exposure of sensitive personal health information to unauthorised access.

A major point to bear in mind is that “form factor – fixation” often leads to a compartmentalisation of new technologies that have no true grounding in reality. Whether a drone is a “surveillance platform” or a “transport vehicle” is purely a question of mode of behaviour not of set function, and the answer to that question can change from context to context for the same system.

Hence, a drone transporting sensitive medical products into contexts of heightened vulnerability for individuals or groups of individuals regardless of its nominal designation metamorphosises into a creature far more concerning than a drone designed to drop crates of soda into suburban courtyards.

Privacy policy design for novel and emerging services, whether for drone deliveries or 3D printing of biotech products, should always proceed with an emphasis on behavioural dynamics not structural form.

Leave a Comment

Log in