Robotic and automated systems have the potential to remove humans from dangerous situations, but their current intended use as aids or replacements for human officers conducting border patrols raises ethical concerns if not regulated to ensure that this use “promot[es] the safety of the officer/agent and the public” (emphasis added). U.S. Customs and Border Protection (CBP) should update its use-of-force policy to cover the use of robotic and other autonomous systems for CBP-specific applications that differ from the military applications assumed in existing regulations. The most relevant existing regulation, Department of Defense Directive 3000.09, governs how semi-autonomous weapons may be used to engage with enemy combatants in the context of war. This use case is quite different from mobile sentry duty, which may include interactions with civilians (whether U.S. citizens or migrants). With robotic and automated systems about to come into regular use at CBP, the agency should proactively issue regulations to forestall adverse effects—specifically, by only permitting use of these systems in ways that presume all encountered humans to be non-combatants.
Challenge and Opportunity
CBP is currently developing mobile sentry devices as a new technology to force-multiply its presence at the border. Mobile sentry devices, such as legged and flying robots, have the potential to reduce deaths at the border by making it easier to locate and provide aid to migrants in distress. According to an American Civil Liberties Union (ACLU) report, 22% of migrant deaths between 2010 and 2021 that involved an on-duty CBP agent or officer were caused by medical distress that began before the agent or officer arrived on the scene. However, the eventual use cases, rules of engagement, and functionalities of these robots are unclear. If not properly regulated, mobile sentry devices could also be used to harm or threaten people at the border—thereby contributing to the 44% of deaths that occurred as a direct result of vehicular or foot pursuit by a CBP agent. Regulations on mobile sentry device use—rather than merely acquisition—are needed because even originally unarmed devices can be weaponized after purchase. Devices that remain unarmed can also harm civilians using a limb or propeller.
Existing Department of Homeland Security (DHS) regulations governing autonomous systems seek to minimize technological bias in artificially intelligent risk-assessment systems. Existing military regulations seek to minimize risks of misused or misunderstood capabilities for autonomous systems. However, no existing federal regulations govern how uncrewed vehicles, whether remotely controlled or autonomous, can be used by CBP. The answer is not as simple as extending military regulations to the CBP. Military regulations governing autonomous systems assume that the robots in question are armed and interacting with enemy combatants. This assumption does not apply to most, if not all, possible CBP use cases.
With the CBP already testing robotic dogs for deployment on the Southwestern border, the need for tailored regulation is pressing. Recent backlash over the New York Police Department testing similar autonomous systems makes this topic even more timely. While the robots used by CBP are currently unarmed, the same company that developed the robots being tested by CBP is working with another company to mount weapons on them. The rapid innovation and manufacturing of these systems requires implementation of policies governing their use by CBP before CBP has fully incorporated such systems into its workflows, and before the companies that build these systems have formed a powerful enough lobby to resist appropriate oversight.
Plan of Action
CBP should immediately update its Use of Force policy to include restrictions on use of force by mobile sentry devices. Specifically, CBP should add a chapter to the policy with the following language:
- A “Mobile Sentry Device” should be defined as any remotely controlled, autonomous, or semi-autonomous mobile technology used for surveillance. Examples of Mobile Sentry Devices include self-driving cars, legged robots, or quadcopter drones.
- No amount of force may be determined “reasonable” if administered by a Mobile Sentry Device, whether the Device is (i) completely controlled by an agent or officer, or (ii) operating in an autonomous or semi-autonomous mode.
- No Mobile Sentry Device may be authorized to administer Lethal Force, Less-Lethal Force, or any type of force applied directly by contact with the Device (i.e., contact equivalent to an “Empty Hand” technique). For example, a legged robot may not be used to discharge a firearm, disperse Oleoresin Capsicum spray (pepper spray), or strike a human with a limb.
- A Mobile Sentry Device may not be used as a Vehicular Immobilization Device (or used to deploy such a device), whether the Mobile Sentry Device is (i) completely controlled by an agent or officer, or (ii) operating in an autonomous or semi-autonomous mode.
- When powered on, Mobile Sentry Devices must maintain a distance of at least two feet from any humans not authorized to operate the Device. The Device and its operator are responsible for maintaining this distance.
- Mobile Sentry Devices may not be used to detain or perform arrests, nor to threaten or intimidate with the implicit threat of detainment or arrest.
- A Mobile Sentry Device may be used to administer humanitarian aid or provide a two-way visual or auditory connection to a CBP officer or agent.
- When approaching people to offer humanitarian aid, the Device must use de-escalation techniques to indicate that it is not a threat. These techniques will necessarily vary based on the specific technology. Some examples might include a flying device landing and immediately unfolding a screen playing a non-threatening video, or a legged device sitting with its legs underneath it and cycling through non-threatening audio recordings in multiple languages.
- When used for humanitarian purposes, the Device may not touch its human target(s) or request them to touch it. To transfer an item (such as food, water, or emergency medical supplies) to the target(s), the Device must drop the package with the items while maintaining at least two feet of distance from the closest person.
- When used to provide a two-way visual or auditory connection with a CBP officer or agent, the Device must indicate that such a connection is about to be formed and indicate when the connection is broken. For example, the Device could use an audio clip of a ringing phone to signal that a two-way audio connection to a CBP officer is about to commence.
These regulations should go into effect before Mobile Sentry Devices are moved from the testing phase to the deployment phase. Related new technology, whether it increases capabilities for surveillance or autonomous mobility, should undergo review by a committee that includes representatives from the National Use of Force Review Board, migrant rights groups, and citizens living along the border. This review should mirror the process laid out in the Community Control over Police Surveillance project, which has already been successfully implemented in multiple cities.
U.S. Customs and Border Patrol (CBP) is developing an application for legged robots as mobile sentry devices at the southwest border. However, the use cases, functionality, and rules of engagement for these robots remain unclear. New regulations are needed to forestall adverse effects of autonomous robots used by the federal government for non-military applications, such as those envisioned by CBP. These regulations should specify that mobile sentry devices can only be used as humanitarian aids, and must use de-escalation methods to indicate that they are not threatening. Regulations should further mandate that mobile sentry devices maintain clear distance from human targets, that use of force by mobile sentry devices is never considered “reasonable,” and that mobile sentry devices may never be used to pursue, detain, or arrest humans. Such regulations will help ensure that the legged robots currently being tested as mobile sentry devices by CBP—as well as any future mobile sentry devices—are used ethically and in line with CBP’s goals, alleviating concerns for migrant advocates and citizens along the border.
Regulations on purchasing are not sufficient to prevent mobile sentry device technology from being weaponized after it is purchased. However, DHS could certainly also consider updating its acquisition regulations to include clauses resulting in fines when mobile sentry devices acquired by the CBP are not used for humanitarian purposes.
DOD Directive 3000.09 regulates the use of autonomous weapons systems in the context of war. For an autonomous, semi-autonomous, or remotely controlled system that is deployed with the intention to be a weapon in an active battlefield, this regulation makes sense. But applications of robotic and automated systems currently being developed by DHS are oriented towards mobile sentry duty along stretches of American land where civilians are likely to be found. This sentry duty is likely to be performed by uncrewed ground robots following GPS breadcrumb trails along predetermined regular patrols along the border. Applying Directive 3000.09, the use of a robot to kill or harm a person during a routine patrol along the border would not be a violation as long as a human had “meaningful control” over the robot at that time. The upshot is that mobile sentry devices used by CBP should be subject to stricter regulations.
Most companies selling legged robots in the United States have explicit end-user policies prohibiting the use of their machines to harm or intimidate humans or animals. Some companies selling quadcopter drones have similar policies. But these policies lack any enforcement mechanism. As such, there is a regulatory gap that the federal government must fill.
No, but it is an immediately actionable strategy. An alternative—albeit more time-consuming—option would be for CBP to form a committee comprising representatives from the National Use of Force Review Board, the military, migrant-rights activist groups, and experts on ethics to develop a directive for CBP’s use of mobile sentry devices. This directive should be modeled after DoD Directive 3000.09, which regulates the use of lethal autonomous weapons systems by the military. As the autonomous systems in DOD Directive 3000.09 are assumed to be interacting with enemy combatants while CBP’s jurisdiction consists mostly of civilians, the CBP directive should be considerably more stringent than Directive 3000.09.
The policies proposed in this memo govern what mobile sentry devices are and are not permitted to do, regardless of the extent to which humans are involved in device operation and/or the degree of autonomy possessed by the technology in question. The policies proposed in this memo could therefore be applied consistently as the technology continues to be developed. AI is always changing and improving, and by creating policies that are tech-agnostic, CPB can avoid updating regulations as mobile sentry device technology evolves.