Regulating Use of Mobile Sentry Devices by U.S. Customs and Border Protection

Summary

Robotic and automated systems have the potential to remove humans from dangerous situations, but their current intended use as aids or replacements for human officers conducting border patrols raises ethical concerns if not regulated to ensure that this use “promot[es] the safety of the officer/agent and the public” (emphasis added). U.S. Customs and Border Protection (CBP) should update its use-of-force policy to cover the use of robotic and other autonomous systems for CBP-specific applications that differ from the military applications assumed in existing regulations. The most relevant existing regulation, Department of Defense Directive 3000.09, governs how semi-autonomous weapons may be used to engage with enemy combatants in the context of war. This use case is quite different from mobile sentry duty, which may include interactions with civilians (whether U.S. citizens or migrants). With robotic and automated systems about to come into regular use at CBP, the agency should proactively issue regulations to forestall adverse effects—specifically, by only permitting use of these systems in ways that presume all encountered humans to be non-combatants. 

Challenge and Opportunity

CBP is currently developing mobile sentry devices as a new technology to force-multiply its presence at the border. Mobile sentry devices, such as legged and flying robots, have the potential to reduce deaths at the border by making it easier to locate and provide aid to migrants in distress. According to an American Civil Liberties Union (ACLU) report, 22% of migrant deaths between 2010 and 2021 that involved an on-duty CBP agent or officer were caused by medical distress that began before the agent or officer arrived on the scene. However, the eventual use cases, rules of engagement, and functionalities of these robots are unclear. If not properly regulated, mobile sentry devices could also be used to harm or threaten people at the border—thereby contributing to the 44% of deaths that occurred as a direct result of vehicular or foot pursuit by a CBP agent. Regulations on mobile sentry device use—rather than merely acquisition—are needed because even originally unarmed devices can be weaponized after purchase. Devices that remain unarmed can also harm civilians using a limb or propeller. 

Existing Department of Homeland Security (DHS) regulations governing autonomous systems seek to minimize technological bias in artificially intelligent risk-assessment systems. Existing military regulations seek to minimize risks of misused or misunderstood capabilities for autonomous systems. However, no existing federal regulations govern how uncrewed vehicles, whether remotely controlled or autonomous, can be used by CBP. The answer is not as simple as extending military regulations to the CBP. Military regulations governing autonomous systems assume that the robots in question are armed and interacting with enemy combatants. This assumption does not apply to most, if not all, possible CBP use cases.

With the CBP already testing robotic dogs for deployment on the Southwestern border, the need for tailored regulation is pressing. Recent backlash over the New York Police Department testing similar autonomous systems makes this topic even more timely. While the robots used by CBP are currently unarmed, the same company that developed the robots being tested by CBP is working with another company to mount weapons on them. The rapid innovation and manufacturing of these systems requires implementation of policies governing their use by CBP before CBP has fully incorporated such systems into its workflows, and before the companies that build these systems have formed a powerful enough lobby to resist appropriate oversight. 

Plan of Action

CBP should immediately update its Use of Force policy to include restrictions on use of force by mobile sentry devices. Specifically, CBP should add a chapter to the policy with the following language:

These regulations should go into effect before Mobile Sentry Devices are moved from the testing phase to the deployment phase. Related new technology, whether it increases capabilities for surveillance or autonomous mobility, should undergo review by a committee that includes representatives from the National Use of Force Review Board, migrant rights groups, and citizens living along the border. This review should mirror the process laid out in the Community Control over Police Surveillance project, which has already been successfully implemented in multiple cities

Conclusion

U.S. Customs and Border Patrol (CBP) is developing an application for legged robots as mobile sentry devices at the southwest border. However, the use cases, functionality, and rules of engagement for these robots remain unclear. New regulations are needed to forestall adverse effects of autonomous robots used by the federal government for non-military applications, such as those envisioned by CBP. These regulations should specify that mobile sentry devices can only be used as humanitarian aids, and must use de-escalation methods to indicate that they are not threatening. Regulations should further mandate that mobile sentry devices maintain clear distance from human targets, that use of force by mobile sentry devices is never considered “reasonable,” and that mobile sentry devices may never be used to pursue, detain, or arrest humans. Such regulations will help ensure that the legged robots currently being tested as mobile sentry devices by CBP—as well as any future mobile sentry devices—are used ethically and in line with CBP’s goals, alleviating concerns for migrant advocates and citizens along the border.

Frequently Asked Questions
What is the purpose of regulating CBP use of autonomous robots as mobile sentry devices rather than purchasing of autonomous robots?

Regulations on purchasing are not sufficient to prevent mobile sentry device technology from being weaponized after it is purchased. However, DHS could certainly also consider updating its acquisition regulations to include clauses resulting in fines when mobile sentry devices acquired by the CBP are not used for humanitarian purposes.

Why is Department of Defense (DOD) Directive 3000.09 not sufficient to regulate the use of force by all government agencies?

DOD Directive 3000.09 regulates the use of autonomous weapons systems in the context of war. For an autonomous, semi-autonomous, or remotely controlled system that is deployed with the intention to be a weapon in an active battlefield, this regulation makes sense. But applications of robotic and automated systems currently being developed by DHS are oriented towards mobile sentry duty along stretches of American land where civilians are likely to be found. This sentry duty is likely to be performed by uncrewed ground robots following GPS breadcrumb trails along predetermined regular patrols along the border. Applying Directive 3000.09, the use of a robot to kill or harm a person during a routine patrol along the border would not be a violation as long as a human had “meaningful control” over the robot at that time. The upshot is that mobile sentry devices used by CBP should be subject to stricter regulations.

What standards do robotics companies have on the use of their technologies?

Most companies selling legged robots in the United States have explicit end-user policies prohibiting the use of their machines to harm or intimidate humans or animals. Some companies selling quadcopter drones have similar policies. But these policies lack any enforcement mechanism. As such, there is a regulatory gap that the federal government must fill.

Is updating its Use of Force policy the only way for CBP to regulate its use of mobile sentry devices?

No, but it is an immediately actionable strategy. An alternative—albeit more time-consuming—option would be for CBP to form a committee comprising representatives from the National Use of Force Review Board, the military, migrant-rights activist groups, and experts on ethics to develop a directive for CBP’s use of mobile sentry devices. This directive should be modeled after DoD Directive 3000.09, which regulates the use of lethal autonomous weapons systems by the military. As the autonomous systems in DOD Directive 3000.09 are assumed to be interacting with enemy combatants while CBP’s jurisdiction consists mostly of civilians, the CBP directive should be considerably more stringent than Directive 3000.09.

Would the policies proposed in this memo vary with the degree of autonomy possessed by the robot in question?

The policies proposed in this memo govern what mobile sentry devices are and are not permitted to do, regardless of the extent to which humans are involved in device operation and/or the degree of autonomy possessed by the technology in question. The policies proposed in this memo could therefore be applied consistently as the technology continues to be developed. AI is always changing and improving, and by creating policies that are tech-agnostic, CPB can avoid updating regulations as mobile sentry device technology evolves.