« The lost penguin | Main | Insert a human »

Coding ethics

boston-dynamics-werobot-2022-370.jpgWhy is robotics hard?

This was Bill Smart's kickoff on the first (workshop) day of this year's We Robot. It makes sense: We Robot is 11 years old, and if robots were easy we'd have them by now. The basic engineering difficulties are things he's covered in previous such workshops: 2021, 2019, 2018, 2016.

More to the point for this cross-the-technicians-with-the-lawyers event: why is making robots "ethical" hard? Ultimately, because the policy has to be translated into computer code, and as Smart and others explain, the translation demands an order of precision humans don't often recognize. Wednesday's workshops explored the gap between what a policy says and what a computer can be programmed to do. For many years, Smart has liked to dramatize this gap by using four people to represent a "robot" and assigning a simple task. Just try picking up a ball with no direct visual input by asking yes/no questions of a voltage-measuring sensor.

This year, in a role-playing breakout group, we were asked to redesign a delivery robot to resolve complaints in a fictional city roughly the size of Seattle. Injuries to pedestrians have risen since delivery robots arrived; the residents of a retirement community are complaining that the robots' occupation of the sidewalks interferes with their daily walks; and one companysends its delivery robot down the street ;past a restaurant while playing ads for its across-the-street competitor.

It's not difficult to come up with ideas for ways to constrain these robots. Ban them from displaying ads. Limit them to human walking speed (which you'll need to specify precisely). Limit the time or space they're allowed to occupy. Eliminate cars and reallocate road space to create zones for pedestrians, cyclists, public tranport, and robots. Require lights and sound to warn people of the robots' movements. Let people ride on the robots. (Actually, not sure how that solves any of the problems presented, but it sounds like fun.)

As you can see from the sample, many of the solutions that the group eventually proposed were only marginally about robot design. Few could be implemented without collaboration with the city, which would have to agree and pay for infrastructure changes or develop policies and regins specifying robot functionality.

This reality was reinfoced in a later exercise, in which Cindy Grimm, Ruth West, and Kristen Thomasen broke us into robot design teams and tasked us to design a robot to solve these complaints reinforced this. Most of the proposals involved reorganizating public space (one group suggested sending package delivery robots through the sewer system rather than on public streets and sidewalks), sometimes at considerable expense. Our group, concerned about sustainability, wanted the eventual robot made out of 3D printed engineered wood, but hit physical constraints when Grimm pointed out that our comprehensive array of sensors wouldn't fit on the small form factor we'd picked - and would be energy-intensive. No battery life.

The deeper problem we raised: why use robots for this at all? Unless you're a package delivery company seeking to cut labor costs, what's the benefit over current delivery systems? We couldn't think of one. With Canadian journalist Paris Marx's recent book on autonomous vehicles , Road to Nowhere fresh in my mind, however, the threat to publc ownership of the sidewalk seemed real.

The same sort of real problem surfaced in discussions of a different problem, based on Paige Tutosi's winning entry in a recent roboethics competition. In this exercise, we were given three short lists: rooms in a house, people who live in the house, and objects around the house. The idea was to come up with rules for sending the objects to individuals that could be implemented in computer code for its robot servant. In an example ruleset, no one can order the robot to send a beer to the baby or chocolate to the dog.

My breakout group quickly got stuck in contemplating the possible power dynamics and relationships in the house. Was the "mother" the superuser who operated in God mode? Or was she an elderly dementia patient who lived with her superuser daughter, her daughter's boyfriend, and their baby? Then someone asked the killer question: "Who is paying for the robot?" People whose benefits payments arrive on prepay credit cards with government-designed constraints on their use could relate.

The summary reports from the other groups revealed a significant split between those who sought to build a set of rules that specified what was forbidden (comparable to English or American law) and those who sought to build a set of rules that specified what was permitted (more like German law).

For the English approach, you have to think ahead of time of all the things that could go wrong and create rules to prevent them. This is by far the easier approach - easier to code, and safer for robot manufacturers seeking to limit their liability. Robots' capabilities will default to strictly limited to "known-safe".

The fact of this split suggested that at heart developing "robot ethics" is recapitulating all of legal history back to first principles. Viewed that way, robots are dangerous. Not because they are likely to attack us - but because they can be the vector for making moot, in stealth, by inches, and to benefit their empowered commissioners, our entire framework of human rights and freedoms.

Illustrations: Boston Dynamics' canine robot visits We Robot.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.


TrackBack URL for this entry:

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)