" /> net.wars: August 2022 Archives

« July 2022 | Main

August 12, 2022

Nebraska story

Thumbnail image for Facebook-76536_640.pngThis week saw the arrest of a Nebraska teenager and her mother, who are charged with multiple felonies for terminating the 17-year-old's pregnancy at 28 weeks and burying (and, apparently, trying to burn) the fetus. Allegedly, this was a home-based medication abortion...and the reason the authorities found out is that following a tip-off the police got a search warrant for the pair's Facebook accounts. There, the investigators found messages suggesting the mother had bought the pills and instructed her daughter how to use them.

Cue kneejerk reactions. "Abortion" is a hot button. Facebook privacy is a hot button. Result: in reporting these gruesome events most media have chosen to blame this horror story on Facebook for turning over the data.

As much as I love a good reason to bash Facebook, this isn't the right take.

Meta - Facebook's parent - has responded to the stories with a "correction" that says the company turned over the women's data in response to valid legal warrants issued by the Nebraska court *before* the Supreme Court ruling. The company adds, "The warrants did not mention abortion at all."

What the PR folks have elided is that both the Supreme Court's Dobbs decision, which overturned Roe v. Wade, and the wording of the warrants are entirely irrelevant. It doesn't *matter* that this case was about an abortion. Meta/Facebook will *always* turn over user data in compliance with a valid legal warrant issued by a court, especially in the US, its home country. So will every other major technology company.

You may dispute the justice of Nebraska's 2019 Pain-Capable Unborn Child Act, under which abortion is illegal after 20 weeks from fertilization (22 weeks in normal medical parlance). But that's not Meta's concern. What Meta cares about is legal compliance and the technical validity of the warrant. Meta is a business, not a social justice organization, and while many want Mark Zuckerberg to use his personal judgment and clout to refuse to do business with oppressive regimes (by which they usually mean China, or Myanmar), do you really want him and his company to obey only laws they agree with?

There will be many much worse cases to come, because states will enact and enforce the vastly more restrictive abortion laws that Dobbs enables, and there will be many valid legal warrants that force them to hand data to police bent on prosecuting people in excruciating pregnancy-related situations - and in many more countries. Even in the UK, where (except for Northern Ireland) abortion has been mostly non-contentious for decades, lurking behind the 1967 law which legalized abortion until 24 weeks is an 1861 statute under which abortion is criminal. That law, as Shanti Das recently wrote at the Guardian, has been used to prosecute dozens of women and a few men in the last decade. (See also Skeptical Inquirer.)

So if you're going to be mad at Facebook, be mad that the platform hadn't turned on end-to-end encryption for its messaging. That, as security engineer Alec Muffett has been pointing out on Twitter, would have protected the messages against access by both the system itself and by law enforcement. At the Guardian, Johana Bhuiyan reports the company is now testing turning on end-to-end encryption by default. Doubtless, soon to be followed by law enforcement and governments demanding special access.

Others advocate switching to other encrypted messaging platforms that, like Signal, provide a setting that allows you to ensure that messages automatically vape themselves after a specified number of days. Such systems retain no data that can be turned over.

It's good advice, up to a point. For one thing, it ignores most people's preference for using the familiar services their friends use. Adopting a second service just for, say, medical contacts adds complications; getting everyone you know to switch is almost impossible.

Second, it's also important to remember the power of metadata - data about data, which includes everything from email headers to search histories. "We kill people based on metadata," former NSA head Michael Hayden said in 2014 in a debate on the constitutionality of the NSA. (But not, he hastened to add, metadata collected from *Americans*.)

Logs of who has connected to whom and how frequently is often more revealing than the content of the messages sent back and forth. For example: the message content may be essentially meaningless to an outsider ("I can make it on Monday at two") until the system logs tell you that the sender is a woman of childbearing age and the recipient is an abortion clinic. This is why so many governments have favored retaining Internet connection data. Governments cite the usual use cases - organized crime, drug dealers, child abusers, and terrorists - when pushing for data retention, and they are helped by the fact that most people instinctively quail at the thought of others reading the *content* of their messages but overlook metadata's significance.intuitively grasp the importance of metadata - data about data, as in system logs, connection records - has helped enable mass Internet surveillance.

The net result of all this is to make surveillance capitalism-driven technology services dangerous for the 65.5 million women of childbearing age in the US (2020). That's a fair chunk of their most profitable users, a direct economic casualty of Dobbs.


Illustrations: Facebook.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

August 5, 2022

Painting by numbers

heron-thames-nw.JPGMy camera can see better than I can. I don't mean that it can take better pictures than I can because of its automated settings, although this is also true. I mean it can capture things I can't *see*. The heron above, captured on a grey day along the Thames towpath, was pretty much invisible to me. I was walking with a friend. Friend pointed and said, "Look. A heron" I pointed the camera more or less where she indicated, pushed zoom to maximum, hit the button, and when I got home there it was.

If the picture were a world-famous original, there might be a squabble about who owned the copyright. I pointed the camera and pushed the button, so in our world the copyright belongs to me. But my friend could stake a reasonable claim: without her, I wouldn't have known where or when to point there camera. The camera company (Sony) could argue, quite reasonably, that the camera and its embedded software, which took years to design and build, did all the work, while my entire contribution took but a second.

I imagine, however, that at the beginning of photography artists who made their living painting landscapes and portraits might have seen reason to be pretty scathing about the notion that photography deserved copyright at all. Instead of working for months to capture the right light and nuances...you just push a button? Where's the creative contribution in that?

This thought was inspired by a recent conversation on Twitter between two copyright experts - Lilian Edwards and Andres Guadamuz - who have been thinking for years about the allocation of intellectual property rights when an AI system creates or helps to create a new work. The proximate cause was Guadamuz's stunning experiments generating images usingMidjourney.

If you try out Midjourney's image-maker via the bot on its Discord server, you quickly find that each detail you add to your prompt adds detail and complexity to the resulting image; an expert at "prompt-craft" can come extraordinarily close to painting with the generation system. Writing prompts to control these generation systems and shape their output is becoming an art in itself, an expertise that will become highly valuable in itself. Guadamuz calls it "AI whispering".

Guadamuz touches on this in a June 2022 blog posting, in which he asks about the societal impact of being able to produce sophisticated essays, artworks, melodies, or software code based on a few prompts. The best human creators will still be the crucial element - I don't care how good you are at writing prompts, unless you're the human known as Vince Gilligan you+generator are not going to produce Breaking Bad or Better Call Saul. However, generation systems *might*, as Guadamuz says, produce material that's good-enough for many contexts, given that it''s free (ish).

More recently, Guadamuz considers the subject he and Edwards were mulling on Twitter: the ownership of copyright in generated images. Guadamuz had been reading the generators' terms and conditions. OpenAI, owner of DALL-E, specifies that users assign the copyright in all "Generations" its system produces, which it then places in the public domain whilegranting users a permanent license to do whatever they want with the Generations their prompts inspire. Midjourney takes the opposite approach: the user owns the generated image, and licenses it back to Midjourney.

What Guardamuz found notable was the trend toward assuming that generated images are subject to copyright, even though lawyers have argued that they can't be and fall into the public domain. Earlier this year, the US Copyright Office has rejected a request to allow an AI copyright a work. The UK is an outlier, awarding copyright in computer-generated works to the "person by whom the arrangements necessary for the creation of the work are undertaken". This is ambiguous: is that person the user who wrote the prompt or the programmers who trained the model and wrote the code?

Much of the discussion evolved around how that copyright might be divided up. Should it be shared between the user and the company that owns the generating tool? We don't assign copyright in the words we write to our pens or word processors; but as Edwards suggested, the generator tool is more like an artist for hire than a pen. Of course, if you hire a human artist to create an image for you, contract terms specify who owns the copyright. If it's a work made for hire, the artist retains no further interest.

So whatever copyright lawyers say, the companies who produce and own these systems are setting the norms as part of choosing their business model. The business of selling today's most sophisticated cameras derives from an industry that grew up selling physical objects. In a more recent age, they might have grown up selling software add-on tools on physical media. Today, they may sell subscriptions and tiers of functionality. Nonetheless, if a company's leaders come to believe there is potential for a low-cost revenue stream of royalties for reusing generated images, it will go for it. Corbis and Getty have already pioneered automated copyright enforcement.

For now, these terms and conditions aren't about developing legal theory; the companies just don't want to get sued. These are cover-your-ass exercises, like privacy policies.


Illustrations: Grey heron hanging out by the Thames in spring 2021.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.