AI, Copyright, Glaze and Nightshade

Headphones and Laptop used for Podcasting

AI Programs are using copyrighted imaged without consent

Should Alternative Photographers and Photographic Artists worry about AI? If you are worried what can you do? I know this is a bit of a contentious issue and one that tends to raise peoples blood pressure and let me assure you that’s not my intention here. Instead, I want to talk about options when it comes to your images being used to train AI without your consent, Glaze and Nightshade.

If you don’t care about your work being use to train AI then that’s up to you. However, if you’re not happy about it what can you do? Once your image (or video) is uploaded it is likely to end up being fed or scraped into an AI. Opt-out lists have been disregarded and shown not to work so that’s not the answer.

Protecting your copyrighted work

So what can you do? This is where Glaze and Nightshade come into the picture. Developed at the University of Chicago these software-based tools are designed to modify images and provide protection against unauthorised use in AI generators. They are also free to use.

Art Installation Photo by Jo Howell which has been glazed.  (c) Jo Howell maverick beyond.com
Art Installation Photo by Jo Howell which has been glazed. (c) Jo Howell maverick beyond.com

What is Glaze?

Glaze was the first to be developed with the specific aim of protecting human creators from having their work used by generative AI without permission. Systems such as Midjourney and Stable Diffusion were ‘trained’ on large amounts of data taken from the Internet which included copyright and sensitive material. As a result, users can ask these systems to create images in a certain style. This could be in the style of a specific artist/photographer with the AI creating an image that may even include the artists signature or watermark.

Alternatively, images could be generated in the style of a process such as Tintype or Cyanotype. Glaze is designed to combat this mimicry. Before posting your image it is ‘glazed’ by putting it through the system which is designed to make tiny adjustments which fool the AI into thinking that the image is in a completely different style, all the while looking unchanged to the human eye. Think of it as a new layer to the artwork that only the generative AI can see.

Measuring Chemicals in photographic Darkroom, Glazed Image - (c) Mara Robinson alchemiart.uk
Measuring Chemicals in photographic Darkroom, Glazed Image – (c) Mara Robinson alchemiart.uk

Next came Nightshade

Glaze may be great but it is essentially a defence. Its new sibling Nightshade, however, in on the offensive. Nightshade is not designed to combat mimicry but as a poisoned chalice. If an image which has been through Nightshade is fed into a Generative AI the information acts like a poison, distorting the ability of the AI to create appropriate images for the prompts provided. The more Nightshade images are fed in the worse the problem will become. The prompt might ask for an image of an apple and the poison will corrupt the model so it produced an image of a raspberry, or maybe even a horse. The idea is the more unauthorised images are fed in the more unreliable the outputs will become.

Can humans really not tell?

Having tried them out with my own images, I can report mixed results about the adjustments being imperceptible to the human eye. Small images with large solid areas seem to be the most problematic. However, the development team are continuing to refine the process.

You can find out more information about both Glaze and Nightshade from the University of Chicago.

All photos shown remain the copyright of the individual artists/photographers and have been Glazed using the default Glaze settings. All image used with permission.