Post disclaimer: The only surefire way for your images to not be used to train AI models is to never post them. Nightshade is a useful deterrent but no technology is truly futureproof.

Alrighty, yo. A little doomsdayin' to kick this off, but I swear I'll offer a solution after creeping you out.


The other day I was on Facebook. I'm not typically, but I'm trying to sell some camera gear on marketplace (gently used, take me up on it, heyo!) so I have to pop on occasionally to see if I have messages. Wow. That was an unnecessary tangent. In any event, I'm in some local model and photographer groups. They connect (shocking) models and photographers in Austin so they can do fun projects together. I saw an insane post the other day. Let's see if I can find it...


Ok, I couldn't find it. Here's the gist. People are beginning to post in local groups looking for models with OnlyFans accounts for a "paid gig" to make an AI Model of them. In other words, they are looking for people who are probably pretty comfortable with monetizing their sexuality on the internet to create AI-based models of them for generative porn. This is no longer a postulation in a national news article, this is a physical example of it beginning with images made in my community by people I'm probably only one or two steps removed from.


But you're like, "ok, whatever. Why do I care? This is consensual."


Increasingly in the media, there have been more accounts of celebrity voice actors being abused by modding communities that clone their voices and use them as voice overs for pornagraphic characters. This is without consent. They're neither paid nor asked for approval


Eek. This is where the really gross, doomsdaying starts to come into play. As these types of services expand and popularize, the more I begin thinking about my daughter's digital footprint and how her likeness could be used to non-consensually train AI models. I don't care what the application is, I'm not here for it. I'm not signing off. I don't have to think too hard about despicable ways it could be abused to be firm in my beliefs here.

Nightshade manipulates artists' images into looking to AI like something else entirely. Credit: Shan Et Al/ArXiv:2310.13828

I'm totally a hypocrite


I've been mildly aware of this mounting pressure in the background, and more aware than average about digital footprint challenges for kids, but have continued to post my daughter's photos on social media. I've justified this by arguing the images aren't embarrassing. She'll never be bullied over the ones I post online or the supporting captions. However, that's a justification based on a static interpretation of an image, not removing her likeness and dynamically creating NEW scenarios for her to exist in which are, at best, not done with her approval, and at worse, compromising.


So what to do, what to do? We can't just throw our arms up and say "it is what it is!" I can maybe do that with images of me (of which there are noticeably few anyways); my life is done and baked. There's a tingle of concern that someone could go really off the rails with my likeness, however the fact is I'm not a 10. And in an internet where the absolutely most beautiful humans on the planet are also there, I've got a feeling someone else will be targeted well before I am.


But Eve.... Eve is a different story. She's young. They could train a likeness off her at every phase of life if they wanted potentially. I don't feel comfortable throwing my hands up and being willfully ignorant when there are solutions swirling around every version of the internet to help me tackle this unique challenge her generation faces.

Nightshade


When we were training our dog Lily, we did it in phases. We began with crate training. Then she graduated to being confined to a kitchen space where we could easily clean up any messes. She started counter surfing at this stage. Oh it was frustrating. One time she even grabbed a book on how to train puppies and shredded it, the irony not lost on us. My husband started leaving out sacrificial paper towers covered in Frank's hot sauce when we'd leave. One day, she got it. And oh did I laughhhhhhh! Naturally, she was coughing and hacking and probably would've googled "will this ever end" had she had a cell phone. Carter, for a moment, thought he'd killed her, and we all descended into chaos and then.... she was fine. And she also stopped counter surfing.


A University of Chicago professor has made a piece of software called Nightshade that essentially covers your photos in proverbial hot sauce like we did to Lily so that AI models choke on and can't train on them. It poisons their models. You load a picture of a dog chasing a ball that's been run through Nightshade, an AI model trains on it, and now when someone types in "dog chasing ball" they get a picture of a Model T Ford hitting a squirrel on the moon.


It's free. It's legal. Irreversible. And most importantly, it's the first step in a practical deterrent for non-consenual use of your art, likeness, etc. to train AI models on. Because look, shaking your index finger and telling these companies they've been bad after they've already trained a model on your likeness is not an effective way to garner behavioral changes from these companies. Neither is creating commercial packages for them to pay to train off of, because why even do that if you can get it for free? BUT if the alternative is they use free images they don't have rights to and it screws up the functionality and subsequently the commercial model for their product.... well, now we're cooking with gas.

Quick, gratuitous photos of my dogs (brindle one we tricked with the hot sauce paper towel is Tiger Lily)

My vision


I was hyped when I learned about this project. I had dreams of passing all of my customers' galleries through this and potentially upselling this as a value-based add-on. Hell, I'd pay for a gallery of my photos to have poison pills in them! That's absolutely valuable in this day-and-age.


I gave it a shot. Truthfully, at this point in time, it's impractical to pass entire galleries through. It takes anywhere from 30-180 minutes per image to poison. My family galleries are typically somewhere between 50-120 images, and I just cannot justify devoting the time it would take to poison every image I deliver for families. I feel confident that, whether it's Nightshade or some second-to-market competitor, this functionality will improve in capability and speed over time because there IS a need. For now, I see myself using it on images I post of Eve's face, one-at-a-time. Because for my daughter, I have the time to go through this riggemroll on a one-off basis. And frankly, you might consider doing this to your images as well.

Nightshade's interface

How to use Nightshade on your family's photos


At least for now, Nightshade is free. I have no idea how long this will last. But I'm gettin' while the gettin's good. Here's how to use it on your own images.


  1. Go to the Nightshade Download page
  2. Download the appropriate version of Nightshade (Windows/ old Mac/ new Mac)
  3. Install
  4. Open (it'll take 5 minutes or so to fully get setup, but it's automatic. You will need at least 4GB of space on your harddrive)
  5. Choose an image you want to poison from your
  6. Set how much you want it to poison your image (too much can really create some wild results)
  7. Choose the speed of the conversion (this will translate to how spicy that hot sauce is)
  8. Pick a download folder location
  9. And whabam! Time to go!


Note from Nightshade: "We would generally not recommend marking Nightshaded images as "Nightshaded" in social media posts or on your online gallery. Nightshade is a poison attack, and marking it as poison will almost certainly ensure that it fails its purpose, because it will be easily identified and filtered out by model trainers."


Note from me about their note: The point is two pronged-- you want to make sure your images aren't used to train on AND poison the AI models. Even if you did name your image "nightshaded" and it was filtered out, you'd still likely achieve your first objective.


For my first round of images, I chose a film photos I'd scanned in with my digital camera scanning setup. The before and after are below. Running through Nightshade produces not visible differences in the content of the image, but has all the benefits of not being able to train on it. Try it. Let me know what you think!

Note for photographers


Your image will be slightly, visibly adjusted. Most of the changes are invisible, but you can see below, it's not totally unaltered. What does this mean for you? It means maybe the images you POST can be run through Nightshade, but the ones delivered for printing/ viewing/ etc. should be un-shaded. If you sell images on Etsy, etc. the preview images could be shaded, but the downloads not. You get the picture. For parents hesitant about signing model releases in contract agreements, maybe you could help assuage them by letting them know anything you post will be run through something like this. Those are my suggestions for now, at least. I've emailed Nightshade for any other recommendations they may have and will update the post if they respond with anything meaningful.

Left image is before Nightshade, Right is after Nightshade. Default settings, Medium Render quality. Ran on an M2 with 12 CPU 64GB RAM, took ~70 minutes.