David Wadhwani, president of digital media at Adobe
The software giant has got a lot of flack from creatives for its focus on ‘ethical’ AI. But I’m starting to think they might actually have a point.
Last week I was in North Greenwich, London, for Adobe Max, the big conference by the design software giant. Full disclosure: like many of the journalists here, the company has paid my train fare and for me to stay in a nice hotel. At the same time, they’ve done this in full knowledge that I often write quite negative things about them. And I gave them no sign that would change this year.
In fact, as I travelled to London yesterday, my mind was firmly fixed on one thing: that Adobe is losing the trust of creatives.
Partly because the Creative Cloud remains so darned expensive. And partly because all they ever seem to talk about these days is their generative AI model, Firefly. Generative AI, let’s remind ourselves, is extremely unpopular with a lot of creative professionals today.
Admittedly, Adobe do stand out for their ‘ethical’ policy towards gen AI, whereby they only train their AI models on their own Adobe Stock content. But is that really a sign that they have our backs, or just a fig leaf for more nefarious ambitions?
Until recently, I was tending towards the latter. But then, at a demo of their Content Credentials software at Adobe Max, I started to think differently.
Protecting copyright
For the uninitiated, Content Credentials is a piece of open-source software that Adobe’s developed for the world to use as it wishes: much as it invented the PDF (Portable Document Format) many moons ago.
Content Credentials, in crude terms, is a way of embedding proof that you took a picture or made an illustration, and nobody else did.
Apparently, it creates an invisible ‘watermark’ in the pixels which will persist there regardless of any technical trickery. So even, for instance, if someone takes a photo of your photo, the software will still somehow recognise it as your own.
Content Credentials are now live in Photoshop and other Adobe media
Kelly Hurlburt, senior designer, Creative Cloud Services
Kelly Weldon, senior staff experience designer
Clever stuff, and this could well be the future of copyright protection: like a modern version of sending yourself copyrighted material in the mail, so you have the postmark as proof.
Content Credentials is a bit more sophisticated than that, though. It can tell you a lot about how something was created, what camera or software was used, what edits were made, and so on. In a world of fake news and fake images, this could all become pretty important.
In fact, I’d say that if you’re a Creative Cloud subscriber, you should be ticking that Content Credentials box every time you start a new piece of work (it’s baked into all the major Adobe tools). Admittedly, if some 12-year-old in China rips off your work, it’s unlikely to be of much use. But if a major brand does—something we depressingly hear a lot of complaints about at Creative Boom—it could be very helpful indeed.
Protection against scraping
So how does all this tie in with generative AI? Well, if you add Content Credentials to your image, you can also tick a box that says you don’t want this image scraped by AI.
Great, I thought. So my obvious question to the guy giving the demo—Andy Parsons (senior director, content authenticity at Adobe)—was: “Apart from Adobe, have any of the other AI companies said they’ll also respect my wishes in this way?”
The answer, unfortunately, was a simple, and pointed, no. Because while Adobe says it will only ever train its Firefly model on Adobe Stock, other companies like Midjourney and Open AI have made no such commitment. Which means that, basically, we’re screwed.
Screwed, that is, as long as that nothing changes. But of course, everything in this world is changing right now. And actually, only a couple of things need to happen, and things could change quite dramatically.
The optimistic scenario
First off, assume that one of the many lawsuits aiming to protect creators’ and publishers’ copyright against the juggernaut of AI finds success. It could be The New York Times v. OpenAI and Microsoft. It could be Andersen v. Stability AI. It could be Authors Guild v. OpenAI. There are over 30 of these going through the courts right now, and if a judge finds in favour of a single one, that will change everything.
Deepa Subramaniam, VP at Adobe
Firefly is now available for video as well as images
Elise Swopes, senior design evangelist
What would happen then? Well, at the moment all these AI tech giants are in bed with Trump, so it’s likely that—just as TikTok—the White House would attempt some compromise, some deal, some stay of execution that would keep these trillion-dollar operations in business. But again, that’s assuming things stay the same. And if Trump’s first 100 days have proved anything, it’s that nothing is certain.
The President’s tariff war will go one of two ways. And if things turn bad—the economy is weakened, his popularity drains away, and the Republicans have disastrous mid-terms—Trump may not be in the mood to focus his energies on protecting the AI industry. Heck, the man falls out with people on the turn of a dime, so they might just have lost his support by then anyway.
Adobe the saviour?
I can see a possible future, then, in which the current rampant, unfettered trawling of content by AI training models gets properly shut down. At which point, where do OpenAI, Facebook, Google and so forth turn? The one company that’s been doing things relatively ethically, and would be untouched by any new restrictions.
At this point, Adobe—which is already, let’s remember, a $150 billion company—would be in prime position to set terms over the future of AI, and maybe become the major partner, in an echo of the AOL-Time Warner-style mergers of the early 2000s.
Far-fetched? Maybe. But it’s a reason for hope nonetheless. And a reason to ponder that Adobe’s ethical model of generative AI, combined with its Content Credentials technology, might turn out to be less an annoyance to creatives, and more a benefit to us all.