Is Adobe using your photos to train AI? It’s complicated • TechCrunch

A sharp-eyed developer at Krita noticed that recently, in their Adobe Creative Cloud account settings, the company had opted them (and everyone else) into a “content analysis” program where they “may analyze your content using techniques such as machine learning (eg for pattern recognition) to develop and improve our products and services.” Some have taken this as ingesting your images for AI. And… they do. In a way? But it is not that simple.

First, a lot of software out there has some kind of “share information with developer” where it sends telemetry like how often you use the app or certain features, why it crashed, etc. Usually it gives you the option to turn this off during installation, but not always – Microsoft incurred the ire of many when it initially said that telemetry was on by default and impossible to turn off in Windows 10.

It’s gross, but what’s worse is dropping a new sharing method and selecting existing users for it. Adobe told PetaPixel that this content analysis “is not new and has been in place for a decade.” If they used machine learning for this purpose and said it a decade ago, that’s pretty impressive, and apparently no one noticed all along. That seems unlikely. I suspect that the policy has existed in one form or another, but has quietly evolved.

But the wording of the setting is clear: it can analyze your content using machine learning, not for that purpose training machine learning. As stated in the “learn more” link:

For example, we may use machine learning-enabled features to help you organize and edit your photos faster and more accurately. With object recognition in Lightroom, we can auto-label photos of your dog or cat. In Photoshop, machine learning can be used to automatically correct the perspective of an image for you.

A machine learning analysis would also allow Adobe to tell how many people used Photoshop to, say, edit images of people versus landscapes, or other high-level metadata. It can inform product decisions and priorities.

You might say that, but that language leaves open the possibility that the images and analysis will be used to train AI models, as part of “developing our products and services”.

Make yours look like this.

True, but Adobe clarified that “Adobe does not use any data stored in customers’ Creative Cloud accounts to train its experimental Generative AI features.” That wording is clear enough, although it also has the kind of legal precision that makes you think they’re talking about something.

And if you take a closer look at the documentation, it actually says: “When we analyze your content for product improvement and development purposes, we first aggregate your content with other content and then use the aggregated content to train our algorithms and thereby improve our products. and services.”

Saw it do use your content to train the algorithms. Maybe just not that experimental generative AI algorithms.

In fact, Adobe has a program specifically for doing that: the Adobe Photoshop Improvement Program, which is opt-in and documented here. But it is entirely possible that your images, through one pipe or another, are used as content to train a generative AI. There are also circumstances when it can be reviewed manually, which is a completely different matter.

While it’s not like Adobe harvests your creativity for its models, you should opt out of this program and all others if you value privacy. You can do it here on the privacy page if you are logged in.

Leave a Reply

Your email address will not be published. Required fields are marked *