- USDT(TRC-20)
- $0.0
Yesterday, I wrote about the controversy surrounding Adobe and its updated terms of service. Creators were irate after receiving a pop-up forcing them to agree to the new terms: If not, they could not access Photoshop, nor could they delete the app from their machines.
It wasn't only the fact that the terms were mandatory that alarmed so many users, however. The new language seemed to suggest that Adobe was claiming the right to access creators' work for a myriad of reasons: That rubbed many the wrong way, as many professionals have NDAs in place for their work with Adobe apps. Of course, legal situations notwithstanding, many also rejected the idea that Adobe could access work produced by these creators, simply because Adobe made the apps they were using in the first place.
Adobe remained silent on the issue, until publishing this blog post. In it, the company explains that its changes to its "Terms of Use" were actually small adjustments, and were meant to bring clarity to the company's moderation policies. The company posted a snippet of the terms to the blog post, with new additions highlighted in pink (including any items that were deleted from the previous terms):
Credit: Adobe
According to Adobe, what's new here is the company says it "may" (rather than "will only") access your content through automated and manual methods, and that it reviews content to screen for illegal content, including child sexual abuse material. If an automated system thinks something is illegal, then it flags the item for human review. The rest of the terms are apparently the same as they've always been, and the pop-up that appeared was a routine re-acceptance campaign for users to agree to the small changes.
Since that "access" was at the crux of the controversy, Adobe went into more detail in the blog post about why it needs it. The company says it needs access to user content for three specific reasons: to run standard functions in apps (like opening files or creating thumbnails); for cloud-based features, like Photoshop Neural Filters and Remove Background; and, as mentioned in the terms above, to screen for illegal activity or other abusive content.
Further, the company says it does not train Firefly Gen AI models on your content, nor will Adobe ever "assume ownership" over your work. If you're wondering why the company specifically says Firefly Gen AI models, and not a more general statement on training AI in general, that's because the company does use the content you store in the cloud, including images, audio, video, text, or documents, to train its AI. Any data you upload to Adobe's servers is fair game for this process, and is aggregated with everyone else's data in order to train Adobe AI to "improve [Adobe's] products and services."
This is not explicitly laid out in the blog post, but Adobe's support article says you can opt-out of this training by heading to the privacy settings of your account, then deactivating the toggle for Allow my content to be analyzed by Adobe for product improvement and development purposes under Content analysis.
Adobe is likely not constantly scraping your work looking for insider secrets on your projects, and it flat-out says it won't claim ownership of your projects. However, the company can access anything you upload to Adobe servers: This access lets Adobe scan for illegal content, but also lets the company scrape your work to train its AI models.
While opting out of AI training is wise, the best way to continue using Adobe apps without worrying about Adobe's access is by keeping all projects local on your machine. If you don't use Adobe's cloud-based services, the company can only access your work for app-related tasks, like generating thumbnails—if the terms are to be believed.
These rules have also largely been in place for an undisclosed amount of time: The pop-up you may have seen this week was for you to agree to the small tweaks Adobe made to the terms, not to agree to sweeping changes. You already agreed to those policies—you just didn't know it. My recommendation? Limit your cloud-based work with Adobe going forward, unless you absolutely need to for work. The more of your content you can keep on your machine, the better.
Full story here:
It wasn't only the fact that the terms were mandatory that alarmed so many users, however. The new language seemed to suggest that Adobe was claiming the right to access creators' work for a myriad of reasons: That rubbed many the wrong way, as many professionals have NDAs in place for their work with Adobe apps. Of course, legal situations notwithstanding, many also rejected the idea that Adobe could access work produced by these creators, simply because Adobe made the apps they were using in the first place.
Adobe remained silent on the issue, until publishing this blog post. In it, the company explains that its changes to its "Terms of Use" were actually small adjustments, and were meant to bring clarity to the company's moderation policies. The company posted a snippet of the terms to the blog post, with new additions highlighted in pink (including any items that were deleted from the previous terms):
Credit: Adobe
According to Adobe, what's new here is the company says it "may" (rather than "will only") access your content through automated and manual methods, and that it reviews content to screen for illegal content, including child sexual abuse material. If an automated system thinks something is illegal, then it flags the item for human review. The rest of the terms are apparently the same as they've always been, and the pop-up that appeared was a routine re-acceptance campaign for users to agree to the small changes.
Since that "access" was at the crux of the controversy, Adobe went into more detail in the blog post about why it needs it. The company says it needs access to user content for three specific reasons: to run standard functions in apps (like opening files or creating thumbnails); for cloud-based features, like Photoshop Neural Filters and Remove Background; and, as mentioned in the terms above, to screen for illegal activity or other abusive content.
Further, the company says it does not train Firefly Gen AI models on your content, nor will Adobe ever "assume ownership" over your work. If you're wondering why the company specifically says Firefly Gen AI models, and not a more general statement on training AI in general, that's because the company does use the content you store in the cloud, including images, audio, video, text, or documents, to train its AI. Any data you upload to Adobe's servers is fair game for this process, and is aggregated with everyone else's data in order to train Adobe AI to "improve [Adobe's] products and services."
This is not explicitly laid out in the blog post, but Adobe's support article says you can opt-out of this training by heading to the privacy settings of your account, then deactivating the toggle for Allow my content to be analyzed by Adobe for product improvement and development purposes under Content analysis.
What's the bottom line?
Adobe is likely not constantly scraping your work looking for insider secrets on your projects, and it flat-out says it won't claim ownership of your projects. However, the company can access anything you upload to Adobe servers: This access lets Adobe scan for illegal content, but also lets the company scrape your work to train its AI models.
While opting out of AI training is wise, the best way to continue using Adobe apps without worrying about Adobe's access is by keeping all projects local on your machine. If you don't use Adobe's cloud-based services, the company can only access your work for app-related tasks, like generating thumbnails—if the terms are to be believed.
These rules have also largely been in place for an undisclosed amount of time: The pop-up you may have seen this week was for you to agree to the small tweaks Adobe made to the terms, not to agree to sweeping changes. You already agreed to those policies—you just didn't know it. My recommendation? Limit your cloud-based work with Adobe going forward, unless you absolutely need to for work. The more of your content you can keep on your machine, the better.
Full story here: