AI and Big Tech are a lethal combination for your privacy, and Adobe’s recent TOS updates prove users won’t tolerate it.
TL;DR
Big Tech’s Big AI Problem….
📉 Big Tech Trust Issues: Users are upset with Adobe, Google, and Microsoft over privacy concerns.
🛡️ Privacy Concerns: Adobe’s new TOS allows access to user content for moderation, sparking a boycott.
🔒 Data Use: Adobe won’t train AI on customer content but will access files to improve services.
🚫 No Opt-Out: Users must accept the new TOS to use Adobe products.
💡 Key Takeaway: Big Tech needs to address user discomfort with data collection for AI.
Does Big Tech have an image problem? The consensus appears to be that it does. In just a few short months, we’ve seen massive backlashes against Google, Microsoft, and Adobe over changes to how their core products function.
The bone of contention, as always, is data collection – specifically, the ways in which these companies collect your personal data and what they do with it.
You see, in order to create a functional LLM, you need data. Lots of data. And these Big Tech companies are now engaged in an arms race to collect and collate as much of your data as possible in order to train their models.
And herein lies the problem: a growing number of users – people like you and me – are simply not comfortable with any of this, and you know what? They’re right – it is downright creepy.
Case in point: Microsoft’s Recall, arguably the creepiest thing ever envisaged by a tech company, was quickly canned after a massive backlash online.
The fact Recall even got off the drawing board proves just how disconnected Big Tech execs are from everyday people.
Adobe’s recent changes to its TOS, however, have the creative industry up in arms – and rightly so. The changes, which are completely unnecessary for its core product, fail to address the fact that many of its users work under NDAs.
But with the new TOS, Adobe would be allowed to access privileged information. It doesn’t matter whether you’re a privacy geek or not, that kind of access, especially for a paid product, is a big no-go, and the fallout from it could be spectacular.
The Adobe Boycott – What Happened
Adobe’s updated terms of service have caused massive panic among its users, resulting in droves of users claiming they will cancel their subscriptions. Sounds like a PR disaster, right?
The boycott stemmed from the fact that, per Adobe’s new TOS agreement, Adobe will have access to active projects – even those under NDA – for the purposes of “content moderation” and other reasons.
Sorry, I’m not looking to discuss anything here, just making a statement. I find it disappointing that Adobe wants to bleed their customers for more money to use their products. I loved Photoshop, but I want to purchase it once and own it like I do with all my other creative software on my computer. I’ll upgrade it myself when I feel compelled to do so. This isn’t supposed to be a streaming service. This isn’t Netflix. And your new business model show that you don’t care what your customers think. So, from now on, until you give us what we want: software that we can actually purchase and own, I will NEVER rent any of your products as a loan out. PERIOD.
Adobe Forum
Inside the agreement, “content moderation” is used, among other reasons, to “improve our Services and Software,” and that means Adobe’s AI model, Firefly.
And to make matters worse, there is no opt-out for this. You HAVE to accept the new TOS to use Adobe products. As blunders go, this one is pretty large, so much so that Adobe has issued an “update” on its new TOS to “shed light” on some of its users’ concerns.
Here’s a breakdown of the “updated TOS” from Adobe:
TL;DR
Key Takeaways from Adobe’s Terms of Use Update
Enhanced Moderation Processes:
- Adobe has increased human moderation in its content review processes to ensure responsible innovation, particularly with the rise of Generative AI.
Limited License for Content Access:
- Adobe requires a limited license to access user content for the purposes of operating and improving its services, and to enforce its terms and comply with legal requirements.
Content Access for Functionality:
- Adobe applications need access to content to perform their designed functions, such as opening/editing files and creating previews.
- Advanced features like Photoshop Neural Filters, Liquid Mode, and Remove Background also require content access.
Content Screening on Adobe Servers:
- Adobe may use automated technologies and manual review to screen for illegal or abusive content on its servers, such as child sexual abuse material, spam, or phishing.
Commitment to Customer Content Ownership:
- Adobe does not train its Firefly Generative AI models on customer content. Instead, it uses licensed and public domain content.
- Adobe does not assume ownership of customer content; customers retain full ownership of their work.
Big Tech’s Big Problem With AI
If you’re a publicly traded company, you work for your shareholders. The job of the CEO of a publicly traded company is to deliver value – meaning, growth – to his shareholders. If he or she cannot do this, they are replaced.
But with companies like Adobe and Google, two of the largest monopolies in existence today, growth is starting to become a problem: when everybody already uses your product, there’s decreasing levels of scope for real, tangible growth.
This is why Google, Microsoft, and Adobe are now going all in on AI models – it’s the new “growth product” that’ll see them through the next decade. Or, at least, this appears to be the operating theory at the moment.
But what none of them seem to factor into their decisions is that people en masse do not seem to want their personal data used to train these new AI models.
And herein lies Big Tech’s biggest problem right now. AI is useful and has plenty of applications, but the current way it is trained – using proprietary IP and personal data – isn’t exactly ethical, and more and more people are now waking up to this fact.
Leave a Reply