50% OFF Residential Proxies for 9 months — use code IPR50 at checkout

Get The Deal

In This Article

Back to blog

Industry News #3: More Lawsuits, New Licensing Standards, and Google’s Search Shift

News

Billion-dollar AI lawsuits, licensing updates, and Google’s search changes are redefining web data and proxy use.

Karolis Toleikis

Last updated - ‐ 3 min read

Billion-dollar lawsuits, new licensing frameworks, and shifting search algorithms - the fight over how data is used heats up, and proxies are at the center of it all.

These developments highlight the growing tension between innovation and regulation. AI systems, web platforms, and data tools must navigate new rules, evolving standards, and technical hurdles to operate effectively.

Here’s a look at this month’s top stories and their impact on the future of web data. If you prefer, watch the video below:

Perplexity AI Faces Another Lawsuit

Last month , we reported on Cloudflare accusing Perplexity AI of scraping websites without consent. Now, Perplexity is facing another lawsuit . Encyclopaedia Britannica and Merriam-Webster claim the AI used their copyrighted content without permission. They also allege trademark misuse, claiming Perplexity diverted web traffic by giving users direct answers instead of linking to the original sources.

This case highlights the growing legal risks for AI systems that rely on scraping live web content. Copyright is one concern, but lost publisher revenue, misleading attribution, and brand misuse are also at stake. The takeaway? Future AI models may need licensing deals, stricter attribution, and clearer guidelines on how they use scraped data.

Hollywood Hits Back: Lawsuit Targets MiniMax

The unlawful data scraping trend isn’t limited to Perplexity. Disney, NBCUniversal, and Warner Brothers Discovery have filed a lawsuit against MiniMax, a Chinese AI startup behind Hailuo AI. The studios are accusing the company of training its AI on copyrighted movies and shows without permission.

For proxy providers, this is a reminder that the infrastructure behind AI can be scrutinized, and proxies are no exception. They can mask traffic, bypass geo-restrictions, or scale data collection. With that in mind, proxy providers need to emphasize compliance, update terms of service, and position their services as tools for ethical data use rather than simply avoiding copyright restrictions.

Ready to get started?
Register now

AI Content Scraping Faces a Licensing Revolution

In response to unauthorized scraping, publishers are fighting back. Reddit, Yahoo, Medium, and wikiHow are launching the Really Simple Licensing standard, or RSL. This open licensing framework lets publishers set machine-readable terms for their content.

The goal? Making sure that AI companies pay creators fairly for using their work. Instead of simply blocking scrapers, RSL encourages licensing and royalty-based usage, which could reshape how AI interacts with online content.

Google’s JavaScript SERPs Drive Demand for Smarter Proxies

Google is moving toward JavaScript-rendered search results and AI-powered search. This makes it difficult for eCommerce businesses and SEO tools to track rankings and visibility with traditional crawlers.

As a result, proxies are now critical. They help distribute requests, avoid detection, and enable accurate data collection at scale. This shift is increasing demand for high-performance proxy solutions with high geolocation diversity and support for headless browsing, while also emphasizing responsible and compliant use.

Emerging AI Protocols Make Proxies Essential for Data Access

Finally, the rise of Model Content Protocol , or MCP, is pushing AI developers toward agentic browsing. When APIs are limited, AI simulates user behavior to gather data. In these cases, proxies play a crucial role. MCP servers rely on rotating, geodiverse IPs to gather content at scale, overcome restrictions, and stay undetected.

This change directly drives demand for advanced proxy infrastructure. For proxy providers, it represents both an opportunity and a responsibility. Aside from offering high-performance solutions tailored to AI workflows, they also need to address the legal and ethical risks of unauthorized scraping or bypassing APIs.

Final Thoughts

From lawsuits and licensing updates to emerging AI protocols, the landscape is evolving fast, and staying ahead means having the right tools. IPRoyal offers high-performance, compliant proxy infrastructure that helps teams handle complex AI workflows safely and efficiently.

Create Account
Share on
Article by IPRoyal
Meet our writers
Data News in Your Inbox

No spam whatsoever, just pure data gathering news, trending topics and useful links. Unsubscribe anytime.

No spam. Unsubscribe anytime.

Related articles