OpenAI has updated its terms: every video generated with Sora 2 is now the user’s legal responsibility. A wave of Disney and political deepfakes was enough to ignite the legal fuse.
The launch of Sora 2 — OpenAI’s video generator — was an instant success and a predictable disaster. Within 48 hours, millions of users tested its limits, producing everything from photorealistic animations to scenes involving copyrighted icons. Chaos flooded social media before the company could react.
The answer came in the form of a contract. A new emergency terms-of-service update now requires every user to agree that they “assume full legal responsibility” for any video featuring third parties, trademarks, or recognizable characters. In other words: OpenAI washed its hands.
The move isn’t just legal — it’s strategic. By shifting liability, the company shields itself from an incoming wave of lawsuits from Hollywood studios, media conglomerates, and even governments. The clause also doubles as a filter: violators can lose ChatGPT access, API keys, and all stored data.
Behind the scenes, executives described Sora 2’s first week as “chaotic.” The model became the most downloaded app on Earth in under two days, surpassing even TikTok’s initial record. But virality came with a cost: political deepfakes during an election year and fake celebrity videos — from Tom Cruise to Taylor Swift.
The decision to transfer risk marks an inflection point: OpenAI is no longer just a technology provider but an architect of accountability. It’s the opposite of social media logic — where companies claim neutrality. Here, OpenAI says explicitly: “You’re responsible for what you create.”
Critics call it a trap: users remain locked inside OpenAI’s ecosystem — with their data, models, and accounts — while bearing the full legal risk. Others call it realism: the price of creative freedom in an era of unprecedented computational power.
The precedent is dangerous. If it sticks, it will redefine the frontier between creator and tool, artist and algorithm, company and platform. OpenAI is testing what “responsibility” means in the age of synthetic reality — and the courts haven’t even begun to respond.
Power shifts hands: from company to user. From protection to exposure.
In every tech bubble, profit is privatized — and risk is socialized.
BE THE FIRST TO KNOW
You already know incompress isn’t here to react to headlines.
Signals before the noise—including the prediction-market turn the press saw days later.




