How do I stop a generative AI system from training on my work?

The legal challenge: understanding the presumption of consent

Within the European Union, there is a presumption of authors' consent for the use of their online content for the training of generative artificial intelligences: this is the Opt-out mechanism. In other words, if it is possible to access your content lawfully and you do not explicitly object to such access, it is presumed that you consent to generative AI systems training on your content.

This situation leaves creators with no choice, they cannot adopt passive behavior.

Legal solutions to exercise your right to opt out

1. The contractual opt-out clause

The most legally robust solution is to Have a professional opt-out clause written. This approach, which has been implemented by several collective management companies (SACEM, ADAGP, etc.), is particularly recommended for:

  • Businesses protecting their intellectual property
  • Creators who want USolid contract protection

A well-written opt-out clause should be specific, clear and enforceable against third parties. It requires legal expertise to be effective in the face of technological and regulatory developments.

2. Opt-out forms for companies that provide generative AI systems

Businesses that provide generative AI systems will need to provide a “sufficiently detailed summary” of the data used for training and allow rights holders to exercise their right to object.

Theoretically, companies developing generative AIs should also offer forms that allow authors to express their refusal for the AI system to be trained on their content. In practice, these mechanisms remain still rarely available and often insufficient.

3. The W3C Opt-Out Protocol

It is recommended to use the W3C “opt-out” protocol : it's about expressing your refusal that generative AI systems train on content by adding files robots.txt, of HTML metadata or to follow the European standard to report opposition to data mining by addingHTTP headers.

These technical solutions make it possible to automatically report your refusal to crawlers, however, there is currently no guarantee that companies that conduct text and data mining will comply with these instructions.

Technical protection tools: Nightshade and Glaze

Nightshade: the AI “poisoner” tool

Nightshade is designed as an offensive tool to distort feature representations within image-generative AI models. Developed by the University of Chicago, this tool:

  • Change pixels invisibly of your images
  • Deceive AI models who will interpret your works incorrectly
  • May degrade performance AIs trained on “poisoned” images

Fewer than 100 poisonous images are enough to start corrupting the prompts, making this tool particularly powerful when used collectively.

Glaze: the protection of artistic style

Glaze applies a filter to artworks, modifying the pixels minimally so that they appear unchanged to human eyes, but differently to AI models.

Unlike Nightshade, which is offensive, Glaze is a defensive tool that specifically protects your artistic style from being imitated.

Nightshade and Glaze limitations and practical considerations

These technical tools have some limitations:

  • Temporary effectiveness : what works today may no longer be effective tomorrow
  • Visibility on some images : Nightshade changes are more visible on images with flat colors and smooth backgrounds
  • Complementary protection : these tools must be combined with legal protections

The evolving European regulatory framework

As of August 2, 2025, all providers of AI models must provide technical documentation and comply with the copyright directive.

The penalties for non-compliance with obligations are substantial: 3% of global turnover, or at least 15 million euros.

This regulatory evolution reinforces the importance of a proactive legal strategy to protect your copyright.

Strategic recommendations

For optimal protection, it is recommended to:

✅ Have a professional opt-out clause written adapted to your situation

✅ Combining technical protections (technical opt-out protocols via robots.txt; HTML tags, HTTP headers etc.)

✅ Document your procedures to opt out to facilitate possible legal actions

✅ Monitor the evolution regulations and protection tools

The importance of specialized legal support

The increasing complexity of AI and intellectual property law requires expert legal support. A specialized lawyer can:

  • Drafting tailor-made opt-out clauses
  • Advise you on the most appropriate protection strategy
  • Support you in your opposition procedures
  • Anticipate regulatory changes

Conclusion: act now to protect your creation

The challenge of protecting content in the face of generative AI is crucial for the future of creation. While the presumption of consent may seem unfavourable to creators, legal and technical tools exist to assert your rights.

Action must be proactive : the longer you wait, the more likely your works are to be integrated into AI training bases. A strategy combining legal expertise and technical tools offers the best possible protection in the current regulatory context.

Do you want to protect your creations effectively? As an expert lawyer in AI and intellectual property law, I support you in drafting tailor-made opt-out clauses and implementing a comprehensive protection strategy.

The article has been written by Betty Jeulin and the English translation generated with AI.