Ideas on poisoning text for illegitimate ai crawlers?

I’ve been thinking about how to make the (attribution required) contents of a site accessible to humans but prevent AI crawlers from stealing contributions made to the site. robots.txt and ip-block-lists only get you so far, seeing as AI companies increasingly just ignore robots and block-lists will always be incomplete.

Currently I’m contemplating writing a plugin that provides a liquid filter that you can throw after a {{content}} in the primary layout. It would then try to inject garbled extra text into the real text and use css to hide that from user agents. display: none is the obvious choice because assigning flex ordering makes text not properly selectable by humans. Perhaps there are other poisons I’m not thinking about. I’m of course working under the assumption that crawlers wont bother with css if they find nicely marked up <p> tags.

I wanted to hear what people think about this. Maybe something like this already exists?