By now, we’ve all heard the stories: generative AI devours copyrighted content for breakfast. And while there’s been a lot of noise, little to nothing has been done about it. Photographers and videographers keep uploading their images online—easy pickings for scrapers—while lawmakers stand frozen, terrified that regulating too hard might break the golden goose that is AI.
The U.S. Copyright Office has taken a strong stance on the outputs of generative AI, but has remained conspicuously silent on the voracious and unauthorized inputs—millions of copyrighted images that fuel these models.
This is where Overlai, and its creator Luke Neumann, step in.
He’s built a tool that gives creators control over what can—and cannot—be done with their content. Using a smart battery of available technologies, Overlai lets users set their own rules, all via a free app.
We spoke with Luke to learn more:

- Can you share a bit about your background and what led you to develop Overlai?
I’m a filmmaker at heart and got started back in the early days of Youtube. I love the creator economy and the fact that it gives everyone the ability to find an audience. In 2016, I experienced my first “algorithm change” and it felt like the audience I had worked to build were no longer seeing my content. So I adapted and tweaked my approach to the new algorithm. When I started to see what Generative AI was capable of, I was excited but also concerned about how it would affect algorithms. I started Overlai to develop tools that help human creators navigate this new reality.
- What is Overlai, and what problem does it solve for creators?
Overlai is a suite of tools that do everything from protecting creators’ work to capturing authentic videos to helping creators monetize their unused footage for AI training, if they choose.
- How does Overlai protect photos, videos, and music from unauthorized AI training or scraping?
Our free iOS app allows creators to easily apply IPTC/C2PA/Copyright protections to their work. We place this metadata on the blockchain and then use our custom invisible watermark to call back to it. Internally, we like to call it the “No Trespassing” sign for content creators. There are a lot of emerging standards out there and we believe stacking them and putting them on blockchain is the best way to go for creators right now.
- You mention content poisoning as a strategy to block AI companies from unauthorized use. Can you elaborate?
Content poisoning is something we have been working on and interested in for a while. There is a project called “Glaze” that creates small distortions in the image that can interfere with model training. The issue with a system like this is that once it’s solved, it’s no longer viable. Glaze has been solved for a while. However, new forms of image poisoning are emerging all the time. Our approach will be random. Not every photo will be poisoned. We will never use the same method. The techniques will be constantly evolving. This allows us to innovate and keep up with new AI developments to ensure our creators are constantly protected.
- Could this approach create potential liability issues?
Not for us. This will be a feature Overlai users need to agree to participate in and it will come with a separate Service Agreement. We are simply providing a service and users can decide where they want that extra protection. After all, what good is a “No Trespassing” sign if there isn’t something tangible to back it up?
- Who is Overlai designed for—amateurs, professionals, or influencers?
Anyone. We have a wide range of users right now. Parents that want to do what they can to keep photos of their families out of AI models. It’s not strictly an IP thing, it’s a privacy thing ultimately.
- Overlai offers protections like C2PA, IPTC metadata, and invisible watermarking.
How do these work together to secure content? We started by looking at the outcomes of some prominent cases involving AI training. Several were dropped due to “lack of a widely recognized and machine readable opt-out”. C2PA and IPTC are becoming very widely recognized and we wanted to make it as easy as possible for someone to assign these machine readable opt-outs to their work. Additionally, we wanted to create a robust watermark specifically designed to survive most social media use. The Overlai watermark survives re-sharing, cropping, and other basic image tweaks.
- What’s the business model behind Overlai? Who pays for it, and what’s the path to long-term sustainability?
We received a good amount of Google Cloud credits initially and the price to protect an image is pretty low. As things scale, we’ll introduce ads to offset the costs. The goal is to make the protection app free for as long as possible. On the other side, we’re developing tools for creators to license their unused content to companies for AI training as well as a B2B/Enterprise version of our protection app. Ultimately, the B2B side drives revenue and the consumer tools drive brand.
- Overlai positions itself as part of a “human movement” against unauthorized AI use. What kind of impact do you hope to achieve?
I personally believe that creators will start to pivot towards authenticity soon. Less edits, more personalization, less “gloss”. As they find out they can either prove or secure the fact that their work is coming from human hands, they will move towards it. I also hold out hope that distribution platforms will allow users to filter by human generated content or feature it in their algorithms.
- If Overlai is widely adopted, do you think AI models will struggle to find enough training data? What are the broader implications of that?
I don’t think they need more data now, I think they need better data. In my opinion, you’ll get the best data by working with creators and professionals. Creators generally only upload 1% of their total work and it’s always the best stuff. It’s all heavily skewed towards “gloss” and so you’ll get AI that can only generate “gloss”. By digging into the archives, the hard drives, and working with creators to create new stuff specifically designed for AI (and paying them to do it), we’ll get better AI and have a healthier creator ecosystem. Win-win.
- Beyond content protection, could Overlai be used proactively—for example, to track where an image appears online or enable safer content licensing?
Potentially, I think this is so far away from being reality though. I hope things get there but we kind of take a “one step at a time” approach.
- If you could push the boundaries of technology even further, what’s a feature or capability you would love Overlai to offer in the future?
I’m a filmmaker that loves YouTube and owes much of his success to that platform. They currently have C2PA enabled but the ability for creators to use it are basically non-existent. I’d like to find a way to prove the authenticity of any camera, regardless of whether it has Content Credentials at the hardware level, so any creator can use these tools with the camera that they have now.
Author: Paul Melcher
Paul Melcher is a highly influential and visionary leader in visual tech, with 20+ years of experience in licensing, tech innovation, and entrepreneurship. He is the Managing Director of MelcherSystem and has held executive roles at Corbis, Gamma Press, Stipple, and more. Melcher received a Digital Media Licensing Association Award and has been named among the “100 most influential individuals in American photography”