Remember that massive comment war I mentioned in my previous post about AI copyright laws? There was one specific argument that kept popping up, and honestly, it drove me crazy. Every time someone defended the page that “borrowed” the artwork, they shouted the same three words: “It’s Open Source!”
The logic went something like this: “This image was made with Stable Diffusion. Stable Diffusion is open source. Therefore, this image belongs to everyone, and I can do whatever I want with it.”
It sounds convincing, right? It sounds like a “Gotcha!” moment. But here is the hard truth: It is completely wrong.
Confusing “Open Source Software” with “Public Domain Content” is the single biggest mistake new users make. It’s the reason why so many people get banned from communities like Civitai or Reddit. Today, we are going to move from the legal courtroom to the technical engine room. We are going to debunk the myth that “free tools” equal “free assets,” and we’ll discuss the ethical way to handle these images—including the controversial topic of when (and when not) to remove a watermark.
Example: Why Canon doesn’t own your photos
To understand why the “It’s Open Source” argument is flawed, we need to strip away the AI jargon and look at a real-world example.
Imagine the camera company, Canon, decides to release the blueprints for their newest camera for free. They say, “Here is the design, anyone can build this camera.” That is Open Source Hardware. Now, let’s say you take those blueprints, build the camera, and use it to take a breathtaking photo of a sunset.
Does Canon own your photo? No.
Does the person who downloaded the blueprints after you own your photo? Definitely not.
In the world of AI:
- The Code (Stable Diffusion): This is the “blueprint” of the camera. It is indeed open source.
- The Model (Checkpoint/LoRA): This is the “camera” itself, trained on specific data.
- The Output (The Image): This is the “photo” you took.
Just because the tool is free for everyone to use, doesn’t mean the product you create with it is community property. As we discussed in our deep dive into legal ownership, while the law is still catching up on the “Copyright” side, the “License” side is actually very specific.
Using “free” AI models comes with rules
Most people skipping through the installation of Automatic1111 or ComfyUI don’t realize they are agreeing to legal terms. The most common license for open-source AI is the CreativeML Open RAIL-M license.
This license gives you a lot of freedom, you can use the model commercially, you can modify it, you can share it. But it is not a “do whatever you want” card. It specifically prohibits using the model to break the law, harm others, or spread misinformation.
Furthermore, if you are downloading models from platforms like Civitai or Hugging Face, you need to look at the specific “Usage Rights” on the right-hand sidebar. Creators often attach custom conditions to their models, such as:
- “No selling images generated with this model.”
- “You must credit the creator.”
- “You cannot re-upload this model to generation services.”
You can read more: https://en.wikipedia.org/wiki/Stable_Diffusion
If you take an image generated by a model with a “No Commercial Use” tag and use it for your Facebook ads because “it’s open source,” you aren’t just being rude—you are violating a contract. And yes, platforms can and will ban your account for that.
The ethics of “cleaning” vs. “stealing” (and how to use watermark removers)
This brings us to the most heated part of the debate: Watermarks.
In the drama I witnessed, the “thief” took an image, used a tool to wipe the original creator’s watermark, and then slapped their own logo on it. This is the definition of malice. It proves they knew the image wasn’t theirs and actively tried to hide the origin.
However, does that mean removing a watermark is always wrong? Not necessarily. As someone who works with image processing tools like Dewatermark, I see a clear distinction between “Theft” and “Restoration.”
The “Red Line” of Ethics:
Scenario A (Theft): You find an amazing piece of AI art on ArtStation. It has the artist’s signature. You use an AI remover to erase the name so you can sell the image as a print.
👉 Verdict: Unethical and potentially illegal. You are erasing the “provenance” (origin) of the work to claim false credit.
Scenario B (Restoration/Research): You are a prompt engineer. You generated an image using a public model, but the model “hallucinated” a messy, garbled text watermark in the corner (a common issue with models trained on stock photos). You use a tool to clean up that artifact to make the image usable for your project.
👉 Verdict: Completely acceptable. You are fixing a technical error in your own workflow.
Scenario C (Private Study): You are building a mood board for a client. You download a watermarked image to see how the composition fits your layout. You clean it up to present a professional draft, with the intention of licensing the real image or generating a unique one later.
👉 Verdict: Acceptable industry practice (Internal use).
Tools are neutral; it’s the intent that matters. If you are using technology to hide someone else’s effort, you are the problem. If you are using it to polish your own creative output, you are a professional.
Why Metadata makes “Stealing” risky
Here is one final warning for the “Open Source means Free” crowd: Removing a visible watermark doesn’t mean you’re safe.
Modern AI generation saves a massive amount of data in the image file itself, known as Metadata (or Generation Data). This includes the exact Prompt, the Seed number, the Sampler, the Steps, and even the Model Hash.
In the community, “Stealing a Prompt” (taking the metadata and replicating the style) is often considered just as bad as stealing the pixel data. Platforms like Civitai have detectors that can read this metadata. If you upload an image claiming it’s yours, but the metadata shows it was generated by someone else’s specialized workflow, you will be called out.
Some advanced tools are even embedding Invisible Watermarks inside the noise of the image. You might scrub the visual logo, but the digital signature remains. This is why “re-posting” is such a dangerous game for your brand’s reputation.
Conclusion
The world of Open Source AI is a gift. It gives us access to technology that used to cost millions of dollars. But “Open Source” refers to the freedom of the code, not the freedom to plunder the community.
Don’t be the person arguing in the comments that you have a right to take other people’s work. It makes you look uneducated about the very tech you claim to love. Instead:
- Check the License: Know what the model allows.
- Respect the Source: If you use someone else’s image, keep the credit.
- Use Tools Responsibly: Use watermark removers to clean artifacts or restore your own work, not to erase authorship.
Now that we’ve covered the legal side and the technical misconceptions, you might be wondering: “Okay, I want to do this right. How do I actually credit people properly so I don’t get yelled at?” That is exactly what we will cover in the next post.









