🌑

Hello World.

Rethinking Open Source Responsibility (EOL)

For a while now, I’ve been playing with a thought experiment: what happens when your code is used for something you completely disagree with?

Open source is great. It encourages collaboration, innovation, accessibility. But it never really asks whether there should be any boundaries at all. Right now, if you use a permissive license, you’re saying: “Take this. Do whatever you want.” And sometimes, that “whatever” means mass surveillance, biased AI systems, or much worse.

Some people say that’s just how open source works. You release something, and then it’s out of your hands. But I started to question that. Does it really have to be?

(Fun fact: Just raising this question is apparently enough to get your post deleted in some open source circles. cough cough. Seems like the debate is already “settled.”)


What is the Ethical Open License (EOL)?

The Ethical Open License (EOL) is a made-up licensing model that asks a simple question. Can we include ethical restrictions in open source?

Not to block regular users or kill innovation. Just to draw a line in the sand on where software use crosses into unethical territory.

EOL would prohibit use in things like:

  • Mass surveillance: no large scale tracking, unauthorized data collection, or government spying
  • Autonomous weapons: no military AI, targeting systems, or automated killing machines
  • Discriminatory AI: no systems that reinforce social bias or make decisions based on race, gender, or class
  • Exploitation networks: no platforms for child abuse, trafficking, or exploitation of vulnerable people

Obviously, someone has to define what’s “ethical,” and yeah, that’s a big conversation. But pretending this isn’t a problem doesn’t make it go away.


How would EOL work?

The structure is familiar. You can use, modify, fork, contribute. But there’s a catch: if you cross one of the ethical lines, you lose the right to use the software.

That would require some kind of process to handle violations. Ideally, an independent board would exist to review complaints and evidence. That brings its own headaches, obviously. But having no process at all leads to chaos and loopholes.

Enforceability is a big issue. And no, this wouldn’t stop people who already ignore laws. But licenses aren’t just about catching criminals. They help shape norms. They tell people, “We don’t want this used for that.” And sometimes, that matters more than it looks.


Who pays for this?

EOL, like most open source licenses, would be free to use. But if you want enforcement, you’ll need infrastructure. That means money.

Some possible options:

  1. Do-it-yourself (Free)
    Communities handle enforcement through public discussion. Cheap, but messy and unreliable.

  2. Independent Ethics Board (IERB)
    Costs would include:

    • Legal checks
    • Investigation of misuse
    • Admin work

    Funding could come from:

    • Companies that believe in ethical tech
    • Donations and crowdfunding
    • Commercial users contributing small fees to cover costs
  3. Hybrid model
    Small projects rely on the crowd. Big users fund structured oversight.

The details aren’t figured out. But if you care about ethical use, you can’t ignore the question of sustainability.


What about royalties?

Not sure this even makes sense yet. But here’s the idea: if your company makes millions directly off EOL-licensed software, maybe some of that should support the project and its values.

Proposed structure:

Annual Gross Revenue Royalty Rate
Less than $1,000,000 0%
$1,000,000 – $5,000,000 1%
Over $5,000,000 2%

Only revenue directly tied to the licensed software or products built from it would count. This wouldn’t touch hobbyists, nonprofits, or tiny startups.

Money from this could support:

  • Investigations and legal help
  • Security audits
  • Keeping the license alive and enforced

What people push back on (and why that’s valid)

“Open source is supposed to be neutral”

That’s the traditional line. Developers provide tools, and it’s not their fault what people do with them.

But that ignores reality. AI systems don’t just exist. They shape lives. Algorithms decide what you see, what you believe, what opportunities you get. Code has consequences. Pretending it doesn’t isn’t neutral. It’s passive.

“This isn’t open source anymore”

Yeah, maybe not. At least not under the OSI’s definition. If open source has to allow everything, then EOL doesn’t qualify. Still, it raises a valid point about what “freedom” in tech really means.

“You can’t license ethics. They’re subjective”

Sure. Ethics aren’t fixed forever. But neither are laws or social norms. We still write them down, argue about them, change them. This isn’t about getting it perfect. It’s about saying something is better than saying nothing.

“This would never hold up legally”

Fair. It would need serious legal review to avoid vague language and gray areas. Right now, it’s not ready for prime time. It’s a draft with a bunch of open questions.

“People doing evil won’t follow the license anyway”

Also true. But this isn’t about stopping everyone. It’s about drawing a line. And that might affect how the more cautious or image-conscious companies act.


So is EOL a good idea?

Honestly? Probably not. But ignoring this whole issue is worse.

Open source has a responsibility problem. And if asking about it is already considered taboo, maybe that’s the clearest sign that something needs to change.

The Ethical Open License (EOL) is on GitHub. It’s messy, unfinished, and full of flaws. But maybe that’s the point.

👉 github.com/timkicker/EOL

Whether anything comes of it or not, at least it starts a conversation that a lot of people are clearly avoiding.


I’m not trying to replace MIT, GPL, or anything like that. But we really need to stop pretending that software is just neutral math.

What people do with our code matters. And maybe, just maybe, we should care.

, , , , , , , , , , , , , , — Feb 24, 2025