
Artificial intelligence is reshaping the music industry, but not always in ways that benefit artists. Many musicians see their work scraped without permission, fueling AI systems that generate playlists and even entire songs. Poison Pill, a UK startup, wants to change that by giving creators a way to fight back.
The Birth of Poison Pill
Founded by Ben Bowler, known for Chew·tv and Aux, the company launched its beta last week. It also secured a finalist spot in the 2025 Music Ally SI:X startups contest. Poison Pill uses adversarial noise algorithms to protect tracks. These subtle changes are inaudible to humans but confuse AI models, which then misclassify genres and instruments.
Bowler explained the motivation clearly: “Most musicians are pissed about the current state of AI in music: well-funded companies are scraping music without permission, creating services that claim to replace them.” Poison Pill’s first goal is to protect 20% of independent music. By doing so, Bowler hopes to force AI firms to negotiate fair licensing deals.
The technology works by exploiting how AI models learn. For example, a model might detect indie music by a guitar’s resonance. Poison Pill adds low-level noise that mimics classical music patterns. Humans hear no difference, but the AI mislabels the track. If enough artists use this method, AI training becomes unreliable and costly.
Wider Creative Protection
Bowler believes this disruption will push companies toward fairer practices. “The system doesn’t need to be perfect to effectively change the power balance between creators and scrapers,” he said. Beyond music, the startup plans to expand protections to photographers, filmmakers, and other creators.
Poison Pill is still self-funded but has already attracted investor interest. The team is also in talks with labels and rights holders about offering API access for larger catalogs.
The message is simple: creators deserve control over their work. By embedding resistance directly into music, Poison Pill offers artists a tool to demand respect—and fair compensation—in the AI era.
