Kristoffer Rom could not believe the numbers coming in from Spotify as he stared at his computer screen.
His independent record company, Tambourhinoceros, published the synth-pop tune Hey Kids by the Danish-Chilean singer Molina a year and a half earlier, in 2018. The initial response was underwhelming. Months later, however, it began to gain popularity on TikTok and YouTube as creators used the song as a catchy, uplifting soundtrack for a variety of expressive films and animations.
From there, Spotify, Apple Music, and other streaming services caught on to the momentum. The song was generating more than 100,000 streams per day by March 2022. Rom remarked, “It was amazing to see all that traction.”
The early euphoria of his crew was quickly dampened by an unnerving insight, though. In addition to TikTok and YouTube artists, streaming music scammers—a more harmful, if less publicly known, component of the current media ecosystem—had also noticed the growing popularity of Hey Kids. These con artists have figured out how to profit from mainstream music platforms by either disseminating barely altered, copycat versions of popular songs and collecting the resulting per-stream payouts or by misleadingly labeling uploaded content so that users unintentionally consume their own music or advertisements.
Much to Rom’s growing dismay, ripped-off versions of Hey Kids – slightly modified but largely indistinguishable from the real thing – were suddenly proliferating across the streaming music landscape, siphoning listeners away from Molina and unfairly pocketing the resulting streaming royalties. Worse yet, nobody at the major services appeared to be doing anything effective to stop the spread of the knockoffs.
“You have the ecstatic joy of people doing creative, great things with the music you’ve put out, on the one hand,” Rom said. “And the total frustration and anger of witnessing people trying to exploit it.”
Currently, a large portion of the music industry is worried about the most recent Silicon Valley threat or opportunity. Industry leaders, most notably Spotify Technology SA Chief Executive Officer Daniel Ek, have been quick to vow increased vigilance on behalf of labels, artists, and copyright holders as AI-generated songs of mystery provenance have already gone viral on streaming platforms. Labels and managers claim that increasingly common fraud is already widespread while the platforms cautiously assess the bright new disruptive force.
At least 10% of streaming activity, according to estimates from Beatdapp, a business that collaborates with providers to identify and eliminate fraud. When used on a large scale with digital music, what might initially seem like simple, everyday deceit adds up to significant larceny. According to Beatdapp, the streaming deception might result in annual income misallocations of almost $2 billion.
According to those in the music business who spoke with Bloomberg, the majority of the issues they confront usually arise on the two biggest international streaming services, Spotify and Apple Music. They assert that YouTube Music from Alphabet Inc. has been considerably more aesthetically pleasing. That’s in part because YouTube has maintained a potent content ID system for years that frequently finds illegal content and then gives rightsholders the option to either completely remove the illegal content or monetise it themselves. (For instance, unofficial YouTube versions of Hey Kids now send any ensuing ad revenue back to Molina’s team at Tambourhinoceros.)
A particular kind of streaming fraud is one that Ben Gaffin, an artist manager and the CEO of Sound Advice, a company that provides music services and represents producers, artists, and media firms, says he frequently runs across. Someone will create a tune and purposefully tag it with the name of a different, more popular artist before releasing it on streaming sites. The platform’s algorithms will then start automatically presenting the incorrect tune to the throngs of followers of the genuine performer and including it in popular playlists, leading to an uptick in undeserved streams as a result of the false information.
Gaffin started looking for an example during a recent interview with Bloomberg and instantly discovered one of these tracks “featuring” his musician Clams Casino.
The incorrectly classified song had already received over 55,000 Spotify plays before Gaffin stumbled upon it. When Gaffin receives a notification from Spotify For Artists informing him that new music is going to be released when none is actually planned, or when fans begin complaining vehemently about a new song they don’t like, Gaffin said he usually learns about counterfeits.
It’s a system vulnerability that is being taken advantage of, according to Gaffin.
“Every artist should have a code or security thing,” Elitzer added, “it seems like a fairly easy fix.” “It’s too late by the time you see it.”
The fact that virtually anyone may post music to popular streaming services in the age of streaming without much control or analysis is one of the challenges. There are various services that let consumers to distribute their songs to the major platforms utilizing do-it-yourself tools, including DistroKid, CD Baby, and TuneCore. Previously a labor-intensive, manual operation, the distribution of new music to shops has become mostly automated.
According to Christine Barnum, chief revenue officer at CD Baby, “Way, way, way back in the day, we had a team of people that listened to every CD that came in the door.” That is not practical when operating at this size.
Businesses like Spotify, which were initially created as platforms for professional musicians, have begun to resemble user-generated content platforms as the volume of amateur content being submitted to streaming services has skyrocketed. According to Spotify, there are over 100 million songs available on its service, and 60,000 tunes were being added daily as of February 2021. Both Apple Music and Amazon Music have claimed to have a catalog of 100 million songs available to consumers.
According to Vickie Nauman, founder and CEO of CrossBorderWorks, a music and technology firm, the expanding scope of streaming services makes it much simpler for tunes with false labels to pass through.
It was simpler to monitor, she continued, “certainly in the world before we had 100,000 songs uploaded a day.”
Most of the time, rightsholders are the ones who must manually submit takedown requests for every problematic music they find. This process can be especially onerous for tiny, independent companies.
Executives at Tambourhinoceros still discover new uploads that plagiarize Hey Kids. Some songs have had their names modified by scammers, who utilize versions of TikTok hashtags to entice unwary listeners. Others offer slightly sped-up or slowed-down versions that appear to have been altered to get around fraud detecting software but still sound remarkably similar to the original composition.
The most popular fake upload that they found had received over 700,000 plays and may have cost over $2,000 in lost income.
For anyone, but especially for us, an independent label from Denmark, Rom stated, “That’s a lot of money.” “We must generate income from the work that we actually do,”