This week, weโre talking:
RIAA, Universal, Warner and Sonyโs landmark lawsuit against Suno and Udio โ perhaps getting one step closer to an answer RE: does model training fall under fair use? ๐ถ ๐ดโโ ๏ธ
APRA is heading to markup โ can it survive being watered down? ๐บ๐ธ ๐๏ธ
Changing privacy policies in the dead of night is always a bad idea โ whether or not regulators will make you pay for it is another question. ๐ ๐
Opting out of METAโs machine learning training is hard on purpose. ๐ฌ ๐
Gamification was all the rage โ did it hurt us more than help us? ๐ฎ๐น๏ธ๐พ
Toys โRโ Us and that ad. ๐งธโค๏ธ ๐ค
My Take:
In todayโs batshit example of โtraining my model is just fair useโ insanity, I give you AI Music companies Suno and Udio. Both were just hit with a lawsuit spearheaded by RIAA, Universal, Warner and Sony alleging the infringement of copyrighted music โat an almost unimaginable scale.โ The last time the record industry took up arms in this way, Napster and every company in its cohort were sued into oblivion and ultimately ceased to exist.
For their part, Sunโoย and Udio arenโt exactly running from the allegations.ย In a Rolling Stone profile of Suno published in March, investor Antonio Rodriguez all but acknowledged that they were playing fast and loose with copyrighted material. He understood when he invested that a lawsuit was likely coming butย โsโawย that asย โjโustย the cost of doing business. He even went so far as to say that he wanted it that way. โHonestly, if we had deals with labels when this company got started, I probably wouldnโt have invested in it. I think that they needed to make this product without the constraints.โ I donโt think the word โconstraintsโ has ever done quite so much heavy lifting.
I believe that music is primal and distinctly human. Iโm all about putting the machines to work, but only in the service of musiciansโย and people who love musicย โย and not the other way around.
Thatโsย โwโhy Boombox exists. Weโre a team OF musicians who want toย โuseย AI to streamline the monotonous, boring tasksย โthat go into music creationโ and to make music collaboration cooler, faster, and better.
And thatโs why weโve created Boombot, your AI assistant producer. We built Boombot with the belief that AI should augment musical artistry, not replace it -- and certainly never steal or โborrowโ from it.ย Boombot is trained onย AI infrastructure using ethically-sourced data only. It does not, nor will it ever, train on the intellectual property of Boombox users or any other musician's art.
Itโs just another reminder that AI โ like any other tool โ can be used in the service of human flourishingโ, or it can deplete and degrade human experience.ย Thoughย with all these bad actors, musicians could easily be forgiven for losing sight of that.
Stories Iโm Following:
Federal privacy law faces new hurdles ahead of markup by Tim Starks VIA CyberScoop ๐บ๐ธ ๐๏ธ
Tomorrow is a big day in data privacy land as the APRA (federal privacy bill) goes to markup. 72% of US adults say they want the bill but it is already getting watered down. The latest version of the bill ditches protections against data-driven discrimination and bias in AI. Big Tech is arguing that AI is separate from data privacyโฆ and if you buy that, Iโve got some ocean-front property to sell you in Arizona.
When the Terms of Service Change to Make Way for A.I. Training by Eli Tan VIA NYTimes ๐ ๐
Iโve written before about the tendency of large companies to change their privacy policies quietly (๐ Adobe) and cross their fingers that consumers donโt notice. This โ and the story below โ really come down to the fact that machine learning has exhausted most publicly available data and remains hungry for more. The FTC has already informed tech companies that changing privacy policies to scrape old data โcould be an unfair or deceptiveโ practice. It seems that Big Tech companies missed the memo. Whether or not regulators will act remains to be seen.
What Iโm Reading:
How to opt out of Metaโs AI training by Melissa Heikkilรคarchive VIA MIT Technology Review ๐ฌ ๐
Iโve never had a Facebook or Instagram account and Iโm increasingly grateful for that. For everybody else in the world, your privacy is under attack. Starting TODAY, META can use your data to train its AI models. Itโs a sneaky invasion of privacy โ and a good reminder of why deceptive patterns are increasingly referred to as โprivacy Zuckering.โ The cumbersome process to opt out of this violation is outlined at the link above.
How gamification took over the world by Bryan Gardiner VIA MIT Technology Review ๐ฎ๐น๏ธ๐พ
A long read but a must read, nonetheless. Gardinerโs piece poses the question: Gamification was always just behaviorism dressed up in pixels and point systems. Why did we fall for it? The Tl;dr is this: โInstead of liberating us, gamification turned out to be just another tool for coercion, distraction, and control.โ
Toys 'R' Us uses OpenAI's Sora to make a brand film about its origin story and itโs horrifying by Danny Gallagher VIA Engadget ๐งธโค๏ธ ๐ค
This is a PR stunt but even stunts can be better executed. Lots more thoughts here but theyโre perhaps best summed up by comedian, Mike Drucker.