The FIR Podcast Network Everything Feed Podcast Por The FIR Podcast Network Everything Feed arte de portada

The FIR Podcast Network Everything Feed

The FIR Podcast Network Everything Feed

De: The FIR Podcast Network Everything Feed
Escúchala gratis

Subscribe to receive every episode of every show on the FIR Podcast Network Política y Gobierno
Episodios
  • FIR #509: Does Corporate Content Need Copyright Protection?
    Apr 14 2026
    When bad actors use AI tools to clone a musician’s voice and upload synthetic versions of their songs, they can then file copyright claims against the original artist’s content — and win, at least initially. That’s because the systems platforms used to validate copyright claims are automated and configured to treat whoever files first as the rightful holder. The result: musicians like Murphy Campbell, a folk artist from North Carolina, lose both revenue and control of their own creative identity. The same mechanism works just as well against any organization that publishes audio or video content online. In this midweek episode, Shel Holtz and Neville Hobson break down how the scam works, why it matters to communicators, and what you should be doing right now — before an incident forces your hand. Links from this episode: AI Cloned Her Voice, Then Claimed Her Songs‘This Is Not Me’: Inside the AI Scams Driving Musicians CrazyA Folk Musician Became a Target for AI Fakes and a Copyright TrollA traditional musician became a victim of AI imitations and a copyright aggressor‘AI slop’: Emily Portman and musicians on the mystery of fraudsters releasing songs in their name The next monthly, long-form episode of FIR will drop on Monday, April 27. We host a Communicators Zoom Chat most Thursdays at 1 p.m. ET. To obtain the credentials needed to participate, contact Shel or Neville directly, request them in our Facebook group, or email fircomments@gmail.com. Special thanks to Jay Moonah for the opening and closing music. You can find the stories from which Shel’s FIR content is selected at Shel’s Link Blog. You can catch up with both co-hosts on Neville’s blog and Shel’s blog. Disclaimer: The opinions expressed in this podcast are Shel’s and Neville’s and do not reflect the views of their employers and/or clients. Raw Transcript Neville Hobson: Hi everyone and welcome to For Immediate Release, this is episode 509. I’m Neville Hobson. Shel Holtz: And I’m Shel Holtz. And today we’re going to talk about something else that communicators need to worry about. I think we need to develop a worry list for communicators. This one starts with a tale about a folk singer from the mountains of Western North Carolina. She’s named Murphy Campbell. She plays banjo and dulcimer and records old Appalachian ballads, some of them written by her own distant relatives. And she posts videos of herself performing in the woods. She has about 7,800 monthly listeners on Spotify. And she is, as Shelly Palmer put it in a recent column, exactly the kind of artist the copyright system was designed to protect. In January, some of her fans started messaging her about songs on her Spotify profile that she had never uploaded. Someone would have taken her YouTube performances, run them through AI voice cloning tools, and posted synthetic versions of her songs under her name on streaming platforms. These fake tracks, to put not too fine a point on it, were really bad. Her dulcimer sounded like — and these were her words — a warbled metallic mess. Her voice had been deepened and auto-tuned into what she called a bro country singer. But here’s where it gets interesting for those of us in communications, because that’s not the end of the story. It didn’t stop at impersonation. Whoever uploaded the fakes through a legitimate music distributor called Vydia (V-Y-D-I-A) then filed copyright claims against Campbell’s original YouTube videos — the very videos the AI had been trained on. Because YouTube doesn’t use humans to review initial copyright claims, Campbell stopped earning revenue on her own content. That revenue started going to the person who had filed the copyright claims. She described herself as being in a weird limbo where “I’m telling robots to take down music that robots made.” Shelly Palmer called this a reverse copyright scam, and he confirmed, speaking to other content creators off the record, that this is more common than he might have believed. Now, I know what you’re thinking — music streaming platforms, artists, what does this have to do with me? And the answer is everything. Because the mechanism that elbowed Murphy Campbell out of earning royalties for her own music will work just as well against any organization that publishes content on platforms with automated enforcement systems. That is virtually every organization that has a YouTube channel, a podcast feed, or any kind of public video or audio presence. So here’s the structural problem as Palmer frames it. The copyright system we have was built on a foundational assumption that the first entity to register a claim is the rightful owner. That assumption held when human creativity was the bottleneck. It breaks completely when AI can generate a synthetic version of any content in seconds using any voice. Think about what your organization puts out there publicly — executive speeches, earnings calls, thought ...
    Más Menos
    21 m
  • ALP 301: Five words every agency owner needs to understand
    Apr 13 2026

    Most agency owners spend a lot of time thinking about growth, clients, and revenue. Far fewer think carefully about the words that define how they actually operate their businesses. In this episode, Chip and Gini dig into five of those words: leadership, management, accountability, responsibility, and authority.

    Leadership and management aren’t the same thing. Leadership is about vision and getting people to follow you. Management is about making the work happen. Knowing which one you’re stronger at is the first step toward building a team that covers your gaps.

    Accountability is the wrong place to start when a team member isn’t delivering. You can’t hold someone accountable for something you never clearly assigned, and you can’t hold them accountable if you didn’t give them the authority to get it done.

    Gini offers a useful comparison: when a client hires you for your expertise and then second-guesses every decision, it’s demoralizing. That’s exactly how your team feels when you delegate the work but not the authority to do it.

    The episode closes with a simple reminder. If you want more freedom as an owner, you have to be willing to actually let go. And if your team isn’t capable of handling more responsibility, you should be asking yourself why you hired them. [read the transcript]

    The post ALP 301: Five words every agency owner needs to understand appeared first on FIR Podcast Network.

    Más Menos
    21 m
  • FIR #508: Inside AI’s Human Raw Material Supply Chain
    Apr 8 2026
    When workers lose their jobs, many turn to gig work to earn income while waiting for new opportunities. Increasingly, companies that hire gig workers are shifting from delivering food or sharing rides to creating content to train AI systems. This raises various communication and ethical issues. Neville and Shel explain what’s happening and discuss the implications in this short midweek episode. Links from this episode: The jobs AI can’t do – and the young adults doing themThousands of people are selling their identities to train AI – but at what cost?The gig workers who are training humanoid robots at homeGig economy becomes new AI training ground The next monthly, long-form episode of FIR will drop on Monday, April 27. We host a Communicators Zoom Chat most Thursdays at 1 p.m. ET. To obtain the credentials needed to participate, contact Shel or Neville directly, request them in our Facebook group, or email fircomments@gmail.com. Special thanks to Jay Moonah for the opening and closing music. You can find the stories from which Shel’s FIR content is selected at Shel’s Link Blog. You can catch up with both co-hosts on Neville’s blog and Shel’s blog. Disclaimer: The opinions expressed in this podcast are Shel’s and Neville’s and do not reflect the views of their employers and/or clients. Raw Transcript Shel Holtz Hi everybody and welcome to episode number 508 of For Immediate Release. I’m Shel Holtz. Neville Hobson And I’m Neville Hobson. Over the past few weeks, I’ve come across a set of stories that all point to something quite striking — not just how AI is evolving, but how it’s being built. Increasingly, the raw material behind AI isn’t just data scraped from the web. It’s us: our voices, our movements, our everyday lives, and increasingly, our identities. There’s a new layer of the gig economy emerging. We’ll explore this in just a minute. People are being paid, typically in small amounts, to record themselves walking down the street, having conversations, folding laundry, even just going about their day. That data is then used to train AI systems because those systems need examples of how people actually speak, move, and interact in the real world. In one case, delivery drivers in the US are being redirected to film tasks for robotics training. Platforms are turning existing gig workers like delivery drivers into distributed data collectors for AI. In another example, people are selling access to their phone conversations through apps that pay contributors to upload voice and text data. And in yet another, workers are strapping phones to their heads to record household chores so humanoid robots can learn how to move. The work is global, fragmented, and often invisible, with workers spanning Nigeria, India, South Africa, the US, and far beyond. Humans are no longer just users of AI — they are raw material suppliers. In China, there are even state-run centers where workers wear virtual reality headsets and exoskeletons to teach robots how to carry out everyday physical tasks. What we’re seeing is the rise of what you might call data labor, where identity itself becomes part of the work. There’s a clear driver behind it. AI companies are running out of high-quality training data. The open web isn’t enough anymore, and synthetic data has its limits. So the industry is turning to something else: real human lived experience. Because if you want a robot to understand how to load a dishwasher, navigate a room, or interact with objects, you need to see humans doing it at scale. But there’s an interesting contrast here. One of the stories highlights a 23-year-old in the US, a guy called Cale Mouser, who earns well into six figures repairing diesel engines. It’s something he’s developed great skill in doing. His work depends on judgment, experience, and problem solving in the real world — things that don’t easily translate into data. So while some people are being paid small amounts to generate data for AI systems, others like Cale Mouser are building highly valuable careers precisely because their skills can’t be reduced to it. And that contrast feels important. Because on one level, this new kind of work does create opportunity. For some people, especially in lower-income regions in the Global South, this is real income — paid in dollars, flexible and accessible. But there’s another side to it. Because what people are actually selling isn’t just time, it’s identity: their voice, their behavior, their presence in the world. And often once that data is handed over, it’s gone — permanently licensed, reused, repurposed, potentially in ways the individual never sees or understands. So you have this asymmetry: individuals earning small immediate payments while companies build long-term, highly valuable AI systems. Perhaps it’s a new version of the Mechanical Turk for the AI era. And that raises a deeper question. What does it mean when ...
    Más Menos
    21 m
Todavía no hay opiniones