Your AI Friends 'Love' You... Podcast Por  arte de portada

Your AI Friends 'Love' You...

Your AI Friends 'Love' You...

Escúchala gratis

Ver detalles del espectáculo
OFERTA POR TIEMPO LIMITADO. Obtén 3 meses por US$0.99 al mes. Obtén esta oferta.
Welcome back to The FAIK Files! In this week's episode: OpenAI is feeling the pressure to meet demand, widening the scope of "Stargate" and looking at debt financing for chips by 2025. Creepy AI companion toys are posing risks to student mental health, and we look at a family's unsettling week with one. Microsoft is bringing Claude to Copilot, diversifying beyond OpenAI, while xAI is suing OpenAI for allegedly poaching talent and secrets. An AI safety tool sparked backlash after flagging art as porn and deleting emails, leading to a student lawsuit. ----> Check out Perry and Cameron Malin's "Deepfake Ops" course: https://maven.com/perception-lab/deepfake-ops Want to leave us a voicemail? Here's the magic link to do just that: ⁠⁠⁠https://sayhi.chat/FAIK⁠⁠⁠ You can also join our Discord server here: ⁠⁠⁠https://discord.gg/cU7wepaz⁠⁠⁠ *** NOTES AND REFERENCES *** OpenAI’s 10+ Gigawatt Plans: OpenAI Feeling the Pressure (Reuters): https://www.reuters.com/business/media-telecom/openai-under-pressure-meet-demand-widens-scope-stargate-eyes-debt-finance-chips-2025-09-24/ Sam Altman Blog Post: https://archive.is/iPaU9 OpenAI Partnership with NVIDIA (Fortune): https://fortune.com/2025/09/24/sam-altman-ai-empire-new-york-city-san-diego-scary/ NVIDIA letter of intent: https://nvidianews.nvidia.com/news/openai-and-nvidia-announce-strategic-partnership-to-deploy-10gw-of-nvidia-systems Creepy AI Toys: AI ‘companions’ pose risks to student mental health. What can schools do?: https://www.k12dive.com/news/ai-companions-student-mental-health-schools/761054/ ‘I love you too!’ My family’s creepy, unsettling week with an AI toy: https://www.theguardian.com/technology/2025/sep/16/i-love-you-too-my-familys-creepy-unsettling-week-with-an-ai-toy Curio: https://heycurio.com/ Oh… and a visit from a “Friend” (https://friend.com/) – we’ve mentioned them before. This interview is… interesting: https://fortune.com/videos/watch/creator-of-'creepy'-ai-friend-necklace-says-it's-'like-god'/41086999-face-458f-b917-ee465c56d187 Crossing Lines: Microsoft & xAI: Microsoft bringing Claude (and Deepseek?) to Copilot (Reuters): https://www.reuters.com/business/microsoft-brings-anthropic-ai-models-365-copilot-diversifies-beyond-openai-2025-09-24/ xAI is suing OpenAI for poaching talent & secrets (Reuters): https://www.reuters.com/sustainability/boards-policy-regulation/musks-xai-accuses-rival-openai-stealing-trade-secrets-2025-09-25/ The official complaint: https://fingfx.thomsonreuters.com/gfx/legaldocs/byvreaygope/XAI%20OPENAI%20TRADE%20SECRETS%20LAWSUIT%20complaint2.pdf AI safety tool sparks backlash: AI safety tool sparks student backlash after flagging art as porn, deleting emails: https://www.washingtonpost.com/nation/2025/09/24/students-lawsuit-ai-tool-gaggle/ *** THE BOILERPLATE *** About The FAIK Files: The FAIK Files is an offshoot project from Perry Carpenter's most recent book, FAIK: A Practical Guide to Living in a World of Deepfakes, Disinformation, and AI-Generated Deceptions. Get the Book: ⁠FAIK: A Practical Guide to Living in a World of Deepfakes, Disinformation, and AI-Generated Deceptions⁠ (Amazon Associates link) Check out the website for more info: ⁠https://thisbookisfaik.com⁠ Check out Perry & Mason's other show, the Digital Folklore Podcast: Apple Podcasts: ⁠https://podcasts.apple.com/us/podcast/digital-folklore/id1657374458⁠ Spotify: ⁠https://open.spotify.com/show/2v1BelkrbSRSkHEP4cYffj?si=u4XTTY4pR4qEqh5zMNSVQA⁠ Want to connect with us? Here's how: Connect with Perry: Perry on LinkedIn: ⁠https://www.linkedin.com/in/perrycarpenter⁠ Perry on X: ⁠https://x.com/perrycarpenter⁠ Perry on BlueSky: ⁠https://bsky.app/profile/perrycarpenter.bsky.social⁠ Connect with Mason: Mason on LinkedIn: ⁠https://www.linkedin.com/in/mason-amadeus-a853a7242/⁠ Mason on BlueSky: ⁠https://bsky.app/profile/wickedinterest.ing Learn more about your ad choices. Visit megaphone.fm/adchoices
Todavía no hay opiniones