
How Do Large Language Models Learn? | AI Series Pt. 3
No se pudo agregar al carrito
Add to Cart failed.
Error al Agregar a Lista de Deseos.
Error al eliminar de la lista de deseos.
Error al añadir a tu biblioteca
Error al seguir el podcast
Error al dejar de seguir el podcast
-
Narrado por:
-
De:
How does ChatGPT actually learn? In Episode 8 of Mr. Fred’s Tech Talks, we go behind the curtain to explore how Large Language Models are trained and why the process isn’t as magical as it looks.
You’ll discover:
- How Large Language Models train step by step, billions of rounds of “guess the next word.”
- Why tokens are the LEGO bricks of AI learning.
- How GPUs, servers, and warehouse-sized data centers form the backbone of AI training.
- A simple library analogy that makes servers easy to understand.
- What happens when training goes wrong: bias, hallucinations, and overfitting.
AI isn’t just software, it’s powered by huge amounts of energy, specialized hardware, and careful fine-tuning by humans. And while it’s powerful, it’s not perfect. These systems don’t “think” like we do they predict patterns, one token at a time.
💡 Tech Tip of the Episode: Try asking ChatGPT/CoPilot/Gemini/Grok etc. a simple question, then twist it:
- “Who was the first person to walk on the moon?” → Neil Armstrong
- “Who was the first person to walk on the sun?” → Watch how it tries to answer the impossible.
This little experiment shows why AI can amaze us one moment and confuse us the next.
Join me in Episode 8 to learn how AI models are trained, what really happens inside data centers, and why understanding this matters for parents, teachers, and anyone curious about the future of technology.
CONNECT
Website: https://www.getmecoding.com
Courses: https://courses.getmecoding.com
FOLLOW
YouTube: https://www.youtube.com/@GetMeCoding
Instagram: https://www.instagram.com/getmecoding
Facebook: https://www.facebook.com/GetMeCoding
LinkedIn: https://www.linkedin.com/in/mrfred77/
Follow, rate ★★★★★, and share!
Hosted on Acast. See acast.com/privacy for more information.