Learning the Dots Podcast Por Matt Williams arte de portada

Learning the Dots

Learning the Dots

De: Matt Williams
Escúchala gratis

Learning the Dots is a technology learning podcast from the hosts of Connecting the Dots. In each episode, Alex and Morgan take one technology topic and break it down clearly, calmly, and practically—no jargon, no hype, and no prior tech knowledge required.

This isn’t about chasing headlines. It’s about understanding how modern technology actually works, why it exists, and why it matters in everyday life and work. Using simple explanations, real-world analogies, and thoughtful conversation, Learning the Dots helps listeners build confidence with the tools and concepts shaping today’s world.

Whether you’re a professional who wants to sound smarter in meetings, a student trying to make sense of complex ideas, or simply someone who wants technology explained without the intimidation, Learning the Dots is here to help you learn—one dot at a time.

Snarful Solutions Group 2026
Economía
Episodios
  • DevOps in 2026 — From Automation to AI Orchestration
    Mar 3 2026

    In this episode of Learning the Dots, Alex and Morgan explore how software development and DevOps are evolving in 2026—from traditional automation to AI-driven orchestration. The discussion centers on the rise of agentic coding, where autonomous AI systems collaborate with human developers to manage large portions of the development lifecycle, troubleshoot infrastructure issues, and accelerate delivery.

    The episode explains how CI/CD pipelines have matured beyond simple build-and-deploy workflows. Modern systems now incorporate self-healing capabilities, advanced canary deployment strategies, and stronger guardrails around supply chain security to protect against vulnerabilities in dependencies and third-party integrations.

    Industry leaders such as Netflix illustrate this shift by investing in resilient, stateless architectures and building internal platforms that streamline development while improving reliability. Meanwhile, tools like GitHub Actions and Jenkins continue to evolve, with platform engineering practices reducing developer friction and increasing consistency across teams.

    Throughout the conversation, Alex and Morgan emphasize that while AI is reshaping how software is built and deployed, success depends on maintaining strong human oversight and rigorous security protocols.

    Key takeaway: The future of DevOps isn’t just faster automation—it’s intelligent orchestration, resilient architecture, and disciplined governance working together.

    Sponsors

    Support the show by using promo code SNARFUL at checkout:

    • Pins and Aces – 21% off https://pinsandaces.com/discount/SNARFUL
    • Skoni – 15% off https://skoni.com/discount/SNARFUL
    • Old Glory – 15% off https://oldglory.com/discount/SNARFUL
    • Strong Coffee https://strongcoffeecompany.com/discount/SNARFUL

    Your support helps us keep learning the dots—one topic at a time.

    Más Menos
    24 m
  • The Modern SDLC — How Software and Code Safely Moves to Production
    Feb 28 2026

    In this episode of Learning the Dots, Alex and Morgan break down the modern Software Development Lifecycle (SDLC) and explain how code safely moves from idea to real users. Software doesn’t jump straight into production—it progresses through structured environments like development, staging, and production to reduce risk and protect customer experience.

    The conversation covers how teams use ephemeral environments and autoscaling to control infrastructure costs in the cloud, spinning resources up only when needed. They also explain why data masking is essential during testing, ensuring sensitive information is anonymized to maintain compliance and protect privacy.

    A key risk highlighted in the episode is configuration drift, where manual changes cause cloud environments to diverge from their intended Infrastructure as Code definitions. Left unchecked, drift can introduce instability and security vulnerabilities.

    To prevent issues before release, teams rely on multiple layers of testing—functional testing, performance testing, and User Acceptance Testing (UAT). Strategic deployment methods like blue-green and canary releases further minimize downtime and limit user impact during updates.

    Key takeaway: Modern software delivery succeeds at the intersection of automation, security, testing discipline, and cost management.

    Sponsors

    Support the show by using promo code SNARFUL at checkout:

    • Pins and Aces – 21% off https://pinsandaces.com/discount/SNARFUL
    • Skoni – 15% off https://skoni.com/discount/SNARFUL
    • Old Glory – 15% off https://oldglory.com/discount/SNARFUL
    • Strong Coffee https://strongcoffeecompany.com/discount/SNARFUL

    Your support helps us keep learning the dots—one topic at a time.

    Más Menos
    21 m
  • Data Lakehouse - One Platform for AI and Analytics, Explained
    Feb 14 2026

    In this episode of Learning the Dots, Alex and Morgan explain the rise of the AI data lakehouse—a modern data architecture that combines the low-cost flexibility of data lakes with the performance and governance of data warehouses. The conversation breaks down why this evolution matters, how it supports both Artificial Intelligence and Business Intelligence on the same platform, and what foundational technologies make it possible.

    What Is a Data Lakehouse?

    A data lakehouse is a unified architecture that allows organizations to store massive amounts of raw data affordably while still enforcing structure, governance, and performance controls needed for analytics and AI. It eliminates the traditional divide between “data lake” and “data warehouse.”

    Why It Evolved

    The hosts explain that modern AI workloads demand more than cheap storage. They require:

    • ACID transactions for reliable updates
    • Schema enforcement for consistent data structure
    • Real-time processing for immediate insight

    Without these capabilities, AI and advanced analytics become unstable, slow, or inaccurate.

    The Open-Source Foundation

    Key open-source table formats power the lakehouse model:

    • Apache Iceberg
    • Delta Lake
    • Apache Hudi

    These technologies enable advanced capabilities like time travel (querying historical versions of data), metadata management, and transactional reliability—bringing warehouse-level discipline to lake-scale storage.

    The Medallion Architecture

    To manage data quality progressively, organizations use the Medallion architecture, which organizes data into three refinement layers:

    • Bronze: Raw, ingested data
    • Silver: Cleaned and validated data
    • Gold: Business-ready, curated data

    This structured refinement ensures that AI models and dashboards are built on trustworthy foundations.

    Why It Matters

    The AI data lakehouse reduces data silos, lowers operational complexity, and enables organizations to run analytics and machine learning on a single platform. It becomes especially powerful for advanced workflows like Retrieval-Augmented Generation (RAG) and large-scale machine learning, where clean, governed, and queryable data is essential.

    Key Takeaway

    The data lakehouse is not just a storage upgrade—it is a strategic architecture that unifies governance, performance, and AI readiness into one scalable foundation.

    Sponsors

    https://pinsandaces.com/discount/SNARFUL – 21% off https://skoni.com/discount/SNARFUL – 15% off https://oldglory.com/discount/SNARFUL – 15% off https://strongcoffeecompany.com/discount/SNARFUL

    Use promo code SNARFUL at checkout to support the show.

    Más Menos
    17 m
Todavía no hay opiniones