The New Quantum Era - innovation in quantum computing, science and technology Podcast Por Sebastian Hassinger arte de portada

The New Quantum Era - innovation in quantum computing, science and technology

The New Quantum Era - innovation in quantum computing, science and technology

De: Sebastian Hassinger
Escúchala gratis

OFERTA POR TIEMPO LIMITADO | Obtén 3 meses por US$0.99 al mes

$14.95/mes despues- se aplican términos.
Your host, Sebastian Hassinger, interviews brilliant research scientists, software developers, engineers and others actively exploring the possibilities of our new quantum era. We will cover topics in quantum computing, networking and sensing, focusing on hardware, algorithms and general theory. The show aims for accessibility - Sebastian is not a physicist - and we'll try to provide context for the terminology and glimpses at the fascinating history of this new field as it evolves in real time.(c) Sebastian Hassinger 2025 Ciencia Física
Episodios
  • Majorana qubits with Chetan Nayak
    Jan 12 2026
    In this episode of The New Quantum Era, your host Sebastian Hassinger is joined by Chetan Nayak, Technical Fellow at Microsoft, professor of physics at the University of California Santa Barbara, and driving force behind Microsoft's quantum hardware R&D program. They discuss a modality of qubit that has not been covered on the podcast before, based on Majorana fermonic behaviors, which have the promise of providing topological protection against the errors which are such a challenge to quantum computing. Guest Bio Chetan Nayak is a Technical Fellow at Microsoft and leads the company’s topological quantum hardware program, including the Majorana‑1 processor based on Majorana‑zero‑mode qubits. He is also a professor of physics at UCSB and a leading theorist in topological phases of matter, non‑Abelian anyons, and topological quantum computation. Chetan co‑founded Microsoft’s Station Q in 2005, building a bridge from theoretical proposals for topological qubits to engineered semiconductor–superconductor devices. What we talk about Chetan’s first exposure to quantum computing in Peter Shor’s lectures at the Institute for Advanced Study, and how that intersected with his PhD work with Frank Wilczek on non‑Abelian topological phases and Majorana zero modes. The early days of topological quantum computation: fractional quantum Hall states at , emergent quasiparticles, and the realization that braiding these excitations naturally implements Clifford gates. How Alexei Kitaev’s toric‑code and Majorana‑chain ideas connected abstract topology to concrete condensed‑matter systems, and led to Chetan’s collaboration with Michael Freedman and Sankar Das Sarma. The 2005 proposal for a gallium‑arsenide quantum Hall device realizing a topological qubit, and the founding of Station Q to turn such theoretical blueprints into experimental devices in partnership with academic labs. Why Microsoft pivoted from quantum Hall platforms to semiconductor–superconductor nanowires: leveraging the Fu–Kane proximity effect, spin–orbit‑coupled semiconductors, and a huge material design space—while wrestling with the challenges of interfaces and integration. The evolution of the tetron architecture: two parallel topological nanowires with four Majorana zero modes, connected by a trivial superconducting wire and coupled to quantum dots that enable native Z‑ and X‑parity loop measurements. How topological superconductivity allows a superconducting island to host even or odd total electron parity without a local signature, and why that nonlocal encoding provides hardware‑level protection for the qubit’s logical 0 and 1. Microsoft’s roadmap in a 2D “quality vs. complexity” space: improving topological gap, readout signal‑to‑noise, and measurement fidelity while scaling from single tetrons to error‑corrected logical qubits and, ultimately, utility‑scale systems. Error correction on top of topological qubits: using surface codes and Hastings–Haah Floquet codes with native two‑qubit parity measurements, and targeting hundreds of physical tetrons per logical qubit and thousands of logical qubits for applications like Shor’s algorithm and quantum chemistry. Engineering for scale: digital, on–off control of quantum‑dot couplings; cryogenic CMOS to fan out control lines inside the fridge; and why tetron size and microsecond‑scale operations sit in a sweet spot for both physics and classical feedback. Where things stand today: the Majorana‑1 chiplet, recent tetron loop‑measurement experiments, DARPA’s US2QC program, and how external users—starting with government and academic partners—will begin to access these devices before broader Azure Quantum integration. Papers and resources mentionedThese are representative papers and resources that align with topics and allusions in the conversation; they are good entry points if you want to go deeper.Non‑Abelian Anyons and Topological Quantum Computation – S. Das Sarma, M. Freedman, C. Nayak, Rev. Mod. Phys. 80, 1083 (2008); Early device proposalsSankar Das Sarma, Michael Freedman, and Chetan Nayak, “Topological quantum computation,” Physics Today 59(7), 32–38 (July 2006).Roadmap to fault‑tolerant quantum computation using topological qubits – C. Nayak et al., arXiv:2502.12252. Distinct lifetimes for X and Z loop measurements in a Majorana tetron - C. Nayaak et al., arXiv:2507.08795.Majorana qubit codes that also correct odd-weight errors - S. Kundu and B. Reichardt, arXiv:2311.01779. Microsoft's Majorana 1 chip carves new path for quantum computing, Microsoft blog post
    Más Menos
    1 h y 3 m
  • Peaked quantum circuits with Hrant Gharibyan
    Dec 12 2025
    In this episode of The New Quantum Era, Sebastian talks with Hrant Gharibyan, CEO and co‑founder of BlueQubit, about “peaked circuits” and the challenge of verifying quantum advantage. They unpack Scott Aaronson and Yuxuan Zhang’s original peaked‑circuit proposal, BlueQubit’s scalable implementation on real hardware, and a new public challenge that invites the community to attack their construction using the best classical algorithms available. Along the way, they explore how this line of work connects to cryptography, hardness assumptions, and the near‑term role of quantum devices as powerful scientific instruments.Topics CoveredWhy verifying quantum advantage is hard The core problem: if a quantum device claims to solve a task that is classi-cally intractable, how can anyone check that it did the right thing? Random circuit sampling (as in Google’s 2019 “supremacy” experiment and follow‑on work from Google and Quantinuum) is believed to be classically hard to simulate, but the verification metrics (like cross‑entropy benchmarking) are themselves classically intractable at scale.What are peaked circuits? Aaronson and Zhang’s idea: construct circuits that look like random circuits in every respect, but whose output distribution secretly has one special bit string with an anomalously high probability (the “peak”). The designer knows the secret bit string, so a quantum device can be verified by checking that measurement statistics visibly reveal the peak in a modest number of shots, while finding that same peak classically should be as hard as simulating a random circuit.BlueQubit’s scalable construction and hardware demo BlueQubit extended the original 24‑qubit, simulator‑based peaked‑circuit construction to much larger sizes using new classical protocols. Hrant explains their protocol for building peaked circuits on Quantinuum’s H2 processor with around 56 qubits, thousands of gates, and effectively all‑to‑all connectivity, while still hiding a single secret bit string that appears as a clear peak when run on the device.Obfuscation tricks and “quantum steganography” The team uses multiple obfuscation layers (including “swap” and “sweeping” tricks) to transform simple peaked circuits into ones that are statistically indistinguishable from generic random circuits, yet still preserve the hidden peak.The BlueQubit Quantum Advantage Challenge To stress‑test their hardness assumptions, BlueQubit has published concrete circuits and launched a public bounty (currently a quarter of a bitcoin) for anyone who can recover the secret bit string classically. The aim is to catalyze work on better classical simulation and de‑quantization techniques; either someone closes the gap (forcing the protocol to evolve) or the standing bounty helps establish public trust that the task really is classically infeasible.Potential cryptographic angles Although the main focus is verification of quantum advantage, Hrant outlines how the construction has a cryptographic flavor: a secret bit string effectively acts as a key, and only a sufficiently powerful quantum device can efficiently “decrypt” it by revealing the peak. Variants of the protocol could, in principle, yield schemes that are classically secure but only decryptable by quantum hardware, and even quantum‑plus‑key secure, though this remains speculative and secondary to the verification use case. From verification protocol to startup roadmap Hrant positions BlueQubit as an algorithm and capability company: deeply hardware‑aware, but focused on building and analyzing advantage‑style algorithms tailored to specific devices. The peaked‑circuit work is one pillar in a broader effort that includes near‑term scientific applications in condensed‑matter physics and materials (e.g., Fermi–Hubbard models and out‑of‑time‑ordered correlators) where quantum devices can already probe regimes beyond leading classical methods.Scientific advantage today, commercial advantage tomorrow Sebastian and Hrant emphasize that the first durable quantum advantages are likely to appear in scientific computing—acting as exotic lab instruments for physicists, chemists, and materials scientists—well before mass‑market “killer apps” arrive. Once robust, verifiable scientific advantage is established, scaling to larger models and more complex systems becomes a question of engineering, with clear lines of sight to industrial impact in sectors like pharmaceuticals, advanced materials, and manufacturing.The challenge: https://app.bluequbit.io/hackathons/
    Más Menos
    30 m
  • Diamond vacancies and scalable qubits with Quantum Brilliance
    Dec 6 2025

    Episode overview
    This episode of The New Quantum Era features a conversation with Quantum Brilliance co‑founder and CEO Mark Luo and independent board chair Brian Wong about diamond nitrogen vacancy (NV) centers as a platform for both quantum computing and quantum sensing. The discussion covers how NV centers work, what makes diamond‑based qubits attractive at room temperature, and how to turn a lab technology into a scalable product and business.

    What are diamond NV qubits?
    Mark explains how nitrogen vacancy centers in synthetic diamond act as stable room‑temperature qubits, with a nitrogen atom adjacent to a missing carbon atom creating a spin system that can be initialized and read out optically or electronically. The rigidity and thermal properties of diamond remove the need for cryogenics, complex laser setups, and vacuum systems, enabling compact, low‑power quantum devices that can be deployed in standard environments.

    Quantum sensing to quantum computing
    NV centers are already enabling ultra‑sensitive sensing, from nanoscale MRI and quantum microscopy to magnetometry for GPS‑free navigation and neurotech applications using diamond chips under growing brain cells. Mark and Brian frame sensing not as a hedge but as a volume driver that builds the diamond supply chain, pushes costs down, and lays the manufacturing groundwork for future quantum computing chips.

    Fabrication, scalability, and the value chain
    A key theme is the shift from early “shotgun” vacancy placement in diamond to a semiconductor‑style, wafer‑like process with high‑purity material, lithography, characterization, and yield engineering. Brian characterizes Quantum Brilliance’s strategy as “lab to fab”: deciding where to sit in the value chain, leveraging the existing semiconductor ecosystem, and building a partner network rather than owning everything from chips to compilers.

    Devices, roadmaps, and hybrid nodes
    Quantum Brilliance has deployed room‑temperature systems with a handful of physical qubits at Oak Ridge National Laboratory, Fraunhofer IAF, and the Pawsey Supercomputing Centre. Their roadmap targets application‑specific quantum computing with useful qubit counts toward the end of this decade, and lunchbox‑scale, fault‑tolerant systems with on the order of 50–60 logical qubits in the mid‑2030s.

    Modality tradeoffs and business discipline
    Mark positions diamond NV qubits as mid‑range in both speed and coherence time compared with superconducting and trapped‑ion systems, with their differentiator being compute density, energy efficiency, and ease of deployment rather than raw gate speed. Brian brings four decades of experience in semiconductors, batteries, lidar, and optical networking to emphasize milestones, early revenue from sensing, and usability—arguing that making quantum devices easy to integrate and operate is as important as the underlying physics for attracting partners, customers, and investors.

    Partners and ecosystem
    The episode underscores how collaborations with institutions such as Oak Ridge, Fraunhofer, and Pawsey, along with industrial and defense partners, help refine real‑world requirements and ensure the technology solves concrete problems rather than just hitting abstract benchmarks. By co‑designing with end users and complementary hardware and software vendors, Quantum Brilliance aims to “democratize” access to quantum devices, moving them from specialized cryogenic labs to desks, edge systems, and embedded platforms.

    Más Menos
    37 m
Todavía no hay opiniones