Episodes

  • #1398: Ru’s Journey from Hospice Nurse to DJ, Artist, & Kaleidosky Event Producer in VRChat
    Jun 21 2024
    Ru is a transgender woman who works as a hospice nurse in Ohio, and since 2018 has been involved as an immersive artists, virtual DJ, and event producer holding weekly events in her psychedelic space called Kaleidosky, which is nominated for Best Music Experience for Raindance Immersive 2024. I had a chance to get a tour of her Kaleidosky 3.0 event space where she showed me how the intuitive VJ system works in creating feedback-loop fractal art in a kaleidoscope that's projected onto a sphere serves as a fully immersive skybox with a floating island in the middle that contains plenty of hang out spots, mini games, and a dancefloor. Kaleidosky has an indie art spirit with a lot of avant-garde performances, with many of them being within the trance or psychedelic trance (aka psytrance) genre, but performers are not limited to this genre. There ends up being an experimental vibe, and happened to have a lot more intimate conversations happening and deep listening happening rather than intensive dancing. Ru doesn't even consider Kaleidosky to be a club, but more of an event space for DJs and VJ artists to experiment with their artistic practice. Ru has featured over 300 different performers since opening the first Kaleidosky in December 2022 and holding events on a weekly basis. I had a chance to catch up with Ru to get a lot more context about her story as being one of the most prolific virtual music event producers in the VRChat scene. Be sure to join the Beat Syndicate group in VRChat and follow her on X (formerly Twitter) for more information on upcoming events. Also be sure to also check out these YouTube documentaries about the clubbing scene in VRChat as well as explorations of gender, and the thriving trans community on VRChat. I Went Clubbing in Virtual Reality: Raves of VRChat by Josef Lorenzo on PBS Voices [which features Ru and her Kaleidosky club] Why are there SO many Trans people in VRChat? Gender, Identity, and Self Discovery by The Virtual Reality Show Identity, Gender, and VRChat (Why is everyone in VR an anime girl?) by Straszfilms This is a listener-supported podcast through the Voices of VR Patreon. Music: Fatality
    Show more Show less
    1 hr and 4 mins
  • #1397: The Troll Project Aims to Create Community-Based Interventions for Trolling & Record Ethnographic Interviews
    Jun 20 2024
    The Troll Project was created by clinical psychologist Ruth Diaz to interrogate the root causes of trolling, but also tackling the issue head on within social VR spaces with some community-driven solutions that she has experimented with. She's recorded a number of ethnographic interviews with existing trolls, former trolls, and community managers to get a better idea behind the root causes that are leading people to troll others online. And she's put her theories into practice when trolls have interrupted her group discussions on different social VR platforms. Diaz has developed a conflict resiliency framework called The D.O.T. Model, which stands for "Deepen. Orient. Transform." The core idea is that there are polarities between the villain archetypes of the troll who fights versus the victim archetype of the target whose response is flight, and then another polarity axis between the vicarious bystander archetype who freezes vs the hero archetype who takes action to fix the situation and become the victor. The D.O.T. framework helps people navigate between these archetypal polarities while dealing with trolls. She writes, "It is designed so we learn how to re-center without using anything on the outside changing to fit our needs. It is a re-orienting “compass” that identifies polarizing relationships patterns, the non-verbals and emotions that accompany these reactive dances, and where one fits in those polarities. Using a catchy visual and simple recipe for each polarity/archetype we embody in negative interactions; it teaches us how to get back to the humanness of heart and reconnect to those around us in a meaningful way. " Diaz was teaching these conflict resiliency methods in public social VR spaces where the group would get trolled, but then they would apply these principles as a group intervention that would actually sometimes result in a transformative experience for the troll. She then started doing ethnographic interviews on different social VR platforms that could potentially lead towards a more formalized community-driven intervention framework for how to deal with trolling. I had a chance to speak with Diaz a couple of weeks ago in order to get more context into some of her ideas about moving beyond the technological solutions of blocking and banning to more holistically address some of the root causes of trolling with more of a community-driven solution. She's presenting today at the Augmented World Expo in a session titled "Resilient XR Environments: Building & Navigating Conflict-Resilient Spaces", and is ultimately hoping that The Troll Project can "contribute to understanding the complex interplay between human behavior, online identity (in 2d and 3d), and social dynamics, facilitating the development of strategies to mitigate negative behaviors and enhance transformative experiences in online communities." Part of a technological solutionism mindset to to expect that technological architectures can solve human problems, but there's limits to the existing technological mitigating strategies and The Troll Project is a welcomed venture into digging deeper into this problem. Trolling is obviously a huge issue that is unlikely to ever be fully eradicated, but Diaz has seen some of the transformative potential of her process by converting trolls into former trolls, and it's worth exploring these types of community-based alternatives to get to the root of the problem. The Troll Project is looking for funding and collaborations to take it to the next level, and so be sure to check out their Join & Contribute section to get more details for how to get involved. This is a listener-supported podcast through the Voices of VR Patreon. Music: Fatality
    Show more Show less
    1 hr and 8 mins
  • #8th Wall Releases Web-Based Game Engine “Niantic Studio” to Render WebXR Experiences
    Jun 18 2024
    Today Niantic's 8th Wall is announcing Niantic Studio, which is "a new visual interface for Niantic 8th Wall developers that offers an entirely new way to build immersive 3D and XR experiences." It's essentially a web-based game engine using three.js that can render WebXR experiences and starts to integrate a few of Niantic's Lightship APIs, but will be launching with more integrated computer vision and geospatial mapping features soon. I had a chance to speak with 8th Wall Founder Erik Murphy-Chutorian at length to get a lot more details, and be sure to tune into the podcast or read more information below to get a lot more context on this announcement that's being made at the Augmented World Expo. You can read more about Niantic Studio in their blog announcement. This is a listener-supported podcast through the Voices of VR Patreon. Music: Fatality
    Show more Show less
    53 mins
  • #1395: Apple Vision Pro as Screen Replacement Power User Brad Lynch on Overlays & Multi-App XR
    Jun 17 2024
    YouTuber Brad Lynch (aka SadlyItsBradley) has completely replaced all of his computer screens with an Apple Vision Pro, even going as far as getting a headless Mac Book Pro that does not even have a screen. He's been using the Apple Vision Pro for around 8 hours a day since launch, and I wanted to get a sense of how he's been using it. It turns out the he is mostly streaming his gaming PC via Moonlight and using social VR apps like VRChat via ALVR to hang out with friends and ambient hang out virtual spaces. He's also using SteamVR overlays to augment his virtual reality experience with Steam apps like XSOverlay, VRHandsFrame, and OVR Advanced Settings. It's feels like a very niche use case of a hardcore VR enthusiast, but one that mixes and mashes realities in a way that might be a sign of things to come. Some of the most compelling apps for Lynch are open source that enable experiences that are being driven by his high-end Windows machine. https://twitter.com/SadlyItsBradley/status/1783406980192702842 Lynch has been closely following a lot on VR hardware developments over the last number of years, but the Apple Vision Pro has satisfied most of his desires in what he wants within a high-end spatial computing device. The resolution is high enough and the quality good enough so that he can spend more time exploring different screen replacement use cases, augmented VR experiences via overlays, and productivity use cases of XR. Most PCVR enthusiasts are Windows users, and so Lynch's audience has traditionally focused more on the gaming use cases of VR. As a result they have not been as interested in the Apple Vision Pro due to the lack of high-fidelity input controls. But for Lynch, the basic locomotion gestures made available in ALVR are good enough for him to get around within VRChat without needed to hook up or use any external controllers. Because of the perceived or actual gaps between his ideal spatial computing use cases and his VR gaming audience, then Lynch actually scrapped his formal review video and is considering releasing clips or falling back to Q&A livestreams to field many questions about the trajectory of hardware in the XR industry. Lynch also has been enjoying the mashing up of spatial contexts in XR, mostly via the SteamVR overlays and windows but mentions some experiments of bringing in fully spatial objects. It reminds me of the interview that I did with the PlutoVR founders in 2020 when they were experimenting a lot with the idea of multi-app spatial computing paradigms with WebXR and apps like Aardvark by Joe Ludwig. Apple is slowly building out more and more spatial primitives across all of their operating systems, and are slowly becoming more and more game engine-like as new APIs were announced as a part of their WWDC, where visionOS 2.0 was announced as coming out later this Fall. We talk about some of the quality of life features, but also the role of an integrated ecosystem, and what BigScreen Beyond, Valve, and Meta might do to keep up with how Apple is pushing forward these ideas of multi-app integrations within spatial computing. At the moment Lynch's 8 hours of daily usage is likely an extreme outlier, but the types of ways that he's blending realities together feels like there's something deeper that we'll continue to see moving forward. VR typically involves a complete context shift, while AR tends to bring in modular elements of other contexts to shift your existing context. Lynch is on the bleeding edge of fully immersing himself within these virtual contexts, but modulating his experience with these SteamVR overlays in what could best be described as a sort of AR within VR use case. The visionOS 2.0 Beta release (coming this Fall) allows users to overlay their Mac Virtual Display over immersive environments, but SteamVR Overlays already enable this on PCVR experiences and so Lynch's experiences with overlays in VRChat could definitely be a sign of how Apple might e...
    Show more Show less
    1 hr and 20 mins
  • #1394: Discussion about VRChat Layoffs & Paths to Profitability with Four Community Members
    Jun 15 2024
    On June 12, 2024, VRChat announced they were laying off 30% of their staff, and they posted a letter to all employees sent from CEO Graham Gaylor that said, "We’re reducing the size of our team by around 30% and saying goodbye to many talented team members in the process. This is the hardest change we’ve had to make at VRChat, and Jesse and I take full responsibility for the decisions that brought us here." VRChat listed the four main reasons for the layoff in that they added too many individual contributors to their development team without a management layer to prioritize efforts, which meant they needed to reduce their headcount to folks who were more directly working on features that would lead to a path of profitability over the next five years. VRChat has taken a long time to launch their creator economy, which is still in closed beta, and their VRChat+ subscription models doesn't offer all that many differentiated features to drive users to join beyond a signal of patronage to support the platform. I wanted to gather some active VRChat community members to talk about the burgeoning creator economy as well as other potential pathways to profitability around streaming avatar streamlining, events, contributor / group subscriptions, and emerging features like increased instance size caps. The VRCSpaces community on X (formerly Twitter) held a Space on the day of the announcement discussing the layoffs, and I invited participants Table, yewnyx, and Miss Stabby as well as qDot who wrote up a really great thread breaking down some of the Silicon Vally startup dynamics. More than anything, I wanted to get their take about what VRChat is getting right, and where the most viable pathways for them might be on their road towards profitability. This is a listener-supported podcast through the Voices of VR Patreon. Music: Fatality
    Show more Show less
    1 hr and 40 mins
  • #1393: Overview of Raindance Immersive 2024 Selection Featuring Social VR Indie Artists & Virtual Culture
    Jun 6 2024
    Raindance Immersive 2024 opened last weekend in VRChat, and will run for the next four weekends featuring the latest innovations of virtual culture with 77% of experiences happening on social VR platforms and 69% that feature VRChat. I spoke at length with co-curators Mária Rakušanová and Joe Hunting about each of the 36 experiences that span nine different categories with four experiences each including Best Art World, Dance Experience, Game, Live Show, Music Experience, Narrative, Out of Competition experiences as well as the Music Video of VR and Short Film of VR. https://www.youtube.com/watch?v=PZ-VmsZ5oAU There are a number of mixed reality games that will be showing on the Apple Vision Pro and Quest 3, as well as the four narrative experiences that will be showing at a physical exhibition at Raindance, but otherwise the other 30 experiences all have some connection to a social VR platform. The best live show and best dance experience categories are new this year, and Hunting is continuing to cultivate the two film categories featuring short films and music videos, all of which were shot within VRChat. I talk with both Rakušanová and Hunting about the shift from June to November, and then dive into each of the nine categories and 36 experiences highlighting the latest trends and innovations by indie XR artists who are primarily working on social VR platforms including VRChat, Resonite, and EngageXR.
    Show more Show less
    1 hr and 29 mins
  • #1392: The Story Behind Tribeca Immersive’s Pivot Towards Immersive Art with Curator Ana Brzezińska
    May 31 2024
    Curator Ana Brzezińska announced the Tribeca Immersive 2024 line-up on May 23, 2024 mentioning that "this year we are not presenting any VR/AR experiences," but rather that they're pivoting this year to featuring eight immersive art pieces by six different artists being shown at Mērcer Labs: Museum of Art and Technology, which first opened to the New York City public on January 4th, 2024. Tribeca Immersive opens on June 6th and runs for 12 days until June 17th, and you can get tickets to see it here. Some immersive creators discovered this surprising news by receiving a refund to their submission, which said "As the immersive landscape evolves, our category is taking new shape to explore formats outside of VR/AR this year and our decision is not a reflection of the quality of your work… Because our change in format was a recent shift and changed our consideration of the works, we will be refunding you your submission fee." Tribeca Immersive will be taking over five of the rooms at Mērcer Labs that includes "a two-story high 5-wall LED immersive hall; a 4DSOUND installation space; and two infinity rooms, including one equipped with volumetric technology that makes you feel like you’re watching a space opera." This space opera location is called the Dragon Room, which is described by Mērcer Labs on their Instagram page in the following way, "Volumetric lighting allows visitors to see voxels (volumetric pixels) of light in space. The light source is modeled as a transparent object in a container of volume. The resulting effect is of passing through a hologram. The use of mirrors and advanced technology, unlock the power of three-dimensional visualization to create a cascade of luminescent particles that appear suspended in infinite space. The individual light particles in an architectural format offer virtually endless opportunities for formal reinvention. The volumetric installation powered by @ledpulse, Dragon02 technology, features a unique configuration exclusive to Mercer Labs. This remarkable installation, the largest of its kind, and incorporates more than half a million LED neurons-microchips, meticulously synchronized to create an unparalleled visual experience." I had a chance to sit down with Brzezińska last week to get a bit more context on this pivot as well as on the eight immersive art pieces in the selection that will be spread across three different daily selections repeating four times over the 12-day run. Also be sure to check out her Substack post titled "In The Stillness of Synthetic Light*. Note on Tribeca Immersive 2024" to get even more context for her intentions and motivations for this year's selection where she says, "I believe that before we start seeking hope — which I expect to become a major theme in many upcoming art events — we need space for reflection and respite that allows for a radical suspension of judgment." I unfortunately won't be able to travel to Tribeca to check it out myself due to a family medical situation that's limiting my travel this year, but I'm looking forward to hearing back from folks in the community to hear how it all comes together. There will not be a formal Storyscapes competition this year, and so I will also be interested to see how the Tribeca Immersive selection continues to evolve over time and whether this represents a permanent shift away from immersive stories towards immersive art, or if we'll see an the inclusion of both forms in the future. This is a listener-supported podcast through the Voices of VR Patreon. Music: Fatality
    Show more Show less
    1 hr and 25 mins
  • #1391: ILM Immersive’s “What If…?” Marvel Experience Forges a New Genre of Immersive Storytelling blending 3D Movie Cinematics with First-Person Embodiment
    May 29 2024
    ILM Immersive & Marvel Studios is releasing What If...?: An Immersive Story exclusively on Apple Vision Pro tomorrow, and I had a chance to take an early look and speak with producer and executive producer Shereif Fattouh as well as technical art director Indira Guerrieri about the process of creating this experience. It's mostly an immersive story with light interactions, but you can't actually die and so it's more about immersing yourself into these worlds with first-person embodiment and exploring around 10 different gesture-based interactions in nearly 50 different interactive moments across the 40 to 45-minute experience. There's also some disembodied moments where you're watching 3D movie cinematics on shards of glass floating in the environment, and so they're also really leaning into how good 3D movies look on the Apple Vision Pro within this experience. Overall, it's blending lots of elements together that feels like something new and a new genre of immersive storytelling that's building upon existing IP, and giving new opportunities to engage with these character's stories as well as their worlds. This is also the first Unreal Engine experience releasing on the Apple Vision Pro store, and so Fattouh told me that ILM Immersive has been making private modifications to get things working. There are features like mixed reality passthrough that are not available yet in Unreal Engine 5.4 yet. Agile Lens' Alex Coulombe told me, "The Epic Games XR team is small so I would I err on the side of assuming no major Vision Pro updates coming soon from Epic. To have the current features working it still requires a source build of Unreal Engine 5.4 (the Epic Games Launcher doesn’t work) and current features include fully immersive mode, hand tracking / gesture recognition, forward and deferred rendering." So hopefully we'll get more on this soon from Epic Games, but at least we know it's possible and we'll start to see more Unreal Engine experiences on the Apple Vision Pro soon. Be sure to check out the transcript below for more information, and also don't miss my episode featuring my talk on an Elemental Theory of Presence, Experiential Design, and Immersive Storytelling, as well as my previous 480+ episodes on the topic of immersive storytelling over the past decade including previous coverage on Cycles, Myth: A Frozen Tale, Star Wars: Tales from the Galaxy’s Edge, and a conversation with John Gaeta about ILMxLab's The Holo-Cinema from Sundance New Frontier 2016. This is a listener-supported podcast through the Voices of VR Patreon. Music: Fatality
    Show more Show less
    51 mins