Envisioning the Future: Crafting Magical and Engaging Spatial Apps with Sha (VRARA LA Chapter Replay)

Watch the replay for the VRARA LA Chapter as they introduced ShapesXR, a groundbreaking tool that redefines spatial design and collaboration!


Guest Speaker: Inga Petryaevskaya, CEO of ShapesXR
Hosted by Vahe Karamian - VRARA LA Chapter President


XR content transcends the limitations of flat screen 3D by immersing users in a fully interactive and spatial environment. Unlike traditional 3D displays, XR technology offers users a sense of depth, scale, and presence, enhancing their ability to perceive and interact with digital content in a more intuitive and natural manner.

Agenda:
-Introduction
-How does XR content differ from flat screen 3D?
-What does the new product development workflow look like for XR?
-Selected use cases and examples
-Live demo of ShapesXR - the power of truly spatial design and collaboration
-Final remarks

Bio:
Inga Petryaevskaya is the Founder and CEO of ShapesXR. Inga's journey in XR began over seven years ago when she co-founded Tvori, a VR real-time animation software aimed at revolutionizing storytelling and 3D animation. During the development of Tvori, Inga and her team recognized the need for tools that could democratize the creation of immersive content, including VR, AR, and Mixed Reality apps, games, and experiences. Inga's solid vision is that spatial apps need spatial creation tools, you have to be immersed in the medium of your user to build magical and delightful XR experiences.

This realization led to the creation of ShapesXR, a platform with a mission to make spatial design and collaboration accessible to everyone. As a product owner, Inga is dedicated to delivering a seamless user experience that empowers individuals, regardless of their 3D expertise, to collaborate and design spatially, inside 3D.

Prior to starting the journey of an entrepreneur, Inga held a Director role at the office of the CTO at Dell Technologies, responsible for driving internal and external innovation projects, including M&A, and prior to that was a researcher at Siemens Corporate Technology. Inga holds engineering degree and MSc in Computer Science from Hamburg University of Technology. See less

VRARA Member Receive Discount to TNW Conference

VRARA Members receive 20% discount. Email info@thevrara.com to obtain.

The 18th annual TNW Conference, happening on June 20 & 21, 2024 in Amsterdam, will bring together 10,000+ industry leaders and tech enthusiasts to reclaim the future and uncover the next big ideas in tech. Be in the same room as corporates, government officials, startups & scaleups and investors to see how they will shape the world of tomorrow.

Let us work towards a future where technology is a beacon of hope once again all while getting all of the benefits of a conference with the energy of a festival - do business, get inspired, meet industry peers, and enjoy every minute of it.

TNW Conference offers unmatched networking opportunities through roundtable discussions, 1:1 meeting areas, workshops, and throughout the Business Hall. Meet with like-minded people from 80+ countries as you build out your international network. The casual conference environment provides you all of the opportunities to meet with the right people through its onsite activities such as padel, ping pong and much more!

Future proof your next big idea with seven industry leading themes through Ren-AI-ssance, Pixels & Profit, Sustainable Futures, Venture and much more. Get excited about the amazing speaker lineup and agenda with powerhouse speakers such as the CEO of Vinted Thomas Plantega, the CEO of Booking.com Glenn Fogel, the CFO of OnlyFans Lee Taylor, the Chief of Global Brand and Communications of Tony's Chocolonely Sadira Furlow, Formula E Porsche Race Driver Antonio Felix da Costa and much more!

At TNW Conference we want to set your startups and investors for success through expertly designed packages. TNW for Startups packages - Bootstrap or Scale-up and TNW for Investor Program are designed to maximize your time before, during and after the event. TNW works closely with you to ensure you have all the right meetings set up before you step foot on the conference grounds. 

Equal opportunity and equal access to the tech scene is vital for a better and brighter future. Although the progress of achieving greater diversity is slow, TNW wants to ensure that the voices of the underrepresented are heard. Through the Women in Tech and Under 30s tickets, TNW hopes to facilitate access to world-class events. 

Don’t miss out on the most innovative and thought-provoking tech event in Europe. Let’s come together and build towards a more sustainable, equitable, and inclusive future at TNW Conference today!

Tim Cook gives an update on the Apple Vision Pro

Tim Cook gave an update (during the Apple event this week) on the Vision Pro with examples from automotive, healthcare, and entertainment (watch the video below). And, we already know half of Fortune 100 companies have bought an Apple Vision Pro and are developing their solutions. Learn more at our Enterprise Summit 2024 on July 24!

Porsche is using the Apple Vision Pro to build the “showroom of the future”. Tim also mentioned Porsche was using the Vision Pro to train service technicians and reimagine track experiences.

Sharp Healthcare is improving surgical eyecare by using the Vision Pro. Tim said this was being done through “simulation, analysis and optimization”.

Professional film design on a theater-sized screen - film director Jon M. Chu was editing his upcoming film ‘Wicked’ on an Apple Vision Pro. This was being done from the comfort of his home on a “theater-size screen”.

Learn more at our Enterprise Summit 2024 on July 24!

How Treeview Implements Spatial User Interfaces for Apple Vision Pro

Case Study from VRARA Member ShapesXR.

Background and Objectives

Treeview is a high-end boutique development studio that builds AR/VR products and solutions for enterprise clients. Treeview has been working for over 8 years in the AR/VR industry and is currently ranked as the #1 AR/VR development team in LATAM.

Treeview has started developing software for the Apple Vision Pro and faced unique development challenges in the front-end implementation process for this platform.

When developing Inviewer, a Spatial STEM Simulation library, the team set out to explore an improved workflow to minimize iterations and reduce time in the front-end implementation of spatial user interfaces for the latest AR/VR technologies, including the Apple Vision Pro and Meta Quest 3.

"ShapesXR has enabled our team to drastically reduce the time for front-end implementations of Spatial User Interfaces for the latest generation of XR devices, including the Apple Vision Pro and Meta Quest 3." - Horacio Torrendell, Founder & CEO of Treeview

Decision Process

We were having trouble with the amount of iterations needed to get the sizing, positions, and rotations of the UI to feel correct in AVP due to the slow build cycles of Unity→Xcode→AVP. We used ShapesXR to enable the design team to define the exact spatial layout, which was then used as the source of truth for the engineering team to implement in Unity.

We use Figma as our UI design tool, but there is a large gap in communication between a 2D representation of a spatial interface and the actual spatial interface. We discovered that ShapesXR bridges this communication gap.

Implementation and Use

We used ShapesXR to design the exact spatial layout we wanted for the Mixed Reality app. We executed design reviews with all stakeholders in ShapesXR to reach agreement on the final design. Once agreed upon, this design was used as the source of truth for the front-end implementation on the Apple Vision Pro and Meta Quest 3.

Results and Feedback

ShapesXR allowed us to drastically reduce the time of implementation by reducing the amount of front-end implementation cycles. Before using ShapesXR, a front-end implementation could take around 5-10 iterations, requiring the design team to collaborate with development to really nail a high-quality interface implementation. Now with ShapesXR, we’ve reduced the iterations to around only 2: a first implementation that exactly aligns to the ShapesXR source of truth and a final polish round which completes the front-end implementation. 

Previously, we would use Figma as our source of truth for the user interface, which has many gaps in information regarding sizing, distance, and rotation of objects relative to the user’s starting position. Now with ShapesXR, our team can define a source of truth that aligns exactly with the expected outcome of the front-end implementation.

Overall Experience

We started with a self-onboarding process for ShapesXR, through which we discovered the power of this tool. We then had a team onboarding session with one of the ShapesXR team members, which was invaluable for providing an in-depth presentation and offering our team a chance to ask questions and explore how we could fully leverage the tool.

One feature we found particularly useful was the layer feature, which enables us to simulate UI logic flow and even define UI animations. This is advantageous because it allows us to add more layers of detail during the design phase, ensuring all details are defined before starting front-end implementation.

Future Plans

Our plan is to start integrating ShapesXR into the design process of the AR/VR software that we build at Treeview. Being a development studio that works with many clients and many projects, we have various design processes for new AR/VR applications. Our plan is to integrate ShapesXR into our workflow, moving from Figma to ShapesXR, then to Unity, and finally to devices such as the Apple Vision Pro, Meta Quest, MagicLeap, and Microsoft Hololens.

Recommendations

We feel that ShapesXR is becoming the primary design tool for XR development. This tool should be used in all XR development teams. For us, the key benefit is that it allows the design team to cover a more extensive range of design details that are specific to spatial user interfaces. This leaves fewer design decision gaps for the engineering team to fill during the front-end software implementation.

Revolutionize Clinical Solutions with Virtual Reality Medical Training

Guest post from VRARA Member Travancore Analytics

In the dynamic landscape of healthcare, where every moment is critical and every decision holds immense weight, the need for comprehensive and effective training is undeniable. Traditionally, medical education has relied on conventional methods, often constrained by cost, accessibility, and the ability to replicate real-world scenarios authentically. However, the emergence of virtual reality (VR) technology has brought about a transformative shift, offering unprecedented opportunities for immersive and interactive learning experiences. This case study delves into a pioneering collaboration between an esteemed Association for Oral and Maxillofacial Surgeons and a technological partner, aiming to revolutionize medical training through the development of a VR solution tailored to address the specific challenges faced in emergency situations within oral and maxillofacial surgery.

At the forefront of this collaboration was the recognition of the unique challenges posed by traditional training methods, particularly during high-stakes scenarios such as membership renewals and licensure tests. Confronted with the limitations of simulating emergencies using traditional mannequins and static scenarios, the Association sought to harness the potential of VR technology to create a more dynamic and realistic training environment. Thus, the project embarked on a journey to develop a VR simulation that would not only replicate critical medical scenarios but also provide users with a hands-on, risk-free platform to hone their skills and decision-making abilities in real-time.

Check here for the full article: 

https://www.travancoreanalytics.com/case-study/virtual-reality-medical-training/

Website: https://www.travancoreanalytics.com/

Follow us on Linkedin for latest updates: 

https://www.linkedin.com/company/travancore-analytics/

University of Miami – ENGAGE Case Study

ENGAGE is a SaaS Enterprise advanced spatial computing platform aimed at the professional and academic market. It is designed for professionals, event organizers, educators, and corporations to build their own virtual worlds, provide metaverse services directly to their clients, and create new business models.

UMverse is a groundbreaking initiative at the University of Miami to integrate emerging technologies within educational and research environments. The program utilizes immersive technology to revolutionize educational approaches. It aims to move beyond conventional learning tools, such as textbooks, photographs, films, or video games, and instead incorporate immersive and multi-sensory experiences and simulations. UMverse is also advancing research about optimal ways to transform the educational landscape and redefine the campus experience while providing personalized and interactive learning and patient care opportunities. 

Recently, Engage assisted the University of Miami in organizing an innovative conference in collaboration with the Virtual Reality/Augmented Reality Association (VRARA), concentrating on the application of immersive technologies like Virtual and Augmented Reality within the Healthcare sector. This event brought together nearly 100 professionals, including practicing physicians, researchers, faculty, and tech industry leaders, to engage in person, exchange insights on their current use of VR/AR and AI, share their learnings, and discuss the anticipated impact of these technologies across the healthcare industry.

What made this conference a first of its kind was that besides allowing people to attend the conference at the University of Miami Lakeside conference center in person, attendees from around the world used Engage’s advanced spatial computing platform to attend it virtually. 

According to Kim Grinfeder, Department Chair of Interactive Media, our goal was to have as many attendees as possible at the conference to exchange knowledge on AR and VR; we recognize that not everyone can attend in person. Therefore, we chose to provide a remote participation option using virtual reality, diverging from traditional video conferencing to offer a more immersive experience. Additionally, we decided to employ the same spatial computing platform already in use at our school for instructional purposes.

We take pride in the University of Miami’s role as a frontrunner in integrating immersive technologies into both undergraduate and postgraduate education. Demonstrating the application of virtual reality technology at our conference extends its use beyond classroom instruction to encompass professional events and conferences, showcasing its potential for organizational settings.

Today, the ENGAGE XR platform is used across the University of Miami, not only to teach courses for students, as we have in over 40 different courses, but also to train them.  As an example, medical students at the Bascom Palmer Eye Institute are using an Engage application we developed to understand how slit lamps are used for eye examinations with various eye afflictions.

While virtual reality training and event participation are gaining traction, the ENGAGE platform enhances this experience by allowing attendees to join live conferences from anywhere globally, offering a more immersive experience than standard video live streaming. On this platform, participants sit in a virtual conference room, viewing live presentations on a virtual stage as though they were at a physical event. They can interact with one another, engage in discussions with presenters, and actively participate in the conference, all from the comfort of their home or office.

Grinfeder adds, “The Engage platform enables students to learn in a virtual environment, offering a unique spatial aspect that platforms like Zoom lack, fostering enhanced collaboration opportunities. At the University of Miami, we explore its potential across various disciplines, using the VR modality classes in Art History, Computer Engineering, Psychology, Religion, and others.” 

Thomas Merrick, Associate Director of XR at the University of Miami, says a university is an ideal environment to try new technologies. Using the ENGAGE platform for our academic studies has demonstrated that we are approaching a tipping point where XR technology will become ubiquitous. When we decided to host the VR AR healthcare summit, we wanted to make it accessible to as many people as possible.  Utilizing Engage allowed us to blend the Virtual with the physical and provide a broader experience for the attendees. 

Engage allows instructors to use their current course materials within its platform or adapt them to leverage the technology’s immersive capabilities. Moreover, students can attend sessions from various devices, including desktop computers, tablets, and VR headsets—a particularly appealing option as VR gaming gains popularity among our students.

 The future of learning will include extended reality technologies at all levels. I also see a future where physical events will have virtual attendee components as standard, so if you can’t go to that conference or that concert live, you can get a lot of the benefits and experience by attending virtual reality. It’s exciting to be involved in this next evolution of the Internet, which will be real-time 3-D.

Avgoustos Tsinakos and Chrysoula Lazou appointed as Co-Chairs of University and Colleges Committee

We are thrilled to welcome Prof. Avgoustos Tsinakos and Chrysoula Lazou to help lead our community and host online sessions for the University and Colleges Committee.

This committee shares and creates best practices, guidelines, and call to actions for university curricula and research relating to VR and AR.

We are excited for the opportunity given by the VRAR Association to Co-Chair the University and Colleges Committee worldwide.

Our vision is to focus on revealing, collecting and presenting best practices and innovators in Universities and Colleges (per country) in order to inspire and boost cooperation and expertise sharing among Academics, (PhD) Researchers, Teachers/Instructors and inspire the educators of the future.

Immersive technologies can create simulations replicating real-life scenarios, allowing students to practice skills, personalize instruction, track student progress and adjust the difficulty level or provide personalized feedback based on individual needs.  In addition, they offer instructors new ways to present content and engage students in a variety of topics.— Prof. Avgoustos Tsinakos and Chrysoula Lazou

XR Training lands OTA deal to build Amphibious Combat Vehicle

XR Training - a fast-growing global studio of VR/XR evangelists, developers, engineers and leaders in the defense industry - has just won an OTA deal to lead the production of 81 VR simulators in 2024.

These simulators will be used to train modern Marines in the operation of a cutting-edge Amphibious Combat Vehicle (ACV). You can read more about it in the press release here.

This custom-built immersive VR training solution serves as a crucial turning point in the evolution of training for the U.S. Marine Corps. Learn more!

Recap of Scalable Student-driven Production of Scalable XR Learning Experiences

Colleen Bielitz hosted our online session and had Prof Robert LiKam Wa give a presentation.

Dr. Robert LiKamWa is an associate professor at Arizona State University, appointed in the School of Arts, Media and Engineering (AME) and the School of Electrical, Computer and Energy Engineering (ECEE). LiKamWa directs Meteor Studio, which explores the research and design of software and hardware for mobile Augmented Reality, Virtual Reality, Mixed Reality, and visual computing systems, and their ability to help people tell their stories. To this end, Meteor Studio’s research and design projects span three arcs: (i) advanced visual capture and processing systems, (ii) systems for hybrid virtual-physical immersion through augmentation of senses, and (iii) design frameworks for data-driven augmented reality and virtual reality storytelling and sensemaking.

The meeting discussed XR learning at ASU scale and beyond, emphasizing partnerships with AR and VR for diverse learning opportunities and immersive storytelling with Dreamscape Learn. The XR program at ASU was highlighted, focusing on MeSH system implementation for scalability and student accessibility, including studio mentorship and faculty involvement. Initiatives for immersive learning experiences for high school students, meteorology simulations, and XR environments in education were also addressed, along with ongoing research projects and collaborations. The meeting explored micro-credentialing in XR programs, VR applications in education for enhanced learning experiences, and potential collaborations for XR lesson builder platforms. The importance of partnerships, scalability, and philanthropic support were emphasized as next steps for action.

Topics & Highlights

1. Discussion on XR Learning at ASU Scale and Beyond

  • investments in partnerships with AR and VR for different learning opportunities and creating learning experiences for various degree programs at ASU.

  • partnership with Dreamscape Learn, emphasizing immersive storytelling for education and engaging students in their learning by making them characters in their own stories.

2. MeSH - Support Network and Studio Mentorship

  • The decision to have faculty mentors for guilds to provide skills and expert mentorship in their subject areas.

  • Discussion about the collaborative and co-designed approach with clients in the intake process.

3. Implementation of XR Environments in Education

  • importance of instructional design, aligning objectives, and the creative team's role in engaging students. He highlighted the need for assessments, time budgeting, and the involvement of primary roles for project success.

  • interdisciplinary nature of the project, involving students from various programs, the establishment of guilds for skill development, and the role of instructional designers in course design at ASU.

4. Research Projects and Collaborations

  • various research projects, including Hurricane Heroes and the Verizon project, led by PhD students and their publications at different conferences.

5. Virtual Reality (VR) for Education

  • The conversation covers the application of VR in education, specifically focusing on scale perception and immersive experiences. ASU's partnership with Dreamscape for creating educational content and hiring students for various roles is discussed.

Mike Vann appointed as co-chair for VR/AR Association Metaverse Committee for XR, Web3, and AI

We are thrilled to have Mike Vann help us lead our community and host online sessions for Metaverse with guest speakers, live discussion and networking so our Members can grow their knowledge and connections!

Mike is the CEO & Co-founder of Agents of Play, a development studio creating business metaverse, AI-powered avatars/agents and experiential technology for global enterprise, agency, non-profit and .gov partners. In 2024, AOP will launch bizVersive, a platform allowing businesses to create 3D immersive web spaces with AI characters. Mike actively promotes metaverse strategy and technology benefits across all industries.

Driven by the potential of virtual human interactions, Mike is inspired by the entrepreneurs, creatives and technologists shaping the metaverse. He envisions an open metaverse that enhances and extends life and business for everyone.

Previously, Mike collaborated with the world's largest mobile and social gaming properties to develop and monetize vast player communities. His metaverse journey began at the intersection of game development and brands seeking premier branded entertainment experiences. Agents of Play was established by game and brand experts after collaborating with the founders of Activision, leading to the pioneering of Advergames. Before his venture into technology, Mike held media and sponsorship roles at the NFL and New York Times company.

I’m energized to work with a community of innovators to realize the potential and impact of a global metaverse for all. The VRARA Metaverse committee will establish best practices, drive research and enable strategic collaboration among VR/AR/XR, Metaverse, Web3, AI and industry leaders.

The Metaverse has the potential to revolutionize global interaction, economy, and creativity, creating a connected reality that transcends physical boundaries and reshapes how we live, work, and connect.

The future is here, let’s go!
— Mike Vann



VRARA Conference Spotlights Aerospace and Defense Topics

By Shane Klestinski, Associate Editor, Team Orlando News

 

The Virtual Reality/Augmented Reality Association (VRARA) Central Florida Immersive Technology Summit 2024 featured an afternoon of multiple discussion panels themed on the “The Future of Immersive Tech in Aerospace & Defense” at Full Sail University, April 18.

 

Hunter Stinson, senior technologist at Integration Innovation Inc. (i3), introduced the afternoon’s series by addressing the challenges of adopting immersive technology in the defense and aerospace industries, and he provided some insights on what he thought these industries might face in the future. Stinson also described examples of immersive applications that are currently improving military readiness, like Mass Virtual’s “Virtual Hangar” extended reality (XR) suite that trains Air Force aircraft maintenance workers. He also mentioned technology like Red 6’s augmented reality efforts that project holograms onto a pilot’s visor for live dogfighting scenarios in flight.

“Planes move pretty fast,” Stinson said. “Stabilization algorithms and putting holograms in a specific place are hard problems, especially when you’re moving hundreds of miles an hour. They’re solving some really neat challenges with this tech.”

 

Nathan Klose, senior creative director at i3, moderated the first two “Immersive Tech in Aero/Defense” panels, which he described as “call and response” discussions. The first provided a government perspective on the topic. Klose began the discussion by asking about solutions that addressed concerns and improved training readiness for operations and maintenance. John Meyers, executive director for the Naval Air Warfare Center Training Systems Division, responded by noting the Navy’s issues in pilot training.

 

“We just can’t make as many pilots fast enough, so now we’re training with T-45 [-related] devices,” Meyers said. “We started with virtual reality and now we’re getting practice with mixed reality… so now we’re saving flight time and we’re getting pilots those reps and sets so they can get their wings faster.”

The second “Aero/Defense” panel gave industry representatives a chance to share their insights and respond to topics raised in the previous panel. During the discussion, Klose mentioned sustainment, interoperability, and standards, which the previous panel had discussed. He asked the second panel about “pain points” associated with those topics, and how to resolve them.

 

“The mission doesn’t change, but the budget does, so what have we learned before and what can we bring back in?” said Waymon Armstrong, CEO and founder of Engineering & Computer Simulations, as he emphasized the importance of reuse. “[With medics,] what we do for the Army applies to Navy corpsmen for the Marines and Air Force pararescuemen. There are things that align very well for that reuse… so build once and deploy often.”

 The next panel concentrated on immersive technology, such as XR, in aerospace, which was moderated by Andy Smith, Halldale Media CEO. Retired Air Force Col. Michael Peeler, a former pilot and current communications director at Mass Virtual, said that he benefited from immersive training near the end of his career, but emphasized that he would have been much more proficient if it had been available when he was just starting out.

 

“Growing up, the traditional thinking was that you could get something really good, really fast, or really cheap, but you couldn’t get all three – and that affected training in trade-offs, for example, in an aircraft’s fuel consumption versus its speed,” Peeler said. “We don’t have those restrictions at this inflection point in XR training. We can simultaneously get our customers better training faster than they’ve ever had before, with a return on investment that’s enormous, and they’re measuring that return in months, not years.”

Higher Prana VR app, Now Available on the PICO Business Store

The Higher Prana VR app is now available on the PICO XR Business Store!

A virtual reality experience uniquely designed through ambient and electro-acoustic soundscapes. Our VR meditation app uses carefully engineered spatial audio to guide you into deeper relaxation and heightened states of mindfulness, offering both beginner-friendly expert guidance and immersive 360-degree nature environments—complete with yoga mats for grounding.

Each session is expertly crafted with trauma-sensitive guidance to ensure a safe and personalized experience for all users. With Higher Prana VR, manage stress, improve pain management, optimize sleep, and enhance mental health outcomes.

VRdirect News: Boosting employee wellbeing through Virtual Reality

Generali Deutschland AG, one of the largest insurance groups in Germany, has partnered with VRdirect to create a Virtual Reality project aimed at promoting health and wellness in the workplace. The VR project features various health practices and can be accessed via VR headsets, smartphones, tablets and web browsers.

Read the full success story here!

This project is a great example of how Virtual Reality can be used for soft skill training and social matters.

Schedule a meeting if you would like to learn more about how VR can benefit your HR department! 

VRdirect will be hosting two high-profile events in May together with Meta and the VR/AR Association. For additional information, please read further below!

Optimizing Customer Experiences Through Customer Call Analytics Platform

In today's fiercely competitive business landscape, the pursuit of superior customer experience has become paramount. Understanding and promptly addressing customer needs are critical not only for survival but also for thriving amidst intense competition. However, the sheer volume of customer interactions presents a daunting challenge, with crucial details often slipping through the cracks. To overcome this obstacle, our transformative customer call analytics platform, powered by advanced generative AI technology such as Chat GPT 4, provides a solution. By seamlessly extracting and analyzing content from customer calls, this platform ensures that no detail goes unnoticed, enabling businesses to stay attuned to customer requirements and maintain a competitive edge.

Built on cutting-edge AI capabilities, our platform offers a comprehensive suite of features tailored to enhance customer interactions and streamline business processes. From tracking the status of customer issues to providing detailed customer profiles for personalized interactions, the platform empowers businesses to deliver exceptional service. Moreover, its ability to swiftly identify specific products or services discussed during calls enhances cross-selling and upselling opportunities, maximizing revenue potential. With sentiment analysis functionality, businesses can accurately gauge customer satisfaction levels and adapt strategies accordingly, fostering stronger client relationships and long-term loyalty.

By revolutionizing how businesses interact with customers, our platform delivers impactful outcomes that drive operational efficiency and elevate customer experiences. From ensuring no valuable information is lost during interactions to eliminating the need for multiple documents with centralized information, the platform enhances productivity and task management. With a deeper understanding of client requirements and efficient issue resolution, businesses can address customer needs effectively and proactively. Ultimately, our customer call analytics platform is not just a tool but a strategic asset that empowers businesses to make data-informed decisions and achieve unparalleled success in today's digital era.

Check here for the full article: 

https://www.travancoreanalytics.com/case-study/customer-call-analytics-platform/

Website: https://www.travancoreanalytics.com/

Follow us on Linkedin for latest updates: 

https://www.linkedin.com/company/travancore-analytics/


Virtual Reality Job Training for Continued Education

From VRARA Member Oberon.

As the energy industry evolves, securing qualified talent to maintain efficient operations despite structural-demographic issues continues to be a challenge for employers. While jobs across the industry continue to grow, they also continue to outpace the qualified labor available to support them. In addition to this inverse demand and supply relationship, the retirement of the industry’s most experienced class of employees presents challenges in ensuring the preservation of their historical knowledge as efficiently as possible.

According to the World Energy Employment Report, global energy employment rose to 67 million in 2022, an increase of 3.5 million from pre-pandemic levels. According to a proprietary survey carried out by the International Energy Agency, around 36% of these jobs are in high-skilled occupations, compared with about 27% for the wider economy. As these challenges continue to grow, many companies have opted to retrain workers for higher-demand positions and increase flexibility. This begs the question; can legacy training methods sustain this level of need and be efficient enough to close the gap? In most cases, the answer is no. Many leading organizations are following the lead of innovative manufacturers and retailers by turning to virtual reality training solutions. Virtual reality training solutions like those offered by Oberon Technologies® capture tribal knowledge and use it to create a fully immersive virtual experience that provides an innovative way to reskill, upskill and train employees with maximum efficiency.

Training Challenges in the Energy Sector

For the oil and gas sector, reskilling and upskilling new employees continues to be a major cost burden. According to the Society for Human Resource Management, for all industries, the cost of training a new employee independent of industry is estimated at approximately $4,125. Oil and gas employers see significantly higher costs as roles often involve working in higher- risk environments like extreme weather conditions, with heavy machinery, at great heights, with high voltage equipment, at increased risk for exhaustion and fatigue, as well as fire and explosion and more.

While much of the cost of job training comes from time spent, there are other, less obvious costs like those associated with utilizing inefficient methods that result in labor turnover, damaged equipment, injury training and the need for re-training. This is especially true for highly technical roles and advanced skill training. Being that more experienced employees would often be utilized if using conventional training methods, the cost of their high-value time is an additional consideration.

In addition to those risks and challenges, many skilled energy workers will also need re-certification throughout their careers. The cost of that continued education as well as the value of their time spent is an additional point of concern for employers.

Virtual Reality Training VS. Legacy Training Techniques

A 2020 PWC study shows that employees completed VR programs four times faster than in-person training and 1.5 times faster than e-learning. In addition, VR training results in learning retention rates of up to 80% one year after training, compared to 20% just one week after traditional training.

In addition to efficiency and knowledge retention, virtual reality methods allow employers to evaluate a trainees’ ability to perform tasks in a very specific, and often high-risk, environment. Knowing their capabilities before they are put into a real-world situation allows for quicker recognition of any knowledge gaps and the reduction of critical errors, often caused by lack of experience. By increasing the efficacy of training, the odds of costly human error and safety and environmental hazards are greatly reduced. The advantages of VR training include:

  • Locational Awareness: VR provides opportunities for trainees to acquaint themselves with a work environment before they ever set foot inside it, providing a level of familiarity when on the job work begins, without the assumption of risk.

  • Knowledge Transfer: Given the skills gap that the industry is facing, VR can help bridge the generation gap by documenting the historical knowledge of the most experienced generation of employees. In addition to knowledge conservation, VR is an enticing recruiting tool for employers to utilize in attracting top talent.

  • Recording and Playback: In the moment feedback is crucial to training success. Because the training takes place in a digital environment, each session can be recorded and replayed for analysis and feedback, which helps trainees learn from their mistakes, while trainers can evaluate performance in a way that directly reflects the work Analytics can be run against the captured data to identify trends and areas needing more attention.

  • Hazardous Scenario Simulation: VR allows trainers to introduce simulations into the training that could be impractical or dangerous for a trainee. VR provides important hands-on experience without the risk and costs associated with real-world learning. Thus, workers can be exposed to a wider range of learning situations, and training can be adapted to unique risk factors.

  • Unlimited Use: Once a VR environment has been created, it can be used and reused on an almost limitless basis, greatly extending the potential for training and knowledge acquisition, while at the same time reducing the time and cost of off-site training.

  • Enhanced Proficiency Training: VR allows technicians to undertake proficiency training just prior to entering a work site, with details that match the actual environment they will be working in.

  • Improved Engagement: VR fully engages the senses, preventing an employee from being distracted by outside influences and thus maximizing learning engagement and retention.

Examples and Typical Use Cases

VR Increases knowledge retention, the safety of front-line workers and speeds up the training process by having personnel well trained and prepared for an emergency. For example, in a recent partnership with GTI Energy, a leading non- profit research and training organization focused on energy transition solutions, Oberon Technologies created a module designed to prepare trainees to “make safe” the scene of an emergency to ensure the protection of life and property, while collaborating with emergency response teams.

The module uses a fully immersive virtual reality training course, integrated into overall training to increase engagement and improve overall retention by 80% or more. Scenarios address performing emergency operations, as defined by the emergency response plan and in compliance with federal regulations. The VR courses further allow for trainees and instructors to interact in the same virtual environment enabling virtual, hands-on training.

Additional Use Case Scenarios Include:

  • New employee onboarding

  • Re-certification courses

  • Facility and working environment familiarization

  • Equipment location and identification

  • Emotional evaluation (testing for fear of heights, claustrophobia, motion sickness, etc.)

  • Standard operating procedures

  • Advanced skills training

  • Hazard identification

  • Inspection procedures

  • Maintenance procedures

  • Emergency response procedures

  • Safety and compliance training

  • Skills and environment screening

Recap of 1 Billion Humans in AI & XR

Doug Hohulin hosted our online session and had Dr. Harvey Castro MD give a presentation.

The meeting discussed the integration of Artificial Intelligence (AI) and Extended Reality (XR) in healthcare, emphasizing the benefits of combining AI for intelligence processes with XR for immersive visuals to enhance learning and potentially gamify experiences. The conversation also touched on the potential of AI to improve healthcare outcomes and reduce medical errors. Doug Hohulin shared insights on AI and XR integration in healthcare, focusing on responsible AI, automated connected vehicles, and healthcare solutions to reduce medical errors. The discussion also covered the impact of generative AI on efficiency gains, time savings, and its projected contribution to the world GDP in the next five years. Additionally, there was exploration on how AI can complement healthcare services, reduce clinician burnout, provide second opinions, and assist in patient care management. The meeting also delved into leveraging AI in healthcare for quality care, addressing the aging population, reducing deaths significantly, and helping in areas like traffic deaths and medical errors. Furthermore, the conversation included the use of AI in medical diagnosis, the role of AI in medicine, reinforced learning in healthcare, integration of augmented reality in healthcare, resistance to AI in medicine, integration of AI in medical practice, improving healthcare diagnosis and technology adoption, integration of AI in mental health care, and the integration of wearables and AI in healthcare. Lastly, there was a discussion on venture investing in healthcare technology startups, emphasizing the importance of domain knowledge, managing risks, dealing with regulatory bodies like the FDA, and the support needed for successful companies.

Topics & Highlights

1. Integration of AI and XR in Healthcare

  • The discussion highlighted the benefits of combining AI and XR in healthcare to create a more immersive and effective learning experience, emphasizing the potential for gamification to enhance learning and exploration.

2. Impact of Generative AI on Efficiency and GDP

  • efficiency gains and time savings from using generative AI, citing examples from studies by Salesforce and Boston Consulting. Accenture's projection of generative AI's impact on world GDP was also mentioned.

  • Further discussion on the potential growth of generative AI usage, its impact on various sectors like healthcare, and the projection of a billion people using immersive devices in the metaverse.

3. Role of AI in Healthcare and Patient Treatment

  • potential of AI in healthcare, addressing issues like clinician burnout, patient treatment, second opinions, and AI's role in complementing healthcare services.

4. Utilizing AI in Healthcare for Quality Care and Wellness Improvement

  • potential of using AI in healthcare to improve quality care, reduce deaths significantly, and address challenges like traffic deaths and medical errors.

5. AI Applications in Medical Diagnosis

  • AI assisted in diagnosing Tethered Cord Syndrome after multiple misdiagnoses, highlighting the potential of AI in medical diagnosis.

6. Role of AI in Medicine

  • Discussion on the advancements in AI for medical applications, including predictive analytics and the transition from evidence-based medicine to intelligence-based medicine.

7. Reinforced Learning in Healthcare

  • importance of reinforced learning in healthcare, the need for specific language models, and the potential applications of bots and AI in personalized patient care.

  • Further discussion on medical training using VR technology for anatomy learning, procedures, and simulations in healthcare.

8. Integration of Augmented Reality in Healthcare

  • benefits of using AR glasses in healthcare, such as accessing patient data, visualizing medical images, educating patients, and enhancing workflow. He mentions the cost of the Apple Vision Pro and its impact on different healthcare departments.

  • experiences with AI technologies like ChatGPT-4 and its applications in healthcare, including diagnosing medical conditions, analyzing images, creating podcasts, and improving efficiency.

9. Integration of AI in Healthcare

  • concerns about the resistance from doctors towards adopting AI in healthcare due to fears of obsolescence, changes in processes, and compensation issues.

  • importance of gradual education and implementation of AI in healthcare to address doctors' concerns and ensure a positive reception, emphasizing the need for experts like himself to guide and educate healthcare professionals.

10. Resistance to AI in Medicine

  • The participants discussed the challenges of convincing doctors to accept AI due to pride, competition, and territorial behavior. Harvey shared experiences of making doctors feel dumb to eventually gain their respect. Doug mentioned patients' fear of AI similar to vaccine hesitancy.

11. Integration of AI in Medical Practice

  • The participants emphasized the importance of making the best first opinion in patient care. They discussed the potential motivators for integrating AI in medical practice, including improved diagnostics and mitigating medical practice insurance risks.

12. Improving Healthcare Diagnosis and Technology Adoption

  • concerns about misdiagnosis by doctors, the slow adoption of technology in healthcare, and the potential risks of over-reliance on AI in patient care.

  • Discussion on the potential risks of excessive reliance on AI in mental health care, highlighting the importance of human-to-human connections and the need for a human in the loop to prevent adverse outcomes.

  • Discussion on the importance of maintaining a balance between AI and human interaction in mental health care, emphasizing the need for human oversight in AI-driven patient interactions.

13. Integration of AI in Mental Health Care

  • concerns about patients relying on AI like Dr. Google and Dr. ChatGPT for medical advice, highlighting the importance of proper medical training and supervision.

  • Agreement that AI tools should supplement, not replace, human medical care.

14. Integration of Wearables and AI in Healthcare

  • Discussion on the significance of HIPAA compliance for healthcare apps and the use of on-device AI for enhanced security and user comfort.

VRARA London: VR Storytelling, and Guided Visit to IBC Accelerators Media Innovation Programme

We are here to keep you up to date on upcoming events and news for VRARA members and partners. If you have something to share contact us at ​londonmarketing@thevrara.com​.

In this edition we have VR storytelling at the Open City Documentary Festival. We also have a guided visit to the 2024 IBC Accelerators Media Innovation Programme kick-off event.

Losing Home: Expanded Realities at Open City Documentary Festival

Celebrating the art of non-fiction, Open City Documentary Festival aims to challenge and expand the idea of documentary in all its forms.

Losing Home will feature work by five artists ranging in form from interactive VR and mixed reality experiences to video games and multiple-channel video which all address ways in which we can become alienated from spaces which once felt familiar, be they the cities we live in, the relationships we are part of or the bodies we live in. The exhibition will include the first UK presentation of new work by artists Aay Liparoto, Patricia Echeverria Liras and Ben Joseph Andrews & Emma Roberts and will also include a new version of Alice Bucknell’s The Alluvials alongside the London premiere of Nick Smith’s We Were Both Wrong.

The exhibition’s private view will take place on Thursday 25 April between 4:30-7pm and will then be open from 12-7pm on Friday 26 – Monday 29 April. The exhibition is free and no booking is required – it is located in Rich Mix, the festival’s hub in Shoreditch.

You can read more about the exhibition ​here​.


IBC Accelerators Media Innovation Programme

The International Broadcasting Convention (IBC) held the kick-off day for their 2024 Accelerator programme last month.

Muki Kulhan, Innovation Lead, IBC Accelerators Media Innovation Programme, gave us this introduction to this exciting and productive initiative in the media technology sector:

“Over the last five years, the IBC Accelerators Media Innovation Programme has provided a really amazing R&D sandbox for new and progressive, emerging media projects exploring new workflows, tools and techniques in XR and Game Engine led creative production. This year will be more ambitious than ever before, with XR, VR, AR and AI-led initiatives becoming truly a part of creative productions in our media, entertainment and broadcasting industries. I can’t wait to see what happens this year and we promise to keep the VRARA community updated as the projects progress!”

You can get a taste of the day in this video ​https://www.youtube.com/watch?v=wx5inTAEOhE​ and read about the selected challenges here: ​https://show.ibc.org/accelerators-2024-challenges​ We plan to follow the progress of the programme throughout the year.

Do you have a story to share?

If you have a story to contribute for the VRARA London Chapter newsletter, contact John Dowsland at ​londonmarketing@thevrara.com​.

John has many years of experience in content management & distribution for film, music and corporate communications, and currently works in calibration technology. In 2018 he joined the board of Cinema Technology Community CIC, and he has been involved in the VRARA since 2022.

Thanks for reading! Click here to Get in touch