INTUITIVE® Joins the Virtual Reality Augmented Reality Association

Intuitive Research and Technology Corporation (INTUITIVE), a professional engineering and technology services company, has joined the Virtual Reality Augmented Reality Association (VRARA) through the Central Florida Chapter.

Over the last decade, INTUITIVE has explored and invested in emerging, interdisciplinary visual solutions for advanced control and communication. They are focused on revolutionizing visualization methods, and how data is perceived, analyzed, and experienced. The company’s analytics and visualization software packages allow users, physicians, analysts, engineers, patients, and stakeholders to be immersed in their data. 

INTUITIVE’s team of digital and technical artists, game developers, and digital and software engineers utilize a myriad of visualization applications—including VR and AR domains—to provide immersive solutions to its Department of Defense (DoD) and commercial customers. Some of the VR/AR applications they have created include virtual experiences for viewing and interacting with medical imagery, holding multi-analyst design reviews, and even to scale replications of aircraft. INTUITIVE’s team has experience with many of the leading-edge Extended Reality (XR) technologies on the market today. 

Tim Hill, INTUITIVE’s Director of Central Florida Operations, said, “With VR & AR technologies, the sky is the limit. To keep up with the ever evolving and changing world, and the growing amounts and types of digital data, we must advance how we survey, analyze, view, and experience complex data. We are excited to join VRARA, and to collaborate with a community of creators who share our same passion and vision.” In addition to joining VRARA, Hill now serves on the VRARA Board of Advisors for the Central Florida Chapter. 

 

INTUITIVE’s Internal Research & Development (IR&D) investments in the area of XR has resulted in a deep portfolio of widgets, toolkits, and an extensible framework that can be used to rapidly prototype and deliver VR and AR solutions.  The company holds several patents on the technology and applications that they have developed in this area since 2014, many with applicability for medicine and radiology.  INTUITIVE’s tools are data agnostic—they can easily re-purpose the methods and interfaces they have developed for the medical domain for any data-intensive engineering and aerospace discipline, such as non-destructive test and evaluation, automatic target recognition, and image processing.  INTUITIVE’s Senior Vice President and Chief Technology Officer (CTO), Dr. William Marx, said, “We’ve been pioneers in developing and delivering visualization solutions for many years.  Our current and future IR&D investments in this area allow us to explore linear and non-linear training solutions; multi-platform and multi-player distributed collaboration; and development of natural user interface solutions that provide intuitive, innovative methods to perform data analytics within the virtual environment.”

ABOUT INTUITIVE RESEARCH AND TECHNOLOGY CORPORATION:

INTUITIVE® is an aerospace engineering and analysis firm that provides production support, software and systems engineering, programmatic support, product development, rapid prototyping, and technology management to the Department of Defense, other State and Federal Government agencies, and commercial companies. Our approach couples the latest technology with engineering expertise, analytical proficiency, and keen managerial oversight.  From design through production to sustainment, we proudly provide management and technical solutions throughout all phases of the system’s life cycle.

For more information about this topic, please contact Arlee Holmes at 256.936.4186 or email at arlee.holmes@irtc-hq.com.


HyperTunnel is part of Techstars Industries of the Future Class of 2023

This accelerator is run in partnership with Oak Ridge National Laboratory, the Tennessee Valley Authority and the University of Tennessee System. Based in the Oak Ridge-Knoxville, Tennessee, metropolitan area, these promising, early-stage companies are focused on emerging technologies across industries including artificial intelligence, advanced manufacturing, climate tech, future of work, mobility and clean energy technology.

“Now more than ever we need novel innovative technologies that have the potential to solve our most pressing problems,” said Managing Director, Tricia Martinez. “Governments and enterprises are looking for deeptech that can battle extreme weather events, design new batteries, capture novel energy resources, and more. Our companies for Techstars Industries of the Future are pushing the boundaries on what is possible with emerging technologies.” 

HyperTunnel is enabling real-time (synchronous) and time-delayed (asynchronous) collaboration between frontline workers and domain experts – from remote physical worksites to centralized immersive digital twins through a mixed reality software platform.

Sony announced a new larger Spatial Reality Display

Sony announced a new larger Spatial Reality Display. A Larger 27 inch 4K Model, ELF-SR2, which provides a Realistic, Glasses-free 3D Visualization experience for professionals. For more info reach out to thaisa.yamamura@sony.com

PARAMUS, N.J., April 4, 2023 /PRNewswire/ -- Sony Electronics is adding a new model to its Spatial Reality Display portfolio with the addition of the ELF-SR2. At 27 inches, the larger 4K option provides highly realistic, three-dimensional content without the use of special glasses or VR headsets. Highlights of the new offering include an upgraded high-speed vision sensor, image quality enhancing technologies and installation flexibility. Additionally, the ELF-SR2 enables more robust functionality through its support of applications and development. It is optimized for industrial design, surgical planning, architecture, engineering, construction, signage, retail, software/application development, game developers and entertainment applications.

"Our original Spatial Reality Display continues to captivate customers by bringing 3D content to life with an astonishing authenticity and sophistication that demands to be experienced hands-on," said Rich Ventura, Vice President of Professional Display Solutions, Sony Electronics. "While our users love the technology, we keep hearing the same question from professionals; 'does this display come in a larger size?' As we expand into more B2B verticals and environments, I'm happy to say that in addition to a bigger screen, we've added some powerful and commonly requested features to enhance content production. These include a wider color gamut, a newly developed engine, more advanced high-speed sensors and rich support for applications and their development – all for an extremely competitive price."

At 27 inches, the 4K Spatial Reality Display provides highly realistic, 3D content without the use of special glasses.

Immersive Spatial Imagery
The display's visual fidelity is amplified by its immersive depth of field and detailed resolution, in addition to its larger size. New generation high-speed vision sensors enable high-speed processing, as well as low latency for reduced motion blur and crosstalk. 10-bit processing supports a wide color gamut that covers Adobe RGB at approximately 100% for accurate color reproduction. The super resolution engine provides upscaling from 2K to 4K. Additionally, the new model offers color moire correction to better address fine details, patterns and lines.

The ELF-SR2's enhanced facial tracking and recognition senses a viewer's eyes to provide a natural and comfortable visual experience, while the wide viewing angle enables consistency and accuracy from numerous vantage points.

Versatile Software Development Environment
The ELF-SR2 promotes application compatibility, as well as efficient and streamlined development through software development kits (SDKs). With support for leading SDKs including Unity and, Unreal Engine, it also allows for development with Open GL, DirectX11/12 and Open XR (coming later this year). The Spatial Reality Display enables simplified VR and AR digital content creation.

A new Spatial Reality Display App Select website will also be available, where users can easily find applications compatible with the display, as well as relevant case studies and information. In addition, Sony will offer an intuitive Spatial Reality Display Player app, which supports various 3D file formats to easily show 3D projects on the display. 

The Spatial Reality Display also allows for the use of specialized, industry-specific applications highlighted in the partnerships below.

Partnerships:

  • 3DICOM MD, Singular Health's FDA-approved 3D software has been designed to work with Spatial Reality Display and enables highly accurate, detailed, glasses-free 3D medical visualization for diagnostic applications.

  • Digital Nation Entertainment (DNE) provides end-to-end production and creative services for mixed reality and live experiences. DNE uses Spatial Reality Display to pre-visualize content from their volumetric capture studio.

  • Developed by Arcturus, HoloSuite powers digital humans by editing, compressing and streaming volumetric video for virtual production and the metaverse, as well as AR/VR viewing on the Spatial Reality Display.

  • KiksAR's platform brings real life shopping experiences to the digital world. Ultra realistic 3D configurators and personalized digital try-on experiences for jewelry and watch stores can be achieved in 3D using the Spatial Reality Display.

  • Magnopus forges design, art, and technology to create new experiences across augmented reality, virtual reality and traditional reality. They're designing 3D entertainment content specifically for Spatial Reality Display and use it in their Virtual Production pipeline.

  • Pixomondo (PXO) delivers Virtual Production and Visual Effects expertise and solutions for Film and Episodic content. PXO is using Spatial Reality Display to pre-visualize 3D assets built in Unreal Engine before final presentation on an LED volume, replacing a head-mounted display, for creators' comfort and convenience.

  • SHoP Architects uses Spatial Reality Display for client presentations and has developed a mobile app to wirelessly control and navigate 3D models on it.

  • Sketchfab is the leading platform for 3D and AR on the web. Using the Sketchfab API, Spatial Reality Display users will be able to download and view 3D models.

  • WhiteMoon dreams, a leading independent game studio, uses Spatial Reality Display to evaluate 3D characters, creatures, textures, lighting, colors, and effects, instead of printing the 3D characters. This provides a faster and more sustainable real-time 3D evaluation.

User Friendly
The easy-to-use ELF-SR2 Spatial Reality Display's updated sensors and high-speed hardware processing provide accommodation for a wider variety of PCs to support different use cases and needs. With a detachable stand, the portable display can be installed in multiple different environments and configurations. It also accommodates the Video Electronics Standards Association (VESA) mounting standard for further flexibility and compatibility.

The ELF-SR2 is planned to be available in May 2023 in the United States and Canada through Sony's professional channels. It is expected to have a MSRP of $5,000 USD and feature a 3-year limited product warranty.

For more information on Spatial Reality Display, please visit https://pro.sony/ue_US/products/spatial-reality-displays. For new AV products and solutions from Sony Electronics, please visit https://pro.sony/prodisplaysolutions and https://pro.sony/press or follow Sony's professional business on social media: LinkedIn, TwitterFacebook, Instagram, and YouTube.

Banuba's Face AR SDK Helps Videoshop Reach Over 20M Downloads

HONG KONG, March 30, 2023 (Newswire.com) - The users and professional reviewers alike praise Videoshop for its huge library of content (thousands of licensed music tracks and hundreds of stickers, sounds, and fonts), as well as high performance and quick reaction to user feedback. Plenty of people have updated their testimonials with information on how Videoshop customer support helped them solve their issues in no time.

The augmented reality capabilities, including virtual backgrounds, face touch-ups, 3D masks, and interactive effects were provided by Banuba Face Filters & Effects SDK/API. It is a premade module that includes a wide array of functions out-of-the-box and can be quickly integrated into an existing app or a project in development. Its features include:

  • Virtual backgrounds

  • Virtual try-on of hats, makeup, jewelry, glasses, etc.

  • 3D masks

  • Color filters (LUTs)

  • Gesture recognition

  • Trigger effects

  • Etc.

It also comes with its own rendering and scripting engines, so no additional software is required. Integrating Face AR SDK usually takes under a week.

Videoshop developers chose to work with Banuba for three main reasons:

  • Cutting-edge virtual background technology

  • Flexibility in adjusting for their needs

  • Attention to customer requirements and quick solutions to their issues.

The cooperation has been beneficial for both parties and will continue for the foreseeable future. 

"Banuba is a flexible company, willing to work with their client's budget and needs. They have a talented team of developers capable of molding the product to my needs," Joseph Riquelme, Jajijuejo Founder, said.

About Banuba

Banuba is an augmented reality company with over seven years on the market, pioneering face tracking and virtual background technologies. Its other products include a virtual try-on SDK for jewelry and glasses; Face AR SDK, a software development kit for various AR applications; and Video Editor SDK, a compact and feature-rich mobile kit for video editing.

VRdirect Studio now supports fully immersive 3D models


Existing models of premises or products, such as CAD files, can now be easily used for presentation in VR. The digital twin of a machine, for example, can be integrated into 360° videos or photos with the VRdirect Studio without much effort and explored from all angles. This enables a truly immersive 6-DoF experience on Virtual Reality headsets.

To explore 3D models in 6-DoF, simply download the VRdirect app from your VR headset's app store and enter the code: f8dd68. Or directly check out the experience in your browser

(And check out our WIKI, if you're wondering what's 6-DoF). 

Using 3D models in Virtual Reality offers several advantages, including:

 Cost savings: Traditional, physical 3D modeling and prototyping can be expensive. VR modeling can reduce costs by eliminating the need for physical materials and shortening the time it takes to create and review designs.

 Improved visualization: VR can provide a more realistic and accurate representation of 3D models compared to a screen view, as the models are viewed in a three-dimensional space.

 Immersion: People interacting with a machine in Virtual Reality can transfer those interactions to the real world much more intuitively than if they were just passively watching a training video.

Better marketing: Highly complex machines and systems do not have to be exchanged in the form of abstract data sheets, but are available in seconds and can be viewed from any angle.

Advanced customer service: Companies can go through the necessary maintenance work step by step with the user and demonstrate it clearly and interactively in the virtual environment.

Read everything about the new feature here and try the VRdirect Studio for free


TINT, a Virtual Makeup Try-on Tool by Banuba, Improves its AI Beauty Advisor

AI Beauty Advisor analyzes a person’s face and recommends the most fitting cosmetics

  • The system works with users of all skin tones

  • The new version helps sell more, increases order size, and drives down returns

TINT, the most realistic makeup virtual try-on tool, has rolled out a new version of its cutting-edge AI Beauty Advisor. It now contains more streamlined algorithms that improve user experience, as well as performance indicators for the businesses.

The demonstrated results are as follows:

  • Over 200% better conversion rate

  • Up to 60% lower return rate

  • Up to 30% higher average order value


AI beauty advisor is an artificial intelligence that uses data from professional MUAs to analyze people and suggest the beauty products that would fit their needs the most. It provides unparalleled personalization for the customers and brings businesses more sales. 

The unique features of virtual makeup try-on TINT are:

  • Proprietary face tracking technology with 3,308 vertices;

  • Better performance and precision; 

  • AI-based seasonal color analysis;

  • AI makeup recommendations system to help every user look their best;

  • Supports trying on several products at once (to improve average order value);

  • Rollout can be conducted in 2 weeks;

  • Adding new products takes up to 48 hours.

TINT is a ready-made application for companies in the Beauty industry developed by Banuba. It is made up of a lifelike makeup try-on, a cutting-edge AI recommendation system, and an intuitive UI. TINT is web-based, which means that a user can access it on almost any connected device without downloading anything. This allows brands to reach the largest potential audience.


WebXR.tools in full display with the VRARA Seattle Chapter

Last week, Hermes Frangoudis, CTO of Arlene.io shared with us the now publicly available WebXR.tools suite of products. During the 30 min online meet, we learned about the Seattle Chapter and what Hermes and his team in Arlene have put together to democratize Web XR design, development and even publishing. In a matter of 20 min 3 XR experiences were created and made available for anyone to use.
See the recording now available in the YouTube VRARA Seattle channel.

Central Florida Immersive Technology Summit Interview

News Channel WESH2 interviewed Dr. Haifa Maamar of Full Sail Univbersity regarding their patnership with the VRARA for the upcoming Central FLorida Immersive Technology Summit. Click the link below to watch the full interview and register today for the summit!


https://www.wesh.com/article/central-florida-immersive-technology-summit/43428661

Bob Cooney appointed as Co-Chair of VR/AR Association Location-Based Experiences Committee

We are thrilled to have Bob Cooney help lead our community for LBE! Join our weekly Online Meets!

For more than 35-years, Bob has accurately predicted the impact of bleeding edge tech on entertainment business trends. He’s the founder of an INC 500 company with a successful NASDAQ IPO to his credit. He has emerged as one of the leading voices in the virtual reality and metaverse space.

Cooney is the go-to-market strategist behind the launch of some of the world's most successful location-based VR companies He is the author of the book Real Money from Virtual Reality, and co-author of The Guardians of the Metaverse, a think-tank report from Laval Virtual.

Bob is widely considered the world's foremost expert on location-based virtual reality, having founded the VR Arcade Game Summit, the biggest dedicated location-based VR conference in the US.

I am honoured to have been invited to chair the LBE committee for the VRARA. I am excited to support this group of experts and work together to advance the VR industry’s offerings in location-based entertainment. Together, we will explore new technologies, innovative ideas, and best practices, all aimed at delivering high-quality and engaging experiences for consumers.
— Bob Cooney

Showing the true beauty of Bozeman Montana thru Virtual Reality

Outer Realm is thrilled to have been a part of Paine Group's upcoming Condo Development, Six Range, in beautiful Bozeman, Montana. We are excited to have helped bring this development to life by creating immersive virtual reality tours that allow potential buyers to experience the community from anywhere in the world.

Six Range is an exclusive mixed-living housing community that combines Scandinavian design and architecture with the stunning natural beauty of the Mountain West. With plenty of green spaces and a focus on environmental consciousness and community-centric living, Six Range is an ideal place to call home.

As we worked on this project, Outer Realm used our drone photography skills to capture breathtaking views of the Bozeman mountains. It allowed us to showcase the natural beauty of the area and create a truly immersive experience that would help potential buyers visualize their future home in Six Range.

Our team at Outer Realm is proud to have been a part of this project. We put our heart and soul into creating virtual reality tours that bring the community to life and help potential buyers envision themselves living in this beautiful development. We're excited to see that our efforts have attracted many visitors eager to experience the community before it's even built.

Internet of Behaviour (IoB) implications for XR analytics

IoB toolkit unlocks human behaviour in XR

The Internet of Behaviour (IoB) is a technology trend with significant opportunities and benefits for Extended Reality (XR). It adds a layer of psychology to the Internet of Things (IoT) to interpret ‘digital dust’ into behavioural patterns. The ultimate goal is to connect devices, analyse user behaviour and influence future behaviours. Gartner predict 40% of the world’s population will be subject to at least one IoB program in 2025 (from marketing to healthcare to manufacturing). Let’s dive into the IoB implications for AR, VR and the Metaverse on our mission to understand how humans behave in 3D experiences with Six Degrees of Freedom (6DOF).

What is the Internet of Behaviour (IoB)?

IoB is a protocol that uses sensors to collect behavioural data from devices. In our case it’s Augmented Reality and Virtual Reality devices (Mobile and Head Mounted Display). Data is processed using Cloud Computing, Artificial Intelligence (AI), Machine Learning (ML) and Data Science to extrapolate human behaviour. It represents the next level of Business Intelligence (BI) in Extended Reality to understand and optimise behaviour. It’s therefore a critical component of an interoperable, synchronous and persistent Metaverse with shared experiences. Over time, predictive analytics on human behaviour in XR will drastically improve user experiences and business performance metrics.

How does the Internet of Behaviour work?

IoB connects AR and VR devices to cloud services like Amazon Web Services (AWS). APIs collect sensor data (gyroscope, accelerometer etc) in a real-time data exchange. Data is stored on databases and analysed by AI algorithms to create an intelligence engine capable of understanding user behaviour patterns. 6DOF spatial data on head, hand and body movements in 3D space is a complex dataset however Cognitive Science (and significant R&D) interprets XYZ coordinates into insights which help XR businesses understand their audiences. Our approach to the IoB tech stack is an end-to-end data analytics platform which does all the heavy lifting so customers, regardless of data analytics expertise, can quickly see data visualisations and measure performance goals.

What are the benefits of IoB for XR industry?

Industry research shows that 83% of XR companies think data analytics are important to grow their business but the study also highlights a measurement gap. Partly because AR/VR experiences generate unique and unstructured user behaviours (compared to the Internet of clicks, taps and likes) and partly due to reliance on legacy toolkits (e.g. web analytics, surveys etc). IoB is a new approach specifically designed to a) collect real-time 6DOF data from any platform or device, b) quickly analyse behaviours in 3D experiences to improve performance and productivity and c) standardise XR metrics to make predictions about future actions based on past behaviours. From Navigation (the position, pathway, priority and order of events) to Attention (the time, sequence and priority of content viewed), there are benefits for all Extended Reality use cases.

How do you apply IoB to your XR business?

A plug-in for 3D engines and XR platforms (no code required) collects real-time sensor data from handsets and headsets. Our intelligence engine analyses 6DOF spatial data in the cloud using AI and Cognitive Science. Prebuilt dashboards present performance metrics and data visualisations across all your AR, VR and Metaverse projects. Our Unity plug-in is live (with more in the pipeline) so you can easily add IoB to your projects now. The Internet of Behaviour is a new framework to better understand your users, identify behaviours which aren’t necessarily obvious and to improve business performance. It’s a great diagnostic tool and is also proven to be predictive of behaviour. 

What are the opportunities for IoB and XR?

Whilst other sectors have been applying IoB to digital marketing, IoT, location services etc for a few years, the opportunity for the XR industry is still nascent. CORTEXR was launched to specifically address this knowledge gap. Our guiding mission is to understand human behaviour in Extended Reality through the lens of psychology and cognitive science. From healthcare and entertainment to education and training, our customers can tap into a wealth of spatial data. How do health conditions like dementia correlate with spatial diagnostics? How do e-commerce virtual try-ons correspond with purchase intent? Does training module behaviour correspond with learning attainment? Our XR data analytics platform helps unlock these opportunities. Drop us an email at contact@cortexr.com if you’d like to collaborate!

What are the data privacy concerns of IoB?


Data privacy is, quite rightly, a crucial debate for the broader tech industry. People are familiar with synchronising their devices and granting permissions. However biometric fingerprints of faces and eyes is particularly intrusive and tech company practices in digital marketing required government regulation e.g. GDPR. Extended Reality and IoB is therefore vulnerable to ethical questions about how data is collected and used at a scale. CORTEXR has a policy of deliberately avoiding Personally Identifiable Information (PII). Raw data on 6DOF behaviour doesn’t collect PII, basic device IDs only measure user numbers and GPS data is restricted to guarantee anonymity. We don’t need (or want) PII data to deliver our XR data analytics services. Long may this continue.

Visit CortexR website

Contact  :Jonathan Barrowman Email: jonathan@cortexr.com 

Pinnacle’s UH-60A Virtual Maintenance Trainer Used in Portuguese Air Force Training

Pinnacle Solutions, Inc. (Pinnacle) has partnered with Arista Aviation Services, LLC (AAS), to develop and instruct a custom maintenance course for UH-60A aircraft, system, and maintenance task training of Portuguese Air Force technicians using Pinnacle’s UH-60A Virtual Maintenance Trainer (UH-60A VMT). The first portion of an eight-week course is being held at Pinnacle’s Corporate Office facility in Huntsville, Alabama in February and March of 2023, where an electronic classroom serves as the course venue. The second and third itinerations of UH-60A VMT maintenance courses will be held at the Customer’s facilities in Portugal in 2024 and 2025.

The proposed classroom instruction is designed to provide maximum use of Pinnacle’s UH-60A VMT and is complemented by customized lessons focused on Avionics maintenance tasks. Upon completion of the UH-60A VMT Maintenance Course, and in combination with several weeks of On-the-Job-Training conducted by AAS, the Portuguese Air Force technicians will be familiar with the Before Flight Inspection procedures that are focused on Preventive Maintenance Servicing and will be prepared to complete maintenance tasks connected to receiving the aircraft in Portugal.

Beyond the classroom style training, the use of the VMT in daily operations is effective in the training of advanced aircraft maintenance, greatly reduces the need for scheduling an actual aircraft to support maintenance training, and supplements hands-on training. As part of Pinnacle’s subcontract to AAS, the UH-60A VMT delivery to the Customer’s facility in Portugal is currently scheduled for the 1st quarter of 2024.

Banuba, an Industry-Leading Augmented Reality Company, Joins VR/AR Association

HONG KONG, March 15, 2023 (Newswire.com) - Banuba, a leading computer vision company, has become a member of VR/AR Association — a global organization uniting the most prominent companies in the mixed reality industry. Its members' list includes Meta, ByteDance, Siemens, Lenovo, Bosch and many other high-profile AR/VR public companies.

VR/AR Association membership will allow Banuba to participate in weekly meets and industry-specific events (e.g. Healthcare Forum and Education Forum). It will also let Banuba join a number of committees engaged in developing best practices for certain verticals.

Banuba is a pioneer in Face AR, leveraging our patented technology, AI, and effective AR Digitization systems for eCommerce, Telecommunications (including video conferencing), and Social Networking. 

Banuba has also made notable gains in hand tracking - detecting the precise location of a human hand in a picture or video. This technology allows accurately trying on rings, bracelets, watches, and other hand jewelry, as well as controlling electronic devices with gestures.

Banuba offers three products, all of which use AR extensively.

  1. Face AR SDK. A premade module that can quickly be integrated in a web, desktop, or mobile application and perform a number of functions: add 3D masks, replace backgrounds on videos and images, allow for virtual try-on of jewelry, headwear, glasses, etc. Besides a blanket decrease in time-to-market, Face AR SDK helps increase conversion rate for eCommerce businesses, boost camera enablement and engagement for video conferencing companies, and enable unique features for other industries. Face AR SDK helps apps get over 100 million sessions with AR features every month.

  2. Video Editor SDK/API. A full-fledged mobile video editor with both core (trimming, sound editing, etc.) and advanced features, including AR filters and effects. Installing it helps release the app much quicker than building similar functions from scratch. VideoEditor SDK has one of the simplest integration processes on the market. In practice, it can cut an app's time-to-market by up to 50%. Besides, making video creation more convenient raises the app's k-factor and the amount of user-generated content.

  3. TINT. A cutting-edge virtual try-on solution for beauty products. It boasts the most realistic virtual cosmetics on the market, automated seasonal color analysis, and ultra-fast digitization of new products (up to 48 hours for an entire collection). As a result, it can massively boost sales (up to +200%) and decrease returns (up to -60%).

Banuba has more than 20 patents in AR and AI. It employs over 50 artificial intelligence and augmented reality experts and is known for offering the most advanced AR Beauty and Tech Virtual Try-on experience.  

Contacts

Email: info@banuba.com

Website: https://www.banuba.com/

Banuba on Linkedin: https://www.linkedin.com/company/banuba-development/ 

Viva VOsCE, which stands for Virtual Objective Structured Clinical Examinations has been accepted for funding by Innosuisse, the Swiss Innovation Agency.

Viva VOsCE, a revolutionary project aimed at transforming medical education through harnessing the power of Virtual Reality (VR) technology has been accepted for funding by Innosuisse. The project is being conducted in partnership with the Geneva University Hospitals(HUG) and the University of Bern, with ORamaVR S.A. as implementation partner.

Objective Structured Clinical Examinations (OSCE) are critical for the practical examination and licensing of medical students. However, traditional OSCE exams can be expensive to administer and often require significant logistical coordination. Viva VosCE will deliver a Virtual Reality platform to assist medical schools in delivering and assessing OSCEs while significantly lowering the cost and overhead. With VOsCE, medical students will be able to demonstrate their clinical skills in a fully immersive Virtual Reality experience that mimics real-world scenarios.

“We are delighted to take part in this innovation project with our collaborators at the University of Bern and ORamaVR. VR can have an immediate impact on medical training and assessment, and we can democratize access to such resources through the Viva VOsCE project.”
Dr. Oliver Kannape, Director of the Virtual Medicine Centre (HUG)

"We are looking forward to the exciting collaboration that will enable students to prepare for OSCE exams more efficiently and realistically and to scientifically evaluate Virtual Reality as a supplement to assessments.”
Prof. Thomas Sauter, Head of Emergency Telemedicine University of Bern and Virtual Inselspital Simulation Lab

“We are excited to support the Viva VOsCE project with our MAGES platform, which aims to revolutionize the next generation of structured clinical examinations with virtual reality”
Prof. George Papagiannakis, CEO & Co-Founder of ORamaVR


More about Geneva University Hospitals:
The Virtual Medicine Centre (VMC) is a transversal center that provides the HUG with the competencies and the technological infrastructure for bringing fundamental and applied XR to the clinical environment – for the benefit of the patient. The VMC enables technology development for research, diagnostics, and medical training in collaboration with academic and industry partners.

More about University of Bern:
The Virtual Inselspital Simulation Lab, a national and international center of excellence for Medical Extended Reality in German-speaking countries, focuses on research with and about Medical Extended Reality. VISL is involved in the education and training of all healthcare professionals. The Institute for Medical Education at the University of Bern develops, implements and evaluates examination formats to assess the competencies of medical trainees in order to ensure the best possible education.

More about ORamaVR:
ORamaVR was created to tackle a major health crisis that is currently affecting almost 5 billion people globally: the lack of access to affordable surgical care. We aim to accelerate the world’s transition to medical VR training by democratizing the VR metaverse content creation and offering a low-code authoring platform (MAGES-SDK) to medical organisations, enabling the mass production of high fidelity medical Virtual Reality Simulations at 1/8th of cost and time against current practices. These medical VR training simulations are utilized by hospitals, medical device companies, medical schools and medical training centres to train and assess their medical professionals on current and new surgical, diagnostic or therapeutic techniques.


Media Contact :
Amalia Kargopoulou
Head of Business Development, ORamaVR
Email: amalia.kargopoulou@oramavr.com

“IMT & VRARA COLLABORATION” Immersive Technologies - Innovation in Education, Training and Game Design

International MSc program

We are proudly announcing the collaboration of IMT  Msc Program with VRAR Association !

The Immersive Technologies - Innovation in Education, Training and Game Design (IMT) MSc program of the Department of Computing Science of the International Hellenic University (IHU) is an International MSc  that covers a unique thematic area in Greece and with a very high degree of innovation in Europe, focusing on cutting-edge technologies such as Augmented Reality, Virtual Reality, Mixed Reality with an emphasis on Education, Training, and Game design

The MSc program is offered worldwide as it has been structured to correspond to modern scientific and technological needs within the constantly evolving socioeconomic context, employing Distance Learning and Blended Learning educational approaches. The students and the Academic Staff of the IMT MSc Program come from different countries as the program is offered online, and all courses are taught exclusively in English.

The VR/AR Association (VRARA) is an international organization designed to foster collaboration between solution providers and end-users that accelerates growth, fosters research and education, helps develop industry best practices, connects member organizations, and promotes the services of member companies.

The collaboration between the IMT MSc Program and the VR/AR Association aims to achieve several objectives.

Firstly, bring together experts from academia and industry in the fields of Virtual Reality (VR) and Augmented Reality (AR) to promote knowledge sharing, best practices, and the latest developments in immersive technologies.

Secondly, the collaboration seeks to provide IMT MSc Program students with access to a network of VR/AR professionals, which will enable them to gain hands-on experience and practical knowledge in the development and application of immersive technologies.

Thirdly, the collaboration aims to facilitate research and innovation in the field of VR/AR, promoting joint research projects and encouraging the development of new applications and tools that can benefit both academia and industry.

Finally, the collaboration seeks to promote the use of VR/AR technologies in education, training, and game design at the Master or PhD educational level,  by highlighting best practices and success stories in these areas and by fostering the development of new educational and training solutions that incorporate immersive technologies.

           
  Professor Dr. Avgoustos Tsinakos,

Director of Advanced Educational Technologies and Mobile Applications  (AETMA) Lab

Director of MSc in Immersive Technologies - Innovation in Education, Training and Game Design

Department of Computer Science

International Hellenic University


CORTEXR data analytics research highlights measurement gap to grow industry

CORTEXR, the XR data analytics company, has published results of a global study into Extended Reality and Metaverse measurement. Research shows the industry is maturing quickly but the computing platform shift necessitates questions about the role of XR in people’s lives and how businesses demonstrate value. Specifically, how does the industry standardise Extended Reality measurement with data analytics to optimise human behaviour in a post-screen world?  

83% think XR data analytics are necessary to grow their business 

There is clear appetite for standardised XR data analytics as the industry matures however a measurement gap exists between the understanding of user behaviour and the business impact of Extended Reality. Most people know that data analytics have not kept pace with creative and technical developments as they’re mainly based on legacy metrics designed for computer mouse and mobile touchscreens, which don’t capture the unique nature of 3D experiences.

It’s vital for our industry to be able to report on the success of campaigns and attribute that value back to the brand.”

“We struggle most of the time to find ROI metrics linked to increased sales.”

“We want to quantify immersion but don’t know the data fields we can use.”

“We need to have more of the data that shows people that XR is more effective than other approaches.”

Current XR measurements are broad and based on legacy metrics 

This XR and Metaverse research uncovers data analytics currently being used – or tools people are aware of – and current practices are broad with many measurement techniques driven by legacy measures. Interestingly, many approaches reflect those used in the consumer research sector, with metrics adopted from either consumer research or academic psychological practices, which are frequently misused or misinterpreted. 

Time 

Time spent in an XR experience is the most common single metric, with people saying it indicated both enjoyment and willingness to stay in the experience, or as a measure of experience complexity. For Gaming, the length of time spent and break points were important however, for Education, time could be an indicator of learning speed/task difficulty or confidence in completing a task. Time therefore needs to be combined with other measures to be interpreted meaningfully.

Dwell Time on specific elements of an XR experience is considered a measure of interest in specific content such as an area of a VR scene or a specific element of an AR experience. It’s important not to confuse this with eye-tracking technology, which also uses Dwell Time metrics, which is based on eye fixation measures. More on eye-tracking later.

Google Analytics (or similar) 

The most common data capture method is Google Analytics (or similar) which is used by 26% of respondents in this research study to track visits / user numbers, user ID, location, dwell time, clickthrough and social media activity. Time is also tracked by Google Analytics however it’s separated out as there are some specific solutions, most notably in VR. This legacy approach doesn’t deliver metrics unique to XR and Metaverse measurement and doesn’t demonstrate the value of 3D experiences above and beyond traditional 2D digital media. 

“Industry processes and KPIs just go down old paths of metrics and there’s lack of maturity in use of data collection.”

Bespoke feedback 

Specific feedback points coded into the experience (i.e. certain actions were tracked) are used as a test to see when actions were completed and if this action, for example, had been done within a certain time frame. It also indicates which features of an experience were being used, showing where users spent their time and the order of actions. These are unique to each XR project so can’t be standardised across other XR experiences. 

“If an experience has specific triggers we need to know how long it takes the user to notice them and then interact with them.” 

Customer survey and polling data (after the event) can be useful to measure overall experiences however they rely too heavily on self-report which is proven – by Behavioural Science – to not capture real behaviour. Self-report can also be unreliable if the influences over behaviour are those where users are unreliable witnesses to their own actions. 

“With survey data you can’t get all the answers, as there are some questions that can’t be answered.” 

Specific questions coded into the experience to collect feedback on a response at a particular time, or in response to a specific prompt, are unique questions that corresponded to specific actions within that XR experience. Pre/post XR measurement of experiences to test an ability, skill or knowledge prior to using the XR experience and, afterwards, demonstrating any change in that measure. These tests are tailored towards particular goals, such as learning a specific task or skill, however this kind of feedback isn’t diagnostic so doesn’t measure how and why the XR experience changed behaviour. The main interest is in demonstrating that performance had changed. 

Amongst survey questions there are some generalisable questions that are used infrequently. However, these mostly focus on the specifics of what the task had been designed to do. Hence it’s mostly bespoke to feedback action measures and questions related to outcomes that were specific to that experience. The main aim for companies is demonstrating the value of their experience by focussing on the objectives that the XR experience was designed to address. These are limited, as standardised metrics, so these bespoke metrics can’t be scaled to generate a business case for XR. Targeted data showing performance has a role, but there is little diagnostic data showing why the nature of an XR experience was superior at delivering results. 

Heatmaps 

Tracking 6DOF movement in 3D space – the solution which CORTEXR delivers – is still relatively uncommon even though spatial data is reported as being informative about user behaviour. Data visualisations using heatmaps to represent the magnitude of movements provides clear interpretation of complex data with use cases relevant to all XR sectors. Heatmaps are primarily being used on both AR and VR projects to measure user attention towards specific content in order to optimise UX and prove effectiveness. Whilst heatmaps are commonly used to measure attention

on websites, applying complex AR and VR data analytics is an underdeveloped XR and Metaverse measurement approach in the industry.

Biometrics 

Heart Rate, Galvanic Skin Response, Respiratory Rate and Pupil Dilation are methods used by a small group of companies however the usage reflects the kinds of claims used in other sectors. Biometric measures are used commonly within psychological research however the best and most comprehensive definition of what these metrics measure is usually termed as arousal which can be both positive (joy) and negative (fear). The pattern of certain kinds of feedback has been used to measure factors such as attention but generally specialised equipment is commonly needed for any accurate assessment. The data is commonly ‘noisy’ and hence large data samples are needed for any clear interpretation. In the consumer research sector, these measures have been claimed as measuring a wide number of cognitive and emotional processes but little empirical evidence exists that they measure what some claim they do!

Eye tracking 

Eye-tracking is used by small number of companies to provide a strong indicator of visual attention and it’s a successful technique across multiple sectors. Eye-tracking indicates where an experience is being visually attended, the order in which things are viewed and the dwell time on specific elements in the XR experience. Accurate eye-tracking requires specialist equipment for HMDs so it is not readily scalable. Some solutions based on front facing cameras do also exist however there are no precise claims for the accuracy beyond those quoted by the manufacturers of the technology, hence any ability to verify efficacy is difficult. 

Facial coding 

Although this wasn’t mentioned in the study, facial coding (where facial muscle movements are tracked using AI) is used by some companies. These data points are similar to front facing cameras used in (non XR) advertising testing. These detect facial responses and infer a reaction however the Scientific American Magazine (2022) points out that these systems only detect muscular facial movements and not the cause. Crucially, there is no linear relationship between facial movement and emotional condition so the accuracy of interpreting internal mental states is questionable. These systems may produce some useful results however the general view is that there is not a 1:1 correlation between facial muscle movements and specific emotions.

XR community want to better understand user experiences 

The panel of experts and companies taking part in this XR and Metaverse research highlighted measurement methods they need to improve understanding of user behaviour. Firstly, the metrics which are clearly defined and readily implementable. Secondly, requests for metrics which aren’t as easy to define and where people had varying ideas about the practical applications. The second category is therefore inconclusive in terms of practical use of these measures. 

Clearly defined metrics with a clear pathway to measurement 

Heatmaps and Attention 

Attention measures were the most requested metric. This includes heatmaps and data visualisations which indicate attention over time, where user attention was directed and when people looked at specific elements. There were a number of reasons given for this. In Educational and Training, the ability to track if attention was being drawn to the correct place or where attention needs to be directed towards an event that was critical for learning to take place. In other sectors, attention is important to indicate which parts of an experience needed the most consideration in terms of aesthetics and detail, and which parts are less important as they are commonly overlooked by users. Attention therefore correlates with awareness of elements within an experience as well as being a strong indicator of user attention. 

“You can get heatmaps for website use, but it would be helpful to have the same capability within an XR experience.” 

“Where they look at the most, gaze tracking, is important, especially in marketing, and you want to understand attention and feed back to what they experience.”

Telemetry and Navigation 

Measuring movement of the device (handset or headset) to track user behaviour in 3D space is important. Understanding yaw and rotation, the direction or orientation, is considered essential to measure a user’s primary visual field. Additionally, metrics on navigation around an object or location as well as direction of travel is seen as valuable. The key objective is to understand whether people can intuitively navigate the experience. Telemetry can also understand reactions, interactions and how users move around/through XR experiences. 

Tracking responses 

Bespoke feedback to assess specific actions within experiences, with reactions measured via an integrated questionnaire, supports current qualitative feedback approaches. As mentioned earlier, whilst this measurement technique is valuable on custom projects, it doesn’t deliver standardised metrics for the industry.

Time, View Count and User ID

Whilst standardised metrics available in Google Analytics are commonly used, they rank low in terms of priority metrics which companies need to advance their understanding of user behaviour. This is symptomatic of the data analytics available to AR/VR creators and highlights the importance of progressive data analytics like CORTEXR which goes beyond legacy tools. 

Not clearly defined metrics with unclear measurement method 

Engagement 

The definition of engagement, depending on who you speak to, differs from enjoyment through to interest and emotional reaction. This demonstrates the breadth of XR practitioner backgrounds and the need for the industry to agree a definition which doesn’t go down the same path as legacy definitions in digital marketing. Engagement, for example, could be measured by the amount of movement in an XR experience based on the volume or distance travelled in a VR environment and magnitude of movement around an AR experience. We have live tests to assess how people engage with XR experiences to see if high movement levels can be associated with high engagement. 

Assessing Behaviour 

There’s clear desire to assess overall patterns of behaviour in XR experiences however the exact behaviours are not defined and are likely to be different by sector and for each experience. It is, from a cognitive science perspective, possible to assess behaviour with navigation and attention measures.

Immersion 

Interestingly, immersion was rarely mentioned in this study, but we think it’s an essential benchmark metric for the immersive tech industry to quantify. The unique nature of AR, VR and Metaverse experiences requires these phenomenological experiences to be fully understood which is why CORTEXR launched our Immersion Index after 3 years of R&D and live tests with beta customers.

Ease of use 

This measure was prioritised by a small number of XR creators, probably because it’s a subjective definition, however the suite of metrics provided by CORTEXR allow you to understand ease of use and optimise experiences.

XR metrics most likely to drive future business value 

What is clear from this XR and Metaverse research is that – whilst the industry have lots of ideas about the metrics which will advance AR, VR and Metaverse measurement – the definitions and methodologies aren’t always consistent. A set of closed multiple choice questions to surface the metrics which are the most important to the XR community did however highlight attention and navigation as standardised metrics.

Attention 

Measuring attention to specific content elements and assessing the content areas which are viewed the most (i.e. what people look at) is important to the XR community. This consolidates the fact that attention measures are also the highest priority in terms of perceived value. This is seen as the best way to assess the greatest level of interest given to content in the experience to optimise content development as it shows which elements are the most engaging. Attention is also highly valuable as a metric to demonstrate business value e.g. brand activity in the experience.

Navigation 

Analysing navigation around the experience, the user journey and the sequence of user events (i.e. how do people move in 3D environments) is also a highly rated metric. This supports telemetry as a measurement technique with a clear benefit to the XR industry. Measuring user position and pathways is essential to understanding how people travel through VR environments and navigate AR objects or portals. Heatmaps are considered the most valuable type of data visualisation to interpret and understand user journeys. 

Immersion 

Understanding immersion involves measuring an internal mental state. Immersion and sense of presence are very similar with presence a subjective–phenomenal interpretation (i.e. people ‘feel’ like they are in the virtual environment in contrast to the real world) and an objective–functional interpretation of presence is the ability to interact. Presence in VR is telepresence (i.e. people feeling they are transported into a virtual place) and the sense of presence in AR is effect of content overlayed onto the real world. Our Immersion Index algorithms compare movement within a particular project to baseline databases to score the overall level of immersion.  

Surveys 

Questionnaires (bespoke feedback) in AR and VR experiences are considered the best way to address specific answers to questions about custom brand or user objectives. 

Standardised metrics are required for XR industry to grow 


This global XR and Metaverse research demonstrates a consistent pattern. Companies are creating a broad range of creative solutions for clients and end users however, beyond human interpretation, there is little consensus on the definition of a good AR or VR experience. Critically, there isn’t a consistent data analytics approach or agreement on metrics which can be applied across companies, sectors and different types of XR experiences. 

This is an essential challenge for the industry to compete with traditional forms of ‘media’ which have established measurement standards to demonstrate ROI. There is therefore a significant gap in the ability of XR creators to produce business results and benchmark one experience against the other. For the XR industry to flourish, standard measures across different experiences are essential for companies to demonstrate the effectiveness of XR as a solution for multiple business challenges. 

Attention is highlighted as a strong candidate for the industry, especially as this metric is commonly used in other industries, most notably Media and Advertising. It’s a strong measure of interest and evidence of cognitive processing with the ‘Attention Economy’ associated with more positive results in the Media and Advertising industry. There are clear benefits to the XR community in understanding attention better, specifically time spent viewing content, what gets attention first and the priority and order of viewed content. Attention is seen as an important metric across multiple XR sectors, as well as AR and VR formats, as it measures interest in the content, helps optimise user experiences and can be attributed to brand or company objectives. Importantly, this metric is scaled through current technologies using CORTEXR. 

Navigation is also important across both AR and VR formats to track position, pathway and user journeys in 3D environments. This represents a significant divergence from legacy measures designed for 2D screens as ease of navigation and ergonomics are entirely different behaviours in 6DOF experiences. The relationship between headset or handset devices and the AR or VR content can be tracked using device sensors so this metric is also scalable across current platforms with a CORTEXR plug-in. 

Extended Reality measurement study spotlights the metrics that matter

A state-of-the-art research study with 65 companies creating XR solutions summarises how AR and VR measurement has been approached, how companies demonstrate business results and what type of data metrics the industry needs to grow. The future of XR data analytics is the core mission of CORTEXR, it’s a long overdue debate within the XR community and the results of this XR and Metaverse research highlight the solution to measuring user behaviour in Extended Reality.

The research was conducted in three stages; qualitative interviews with XR experts, desktop research to establish the context of current market data and quantitative surveys to ensure a broad perspective. The XR community taking part included brands (e.g. Jaguar Land Rover), agencies (e.g. Ogilvy), platforms (e.g. Unity), manufacturers (e.g. HTC), developers (e.g. nDreams) and media owners (e.g. Yahoo) so it’s a credible snapshot from the industry. 

The start of a discussions on the future of XR data analytics 

The aim of this XR and Metaverse research was to assess what data analytics are needed – in the context of current methodologies – to highlight the metrics which the industry think are the best way to communicate the benefits of Extended Reality to existing clients as well as those who are not yet engaged. This is especially relevant to developments in the Metaverse – where a persistent simulated world has collective experiences with shared goals – as measuring human behaviour in these experiences is essential to grow the industry. 


The XR industry is both creative and resourceful in trying to advance the understanding of AR, VR and Metaverse technologies in a post-screen world. The industry is approaching maturity so communicating the benefits of XR and demonstrating business value requires standardised methods of measurement. Visit CORTEXR to find out more and get in touch at contact@cortexr.com to join our community of XR companies advancing the metrics of human behaviour in XR.

The 10 most innovative companies in augmented and virtual reality of 2023

Post originally appearing on fastcompany.com by Mark Sullivan.

The metaverse—and by extension the mixed-reality headgear we might use to access it—had a moment in the sun in 2021, in part because of the social isolation of the pandemic, and in part because Mark Zuckerberg seemed to bet the future of Meta (née Facebook) on it. But much of the hype around the idea of an immersive virtual public space has vaporized as it has become clear that the hardware, software, and standards needed to create the experience are simply not ready.

As the public’s gaze moved on, consumer interest in augmented and virtual reality seemed to flag. In the U.S., for example, sales of VR headsets sagged 2% year-over-year (as of early December 2022), after doubling year-over-year the year prior, according to research from NPD Group. Yet tech companies big and small continue investing big money and top-tier talent in spatial computing devices and experiences.

Despite losing $10 billion on its Reality Labs business in 2021, Meta has said it expects to continue spending at that same rate to advance its VR, AR, and metaverse ambitions. The company defined the state of the art in VR headsets with its new Quest Pro, which featured much-improved pass-through imaging, new eye-tracking technology, and better hand controllers.

Other companies on our list tackled specific aspects of the AR/VR ecosystem. In content, Archer’s Mark, Innersloth, PatchXR, and Rendever leveraged spatial computing to breathe new life into formerly 2D experiences. Unity and Varjo found new ways of making AR/VR work for enterprises. Niantic reached impressive scale with its AR mapping layer, which will allow developers to anchor virtual objects at fixed points in the real world. And Coca-Cola found a mix of social and AR that helped it connect with tech-forward customers.

The wildcard in mixed reality’s journey toward the mainstream is Apple, which reportedly has hundreds of engineers working in secret on its own mixed-reality platform. It may be years before all the moving parts needed for a mainstream metaverse are ready to go. In the meantime, we can look forward to steady improvement in both the hardware and software needed for cool—if smaller-scale—spatial computing experiences.

1. UNITY

For integrating data with digital twins

Unity, known for its dominant 3D gaming engine, has been working with organizations that manage several large airports to develop digital twins of their facilities. One of them is the Vancouver Airport Authority, which launched its digital twin in March 2022. The technology, which intakes data from sensors placed around the airport as well as historical data, allows airport personnel to visualize many aspects of real-time operations, and can be used for training, optimization, future planning, simulations, and testing. This allows administrators and planners to make data-driven decisions to respond to situations that could affect passenger experience or safety. For instance, the airport might anticipate an increase in auto traffic around the airport, model its likely effects (such as increased security wait times), then plan accordingly. In September 2022, Unity unveiled its most ambitious digital twin effort, working with the Orlando Economic Partnership to produce the virtual 800-square-mile region to help the metropolis do a wide array of urban planning work, including climate change, construction and utility projects, and transportation. Unity doesn’t break out revenue from its digital twin work but the company grew 25% year over year in 2022, with revenue of $1.4 billion.

2. COCA-COLA

For adding fizz to the cultural conversation with mysterious new flavors

Plenty of companies leveraged social media in an effort to stand out from the competition in 2022, but few (if any) kept people talking like Coke did with Coca-Cola Creations. Beginning in February 2022, the company released mysterious new flavors with elaborate (but comparatively low-cost) campaigns designed to generate social conversation. The first soda to roll out was Starlight, which the company claimed was space-flavored. It included an augmented-reality concert by singer-songwriter Ava Max that could be accessed only by scanning a QR code on a Starlight can. When a user did so, they could see the performance, which appeared to be on a translucent stage on a space station.

In April, Coke followed up with Sugar Byte, which it swore tasted like pixels. This flavor targeted gamers, launching in Fortnite with a game accessible within its realm. Like Starlight, the actual soda was sold in very limited quantities, first in Latin America and later in the United States. The exclusive nature of the product further amped up chatter on social media.

An exclusive collaboration with the DJ Marshmello followed in June 2022. Coca-Cola did a Twitch takeover in July, meaning that the first ad Twitch users saw was for this promotion. Scanning the QR code on these cans took viewers to a colorful, morphing video that resembled something between a mood ring and a lava lamp while a new Marshmello song played. Another new flavor, Dreamworld, arrived in August, with a digital clothing collab for users’ avatars and a shareable AR mural.

Although Coke has not released data on the impact of the campaign, CEO James Quincey told investors several times during the year that Creations exceeded expectations and had “tremendous traction” and engagement. Overall, Coca-Cola increased global case volume 5% during 2022.

3. META

For making a “real” mixed-reality headset

At $1,499, Meta’s new Quest Pro VR headset is far more expensive—$1,100 more—than its popular Quest 2 VR headset. But from a technical point of view, the Pro advances some of the technologies that will be needed in a true mixed-reality device and that were simply missing in the Quest 2. The Pro’s vastly improved “pass-through” image, for example, lets wearers see the real world in front of them far more clearly (and in color), so that graphics can be integrated into that real world more believably. The Quest Pro also adds eye-tracking, which, among other things, lets the wearer’s avatar look around and make eye contact with the avatars of others as they’re speaking. The Meta Quest Pro is hardly perfect: It’s a bit ungainly, could be more comfortable, and its battery life is only two hours. That said, it’s a harbinger of what’s to come in the world of mixed-reality headsets. Meta has been criticized for its spending to create a new platform where it could control both the hardware and software; it spent $5 billion on this effort in the fourth quarter of 2022 and an estimated $49 billion since 2012. Although the company has recently said that “efficiency” is a key focus for 2023, it already has what may be the most advanced standalone VR and mixed-reality device on the market today—and is still devoting vast resources to further development.

4. GOOGLE

For using 3D imaging to bring people together

Project Starline is Google’s response to a post-pandemic world where remote work has changed the way we communicate. Since remote meetings such as 2D Zoom calls have their obvious drawbacks, Google sought to build an advanced 3D teleconferencing booth that makes remote meetings seem a lot more like real, in-person communication. The company says that the experience is made possible by a convergence of breakthroughs in 3D imaging of people, compression of the 3D video signal for efficient transmission, and the 3D displays needed to render people in a life-size and lifelike way.

In 2022, Google began providing some large companies—including WeWork and Salesforce—with Project Starline booths so that workers could begin testing and providing feedback on the experience. So far Google has spoken only about the performance of the technology, not about its actual hardware components and how much it all might cost, assuming it decides to commercialize it. It’s possible that Google sees the Starline tech as a new, headset-free approach to mixed reality that it can develop as a foundational technology to be used in future AR/VR products: Both headset-based AR/VR efforts and Project Starline are part of a group called Google Labs.

5. INNERSLOTH

For bringing Among Us to VR

Few games have benefited as much as being ported to VR as Innersloth’s Among Us. As in the earlier 2D versions of the game, which remain among the best-selling titles according to the NPD data, players (represented as little armless cartoon astronauts) work together to repair a spaceship—knowing all the while that one or more of the “crew members” is secretly an “imposter” bent on sabotaging the ship and killing everybody on board. Among Us VR, which Innersloth created with help from Schell Games and released in November 2022, amps up the tension by putting game play in a first-person point of view within the immersive 3D environs of the ship. Players can see what’s in front of them, but danger may lurk behind a wall or around the next corner. Among Us VR makes compelling use of spatial audio: Players might hear the squish and thunk of a crew member being killed in some other part of the ship, or footsteps ominously approaching, or the close voices of other crew members frantically trying to deduce the identity of the imposters before it’s too late. Among Us was already a suspenseful experience in 2D, but the $10 VR version takes it to another level—which some players have termed, in the best possible way, “horrifying.”

6. NIANTIC

For growing an AR map of the world

At its developer conference in May 2022, Niantic formally launched its “Lightship Virtual Positioning System,” a virtual map of the world that allows AR developers to anchor 3D graphics to physical places. For example, a developer might hide a digital prize near a well-known statue as part of a scavenger hunt game. These objects are persistent—that is, users can find them tethered to the same real-world place when they leave and return. Niantic’s map is important because developers need it to create Pokémon Go–style games instead of closing themselves inside a VR headset. The map is growing and spreading rapidly. When it launched last spring, there were 30,000 VPS-activated public locations, mostly in San Francisco, London, Tokyo, Los Angeles, New York City, and Seattle. As of December 2022, the map has more than 140,000 VPS-activated public locations in 125 cities around the world.

7. RENDEVER

For using VR to help seniors

Rendever is using VR to promote engagement and mental fitness among senior citizens. The Somerville, Massachusetts–based VR content company operates a platform that delivers customized 3D immersive experiences to nursing homes and other senior living facilities. The content lets people relive moments from the past (weddings, for example) and virtually visit bucket-list destinations. Rendever’s experiences are designed to fight off feelings of loneliness and isolation, which research shows are common among its target audience, and to stave off dementia by challenging users’ minds. In 2022, the company launched RendeverFit, a VR program that combines physical fitness with cognitive stimulation and socialization. It consists of three different modules—Cycle, Paddle, and Paint—each designed to let seniors “gain the benefits of physical activity without feeling like they’re working out.” The five-year-old company says that it has now delivered more than one million VR experiences, and it picked up 3,000 new users in 2022. As touching videos of old folks immersed in 3D memories for the first time attest, it’s a deft use of the new technology for good.

8. ARCHER’S MARK

For reliving history through VR

Archer’s Mark‘s “On the Morning You Wake (to the End of the World): Take Cover” debuted at the Sundance Film Festival’s New Frontiers program in January 2022. The production studio’s narrative VR work lets participants relive the horrifying morning in Hawaii on January 13, 2018, when residents began getting text messages saying that a nuclear attack on the island was underway. The five-minute VR story may be the first to incorporate real-life emergency alerts within an immersive 3D environment to preserve to memory the terror and chaos of a real historical event. “Take Cover” leaves behind it a powerful question: Why does nuclear weaponry’s Sword of Damocles still swing so close overhead?

9. VARJO

For easing VR cloud streaming for enterprises

Varjo started out making high-end mixed-reality (XR) headsets for enterprises, but it has more recently expanded to help those businesses overcome the challenge of administering XR content to their teams (such as designers). Doing this one seat at a time is complicated, data-intensive, expensive, and requires hardware with lots of processing power, so in April 2022, the Finnish company introduced what it calls the “Varjo Reality Cloud” to stream high-resolution, mixed-reality content down to less powerful PCs and headsets across an organization. The onboarding and security of new users happens in the cloud. The end result is that joining an XR collaboration session is more like joining a Zoom call, and it’s attracted a number of automotive industry customers such as Kia, Rivian, and Volvo. Varjo announced in November 2022 that its cloud service can now stream high-quality XR content powered by Unity’s gaming engine and Epic Games’s Unreal Engine.

10. PATCHXR

For bringing virtual instruments to VR

Playing around with virtual musical instruments within a digital audio app such as Apple’s Garageband is lots of fun, but it’s a 2D experience that can be visually confusing, especially as more instruments come into play. PatchXR‘s Patchworld experience for Meta Quest 2, which was released in July 2022, surrounds the player with virtual musical instruments in VR. It’s surprisingly robust as a creation tool—if only for electronic music. Users can select (or make their own) musical instruments and play them within bizarre virtual spaces, or jam in the same space with friends. They can also record the performance, sing, and add cool effects. Best of all, the expansive 3D space makes it all simple. It’s possible to jump right into one of the ready-to-play worlds and start creating and remixing. The $30 app has earned five-star ratings from 87% of reviewers in the Quest app store.