Lockheed Martin's Prepar3D adds support for Varjo's mixed reality: "VR and XR are going to be critical for our next generation of trainers"

Lockheed Martin is a global security and aerospace company who develops Prepar3D, a visual simulation platform that allows users to create training scenarios across aviation, maritime, and ground domains. Varjo is proud to announce that they are working with Lockheed Martin and the Prepar3D team to enable the most demanding training and simulation scenarios that require the highest-fidelity immersive displays.

The newly-released Prepar3D v5, out this week, brings essential new capabilities and features to solve today’s emerging virtual and mixed reality training needs. What’s most exciting about it? The new Prepar3D brings support for Varjo’s mixed reality headset, XR-1 Developer Edition. This opens up an incredible range of new training applications where the real and synthetically generated worlds blend seamlessly to create compelling training solutions. Prepar3D is also compatible with the VR-2 and VR-2 Pro.

“With the advent of the virtual reality technologies, we are always trying to find ways to integrate with the latest and greatest tech into our software. Virtual reality and mixed reality are going to be critical for our next generation of trainers,” said Kyle Myers from Prepar3D.

“If the visual acuity of the virtual environment is not good enough, that impacts the training and ultimately is a negative experience. When we use headsets such as Varjo, where the resolution of the headset is much sharper and much closer to human-eye resolution, the experience is much more natural. Using technologies such as Varjo allows us to create a training experience with a wide field of view, without having the overhead of a large dome or multiple LCD screens.”

CONTACT

Name: Annaleena Kuronen

Email Address: annaleena@varjo.com

Website link

Shape the Future: How AR Will Transform Retail In The Next Decade (Free Online Event)

fdf.AR-retail.png

Over the next decade, augmented reality technology will permeate into our daily lives in ways we could not have imagined. In retail specifically, consumers are demanding a better shopping experience, and have started using augmented reality applications like the Ikea Place app to help them make better purchasing decisions. Houzz also reported that over two million people have used AR when buying products in the Houzz app and people who have engaged with the tool were 11 times more likely to purchase.

AR offers exciting ways to bridge the online and offline shopping experience. For example, AR can allow customers to virtually try on products like shoes, clothes, sunglasses or cosmetics so they can shop in the comfort of their own home. In other cases, AR can make in-store displays more interactive and engaging, which builds more brand loyalty and awareness.

If you you're interested in retail, e-commerce, marketing/branding this event is for you!

Friday, April 24th, 2020, 10 AM PST /1 PM EST. Featuring speakers Amy Peck (HTC), Tony Parisi (Unity), Elliott Round (M-XR) and Raffaella Camera (Accenture). Hosted by James Basnett (Answer Intelligence).

Limited spots available.

RSVP here

Looking Glass Holographic Displays Integrate with Unreal Engine

Brooklyn, NY - April 9, 2020 - The holographic display company, Looking Glass Factory, announced today that they have released a plugin for Epic Games’ Unreal® Engine. Now content creators working in Unreal Engine 4 can instantly view and share their 3D content holographically in any one of Looking Glass Factory’s line of holographic displays. The UE4 plugin can be downloaded for free at https://lookingglassfactory.com/unreal.

In the past, content creators have been limited to visualizing their 3D content on a flat 2D screen. With the Unreal plugin, now content creators can see what their models, worlds, and experiences will look like in 3D in real-time. This is especially crucial for industries such as automotive, architecture, mapping/GIS, medical imaging, and entertainment, who use Unreal Engine to view complex 3D environments.

“Every day since we launched the Looking Glass in 2018, more and more engineers and designers would reach out and ask when we would support Unreal Engine,” said Shawn Frayne, CEO & co-founder of Looking Glass Factory. “That's why we're so excited to announce the UE4 plugin for the Looking Glass today. Now studios around the world can make holographic experiences that go beyond anything ever seen before.”

“Having access to a glasses-free holographic display is a massive breakthrough, and presents an exciting prospect for teams working in immersive computer graphics, visualization and content creation,” explained Kim Libreri, CTO, Epic Games. “The Looking Glass holographic display provides a stunning level of realism, and we look forward to seeing the innovations that emerge with the support of Unreal Engine generated content.”  

Powered by Looking Glass Factory’s proprietary light field technology, Looking Glass holographic displays are true holographic windows into another world that can be experienced by groups of up to a dozen people. The company created the Looking Glass to meet the needs of enterprises that were looking for a deeply visceral, genuinely three-dimensional experience. Now the thousands of developers around the world already using the Looking Glass developer kits and industries using the new large-format Looking Glass 8K can create and deploy a new class of holographic experiences powered by Unreal Engine.

Unreal Engine Plugin Feature List:

  • Real-time 3D view of content in Unreal’s Game View

  • Holographic 3D visuals in the editor and in builds

  • Support for buttons on Looking Glass displays

  • One-build deployment for 8.9", 15.6", and 8K units

  • Adjustable camera for clipping planes and FoV

  • Support for default image effects from Unreal, or customizable effects

  • Windows only (Linux/Ubuntu coming soon)

  • Leap Motion Controller support

Learn more about the Looking Glass holographic displays and arrange to see one for yourself at https://lookingglassfactory.com/unreal.

About Looking Glass Factory

Looking Glass Factory Inc, with headquarters in Greenpoint, Brooklyn and additional operations in Hong Kong, is the global leader in the field of group-viewable holographic interfaces with the launch of their flagship product line, The Looking Glass.  This unique combination of advanced hardware and software is based on the company’s patented light field technology and in 2019 became the most widely-adopted holographic display in history. Today, the company serves the holographic needs of both developers and enterprises and is committed to building the headset-free hologram-powered future we were all promised in science fiction growing up. For more information, visit www.lookingglassfactory.com or contact:

For press inquiries: press@lookingglassfactory.com 

For all business development inquiries: future@lookingglassfactory.com 

About Unreal Engine

Epic Games’ Unreal Engine is the world’s most open and advanced real-time 3D tool. Creators across games, film, television, architecture, automotive and transportation, advertising, live events, and training and simulation choose Unreal to deliver cutting-edge content, interactive experiences, and immersive virtual worlds. Follow @UnrealEngine and download Unreal for free at unrealengine.com

CONTACT

Eric Schumacher
Neology Marketing Communications
erics@neologyconcepts.com | 310.403.8456

Live Online Workshop: Immersive Interaction Design Principles on 25th of April

Get 10% off your registration here.

More information on this first one of kind online workshop, click here.

More information on this first one of kind online workshop, click here.

Some of the key takeaways from the workshop will be…

Kinematic vs Physics

Understand the main differences between kinematic (“passive”) versus physics-based (“dynamic”) interaction.

Physics in VR

Improving Interaction Experiences using Inverse kinematics to simulate physical presence.

Bring your own project (BYOP)

You can also apply to get feedback on your VR/AR project’s interaction design.

Watch the latest in hand-tracking capabilities.

On 25th of April, you will learn to design a fully functional responsive interface with physics-based interactions and hand tracking input.

XR Bootcamp is the first academy offering on-site and online full stack VR/AR programs for designers and developers with career opportunities. 

Together with it’s advisory board from Accenture, Volkswagen, HTC, ExxonMobil, BNP Paribas, Bosch, Deutsche Telekom, KLM, Verizon and others, XR Bootcamp co-developed a VR/AR curriculum to focus on high in-demand skills that are most needed in today’s biggest companies. 

Trainers are successful VR/AR professionals, that teach the latest and most relevant skills most needed in a rapidly changing environment of tools. 

Online workshops are project focused and teach students to create their own portfolio projects, while the on-site bootcamp teaches a full stack of VR/AR development and design tools needed to grab that next great VR/AR industry job. 

LAST CALL FOR SPEAKERS! VR/AR Global Summit ONLINE Conference+Expo. Deadline is this week April 15th

LAST CALL FOR SPEAKERS! Deadline is this week April 15th. If you want to speak during our VR/AR Global Summit ONLINE Conference+Expo, this is your last chance! Go here www.vrarglobalsummit.com and click "I want to Participate."

Our online event is 3 days, covering 3 time zones; it will be mega! We already invited 50,000 professionals and expect 10K+ will attend live. Topics include Enterprise, Automotive, Education, Healthcare, Fitness, and so much more!


Also,

  • Get tickets

  • Become an Exhibitor

  • Become a Sponsor

  • Go here and click "I want to Participate"

The Federal “CARES” Act Stimulus Legislation Benefits For Media/Tech Companies. $2 trillion COVID-19 package

Learn more at our VR/AR Global Summit ONLINE Conference + Expo on June 1-3

As of April 6, 2020
Subject to Change as Program is Developed and Implemented

The $2 trillion COVID-19 package includes federally-guaranteed loans and grants for qualifying medium-size and small companies.

For Medium-Size Companies (generally 500 to 10,000 employees), the CARES Act provides loans at a two-percent interest rate, with no payments due for the first six months.

  • Companies receiving the benefit cannot “outsource or offshore” jobs for two years after the loan has been repaid.

Small Companies (generally less than 500 employees, though some larger companies may also qualify)[1], may take advantage of the “Paycheck Protection Program” (PPP) and the “Economic Injury Disaster Loan” (EIDL). Small Companies can apply for both loans but not to cover the same expenses.

          Paycheck Protection Program. Under the PPP, a Small Company can get a loan of up to $10 million, which is automatically forgiven (doesn’t need to be repaid) if the Small Company can show that the amount was spent on business expenses such as payroll, rent or utilities during a specified eight-week period between February 15 and June 30, 2020. The interest rate is 1%. The amount of the loan depends on how many employees the Small Company has (with compensation no more than $100,000).

  • For full eligibility of forgiveness, the Small Company must show that it has not recently laid off employees.

  • Any forgiven amount would be reduced if employees are laid off during the eight-week period after the loan is originated.

  • Forgiveness for payroll amounts will not apply to compensation of over $100,000.

          Limitation for Venture-Backed Companies. Currently the PPP does not apply to Small Companies that have VC investment, if the VC investor is an “affiliate” of the Small Company under IRS rules. However, there is an effort underway to remove or revise this restriction.

          Economic Injury Disaster LoanUnder the EIDL, a Small Company can borrow up to $2 million for working capital, such as fixed debt and payroll. A Small Company applicant may be able to obtain an $10,000 grant paid up front as part of the loan arrangement. The interest rate is 3.75% and the term can be up to thirty years. No payment will be due for a year, although interest will accrue during this period.

Employee Retention Tax Credit. The CARES Act provides for a one-year employee retention tax credit, allowing companies to take a tax credit against employment taxes equal to 50 percent of qualified wages for each employee, up to $10,000 total per employee, if the company can show that it was adversely affected by COVID-19. “Qualified wages” means wages for employees who are unable to work due to “shelter in place” or similar restrictions, or for whom the employer has furloughed due to decline in revenue because of COVID-19. For companies that have less than 100 employees, all wages are eligible for the tax credit, not just “qualified” wages – even if the employees continue to work.

Other Programs. Other programs for small and medium-size companies may be available at the state and city level.

  • As examples, in San Francisco, the Office of Economic and Workforce Development (OEWD) will issue emergency grants of up to $10,000 for affected small businesses. In New York City, zero-interest loans are offered with the funding limit of $75,000.

  • Facebook announced a $100 million grant program for small businesses. Google announced a program of over $800 million, some of which is supposed to go to support small and medium-size businesses.

Local Chambers of Commerce are a good place to learn what local programs are being implemented and connect with other business owners in the area.

Gamma Law is a San Francisco-based firm supporting select clients in cutting-edge business sectors. We provide our clients with the support required to succeed in complex and dynamic business environments, push the boundaries of innovation, and achieve their business objectives, both in the U.S. and internationally. Contact us today to discuss your business needs.

[1] – Small Companies includes select types of companies with fewer than 1,500 employees if they meet a “small business size” standard. SBA offers a calculating tool whether a business is small, thereby qualifying for loans. See https://www.sba.gov/size-standards/.

What's It Really Like to Host a Live Event in VR?

VRARA Chicago VR Online.png

TL:DR - On April 1, 2020, VR/AR Chicago hosted our monthly event in VR. The biggest three takeaways are: 1. prep and plan just like you would for an in person event; 2. be ready to handle technical issues; 3. prepare your audience for the new environment.

First a Little Background

VR/AR Chicago hosts a monthly event that normally takes place at 1871, Chicago's tech incubator. On an average month we draw between 75-150 attendees, with the fluctuation generally related to industry topic, weather, and timing related to other events. The stage portion of the event is live streamed to the VR/AR Chicago Facebook page, where it is also archived. I've been the primary coordinator of the event since its founding in 2017, but for the last couple years a lot of help has come from the VRARA Chicago Chapter leadership - handling speaker sourcing and coordination, social media marketing, graphic design, sponsorship generation, and on-site registration and check in.

The purpose of VR/AR Chicago is to help our audience members take their next step forward with immersive technology. We provide technology demonstrations, networking, industry expert panels and live Q&A. This is done specifically so that members of our audience can obtain whatever it is they need to move forward with their use of augmented reality and virtual reality, whether that be an industry connection, information, inspiration, direct experience, or anything else. It is a free event that is open to the public, and generally takes place on the first Wednesday of each month.

Structure of the Event

In order to understand what makes the event successful, the structure needs to be addressed. There are four main components that make the VR/AR Chicago event series valuable. These are networking, demonstration, presentation, and live Q&A. Knowing that it was impossible to provide live, in person technology demonstrations - though attending the event itself was in one way a live demonstration of VR technology - that left the other three components. We had to be present in an environment that allowed for open networking between the attendees prior to and after the main stage event. This factor eliminated the possibility of simply hosting a webinar. At the same time, we did want to provide a valuable stage presentation with produced content, and facilitate a live question and answer format discussion. In other words, crowd control and presentation ability were vital. The one other factor that makes VR/AR Chicago successful is its regularity. We don't cancel, and we don't reschedule. The shelter in place order hit Illinois on March 20, just 12 days prior to our April 1 event. We immediately started planning to take the event virtual.

Venue Selection and Event Planning

Like any event, the first step was choosing a venue. We evaluated several options very quickly on a number of important criteria. The systems we looked at included VRChat, RecRoom, SecondLife, AltSpace, and Mozilla Hubs. The main evaluation criteria were:

  • Capacity (knowing we were going to draw an audience of at least 75)

  • Moderation (to present in the space, we had to be able to mute the crowd)

  • Presentation (had to be able to run a slide deck with embedded video)

  • Reliability (the tools in the environment had to work without exception)

  • Accessibility (not everyone has VR, we need to be accessible to non-VR users)

  • Streaming (we've streamed our events for months and we're not stopping now)

In the end, only AltSpace met all of the necessary criteria. We were concerned about the inability of people with Apple devices to access the AltSpace environment directly, but a choice had to be made and ultimately the other factors were more important. Additionally, it was very easy to get in touch with AltSpace staff for support, which was a welcome surprise.

Once the venue was chosen, the next tasks included evaluating all the capabilities of the space and preparing the content. We quickly learned how to control our live presentation using slides.com, which allowed us to embed YouTube videos. Jason from AltSpace event staff helped us mirror the room, and spent over an hour walking the VR/AR Chicago team through the event moderation tools, as well as tips for troubleshooting issues during the event. Finally, we tested the ability to place a virtual camera operator in the space to stream the event to our Facebook page, with a minimal 10 second delay, where we monitored the chat during the event so that Apple users had a way to participate in the live discussion.

Be Ready to Handle Technical Difficulties

Anybody that's spent any amount of time working in live events knows that you need to appease the "live demo gods" with an ample sacrifice of your time prior to the actual event, or you will incur their wrath. We tested slides, videos, muting and un-muting microphones, resetting the room, playing videos, and broadcasting the live stream several times in the actual AltSpace environment during the days leading up to the event. We practiced it so many times that troubleshooting issues became a matter of muscle memory. During the actual event, at the first video roll, the audio didn't play. It took just under 2 minutes to diagnose the issue, reset the room, and then reset the live stream operator in the room so that the event could continue for both the attendees in VR and the people watching the live stream. Three days prior to the event we concluded that one of our scheduled presenters was unable to access AltSpace. As a result, we scheduled time for Devon Rutkowski of Gilbane to record her presentation using Zoom before the event, and embedded that video into the event slide presentation. During the event, some people - particularly those watching the live stream - had trouble hearing Meg Miller of Gensler because her headset microphone wasn't picking up the audio well. We boosted her volume as high as it would go.

In the end, the event worked as well as could have been expected for an audience of around 60 people in VR and an additional 25 watching the live stream.

Prepare the Audience for the New Medium

We knew that over half our audience was not going to have experience with AltSpace - so in addition to marketing the event, we had to provide detailed instruction about how to access the event. VR/AR Chicago coordinator Tom Packard not only provided detailed instructions, with pictures, about creating an account on AltSpace and setting up an avatar. He crafted messaging and suggested that we send notice to everyone that registered for the event urging them to setup the software necessary to attend in advance, as it took around 30 minutes to create an account and complete the on-boarding process. He also spent 2 full days in AltSpace in a similar environment to the presentation environment, greeting people who read the instructions and wanted to take AltSpace for a test run. We met with Meg Miller a day prior to the event in the practice room, so she could get used to the controls and practice presenting. Ultimately we treated the event like a normal live event in a physical venue. Even the AltSpace events team members who stopped by the space to assist us commented on how our level of preparation exceeded anything they had seen before.

One Final Thought

During the last hour prior to the start of our event, one final thought occurred to me. Hosting an event in VR is something a lot of people and companies are going to be trying right now. Many of them won't have the same level of experience with live event production and VR technology as I do. With 30 minutes to go prior to the start of the event, I setup two cameras and three screen captures to record how I produced and managed the event in real time. This was recorded using OBS and later streamed to Twitch and posted on the VRARA Chicago Chapter's YouTube page. It is also included below.

I do need to again thank our presenters, our sponsors, 1871, VRARA and BUNDLAR, the AltSpace Team, and my team at the VRARA Chicago Chapter and VR/AR Chicago.

So Real Partners with AnotherWorld VR to Expedite Full Automation of World-First Digital Twin Production

 Berlin based studio collaborates with bleeding edge Swiss tech firm to fast track full automation of SO REAL’s game/XR ready 3D asset production 

Bern, Switzerland April 6th 2020 – SO REAL Digital Twins AG today announce their collaboration with anotherworld VR GmbH. Berlin based anotherworld will operate as an extension of the Swiss firm’s development team, aiding delivery of their aggressive plan to fully automate CT2VR digital twin production by 2021. 

SO REAL’s world first, patented CT scanning and conversion tech delivers fully engine ready 3D objects, complete with fully functioning physics and cool features like “Superman X-Ray vision mode”. anotherworld - leading German VR studio led by Max Sacker and Ioulia Isserlis – will lend their expertise to core aspects of the process including physics and VR capability, pushing the bleeding edge of XR. 

 Ian Ravenshaw Bland, SO REAL co-founder and CEO, commented, 

“We have quite a challenge ahead of us to automate the production of cinematic-quality digital twins. To get this done, we must partner with the technology leaders in various fields; those whose DNA compels them to push further into the frontier. This describes AWVR perfectly. We have been collaborating with AWVR on a project basis since before SO REAL was even founded and always discussed working more closely together as a team. Now that we are funded and set up, we’re doing it. Stay tuned for more world’s firsts over the course of 2020.” 

Max Sacker, Creative Director, anotherworld Gmbh, commented, 

“One of the key aspects of virtual reality production is conveying a sense of realism. Virtual reality is about deep immersion and transporting the user into a believable simulated reality. The VR headset is the ultimate magnifying glass through which the user scrutinises the illusion they are being asked to believe. So creating 3D assets to the highest standards of realism is of paramount importance to VR studios as well as visual effects and animation studios creating AAA quality visuals. We are very excited to team up with SO REAL to push 

the boundaries of visual realism using cutting edge CT scan technology and building on the years of experience gained from optimising 3D visualisation and asset creation." 

PRESS CONTACTS 

Jackie Knivett 

Jackie@virtualcomms.co.uk 

T: 07739 170663 

Lu Digweed 

Lu@virtualcomms.co.uk 

T: 07949 485588 

Sam Brace 

Sam.Brace@virtualcomms.co.uk 

T: 07813 988908 

About SO REAL DIGITAL TWINS AG 

This isn’t real, this is SO REAL. 

SO REAL DIGITAL TWINS AG have created the fastest 3D model tech tool in the business. Using patented scanning and conversion technology, we’ve automated the production of digital twin 3D objects from real objects. This unique twinning technology allows the mass-production of cinematic-quality, game/VR-ready 3D assets for use in games, films, and VR/AR/MR. 

SO REAL deliver one of the fastest, most technically polished and complete solutions for game tech, art and exact requirement design. From physical objects to playable assets, SO REAL’s precision focus brings the real world closer to virtuality. 

About anotherworld VR 

Well-known in the industry as the gold standard of photorealism, anotherworld VR lives at the convergence of art and technology. They are constantly experimenting with the latest developments, blending narrative storytelling, cinematic aesthetics, and immersive gaming interactivity making them an ideal partner. 

AWVR balance and push the boundaries of innovative technology by competition selections at film festivals (Venice VR at VIFF) and working with high-profile clients including Deutsche Bahn and Microsoft. 

www.anotherworldvr.com 

Pico Interactive added to the Sponsor lineup for the VR/AR Global Summit ONLINE Conference+Expo!

Pico Interactive Sponsors the VR/AR Global Summit ONLINE Conference+Expo

The online virtual event is taking place June 1-3

April 2, 2020. San Francisco

With operations across the globe, Pico Interactive develops innovative VR and AR hardware solutions which enable business clients to provide the best virtual reality experience in Medical, Education, Marketing, and Location-Based Entertainment industries. More info at www.pico-interactive.com

The VR/AR Global Summit Online is an online conference, brought to you by the VR/AR Association,  connecting the best virtual reality and augmented reality solution providers with enterprise and media entertainment companies.

“From the Fortune 500 to startups” might sound like a really general pitch - but at the Global Summit, we really do see it all -- and we love it. Our goal is to help the industry grow and cross-pollinate.

We’ve invited people from aerospace, enterprise, health, science, real estate, advertising, manufacturing, telecom, 5G, etc.

More info at www.vrarglobalsummit.com

CONTACT

Kris Kolo

VR/AR Association

kris@thevrara.com

TG0 announces etee SteamVR: the button-free VR controller

etee - the first full finger control and proximity, touch, gesture and pressure sensing VR controller - is available now on Kickstarter



LONDON – Tactile control specialist TG0 has unveiled etee, a groundbreaking new VR controller delivering more mesmerising VR experiences through patented finger sensing technology.

With etee, VR gamers can immerse themselves in new worlds using all of their fingers simultaneously. Unlike other controllers, etee allows gamers to control their VR experiences via a full spectrum of movements such as proximity, touch, gesture and pressure.

etee controllers are wireless, lightweight, durable and designed to be worn on the hand for long periods. An ergonomic form, patented technology and long lasting battery make it the ideal controller for scenarios including VR and AR and conventional gaming, drone flying, home device activation and multiple Internet of Things (IoT) applications.

TG0 will be selling etee controllers for the first time during a one-month Kickstarter campaign launching on April 2 nd . The page is now live at:

https://www.kickstarter.com/projects/tg0/etee-complete-control-in-3d

“VR games need a controller that moves beyond the stale, TV remote-like devices that are prevalent today,” said etee designer Mick Lin.

“etee takes us away from binary button commands into a new world of gesture, grip and touch-based controls. It delivers richer, more mesmerising interactions: the experience gamers deserve.”

While VR and AR gaming and training has advanced massively in a short period of
time, existing controllers have retained an old-fashioned button-led design that has
barely evolved from the games controllers of yesteryear.

Unlike legacy controllers, etee delivers on:

● Full finger sensing and gestural control
● Lightweight - 120g for the SteamVR version makes it the lightest VR
controller
● Easy to put on and take off
● Sweat, dust and water-proof for improved hygiene and reliability
● Long battery life (16 hours standby, up to 6h active)
● Thumb trackpad to simulate joystick
● Ergonomic, intuitive design
● Proximity, touch, gesture and pressure sensing
● No buttons for a more natural experience
● Simplified product user interface

Designed to be as cosy and comfortable as a cardigan, etee hugs the hands and allows users to forget they’re even wearing a controller. etee is easy to slip on and off in seconds and a simple cylindrical shape couples freeform design with robust sensing. The end result for users is a controller they can wear rather than hold, and a more engaging, fully-realised experience no matter what the end application.

Jakub Kamecki, etee’s Head of Business Development, said: “I’ve never seen someone pick up an etee controller and fail to be wowed.

“We want to share this technology and to build a community, not only of gamers but of developers, innovators and inventors.” etee controllers are equipped with:

● Thumb trackpad
● Free hand design
● Open palm gesture sensing for all fingers
● RGB colour LED
● State-of-the-art haptic feedback
● USB Type C ports
● Bluetooth and WiFi communication

Full technical specifications are available on request. Interested parties can stay updated by following the etee Kickstarter campaign at https://www.kickstarter.com/projects/tg0/etee-complete-control-in-3d

Jakub said: “We’ve felt for a long time that etee is the future of VR control. Now we’re going to make it so.”


About etee

etee controllers simplify and enhance VR and AR experiences by bringing human intuition to the fore. etee's buttonless, built-in sensor surface captures finger proximity, touch, pressure and combined gestures. This allows users to immerse themselves in the digital world and forget about controllers altogether. With a lightweight and sweat-proof body, etee is durable and comfortable, and can be slipped on and off in seconds. etee is the controller of choice for those seeking mesmerising, seamless, and simple experiences.


About TG0

TG0 is the team behind etee. TG0 is an IP-focused start-up commercialising a platform technology to create novel tactile control products. TG0 works with partners and clients, including leading brands and manufacturers in the consumer and automotive sectors. TG0 is a diverse team passionate about creating products that people have never seen before, with effortless and elegant manufacturing. To find out more, visit tg0.co.uk.

Message from Ming Kong, CEO and TG0 Co-founder

TG0 was founded based on a shared vision that we will one day create our own branded product powered by the technology we developed throughout the years. For more than 2 years time, our team worked tirelessly on identifying a problem, developing a new solution and designing and communicating a product, which took us where we are at today with etee. Due to COVID-19, we considered delaying or even cancelling the campaign. However, our team have again shown incredible passion and perseverance powering through the difficulties of working in isolation and have delivered a full and appealing campaign. At this time, we believe it is the best that we do our job without concerns and fears, and hopefully bring our customers a sense of normalcy.

CONTACT:

Website URL: http://tg0.eu/pledge

Name: Jakub Kamecki

Email Address: kuba@tg0.co.uk

#StayAtHome license: VRdirect supports companies during the Corona crisis in presenting their products with Virtual Reality

Learn more at our VR/AR Global Summit ONLINE Conference + Expo on June 1-3

In response to the current corona crisis, the Munich-based start-up VRdirect is offering small and medium-sized companies the opportunity to use its software for creating their own virtual reality content completely free of charge for six months.

The offer is valid until 30 September and is aimed primarily at sectors in which the opportunity to present products on their own sales floors or at trade fairs and events is lacking because of current security regulations. Virtual Reality represents a great opportunity to present one’s own products or services in a visually appealing way and to integrate possibilities for direct purchase within a self-designed application.  In addition, VRdirect offers companies of all sizes support in the implementation of virtual reality projects by providing the necessary hardware (e.g. 360° cameras) and consultation regarding the development of a virtual reality strategy.

Interested parties can find the registration link and further information on the #stayathome license at https://lp.vrdirect.com/corona/stay-at-home

CONTACT

Dominik Möcklin

Managing Director

dominik.moecklin@vrdirect.com

To celebrate their 50th anniversary Superstar shoe, Adidas and WSS teamed up with Genius Ventures Inc to create an in-store AR activation

To celebrate their 50th anniversary Superstar shoe, Adidas and WSS teamed up with Genius Ventures Inc to create an in-store Augmented Reality Snapchat activation.

This in-store activation with Snapchat code was highly successful, so successful, that they moved on to a second phase to include a coupon-code for a free hat upon making their purchase in-store.

Benefits of Augmented Reality marketing include:
- Valuable data using AI technology
- Increased brand awareness
- Unique way to engage with your consumers
- Limitless possibilities
- And so much more!

Sounds cool, right?

Try it out now at https://lnkd.in/gQZu4y2

How to use:

 Open the Snapchat filter (link in bio).
Tap on floor or any surface to activate.
You can use your fingers to make it bigger or smaller.
Tap on floor again to relaunch.

Have you considered Augmented Reality as part of your marketing strategy?

To get in touch, please contact Genius Ventures Inc at www.geniusventuresinc.com

CONTACT

Name: Marah Berezowsky

Email Address: marah@geniusventuresinc.com

Your Website URL: https://geniusventuresinc.com

The Benefits of Eye Tracking in VR for Research

By Ville Leppälä, vertical lead for Varjo’s research and healthcare business

Varjo VR heatmap eye tracking.png

Virtual reality and eye tracking have been used in research for decades but until recently, the research community hasn’t been able to fully benefit from merging the two technologies. 

Read this post to learn why and how researchers can benefit from eye tracking in VR.

What is eye tracking and what is it used for?

Ville is the vertical lead for Varjo’s research and healthcare business, working closely with customers and partners. 

Eye tracking makes it possible to record eye positions and movements and analyze gaze positions in both 2D and 3D environments. Researchers use eye tracking to understand the human processing of visual information and to gather insights into the human mind. 

Eye movements reveal more than people can imagine. They can help us understand how respondents feel, act and think about certain subjects. The quantitative data from eye tracking can also complement studies based on questionnaires and interviews. Researchers can play back events, analyse them in detail and talk about the experience with subjects.

Traditionally, eye movements have been captured with either remote or mobile eye trackers. Remote eye trackers are mounted close to a computer screen and require respondents to sit in front of a monitor and interact with screen-based content. Mobile devices, on the other hand, are fitted near the eyes (usually mounted onto eyeglass frames) to track their eye movements.

 

Limitations of remote eye trackers

Running a study with remote eye trackers means that you’re tied to a single laboratory, as changing the surrounding environment would cause bias. Remote eye trackers track the eyes only within certain limits, meaning that considerable head movements during a study will break the eye tracker calibration and the measurement will fail. For example, studying children can be challenging or nearly impossible with this kind of equipment.

Typically this isn’t a problem if the measurement time is short, but respondents tend to have a harder time concentrating for longer periods. One good thing is that gaze interactions can be directly mapped onto the screen content because the gaze vector and content position on the screen is known at every moment. However, the restrictions above make the usefulness of this setup very limited.

 

Why mobile eye trackers aren’t the solution either

Mobile eye trackers allow respondents to move freely. This is an advantage since you can plan your study in a natural environment. The surrounding scene is videoed, and eye movements are mirrored into the recording.

Analyzing this, however, is challenging, as the scene video captures is basically just an array of 2D photos, each from a slightly different angle. Identifying stimuli from each photo requires either a vast amount of manual work or advanced computer vision algorithms that attempt to map gaze interactions from all angles into a single picture.

For instance, when a respondent looks at a statue in a museum scene from one side, it is relatively easy for a computer vision algorithm to track the gaze interaction with the statue. But when the respondent moves to the other side of the statue, it’s not easy to identify it as the same statue, as it might look quite different from the back.

 

VR eye trackers enable full control of the study environment

A VR-based eye tracker combines features of both mobile and remote setups. The eye tracker is mounted on the user’s head, allowing the him or her to move freely without the risk of losing calibration. This helps increase data quality and improve the success rate of measurements.

In addition, VR headsets immerse respondents in a virtual environment that’s completely isolated from the surrounding real one. This means that studies can be performed in various locations without the environment change causing a bias. The conditions are always repeatable, and researchers can easily try out different scenarios. The whole research setup is portable and fits into an aircraft cabin-sized bag, which can save huge amounts of time and money.

Plus, having full control of the environment enables new ways to gamify studies, making them more appealing and immersive to respondents.

 

Get started with Varjo’s integrated 20/20 Eye Tracker

Varjo’s integrated eye tracking allows you to gain deep, reliable, and immediate insight into what your subjects see and experience. Using Varjo’s human-eye resolution VR/XR headsets, you can track what’s behind users’ reactions and behaviors as they interact with photorealistic objects, environments and virtually any stimuli – all with the full control and flexibility of virtual and mixed reality.

Varjo’s integrated 20/20 Eye Tracker lets you gain real insights into human behavior and understand cognitive engagement – all in real time.

All of our headsets VR-2, VR-2 Pro and XR-1 Developer Edition come with built-in eye tracking.

Learn more here

ThirdEye and TROIA Bring Remote Assistance via MR Glasses to Energy, Utilities, Oil & Gas Sectors

ThirdEye, a leader in augmented reality (AR) and mixed reality (MR) enterprise solutions, has partnered with TROIA, an innovative IT company focused on quality asset management solutions and development of AR solutions. With the new partnership, ThirdEye’s X2 MR Glasses will now allow technicians, field workers and remote experts the ability to repair and maintain assets with greater safety, efficiency and cost-effectiveness.

By combining ThirdEye’s X2 MR Glasses and TROIA AR, enterprise organizations within energy, utilities, oil and gas sectors will have access to a suite of applications that prioritize safety and more efficient workflows. Technicians working in the field can expect to see dramatic improvements to the way work orders, job plans and other real-time mission-critical information are delivered, along with improvements in productivity, effectiveness, accuracy and safety. Customers can also display real-time measurement data visualization on ThirdEye's X2 MR Glasses through TROIA's TAGMANAGER application, and place asset information tables and digital meters into scene environments on demand.

Here’s a link to the press release.

MEDIA CONTACT

Elsa Anschuetz | Uproar PR

P 321.236.0102 x233 | M 321.960.4062

E eanschuetz@uproarpr.com

Lessons from AR Leaders, Part III: The Field (New Report)

ARtillery Intelligence’s latest report, Lessons from AR Leaders, Part III: The Field examines AR usage and revenue leaders, and the strategic implications for AR success. Subscribe for the full report. VRARA members get a discount.

January cover.png

The consumer AR sector still lingers in early stages. Among other things, this means the playbook is being written mid-flight. There’s a great deal of experimentation underway as companies test and iterate rapidly to discover winning formulas and business models.

This goes for consumer AR product strategies. Though a common sentiment in 2016’s hype cycle was that AR applies to everything, it’s become clear that it’s not a silver bullet. It will have native and natural applicability to some aspects of our lives and work… but not all.

Beyond macro categories and use cases where AR should or shouldn’t be developed, there are more granular strategies around user experience (UX). What types of AR interactions resonate with consumers? And what best practices are being standardized for experience and interface design?

Equally important is the question of AR monetization and revenue models. Just as user experience is being refined, questions over what consumers will and won’t pay for are likewise being discovered. The same goes for brand spending in cases of sponsored AR experiences or ads.

These lingering questions compel acute attention to quantifiable AR market successes and best practices. Not only do the sector’s early stages mean that these questions are prevalent… but also that their answers are scarce. That includes evidence of successful execution, as well as transferrable lessons.

With that backdrop, ARtillery Intelligence ventures to find, aggregate and draw meaning from finite AR successes in today’s environment. When examining consumer AR engagement and revenue leaders, what product attributes and tactics are driving their performance?

This started in Part I of the report series with Snapchat. Its social lenses have the greatest consumer AR active usage, and it holds the leading share of AR ad revenue. Among other things, this is propelled by product-market fit, ease of use, distribution and fulfilling key goals for brand advertisers.

Also on the list is Pokémon Go, which we examined in Part II of the series. Though the tech press has moved on to other shiny things, 2019 marks its best revenue performance to date. This is attributed to innovation cycles that breed ongoing novelty and replayability, as well as its sparing use of AR as a game element.

After examining these proven leaders, we now turn attention in the third and final installment of this series to emerging players that show signs of potential. Though earlier and unproven, they show promise and adherence to best practices examined in parts I & II. And they show new best practices worth noting.

These upstarts include 8th Wall, Ubiquity6, and Tilt Five. They also include established brands entering AR, such as Houzz, Instagram and Pinterest. This seemingly random sample shows signs of product and business model traction, which we’ll examine in the coming pages. The goal, as always, is to triangulate best practices and extract tactics and takeaways for AR players today.

Subscribe for the full report. VRARA members get a discount.

Rail Cargo Carrier powered by SPACE1 AR+AI product, which was supplied by OverIT

Augmented Reality combined with Artificial Intelligence to reshape and optimize Field processes in logistics, for the resources to continue empowering their skills and enhance their productivity, while streamlining the debriefing process and the sharing of know-how.

Business challenge

Rail Cargo Carrier has been into cross-border rail transportation since 2006, giving a major contribution to the entire international logistics chain, by cooperating with the production units belonging to this group and based in Bulgaria, Germany, Croatia, Romania, Slovakia, Slovenia, Czechia and Hungary. Interaction among all involved parties is essential to ensure safety standards on cargo trains.

The goal for Rail Cargo Carrier is to enable its field operators, in charge of logistic tasks, to safely eliminate risks when carrying out routine inspections on cargo trains. This translates into collecting and processing multimedia information in the form of dynamic reports, as to cut down possible costs due to gaps in cargo quality standards.

Solution

To this purpose, Rail Cargo chose SPACE1, which combines Augmented Reality and Artificial Intelligence to implement reshaped and optimized Field processes. In this specific case, the product supplied by OverIT allows technicians to perform their inspections hands-free, ensuring the highest safety and production standards regardless of the working conditions, while supporting skill-development and simplifying the process for managing and sharing the know-how.



Benefits

  • Standardized inspection procedures based on preset checklists and digital workflow, for step-by-step support to inspectors in data collection.

  • Working under the highest safety standards and in any weather conditions thanks to Augmented Reality and a voice interaction device, for performing tasks in hands-free mode.

  • Real-time compilation of digital reports on field

  • Automated data acquisition through OCR readers

  • Background video recording of the inspection and automated saving with metadata tagging on cloud, to handle possible complaints and train new resources

  • Real-time remote support by experts

Contact:

Name: Francesco Benvenuto

Email Address: francesco.benvenuto@overit.it

Website URL: http://www.overit.it/en

AVATOUR Launches Commercially: Remote Presence is Now a Reality

AVATOUR Remote Presence lets you visit a real place in real time, providing a new substitute for travel -- especially useful in these challenging times. AVATOUR has been in development for two years, and in beta since Fall 2019; today marks the first commercial release of the platform. Avatour use cases include remote inspections, sales, recruiting and training - just to name a few. Any industry that needs to share a place or property can reduce or eliminate travel by getting onsite fast with AVATOUR.

“As more companies and countries limit or halt travel, businesses worldwide are being forced to find remote working solutions. Videoconferencing platforms like Zoom and Skype can replace a conference room, but until now, there was no substitute for location-dependent meetings like facility walkthroughs, site inspections, and site-specific training,” said Imeve CEO Devon Copley. “Using AVATOUR to visit a location alongside someone thousands of miles away feels more like an in-person visit than a videoconference.”

The AVATOUR platform is now generally available to qualified companies in the United States, EU, UK, Australia, New Zealand, Japan, and China. They are currently offering a FREE 360 Camera & Starter Kit for the hardware/software package starting at US$1190 (€1090, £990, ¥150,000, CNY8,800) for 60 days of service. Interested company representatives can visit http://avatour.co/get-started to apply.

Contact

Name: Jenna Clark

Email Address: jenna@imeve.com

Website URL: http://avatour.co