Startup Positron Hopes to Change Way We Watch VR

The first Virtual Reality chair designed for communal theater settings is here.

Created by Los Angeles based startup Positron, the Voyager cinematic VR chair is a full-motion chair featuring a computer-controlled base that tilts and spins the chair in concert with what’s happening on screen.

At first glance, Voyager looks like a modern pod chair that you might find in any high-end furniture show room, requires little set up – and that’s exactly the point.

“You start with something familiar [the chair],” explains Positron CEO Jeffrey Travis, a software engineer with a film degree.

The egg-shaped chair dubbed “Voyager” is designed for seated VR (without the hassles of setting up bulky goggles). The white pod, red velvet lined chair uses audio embedded in the chair for an immersive VR experience. The chair’s relatively quiet motors deliver motorized robotic rotation and reclining.

With Voyager, CEO Travis says the goal is to provide a cinematic experience, and a familiar gateway to VR.

A key feature is Voyager’s ability to provide 360 degrees of unlimited yaw motion and 35 degrees of pitch motion.

“One of the things that we’re addressing is the motion sickness,” explains Travis. “Twenty percent of people get motion sickness watching VR. This has to do with the brain seeing visual images that indicate that you’re moving, but you’re not.”

“We’ve found by adding a little bit of movement, the motion sickness is two percent.”

Voyager also features scent technology, specialized seating for motion-synchronized theatres, and will work with either the Samsung Odyssey ant Oculus Rift headsets.

Tom Cruise in Zero-G
In January 2017 Voyager premiered at the Sundance film festival held in Park City, Utah. Positron’s team of engineers created a virtual cinema theatre, consisting of twenty motion synchronized Voyager chairs.

Viewers were immersed in a twenty-minute VR experience narrated by Tom Cruise that took them behind the scenes of Universal Studio’s blockbuster film The Mummy.

During the VR experience, Tom Cruise and co-star Annabelle Wallis are sent tumbling around in “zero gravity” inside an air freighter as it drops out of the sky during a crash. As the plane goes through zero gravity, the chair tilts up and down to give audience members the feeling of floating in Zero-G.

Will VR Create Social Isolation?
Looking ahead to the future of VR, Travis acknowledges the concern that VR will be just one more screen addiction in our future.

“Today people grapple with screen addiction and feeling tethered to their smartphones. And I think about that,” he says.

But at a Positron event held at the Ace Hotel in downtown Los Angeles, he and his team observed another side to VR. Attendees were taking their headsets off, gathering at the bar, and wanting to talk about their VR experience afterwards.

“One of the things we found was that having the chairs synchronized,” says Travis “and having the experience start at the same time creates community and a social experience.”

VR and Treating PTSD
Asked about potential applications of VR in mental health, he has high hopes for the future.

“One of the anticipated uses of this is treating PTSD, and anxiety disorders and I think having biometrics and that real-time feedback will be exciting,” explains Travis, who says that Positron is looking into integrate biometrics into the chairs’s built-in PC. “I’m excited to see what happens with VR in mental health.”

In regards to opportunities for VR applications beyond movies, Stanford University Professor and VR expert Jeremy Bailenson seems to agree.

In his book Experience on Demand: What Virtual Reality Is, How it Works, and What It Can Do Professor Bailenson says studies in the lab have shown that VR is an experience that feels real to the brain. Because it feels so immersive, early studies indicate that VR can improve our ability to recover from trauma, to communicate, and to learn.

Explaining how VR is fundamentally different Bailenson says,“Virtual Reality is not a media experience, it’s an actual experience.”

According to Bailenson Hollywood’s excitement over VR aside, the “killer app” or best uses of VR may not be those that leap to mind immediately. Training athletes, healing mental health issues, and even using VR to create empathy for societal issues like climate change, are some of the VR studies his lab is diving into.

Positron’s CEO admits that perhaps the “holy grail” of VR content that inspires has yet to be created. But he sees VR as a new technology that has evolved since the 1990s when bulky VR headsets and graphics with very low polygon count were the norm.

“When you put the headsets on [in the next two to three years] you may not be able to tell the difference between Virtual Reality and reality,” according to Travis.

“It is still the dawn and in four or five years, this will be a huge industry for telling VR stories that matter.”

Positron closed a $1.4 Million seed funding deal in January 2018, provided by Lazar Ventures. The company says that Voyager VR chairs will be coming to cinemas, VR centers, hotels, museums and airports later this year.

Caltech Expert Explains What’s Next in Robots

In the blockbuster sci-fi movie Elysium the main character Max (Matt Damon) is fitted with an exoskeleton, a sort of body armor that augments his strength.

“Things like the Elysium exoskeleton is closer than we think,” says Aaron D. Ames, PhD and Bren Professor of Mechanical Engineering at the California Institute of Technology (Caltech) in Pasadena, California just outside Los Angeles.

“Robotic assistive devices that help you walk,” according to Dr. Ames who is Caltech University’s resident robot authority, ”or give you a workout, or help do your job better might be something in the future.”

Dr. Ames is part of the team at Caltech’s new Center for Autonomous Systems and Technologies (CAST). Launched in October, 2017 CAST will unite over a dozen engineers and scientists from many disciplines to advance research on robotics, drones, driverless cars, and machine learning.

Ames also heads up the Advanced Mechanical Bipedal Experimental Robotics (AMBER) Lab at Caltech, in the Department of Computing and Mathematical Sciences. He has spent twenty years designing algorithms and efficiency equations, to teach robots to generate their own walking gait.

Ames specializes in research on humanoid robots, robots that walk like a human on two legs using bipedal locomotion. He developed DURUS, an ultra-efficient walking humanoid robot. The SRI DURUS prototype walks on a treadmill using a heel-toe strike much like a human, and even wears Addidas sneakers. Currently AMBER lab works with the custom built robot AMBER 3M and Cassie (built by agilityrobotics).

Professor Ames has a list of degrees including a PhD in electrical engineering and computer sciences from UC Berkeley, and post doc awards to take up a lengthy paragraph on his Caltech bio page.

But in person he doesn’t exactly fit the mold of shy mathematician. He speaks to his students in a booming voice that hardly needs a microphone to project from the stage. And standing at over six feet, with dark brown hair and beard, and buffed out bi-ceps he could pass for someone you might run into at the gym.

He is a theoretical mathematician who lectures on the Lyapunov function and unified control framework mathematical computations. But Ames is also a science fiction fan at heart. So speaking to a small group of students and alumni at Caltech recently, he referenced popular sci-fi movies to frame what the future might hold.

“The idea behind iRobot starring Will Smith, that we’ll all be living with assistive robots that take out the garbage or have feelings,” says Ames, “is not something I see near term.”

Instead according to Ames, the most immediate changes that we can expect to see from robots and artificial intelligence in the next twenty years will be appear in the realm of: how we drive, how we work, and even how we recover from injury.

Robots are getting more complex and artificial intelligence is also getting more complex. So while Ames’s research focuses on the robot platform, he says the real potential lies in understanding the connecting point between robots and AI.

The industry research seems to agree. Leading research firm Boston Consulting Group (BCG) is conservatively projecting that the robotics industry will reach $87 billion by 2025.

Sci-fi movie Minority Report according to Ames, is a closer depiction by Hollywood of reality. The film depicts Washington D.C as the city of the future that includes autonomous self-driving vehicles whizzing around streets. In the film’s jaw-dropping chase scene, Tom Cruise’s self-driving vehicle is constantly negotiated pathway to its target destination, while avoiding traffic jams.

This swarm technology and collision control depicted in Minority Report, is in fact something that CalTech’s new CAST center will be working on for years to come.

“In terms of the Google car,” he says “We’re not quite there yet.”

According to Ames for the immediate future there is still much work to do in building algorithms, designing efficient hardware, and integrating AI into existing robots and drones.

Today at Caltech’s new Center for Autonomous Systems and Technologies (CAST), Professor Ames will be working with artificial intelligence and machine learning experts on this problem.

To illustrate the current obstacles facing autonomous robots and cars, he cues up a YouTube video clip of Boston Dynamics’s Atlas robot striding through the snow. The jaw-dropping clip shows a 6” tall 320 1b humanoid robot name Atlas walking in the snow using its upper limbs to negotiate the rough outdoor terrain.

“What the YouTube clip didn’t show,” says Ames, “was the person in the background using a joystick to drive the robot.”

Ames points to the video to a joystick operator casting his or her shadow in the snow, causing a few chuckles in the audience.

“Is that robot learning?”

Ames reminds his students that the Atlas robot is being fed commands (from the joystick operator), and is following functions of dynamics and control. The robot certainly looks cool, but is not a display of artificial intelligence, machine learning, or autonomy on a robot platform.

The hope is that CAST will create an environment that makes needed interdisciplinary collaboration possible – to bring together the state-of-the-art in robots and machine learning and Artificial Intelligence.

The new CAST center will focus on the study of mechanical engineering, AI and machine learning that could be applied to different platforms – exoskeletons, robots, or self-driving cars.

Ames says that is has been rare for systems designers like himself to interact with AI designers.

Could the next generation humanoid robot developed at CAST, one day hold the key to self-driving cars?

That just might be the million-dollar question.