Imagine watching the All Black No 10 lining up to take the winning penalty in the 2019 World Cup and being able to experience the emotions he is going through.
If he has nerves of steel, you can hear his heartbeat stay steady and sensors show he is hardly sweating.
Or, if you are not into sport, you could share the joy of a wedding, hearing the thumping of the bridegroom’s heart as his bride-to be walks down the aisle.
This sharing of experiences is what a wearable technology product being developed at the University of Canterbury hopes to do.
It works by sharing video and data captured from a device worn by someone and then displaying it for another person to experience. It is taking the concept of sharing to a whole new level.
So, if a friend wants to share the thrill he gets from skydiving you would sit at home in front of your monitor seeing what he sees. If he’s wearing a 360-degree, live-streaming camera you could also see what is around him.
As he prepares to jump, you would hear his heart rate increase and maybe the screen would become brighter to convey the happiness he was experiencing (or darker if he is scared).
To have an even more immersive experience you could wear virtual reality goggles like the Oculus Rift. All this technology exists and Professor Mark Billinghurst and his team at the Human Interface Technology Laboratory (Hit Lab NZ) in Christchurch has built a prototype, though it doesn’t perform at the level described above – yet.
Billinghurst describes the product as a form of teleportation.
“One person at one location can share their experience with another person at a different location,” he said
It is part of a wearable technology trend that is predicted to be a billion-dollar industry in the next few years.
With products like Google Glass available more experiences are being captured in new and exciting ways.
“Our research aims to add another layer to that, by not only allowing people to see through your eyes, but also know what you’re feeling.”
Billinghurst said there was a convergence of technology at the moment that made it possible to push the boundaries of what is being done.
“We’ve already got a Google-Glass prototype and the emotion recognition sensors working. We can use some physiological sensors that might measure levels of excitement, or fear or nervousness and share that with people as well.
“We are now working on joining the two together, and creating a powerful user experience.”
This includes improved interpretation by computers of facial expressions so you can measure what a person is feeling. While this technology is not new, the Hit Lab is one of the first to convey this information through to another person.
They are doing this in several ways from something as basic as a chart showing increases and decreases of certain feelings, showing colours (eg red for anger), playing music and adding filters to video (eg brighter for happiness).
“There’s been a lot of research done to recognise people’s emotions but very little research about how we can use that research to share that emotion with someone else,” Billinghurst said.
“What we’re looking at here is really a new type of communication experience.”
The prototype would be on show at the TedXChristchurch event next month, Billinghurst said.
After that funding from Samsung runs out so the Hit Lab will be seeking more money to progress the project.
The project is part of Samsung’s Think Tank Team that helps promote research that creates new ways of capturing, displaying, sensing, feeling, producing and interaction.