Professor Adrian David Cheok was awarded the Distinguished Alumni Awards by University of Adelaide in recognition of his achievements and contribution in the field of Computing, Engineering and Multisensory communication. The Distinguished Alumni Awards recognise the outstanding contribution and significant impact made by alumni of the University.
Professor Adrian Cheok obtained a Bachelor of Engineering (Electronic) in 1994 and a PhD in Engineering in 1999. Professor Cheok is a pioneer in mixed reality and multisensory communication; his innovation and leadership has been recognised internationally through multiple awards.
Some of his pioneering works in mixed reality include innovative and interactive games such as ‘3dlive’, ‘Human Pacman’ and ‘Huggy Pajama’. Professor Cheok is also the inventor of the world’s first electric and thermal taste machine, which produces virtual tastes with electric current and thermal energy.
Past winners include Julia Gillard. A full list of past winners can be found here.
You can now send tastes, smells and even kisses virtually. A new age of virtual reality involving all five senses is here, with Adrian Cheok of the Imagineering Institute at its forefront.
Daniel Soo | August 1, 2016 | Editorials
AsianScientist (Aug. 1, 2016) – Once the stuff of science fiction, virtual reality (VR) technologies are becoming ever more present in our daily lives. From documentaries that bring you to the Great Barrier Reef, to games that let you soar through the air as an eagle, this technology looks set to change the world as we know it. As director of the Imagineering Institute, Malaysia, and founder and director of the Mixed Reality Lab, Singapore, engineer and inventor Dr. Adrian Cheok believes that the future of mixed reality—the integration of the virtual and physical world—belongs to smell, taste and touch. Cheok has played a key role in the innovation of various mixed reality devices and applications, some of which have already been released commercially. His Huggy Pajamas comforts anxious children by allowing them to receive virtual hugs sent from their parents. More intriguingly, his Scentee smart phone attachment allows one to send and receive various scents on command. Asian Scientist Magazine recently chatted with Cheok, who was in Singapore to deliver the keynote speech for the Singapore Science Festival’s Visual SG event, to find out what has been keeping him busy.
In your own words, what is mixed reality?
Mixed reality is the merging of our physical reality with virtual reality, which can be done at the level of all five senses. Science has shown that we communicate with all of our five senses. In fact, non-verbal communication is more than half of human communication, and that’s why it’s still very different to have a meeting with someone with a video call than to meet them in person. Something is missing when you just communicate with video or through the internet. I believe that in the future, we will be able to communicate with every one of our senses through the internet, and move from the age of information that we are in today to the age of experience.
What are some applications of mixed reality?
Telepresence will of course be a big example of how mixed reality can be applied. If we can transmit all five senses, telepresence would allow people to really feel like they are together. They could touch each other, or even share a dinner together, even though they may be on totally different sides of the world. Mixed reality devices can also create new kinds of communication. If you can digitally taste and smell, then you can have an app on your smartphone and virtually taste and smell a dish at a famous restaurant. Another big benefit is also of collaboration. For example, you could collaborate with one or many people, like cooking a dish together through the internet. Another application that mixed reality will lead to is new kinds of learning. For example, instead of reading a book or watching a movie about ancient Rome, you could feel what it’s like to be there and even taste and smell what it’s like to live in an ancient city. This would create a totally new kind of learning because we humans learn very much experientially.
What first drew you to work on virtual reality?
I first began by looking at augmented reality systems, which allow people to see virtual 3D objects in the physical world. I noticed that the first thing people did was to try to touch the objects. That’s when I realized that we have to extend augmented reality beyond the 3D graphics that we see on our video games and movies. We need to use touch, taste and smell to really create a sense of presence in the virtual world. That’s what I call experience communication: not just sharing information, but sharing your experience.
What are some of the limitations to engaging our five senses using virtual reality?
Right now, we are concentrating on making the technology that allows for the virtual communication of touch, taste and smell by digitizing these senses. It’s a very difficult problem. Fundamentally, sound and light are frequencies: they are wave-based, and can be transmitted into digital bits onto the internet. The fundamental problem with touch, taste and smell is that it’s a different kind of sense. For smell, we sense molecules, which triggers some electrical pulse to your brain. It’s still very much in the early stages of research but we have successfully been able to produce virtual taste, using electrical signals only and without any chemicals. We are now working on smell, which is an even more difficult problem. Unlike taste, where we only have five different taste receptors, most scientists estimate us having a few thousand different kinds of individual smell receptors. How do we stimulate those individual receptors? It’s going to be a very big problem, but so far we have gotten some success by using small electrodes on the inside of the nose to stimulate olfactory sense.
How do you think virtual reality technology will evolve in the next 20 years?
Technology is increasing at such an exponential rate that we can’t imagine exactly what the world will be like, but we can imagine that it’ll be incredibly different. I think that whatever that doesn’t break the laws of physics can be invented by humans. We can’t change the fact that we’re born with five different types of taste receptors, or that there’s a specific range of frequencies we can see and hear, but I believe that we can somehow alter our perceptions with technology. For example, the spectrum of light is much wider than what we can see. We can’t see infrared with our naked eyes but we can now visualize it with infrared glasses. In some way, we already live in our own virtual reality: an analog biological virtual reality, because we are just seeing the world we are naturally designed to see. We think that this is the reality but it’s not—that’s why it’ll be so much different when we have virtual reality, because we already don’t see a kind of ‘objective reality.’
Adrian Cheok, professor of pervasive computing at City University London and director of the Mixed Reality Lab at the National University of Singapore, is on a mission to transform cyberspace into a multi-sensory world. He wants to tear through the audiovisual paradigm of the internet by developing devices able to transmit smells, tastes, and tactile sensations over the web.
Lying on the desk in Cheok’s labis one of his inventions: a device that connects to a smartphone and shoots out a given person’s scent when they send you a message or post on your Facebook wall. Then there’s a plexiglass cubic box you can stick your tongue in to taste internet-delivered flavours. Finally, a small plastic and silicone gadget with a pressure sensor and a moveable peg in the middle. It’s a long-distance-kissing machine: You make out with it, and your tongue and lip movements travel over the internet to your partner’s identical device—and vice versa.
“It’s still a prototype but we’ll be able to tweak it and make it transmit a person’s odour, and create the feeling of human body temperature coming from it,” Cheok says, grinning as he points at the twin make-out machines. Just about the only thing Cheok’s device can’t do is ooze digital saliva.
I caught up with Cheok to find out more about his work toward a “multi-sensory internet.”
Motherboard: Can you tell us a bit more about what you’re doing here, and what this multi-sensory internet is all about?
There is a problem with the current internet technology. The problem is that, online, everything is audiovisual and behind a screen. Even when you interact with your touchscreen, you’re still touching a piece of glass. It’s like being behind a window all the time. Also, on the internet you can’t use all your senses—touch, smell and taste—like you do in the physical world.
Here we are working on new technologies that will allow people to use all their senses while communicating through the Internet. You’ve already seen the kissing machine, and the device that sends smell-messages to your smartphone. We’ve also created devices to hug people via the web: You squeeze a doll and somebody wearing a particular bodysuit feels your hug on their body.
What about tastes and smells? How complex are the scents you can convey through your devices?
We’re still at an early stage, so right now each device can just spray one simple aroma contained in a cartridge. But our long-term goal is acting directly on the brain to produce more elaborated perceptions.
We want to transmit smells without using any chemical, so what we’re going to do is use magnetic coils to stimulate the olfactory bulb [part of the brain associated with smell]. At first, our plan was to insert them through the skull, but unfortunately the olfactory part of the brain is at the bottom, and doing deep-brain stimulation is very difficult.
And having that stuff going on in your brain is quite dangerous, I suppose.
Not much—magnetic fields are very safe. Anyway, our present idea is to place the coils at the back of your mouth. There is a bone there called the palatine bone, which is very close to the region of your brain that makes you perceive smells and tastes. In that way we’ll be able to make you feel them just by means of magnetic actuation.
But why should we send smells and tastes to each other in first place?
For example, somebody may want to send you a sweet or a bitter message to tell you how they’re feeling. Smell and taste are strongly linked with emotions and memories, so a certain smell can affect your mood; that’s a totally new way of communicating. Another use is commercial. We are working with the fourth best restaurant in the world, in Spain, to make a device people can use to smell the menu through their phones.
Can you do the same thing also when it comes to tactile sensations? I mean, can you put something in my brain to make me feel hugged?
It is possible, and there are scientists in Japan who are trying to do that. But the problem with that is that, for the brain, the boundary between touch and pain is very thin. So, if you perform such stimulation you may very easily trigger pain.
It looks like you’re particularly interested in cuddling distant people. When I used to live in Rome, I once had a relationship with a girl living in Turin and it sucked because, well, you can’t make out online. Did you start your research because of a similar episode?
Well, I have always been away from my loved ones. I was born in Australia, but I moved to Japan when I was very young, and I have relatives living in Greece and Malaysia. So maybe my motivation has been my desire to feel closer to my family, rather than to a girl. But of course I know that the internet has globalized our personal networks, so more and more people have long-distance relationships. And, even if we have internet communications, the issue of physical presence is very relevant for distant lovers. That’s why we need to change the internet itself.
So far you have worked on a long-distance-hugging device and a long-distance-kissing machine. We also have gadgets that can transmit a person’s body odour. If I connect the dots, the next step will be a device for long-distance sex.
Actually, I am currently doing some research about that. You see, the internet has produced a lot of lonely people, who only interact with each other online. Therefore, we need to create technologies that bring people physically—and sexually—together again. Then, there’s another aspect of the issue…
As you noticed, if you put all my devices together, what you’re going to have soon are sorts of “multi-sensory robots”. And I think that, within our lifetime, humans will be able to fall in love with robots and, yeah, even have sex with them.
It seems to me all the work you’re doing here may be very attractive for the internet pornography business.
Of course, one of the big industries that could be interested in our prototypes is the internet sex industry. And, frankly speaking, that being a way of bringing happiness, I think there’s nothing wrong with that. Sex is part of people’s lives. In addition, very often the sex industry has helped to spur technology.
But so far I haven’t been contacted by anybody from that sector. Apparently, there’s quite a big gap between people working in porn and academia.
In the third of a series of reports on the highlights of the Singapore Science Festival this year, we look how scientists are making it possible to create virtual worlds real enough to smell and taste.
SINGAPORE — See animals in a zoo that are so real, that you can smell them, when you are merely standing in a room with a video feed. Or send a kiss virtually, one that can be felt.
These are the “realities” Professor Adrian Cheok wants to create, in his mission to make communicating digitally more realistic — by making it possible to send touch, smells and tastes over the Internet.
Prof Cheok, who is from the Imagineering Institute, is among the scientists working in the field of simulation technology, to change the way people experience not just communications, but also science.
For example, children in Spain with certain diseases, who typically spend many hours in the hospital, can now “visit” the zoo virtually through a live 3D video feed, as part of an ongoing study by the Malaysia-based Institute and the University of Valencia to see if virtual zoos have a positive effect on the children.
Unlike sound and light, which are frequency-based and can be sent digitally, taste and smell are chemical-based, so the challenge is in using electrical signals to stimulate these senses, said Prof Cheok, who is working on a device that sends kisses virtually — with the help of a silicon device attached to a mobile phone — and another which artificially produces taste sensations.
Taste and smell are the “most difficult senses” to digitise, said Prof Cheok, and he is working on a project that has electrodes implanted inside the nose to stimulate olfactory receptor neurons to produce a smell.
Another device connects to the tongue to change its temperature, producing a sweet taste through thermal energy.
“In the future it may not be silver electrodes, it could be cutlery which has electrical signals … (with) a tiny wireless electrode in your nose connected to your smartphone,” he said.
Currently, simulation technology in the market is focused on augmented reality, and more widely seen in advertising and entertainment, he said. But researchers focus on inventing new technology and then work with businesses to figure out how it can be made into useful commercial products.
Simulation technology can recreate novel experiences without causing harm, such as allowing people to “unwrap” a mummy — via a visualisation table at the Science Centre Singapore.
Created using X-rays, laser scans and photos of a real mummy, Neswaiu, a wealthy Egyptian priest who lived in the third century BC, visitors can zoom in on the touchscreen to examine Neswaiu’s sarcophagus, and then peel off layers of the body on a touchscreen to study the anatomy, internal organs and even the amulets buried together with him.
The technology behind the Mummy Explorer was originally created for visual medical images so doctors could perform virtual autopsies, but the team “quickly realised there were wider opportunities for the use of the technology”, said Prof Anders Ynnerman, the scientist behind the technology.
The explorer “is a very nice way of being able to engage people … (and let) the general public have a feeling for what scientists are doing and what the scientific exploration process is like”, since they can actively participate and freely explore the mummies themselves.
The first interactive touch table was completed in 2009, and has been used in other museums, like the British Museum and the Museum of Mediterranean and Near Eastern Antiquities in Stockholm.
Prof Ynnerman said the British Museum was initially worried that the dazzle of a digital artefact would leave the real mummy in the cold. But they found that people spent thrice the amount of time at the exhibit, looking at the mummy, his digital counterpart, and then at the mummy again to study it closely.
He hopes that with such interactive visualisations can shorten the distance between scientific research and the general public, so that through exploring, they can feel “part of the scientific discovery”.