Catching the whiff of success

A team made led by City University London’s Mixed Reality Lab and other university academics are finalists in the HackingBullipedia Global Challenge, aimed at discovering the most inventive design and technology to support the world’s largest repository of gastronomic knowledge.

A combined team comprising academics from City University London’s Mixed Reality Lab, University of Aix-Marseille (France) and Sogang University (South Korea) has made the final of this year’s HackingBullipedia Global Challenge aimed at discovering the most inventive design and technology to support the world’s largest repository of gastronomic knowledge.

Led by Professor Adrian Cheok, Professor of Pervasive Computing in the School of Informatics, their competition entry is titled “Digital Olfaction and Gustation: A Novel Input and Output Method for Bullipedia”.

The team proposes novel methods of digital olfaction and gustation as input and output for internet interaction, specifically for creating and experiencing the digital representation of food, cooking and recipes on the Bullipedia. Other team members include Jordan Tewell, Olivier Oullier and Yongsoon Choi.

No stranger to digital olfaction applications in the culinary space, Professor Cheok recently gave a Digital Taste and Smell presentation to the third top chef in the world, Chef Andoni Luiz Aduriz, at Mugaritz restaurant in San Sebastian, Spain.

The HackingBullipedia Global Challenge was created by the renowned world leading culinary expert, Chef Ferran Adria I Acosta.

The jury, comprising some of the best culinary and digital technology experts in the world arrived at a shortlist of four teams after carefully sifting through 30 proposals from three continents drawn from a mix of independent and university teams.

The other teams in the final are from Uni­ver­si­tat Pom­peu Fabra (Barcelona); the Tech­ni­cal Uni­ver­sity of Cat­alo­nia; and an independent (non university) team from Madrid.

On the 27th of November, two representatives from each of the four finalist teams will pitch their proposal and give a demonstration to the competition’s judges after which the winner will be decided.

Professor Cheok is very pleased that City will be in the final of the competition final:

“I am quite delighted that we were able to make the final of this very challenging and prestigious competition. There were entries from various parts of the world covering a broad spectrum of expertise including a multidisciplinary field of scientists, chefs, designers, culinary professionals, data visualisation experts and artists. We are confident that our team has prepared an equally challenging and creative proposal which will be a game-changer in the gastronomic arena.”

[http://hackingbullipedia.org/thechallenge/overview]

The Multi-Sensory Internet Brings Smell, Taste, and Touch to the Web

The Multi-Sensory Internet Brings Smell, Taste, and Touch to the Web

By Gian Volpicelli

Interview article from Motherboard:

Motherboard1
Adrian Cheok with his taste-transmitting device. Photos by Jonathan Shkurko

Adrian Cheok, professor of pervasive computing at City University London and director of the Mixed Reality Lab at the National University of Singapore, is on a mission to transform cyberspace into a multi-sensory world. He wants to tear through the audiovisual paradigm of the internet by developing devices able to transmit smells, tastes, and tactile sensations over the web.

Lying on the desk in Cheok’s lab is one of his inventions: a device that connects to a smartphone and shoots out a given person’s scent when they send you a message or post on your Facebook wall. Then there’s a plexiglass cubic box you can stick your tongue in to taste internet-delivered flavours. Finally, a small plastic and silicone gadget with a pressure sensor and a moveable peg in the middle. It’s a long-distance-kissing machine: You make out with it, and your tongue and lip movements travel over the internet to your partner’s identical device—and vice versa.

“It’s still a prototype but we’ll be able to tweak it and make it transmit a person’s odour, and create the feeling of human body temperature coming from it,” Cheok says, grinning as he points at the twin make-out machines. Just about the only thing Cheok’s device can’t do is ooze digital saliva.

I caught up with Cheok to find out more about his work toward a “multi-sensory internet.”

Motherboard2
The make-out device, plugged into an iPhone

 

Motherboard: Can you tell us a bit more about what you’re doing here, and what this multi-sensory internet is all about?

There is a problem with the current internet technology. The problem is that, online, everything is audiovisual and behind a screen. Even when you interact with your touchscreen, you’re still touching a piece of glass. It’s like being behind a window all the time. Also, on the internet you can’t use all your senses—touch, smell and taste—like you do in the physical world.

Here we are working on new technologies that will allow people to use all their senses while communicating through the Internet. You’ve already seen the kissing machine, and the device that sends smell-messages to your smartphone. We’ve also created devices to hug people via the web: You squeeze a doll and somebody wearing a particular bodysuit feels your hug on their body.

What about tastes and smells? How complex are the scents you can convey through your devices?

We’re still at an early stage, so right now each device can just spray one simple aroma contained in a cartridge. But our long-term goal is acting directly on the brain to produce more elaborated perceptions.

What do you mean?

We want to transmit smells without using any chemical, so what we’re going to do is use magnetic coils to stimulate the olfactory bulb [part of the brain associated with smell]. At first, our plan was to insert them through the skull, but unfortunately the olfactory part of the brain is at the bottom, and doing deep-brain stimulation is very difficult.

And having that stuff going on in your brain is quite dangerous, I suppose. 

Not much—magnetic fields are very safe. Anyway, our present idea is to place the coils at the back of your mouth. There is a bone there called the palatine bone, which is very close to the region of your brain that makes you perceive smells and tastes. In that way we’ll be able to make you feel them just by means of magnetic actuation.

Motherboard3
Cheok demonstrates the taste-transmitter

 

But why should we send smells and tastes to each other in first place?

For example, somebody may want to send you a sweet or a bitter message to tell you how they’re feeling. Smell and taste are strongly linked with emotions and memories, so a certain smell can affect your mood; that’s a totally new way of communicating. Another use is commercial. We are working with the fourth best restaurant in the worldin Spain, to make a device people can use to smell the menu through their phones.

Can you do the same thing also when it comes to tactile sensations? I mean, can you put something in my brain to make me feel hugged? 

It is possible, and there are scientists in Japan who are trying to do that. But the problem with that is that, for the brain, the boundary between touch and pain is very thin. So, if you perform such stimulation you may very easily trigger pain.

It looks like you’re particularly interested in cuddling distant people. When I used to live in Rome, I once had a relationship with a girl living in Turin and it sucked because, well, you can’t make out online. Did you start your research because of a similar episode?

Well, I have always been away from my loved ones. I was born in Australia, but I moved to Japan when I was very young, and I have relatives living in Greece and Malaysia. So maybe my motivation has been my desire to feel closer to my family, rather than to a girl. But of course I know that the internet has globalized our personal networks, so more and more people have long-distance relationships. And, even if we have internet communications, the issue of physical presence is very relevant for distant lovers. That’s why we need to change the internet itself.

Motherboard4
The scent device in action

 

So far you have worked on a long-distance-hugging device and a long-distance-kissing machine. We also have gadgets that can transmit a person’s body odour. If I connect the dots, the next step will be a device for long-distance sex.

Actually, I am currently doing some research about that. You see, the internet has produced a lot of lonely people, who only interact with each other online. Therefore, we need to create technologies that bring people physically—and sexually—together again. Then, there’s another aspect of the issue…

What’s that?

As you noticed, if you put all my devices together, what you’re going to have soon are sorts of “multi-sensory robots”. And I think that, within our lifetime, humans will be able to fall in love with robots and, yeah, even have sex with them.

It seems to me all the work you’re doing here may be very attractive for the internet pornography business.

Of course, one of the big industries that could be interested in our prototypes is the internet sex industry. And, frankly speaking, that being a way of bringing happiness, I think there’s nothing wrong with that. Sex is part of people’s lives. In addition, very often the sex industry has helped to spur technology.

But so far I haven’t been contacted by anybody from that sector. Apparently, there’s quite a big gap between people working in porn and academia.

By Gian Volpicelli

Seminar Multisensory Internet Communication and Virtual Love Chaired by Sir Peter Williams CBE, Speakers Adrian David Cheok and David Levy

Love and sex with robots seminar

 

Seminar details:

26 November 2013

Event time: 6:00 – 7:20pm

Drinks reception: 7:20pm – 8:00pm

Daiwa Foundation Japan House, 13/14 Cornwall Terrace, Outer Circle, London NW1 4QP

Organised by The Daiwa Anglo-Japanese Foundation

[btn text=”Booking Form” tcolor=#FFF bcolor=#FF0000 link=”http://www.dajf.org.uk/events/booking-form”]

 

 

Seminar

Multisensory Internet Communication and Virtual Love

The era of hyperconnected internet allows for new embodied interaction between humans, animals and computers, leading to new forms of social and physical expression. The technologies being developed will in the future augment or mix the real world together with the virtual world. Humans will be able to experience new types of communication environments using all of the senses, where we can see virtual objects in the real environment, virtually touch someone from a distance away, and smell and taste virtual food. Our physical world will be augmented with sensors connected to the internet, buildings and physical spaces, cars, clothes and even our bodies. During the seminar, we will discuss some different research prototype systems for interactive communication, culture, and play. This merging of computing with the physical world may lead to us developing personal feelings for computers, machines and robots, which we will discuss in the second part of the seminar. In the second part, we will be inviting the audience to join us in an exploration of the limits of artificial intelligence. What will it mean for society when artificial intelligence researchers succeed in creating sophisticated artificial personalities, artificial emotions and artificial consciousness? When robots are also endowed with the ability to recognize what we say and what we mean, will they be able to carry on interesting, amusing, intelligent and friendly, even loving conversations with us? How will humans react to this new breed of “person” that can say “I love you” and mean it? These are some of the questions that will touch on the possibility of love, sex and marriage with robots.

 

About the contributors

Professor Adrian David Cheok Professor Adrian David Cheok is Professor of Pervasive Computing at City University London and Founder and Director of the Mixed Reality Lab. His background is in Engineering, and he gained his PhD at the University of Adelaide in 1999. After working at the National University of Singapore and Mitsubishi Electric in Japan, he became Professor at Keio University in the Graduate School of Media Design. His research is concerned with mixed reality, human-computer interfaces, wearable computers, pervasive and ubiquitous computing. He is a recipient of many awards and prizes, including the Hitachi Fellowship, the Microsoft Research Award in Gaming and Graphics and the SIP Distinguished Fellow Award, and was designated as a Young Global Leader by the World Economic Forum in 2008. Professor Cheok often discusses his work on media outlets such as the BBC, CNN and the Discovery Channel, and also works as Editor in Chief of three academic journals, one of which is Lovotics: Academic Studies of Love and Friendship with Robots.

 

Dr David Levy Dr David Levy is President of the International Computer Games Association, and CEO of the London based company Intelligent Toys Ltd. He graduated from the University of St. Andrews in 1967, and moved into the world of business, professional chess playing and writing. He has written more than thirty books on chess, and was awarded the International Master title by FIDE, the World Chess Federation in 1969. In 1968, David started a bet with four Artificial Intelligence professors that he would not lose a chess match against a computer program within ten years. He won that bet. Since 1977 David has been involved in the development of many chess playing and other programs for consumer electronic products. David’s interest in Artificial Intelligence has expanded beyond computer games into other areas of AI, including human-computer conversation, and in 1997 he led the team that won the Loebner Prize competition in New York, which he won again in 2009. His fiftieth book, Love and Sex with Robots, was published in November 2007, shortly after he was awarded a PhD by the University of Maastricht for his thesis entitled Intimate Relationships with Artificial Partners.

 

Sir Peter Williams CBE (chair)
Sir Peter Williams CBE is the Chairman of the Daiwa Anglo-Japanese Foundation, and has a PhD in Engineering from the University of Cambridge. He has previously served as Honorary Treasurer and Vice President of the Royal Society, Chairman of the National Physical Laboratory, Chancellor of the University of Leicester, Chairman and Chief Executive of Oxford Instruments plc, Deputy Chief Executive of VG Instruments Ltd., Master of St. Catherine’s College Oxford, Chairman of Trustees of the Science Museum and Chairman of the Engineering & Technology Board. He has advised Government on issues of science and education, including the ‘Williams Review’ of primary mathematics in 2008 and in 2010 was a member of an international review of the Intergovernmental Panel on Climate Change (IPCC) for the UN Secretary General. He was knighted in 1998 and is a Fellow of the Royal Society and of the Royal Academy of Engineering.