The Future of the Digital Multi-Sensory Consumer Experience

Screen Shot 2015-10-20 at 1.05.32 am

14 July 2014 By Kate Nightingale

We live in an increasingly digital world. We work, shop and play digitally most of the time or at least the digital device is used at some point during these activities. Moreover, the form of marketing being used to reach the majority of consumers is increasingly digital marketing as countries all over the world have access to internet whether via computer or a mobile device. And that number is only going up.

Basically most of our daily activities are facilitated, shared by or experienced with some type of digital device. The crucial word in here is ‘EXPERIENCE’. We all search for meaningful, intriguing or shocking experiences every day of our lives. Whether it’s sipping cafe au lait in a romantic cafe in Paris, watching a chick flick with your girlfriends and running out of tissues, or meeting your new love for the first time. All these experiences have one thing in common: they are multi-sensory. The smell of that freshly brewed coffee, the warmth and complexity of that first taste, the view of Eiffel Tower, the passion and musicality of French language…

Feeling like jumping on a Eurostar for a quick Paris experience? Now imagine that you can have all that in a comfort of your home. I know, it probably won’t feel as romantic and extraordinary as in real life but it certainly will be possible in not too distant future.

Scientist are heavy at work developing technologies that will allow you to transfer smells, tastes and textures digitally or even at some point create an augmented/virtual reality of a Paris cafe with all those sensations available for you. But they are also teaching computers how to see, smell or develop nutritious and healthy tastes with goal of improving our lives.

One of the better developed areas of research is on seeing. There are already plenty of programmes available that can, for example, read our emotions while we watch an advert so the advertising executives know whether the ad they have produced will have a desired effect. One of these programmes is the FaceReader developed by VicarVision which also has been recently introduced for online use. Another exciting project of VicarVision is Empathic Products using emotion recognition to, for example, personalise digital signage and adverts in shopping centres.

How about social media analytics and consumer insight? As we share more and more visual content and less text, the need for analysing our likes and dislikes based on the photos we share became urgent. Fortunately companies like Curalate have developed the software to help companies gain useful insight from visual content or allow them to send personalised offers based on the photos people share via Instagram.

But these are not the most exciting developments. Much more intriguing and perhaps slightly shocking technologies are being developed to help us touch, sniff and taste digitally.

We already have various vibrations on mobile devices to let us know when we perform certain functions. Notice the difference in vibrations when you press the keyboard to when you receive a text or tweet? This is nothing! Soon we will be able to feel textures of fabrics and other materials via the use of ‘microscopic’ vibrations send to our mobile devices.

Imagine shopping online for a dress and being able to feel the textures of the fabric it is made of. Or looking at an advert of a jumper on a train station and being able to touch it and obviously buy it instantly. Or think about the possibilities for B2B market – buyers being able to check the texture and quality of the product virtually before ordering thousands of items to sell in their stores. And how about feeling the temperature or the climate via your phone? This will add a completely another dimension to booking travel and, who knows, maybe even virtual travel. Virgin Holidays opened last year a real-life version of such experience, sensory holiday laboratory as they called it, last year in Bluewater where you can stand on a sandy beach, smell the sea and take photographs to share on your social media. Now imagine the same experience in your living room…

The area of research which is working on making it possible is called HAPTICS, as in haptic (touch) perception. One of the experts in the field is Katherine Kuchenbecker who runs the Haptics Group in the University of Pennsylvania. In this short video she explains some of the research the group is working on and introduces the term Haptography, a photography with haptic qualities. How about Instagraming or Tweeting a picture of a cat that you can actually stroke?! Oooh!

IBM Research lab is yet another institution working on developing such technology. They explain that at the beginning it will take a form of a dictionary with, for example, silk having a specific vibration definition that a company will be able to use to represent the fabric they used. However, eventually we will be able to touch digitally in real time.

Immersion Corporation, founded in 1993, is a pioneering company in the use of haptics to enhance digital experience. They are developing some really interesting technologies for mobile, gaming and even films and sport. They have, for example, created an engine that automatically translates the audio in the game to haptic feedback. They are also working on applying this to video content such as advert, action movies and sporting broadcast. How would you like feel like you’re on the field during the World Cup Final?! Soon it will be possible.

It all sounds ‘haptastic’ but why would companies invest in that? Immersion Corporation actually did some research on that and found that content with haptics in it increased the viewer’s level of arousal by 25%. From consumer psychology we know that arousal and pleasure are the key motivators to purchase so imagine the effects of the haptic content on your sales figures.

They have also tested a metric used commonly in streaming video called quality of experience. They asked participants to watch 5min long content and divided them into three conditions: no haptics, haptics reflecting the subwoofer experience, and haptics adding to the story-telling. They found that quality of experience was 10-15% higher in subwoofer haptics condition and between 25-30% higher in narrative haptics condition as compared to no haptics content. See more of their research here.

So soon we will be able to touch the dress before we buy it but how about buying perfume or other cosmetics online? Not to worry! Digital scent messaging is already here.

A new invention called oPhone has been just introduced to the market. It allows you send scent messages and even create your own scent impressions. There is also an IPhone app called oSnap which allows you to create sensory oNotes which you can share with your friends. However, to be able to actually smell your creations, your friends will either have to have the oPhone or go to one of the HotSpots, currently only available in Paris and New York. One of the founders Dr. David Edwards says that the scent vocabulary is at the moment limited to some food-related smells but it’s only a matter of time before we will be able to watch a movie and smell the beach we see.

Another inventor in the field is Dr. Adrian Cheok, founder of Mixed Reality Lab in Singapore and professor of pervasive technology in the City University London. He and his team invented a small device called Scentee which you can attach to your smartphone to send various smells to your friends and family. However, you need to have separate cartridge for each smell and the scent vocabulary is currently limited.

Dr. Cheok also works on digital taste, an ability to send tastes via internet and mobile devices. He presented his work last month on the event called the Circus for the Senses that took place in the Natural History Museum during the Universities Week. It certainly had a great reception. Who wouldn’t want to watch their favourite chef preparing a delicious raspberry Pavlova and be able to taste it immediately. I’m sure you will get up right this second and run to buy or make it. No, by then you will be able to press a button on your TV or mobile and it will jump out of the screen onto your table! I know, maybe slightly farfetched but totally possible within I guess about 10 years.

So now we are impressed when we can download movies and music via our mobile or purchase our groceries. In 5 years we will have all these amazing gear available allowing us to sniff, taste and touch what you see on your screen.

However people will still want an experience and social connections. This is where augmented reality or virtual shops and other venues will come into play. Brands will be able to have virtual shops which people can visit from a comfort of their home. I’m not talking about using avatar but to be actually immersed in the multi-sensory virtual brand experience. So you will be able to walk through the virtual shop, touch the merchandise, smell it and even try it on. Imagine the possibilities for the company to personalise this experience to each individual with a touch of a button! Oh, sorry! This will be automated with the state-of-the-art software!

And how about applying such technologies as Face Reader that can read our emotions and other programmes reacting to our biological functions like heart rate and level of arousal to adjust this virtual experience? For example, the computers will be able to see disgust or other unpleasant emotion on your face and attribute it to a smell you perceived. That will allow a retailer to change this olfactory experience to a positive one instantly.

And how about online dating? We will be able to sniff pheromones adding a completely different dimension to an idea of love at first sniff.

Do you know of the Secret Cinema? These are very secretive events where you can truly experience certain movies by being inserted into a specially created set. Imagine now that you can do it from a comfort of your couch. It’s going to be kind of like 3D with added touch, temperature, scent and taste sensations. It will make you feel like you’re a part of the action and, who knows, maybe even insert yourself into a plot. That’s a true co-creation!

Dr. Cheok certainly shares that view as represented in his comment for CNN article: ‘the ultimate direction of goal is a multi-sensory device unifying all five senses to create an immersive virtual reality, and could be usable within five years’.

Of course, before this technology becomes widely available and affordable, companies need to create immersive and co-creative multi-sensory consumer experiences in real life. As research in consumer psychology and marketing shows us this can have incredible effects on the consumer-brand relationship and obviously the bottom line. Look out for our Sense Reports (coming soon) explaining some of these effects.

See more at: http://stylepsychology.co.uk/digitalmultisensoryconsumerexperience/#sthash.5g0R3S0k.Or2f9K8J.dpuf

Etäläsnäolo tulee, oletko valmis? | Prisma Studio

17-3122555cd9fb7e353f

Pelkkä audiovisuaalinen viestiminen on kohta niin passé. Lähitulevaisuudessa viestimme ja sometamme kaikilla viidellä aistillamme. Kokkiohjelmia voi kohta haistaa ja maistaa, suudelmat tulevat perille robottien avulla ja halauksia välitetään älypyjamalla.

Makuaistia huijataan

– Me elämme nyt informaatioaikakautta. Mutta olemme siirtymässä tiedonvälityksestä kokemusten jakamiseen ja pystymme pian välittämään myös kosketuksia, makuja ja hajuja verkon yli. Siitä tulee ihan uudenlaista laajennettua todellisuutta, selittää Lontoon City Universityn tietotekniikan professoriAdrian Cheok.

Adrian Cheok haaveilee, että voimme kohta esimerkiksi maistaa tv:n kokkiohjelmat. Ensimmäinen askel siihen suuntaan on Singaporen kansallisessa yliopistossa kehitetty kieleen kytkettävä simulaattori, jolla huijataan makuaistia sähköisesti esimerkiksi maistamaan happaman maun:

Adrian Cheok on ollut myös mukana kehittämässä puhelimen lisälaitetta, jolla jo nyt voi lähettää tuoksuviestejä verkossa tai herätä uuteen aamuun lempituoksu nenässä. Miten olisi ruusuntuoksuinen syntymäpäiväonnitteluviesti? Tai herkullisen tuoksuinen kaloriton ateria? Tässä vähän esimakua, tai -hajua, jälkimmäisestä:

Yksin yhdessä

Tulevaisuuden teknologiat mahdollistavat siis sen, että voimme kohta kokata ja/tai syödä yhdessä, vaikka olisimmekin kaukana toisistamme, koska voimme jakaa kokemuksemme – aistimamme hajut ja maut – verkon yli.

Fyysistä välimatkaa lyhentämään ja ikävää helpottamaan kehitellään koko ajan uusia välineitä. Osakan yliopistossa on kehitetty ihmisen muotoista, halattavaa tyynyrobottia, jonka sisälle voi sujauttaa puhelimen ja näin kuvitella, ettei puhukaan puhelimessa, vaan tiukassa halauksessa:

Adrian Cheokin johtamassa Mixed Reality Lab:ssa taas on kehitetty halaavaa pyjamaa, joka välittää vaikkapa työmatkalla olevan vanhemman halaukset lapselle, ja Kissenger-robottia, jonka avulla voi suudella netissä:

Eivätkä tutkijat ole unohtaneet lemmikeitäkään. Adrian Cheok on ollut mukana kehittämässä laitteistoa, jonka avulla omistaja voi silitellä lemmikkiään – vaikkapa lemmikkikukkoaan – verkon yli:

Robottiavioliitot tulevat. Miten kännykkä välittää tuoksun ja kosketuksen? Teknologia tunkee kehoomme! Näistä visioista lisää Prisma Studiossa keskiviikkona 23.9. TV1 klo 20.
Uskaliaita väitteitä pöyhimässä futuristi Elina Hiltunen, biotekniikan tutkija Lauri Reuter ja psykologi Jukka Häkkinen. Ohjelmaa luotsaa Marjo Harju.

Source: http://yle.fi/aihe/artikkeli/2015/09/18/etalasnaolo-tulee-oletko-valmis

Sex robots are definitely coming in the future

EX MACHINA - 2015 FILM STILL - Alicia Vikander as Ava
Sex robots have been examined in films like “Ex Machina” where actress Alicia Vikander played a life-like robot named Ava.

It’s Saturday night, 2050. You switch on some music, turn down the lights and flick the switch to ON. No need for dinner or even a clean shirt because tonight, you’re romancing a robot.

That’s the scenario envisaged by David Levy, author of “Love and Sex and Robots,” who predicts it won’t be long before we’re all doing it — with machines.

“It just takes one famous person to say I had fantastic sex with a robot and you’ll have people queuing up from New York to California,” the CEO of Intelligent Toys Limited told News.com.au. “If you’ve got a robot that looks like a human, feels like a human, behaves like a human, talks like a human, why shouldn’t people find it appealing?”

5602973fc4618833668b45e9
Pepper, a well-known Japanese humanoid robot, has been hacked for ‘sexual purposes.’

This November, Levy along with Professor Adrian Cheok will chair the second international congress on “Love and Sex with Robots” in Malaysia. The event will bring together academics from around the world to discuss the legal, ethical and moral questions on everything from “teledildonics” to “humanoids”.

 

Levy said the subject has spawned a huge amount of interest since his 2007 book and it’s only a matter of time before the currently “crude” versions available become more sophisticated and go mainstream.

“If there was a sophisticated sex robot around now, then I would be very curious to try it,” he said.

“It can’t be long before we get to the point that there are robots looking very lifelike and with appealing designs that people find appealing to look at and then it’s a question of how long it will take before the artificial intelligence is developed to the point where they can carry on interesting and entertaining conversations?”

Whether you find it horrifying or appealing, there’s no doubt the idea has taken root in popular culture with films like “Her,” “Lars and the Real Girl” and “Ex-Machina” dedicated to the relationship between humans and machines.

IF THERE WAS A SOPHISTICATED SEX ROBOT AROUND NOW, THEN I WOULD BE VERY CURIOUS TO TRY IT.

This week the makers of Japanese robot Pepper issued a warning, saying using it for “sexual purposes” breaks the rental agreement after people hacked its software to give it “virtual breasts.”

LAS VEGAS, NV - JANUARY 22:  (EDITORS NOTE: Image contains partial nudity.) (L-R) Adult film actresses/directors jessica drake, Asa Akira and Stormy Daniels pose with their Wicked RealDolls at the 2015 AVN Adult Entertainment Expo at the Hard Rock Hotel & Casino on January 22, 2015 in Las Vegas, Nevada.  (Photo by Ethan Miller/Getty Images)
(Left to right) Porn stars Jessica Drake, Asa Akira and Stormy Daniels pose with their RealDolls at the 2015 AVN Adult Entertainment Expo.

Meanwhile real-life technological advances like David Hanson’s human robots or Hiroshi Ishiguro’s version have been making robots look more lifelike by the year. Several versions of robotic sex dolls already exist, include RealDoll made by Californian company Abyss, whose owner David Mills once told Vanity Fair he loves women but “doesn’t really like to be around people.”

But along with advances in artificial intelligence, ethical debate is raging around the use of robots whether in the military, medicine or at home, with many questioning what the rapid advances are doing to our relationships with others and ourselves.

Levy is “absolutely convinced” sex with robots is a positive thing for the “millions and millions” of people around the world who don’t have satisfactory relationships. He thinks they could be the cure for everything from loneliness to pedophilia by helping to “wean” pedophiles off having sex with the children they’re attracted to.

“For whatever reason there are huge numbers of people who just don’t have a relationship with someone they can love and someone who can love them,” he said. “For people like that, I think that sex robots will be a real boon. It will get rid of a problem they’ve got, fill a big void in their lives and make them much happier.”

SAN MARCOS, CA – FEBRUARY 5: A male RealDoll is placed in a shipping container at the Abyss Creations factory in San Marcos, California. Photo: A male RealDoll is placed in a shipping container at the Abyss Creations factory in San Marcos, California.

It’s a view that has been described as a “terrifying nightmare” by robotics ethicist Dr. Kathleen Richardson. The senior research fellow at De Montfort University recently launched a Campaign Against Sex Robots with fellow researcher Dr. Erik Billing and wants to highlight the kind of inequalities sex robots can perpetuate in real life.

“We’re not for a ban of sex robots, what we’re giving people is information about are the arguments for sex robots justified, and we’re asking them to examine their own conscience and whether they want to contribute to this development,” she told News.com.au

“Everyone thinks because it’s a robot prostitute then real women and children in the industry won’t be harmed. But that’s not happened because if you don’t address the core idea that it’s not OK to reduce some human beings to things then all you do is add a new layer of complexity and complication and distortion to an already distorted relationship.”

While the emerging nature of the technology means long-term effects have not been documented, Dr. Richardson fears widespread use of robots for sex will destroy human capacity for empathy and entrench notions of sex and gender already prevalent in the sex industry.

“Sex can never not be relational. You need another person. If it’s not relational you’re really masturbating,” she said.

HER, Joaquin Phoenix, 2013, ©Warner Bros. Pictures/courtesy Everett Collection
In 2013’s “Her,” Joaquin Phoenix falls in love with a bodiless operating system voiced by Scarlett Johansson.
Photo: Warner Bros

These complexities are the kind of moral, ethical and legal quandaries Professor Cheok expects to air at the conference.

The Australian-born digital expert specializes in human-computer interfaces and thinks robots will be integrated into our lives in the short-term as friends, sex objects and care-givers before the relationships develop and could even include different levels of compliance for the types of relationships people want to have.

“We really don’t know how human society will react. The worst-case scenario is that people begin to have a robot partner rather than a human partner,” he said, adding that this could happen to a “small percentage of the population” similar to the way people have died after being gripped by the reality of video games.

“There will be some people … that prefer robots over humans but I think that won’t be the majority. I think most people will prefer to have real human relationship.”

Source: http://nypost.com/2015/09/29/sex-robots-are-definitely-coming-in-the-future/

Adrian Cheok Talk at TUT: “Everysense Everywhere Human Communication”

Adrian Cheok Talk: “Everysense Everywhere Human Communication”

Time: Friday 25.9.2015 at 10.15-12

Place: TUT, Tietotalo, TB109 (Korkeakoulunkatu 1, 33710 Tampere)

Adrian Cheok was visiting Tampere and gave an inspiring talk on the title “Everysense Everywhere Human Communication”. Cheok was also one of the main speakers in MindTrek 2015. The talk at TUT was organized by UBINET doctoral network. Adrian Cheok is the Director of the recently established Imagineering Institute, Malaysia, and Professor of Pervasive Computing, in City University London. He has been working on research covering mixed reality, human-computer interfaces, wearable computers and ubiquitous computing, fuzzy systems, embedded systems, power electronics.

x112029

Problem in current interaction with computers is that all the interaction happens behind the glass or window. In human interaction nonverbal communication is extremely powerful – 60 % of all information transformed face-to-face is nonverbal.

Traditionally interacting with devices has included visual, audial and tactile feedback. Adrian Cheok’s group designs for multimodal and multisensory interaction – bringing all the senses to the interaction. Vision of Adrian Cheok’s group is to augment reality with artificially created stimuli. Visual, sound, touch, smell and taste. Smell and taste connect to the limbic system, which links to memories and feelings, and thus have great possibilities in emotional design.

WP_20150925_10_16_50_Smart-300x169laite_03_900px

Humans can develop new types of communication environments using all the senses, including touch, taste, and smell, which can increase support for multi-person multi-modal interaction and remote presence. Cheok suggested alternative ubiquitous computing environments based on an integrated design of real and virtual worlds. Cheok used Sensorama, the first system enabling also olfactory feedback, as an example of how traditional the idea of multisensory interaction is.

Anna-Adrian-Cheok_03_900pxWP_20150925_10_59_08_Prolaite_05_900px

Cheok claims that also possibilities of tactile feedback are underused. In communications between humans, touch is essentially important. Emotional messaging, by hugging and transferring a feel of hug through haptic feedback was one of the ideas their group has tested. Also, using haptic feedback in kiss messaging – application and device called Kissenger.

laite_01_900pxlaite_02_900pxWP_20150925_10_47_04_Pro

 

Cheok aims for new kind of tele-presence with all the senses included. One of the possibilities of touch-based communication in the future is that where the touch traditionally is one-to-one, in digitalized form it can be one-to-many.

RingU concept enables users to remotely send touch, caresses and hugs through the haptic feedback in a form of ring.

But not only in human-to-human interaction, also when interacting with animals touch is important. Cheok introduced system to mediate feeling of touch to animals.

12047066_10154212691268976_8118230164992073530_nsali1_855px-300x200

Digitizing the senses

WP_20150925_10_56_40_Smart-300x169

Cheok claimed that our brain is living in a virtual reality even now – all the perceptions are interpretations of physical things by brain. We see the world through the filter of our senses and the brain interpretations can be tricked or digitized.

WP_20150925_11_50_39_ProWP_20150925_11_51_19_Pro

Scentee: 60 gram device that can be attached to your mobile phone and can generate different smells.

Smell can effect and relieve muscular pain, e.g. neck and back pain. Smells released in the office space could make people feel better. Olfactory sensations require chemical stimuli, but taste neurons can be activated with electric signals. Adriand Cheok introduced device that can produce sour taste through electric signals.

laite_04_900px

In the end of the lecture, Cheok discussed possibilities of food messaging, food printing and communication. E.g. edible messages, even 3D food printing in the future can be possible. Children would be able to program food dishes without the heat, fire and knives and all the dangerous or hard-to-use things. Cheok claimed, that as in every new media, there will be downsides, but in the end the aim is to increase human happiness.