Adrian Cheok is a 2016 Distinguished Alumni Award recipient in recognition of his achievements and contribution in the field of computing, engineering and multisensory communication.

Adrian is a pioneer in mixed reality and multisensory communication; his innovation and leadership has been recognised internationally through multiple awards.
Some of his pioneering works in mixed reality include innovative and interactive games such as ‘3dlive’, ‘Human Pacman’ and ‘Huggy Pajama’. He is also the inventor of the world’s first electric and thermal taste machine, which produces virtual tastes with electric current and thermal energy.

Adrian Cheok is a 2016 Distinguished Alumni Award recipient in recognition of his achievements and contribution in the field of computing, engineering and multisensory communication.

Olfactometer

By Adrian David Cheok, Kasun Karunanayaka, Halimahtuss Saadiah, Hamizah Sharoom

 

 

Many of the Olfactometer implementations we find today, comes with high price and they are

complex to use. This project aiming to develop a simple, low cost, and easily movable laboratory

Olfactometer, that can be used as a support tool for wider range of experiments related to smell, taste,

psychology, neuroscience, and fMRI. Generally, Olfactometers use two types of olfactents; solid or

liquid odor. Our laboratory Olfactometer (as shown in Figure 1) will support for liquid based odors

and later we may also extend to handle solid odors.  Also we are thinking of improving this system as

a combined Olfactometer and Gustometer.

 

In this Olfactometer design, we utilize continuous flow (Lorig design) for good temporal control.

Lorig design have simpler design and low cost due to minimal usage of parts as compared to other

designs (Lundstrom et al., 2010). Our Olfactometer contains an 8 output channels that will produces

aromas in a precise and controlled manner. Besides that, it also produces a constant humidified flow

of pure air. The laboratory Olfactometer components consists of oil less piston air compressor, filter

regulator & mist separator, 2- color display digital flow switch, check valve, solenoid, manifold,

TRIVOT glass tube, connector, gas hose clip and also PU tubing. The controlling system of

Olfactometer will consists of Adruino Pro mini, UART converter, USB cable and solenoid circuit.

 

Air supply from oil-less piston air compressor plays an important part to deliver the odor with a

constant air pressure to the subjected nose. After that, the filter regulator combined with mist

separator are used to ensure the air is clean and did not have any other contaminant. After the filter,

the air flows will be metered through to 2-color display digital flow switch. Check valves are

connected after the flowmeter to ensure that the air will flow only in one direction. After the check

valve, the tube is then connected to 9 fitting male connectors which directly fit into the 8 output of

manifold. Then, 8 pcs of normally closed solenoid valves will be connected to the top of the manifold.

The Olfactometer can manually be controlled by computer to send an odour to the nose. A 2-color

display digital flow switch will be connecting after solenoid to make sure the air will flow around 3

LPM to 5 LPM (to avoid any discomfort to the subjected nose). If the solenoid valve is on by

computer, the air will pass through to the glass bottles and proved it by seeing the bubbling air inside

of glass bottles. Finally, air flow will blow the liquid/solid and go through the check valve before

straight to human nose.

 

The Laboratory Olfactometer

By Nur Ellyza Binti Abd Rahman*, Azhri Azhar*, Murtadha Bazli , Kevin Bielawski , Kasun Karunanayaka, Adrian David Cheok

 

 

In our daily life, we use the basic five senses to see, touch, hear, taste and smell. By utilizing some of

these senses concurrently, multisensory interfaces create immersive and playful experiences, and as a

result, it is becoming a popular topic in the academic research. Virtual Food Court (Kjartan Nordbo,

et. al.,2015), Meta Cookie (Takuji Narumi, et. al.,2010) and Co-dining (Jun Wei, et. al., 2012)

represent few interesting prior works in the field. Michel et al. (2015) revealed that dynamic changes

of the weight of the cutleries, influence the user perception and enjoyment of the food. The heavier

the weight of the utensils, would enhance the flavour. In line with that, we present a new multisensory

dining interface, called ‘Magnetic Dining Table and Magnetic Foods’.

 

‘Magnetic Dining Table and Magnetic Foods’ introduces new human-food interaction experiences by

controlling utensils and food on the table such as modify weight, levitate, move, rotate and

dynamically change the shapes (only for food). The proposed system is divided into two parts;

controlling part and controlled part. The controlling part consist of three components that are 1)

Dining Table, 2) Array of electromagnet and 3) Controller circuit and controlled part consist of two

components; 1) Magnetic Utensils and 2) Magnetic Foods. An array of electromagnet will be placed

underneath the table and the controller circuit will control the field that produce by each of the

electromagnet and indirectly will control the utensils and food on the table. For making an edible

magnetic food, ferromagnetic materials like iron, and iron oxides (Alexis Little, 2016) will be used.

We expect that this interface will modify taste and smell sensations, food consumption behaviours,

and human-food interaction experiences positively.

 

Magnetic Dining Table and Magnetic Foods

Bench of Multi-sensory Memories

By Stefania Sini, Nur Ain Mustafa, Hamizah AnuarAdrian David Cheok

 

 

What if cities have dedicated urban interfaces in public spaces that invite people to share stories and

memories of public interest, and facilitate the creation of a public narration? What if people share and

access these stories and memories while chatting with a public bench? Will the interaction with the

bench provide a meaningful, memorable and playful experience of a place?

 

The Bench of Multi-sensory Memories is an urban interface whose objective is to investigate the role

of urban media in placemaking. It mediates the creation of a public narration, and affords citizens a

playful and engaging interface to access and generate stories and memories that form this narration.

 

The bench has been designed and fabricated in collaboration with the Malaysian artist Alvin Tan,

which has experience with bamboo installations in public spaces. Its structure is robust and it allows

to easily and safely allocate all the hardware components. The hardware and software system

consists of: a) input devices, the USB Microphone and the Force Sensitive Resistor (FSR) sensors; b)

Analog-Digital or Digital-Analog (AD/DA) Converter Module Board; c) Microcontroller, a Raspberry Pi

3; d) output device, a Speaker; e) the software, the Google Speech API. The components operate as

following: the FSR sensors detect the presence of a person in the bench through physical pressure,

weight and pressing; the AD/DA Converter Module Board read the analogue values of the FSR

sensors and convert them into digital values, readable by the Microcontroller; the Microcontroller,

which has advanced features, such as the Wi-fi, Ethernet, Bluetooth, USB, HDMI and Audio Jack,

easily connects the inputs and output devices. Currently, the system software implements speech

applications, such as text-to- speech and speech-to- text: Google Speech API generates the voice

based on the text, records the voice and translates the speech into text, through the output and input

devices. Therefore, at the moment, the system performs a scripted sequence that includes text-to-

speech and speech to text translations. In the short term, we will be able to employ a custom chatbot,

that is able to conduct interactive and meaningful conversations.

 

Bench of Multi-sensory Memories

Multi-sensory Story Book For Visually Impaired Children

By Edirisinghe Chamari, Kasun Karunanayaka, Norhidayati Podari,  Adrian David Cheok

 

Experience of reading for children is enriched by visual displays. Researchers suggest

through picture book experiences, children expose themselves to develop socially,

intellectually, and culturally. However, the beauty of reading is an experience sighted

children naturally indulge, and which visually-impaired children struggle with. Our multi-

sensory book is an attempt to create a novel reading experience specifically for visually-

impaired children. While a sighted person’s mental imagining is constructed through visual

experiences, a visually-impaired person’s mental images are a product of haptic, and

sounds. Our book is introducing multi-sensory interactions, through touch, smell, and sound.

The concept is also aiming to address a certain lack of appropriately designed technologies

for visually-impaired children.

 

Our book titled “Alice and her Friend” is folding out to reveal a story about a cat, whose

activities are presented with multi-sensory interactions. There are six pages in this book, with

different sensors and actuators integrated in each page. The pages were designed with

textures, braille, large font text, sounds, and smell. With this book, we believe we have

contributed a new reading experience to the efforts of visually- impaired children to

understand the beauty of the world.

 

A Picture Book for Visually Impaired Children