AI Simulating the Human Body — Part 2

By Alaa Eljatib

In part one of my blog I introduced a different approach to understanding AI in the perspective of the human body and simulating the five senses. I covered how AI simulates both hearing and sight, and now I’ll discuss how it applies to smell, touch and taste.

Smell

Imagine being able to smell a certain perfume you see on a commercial in your living room or send a scent of a rose over a messaging application. Currently, the main challenge is the ability to accurately replicate an odour or scent and deliver that to another person in a different space. How can AI identify a fishy scent vs. a sweet scent? How can we decode and encode a scent to generate the same smell in the same way that image processing can encode what we see and decode it to a digital image? Those are questions we still need answers to.

Touch

Simulating this sense might be the most well explored among other senses, especially after the innovation of touch screens that we now see on all our devices from smartphones and tablets to your refrigerator door. The most radical and useful application of AI and touch is in healthcare. For example, when a robot can perform the same movements and force of touch as a surgeon to operate on a patient. Surgeons can use AR (Augmented Reality) technology to simulate an operating room while wearing gloves containing sensors. When the surgeon moves his/her fingers, that information gets transmitted to the robot where it can mimic the exact motions on a patient in another room or other side of the world.

Another AI application in healthcare is the ability to measure people’s heart rate or blood pressure by touching a camera lens. This application is another example of Camera IPA (Image Processing and Analysing). You can also measure blood oxygen saturation using a camera’s flash. More precise measurements can be achieved by using external sensors connected to a phone via bluetooth. Most of the current mobile apps use either a camera or a camera’s flash on a smartphone for heart rate, blood pressure and oxygen saturation measurements without any third party external sensors.

All of the applications mentioned above attempt to facilitate the interaction between human and machine. However, more developed and intelligent machines are needed to take this application of AI to the next level, such as being able to poke a phone screen and transfer it to another user who can feel that poke on their screen.

Taste

The main problem here is identifying the taste and representing it in a mathematical model, which can then be coded in binary language. How can we identify and simulate something that’s sweet along with the intensity of sweetness? Essentially allowing us to taste something that one person is tasting from one device to another. Imagine being able to taste a dish while watching it on a cooking show. We’re still ways away from achieving this.

One of the initial attempts to simulate taste using AI is the Japanese electric salt-flavoured fork, which leverages electrical currents to create a salty taste. Health conscious individuals or those with medical conditions can cut down on the amount of salt they consume without sacrificing the flavour of their food.

Future advancements in hardware containing chemical sensors on a smartphone for example, can stimulate more software applications that can better simulate applications of taste. For example, we can build an application leveraging machine learning, that can evaluate the level of sweetness, saltiness etc. of a dish and suggest ingredients to add based on your taste preferences.

AI is a multi-faceted field with a lot of subfields that we can dive into, including fuzzy logic, natural language processing, neural networks, expert systems, and the like. If there are any topics you’d like me to cover leave a comment and I’ll address it in my next blog post.

author

About the author

Alaa Eljatib was fascinated by the possibilities and capabilities of AI, and pursued a masters and Ph.D in AI at Damascus University in Syria. After leaving his home in 2016, he joined TribalScale as an Agile Software Engineer bringing his wealth of experience in AI and engineering.

Connect with TribalScale on Twitter, Facebook & LinkedIn!

Visit Us on Medium

You might also be interested in…