Sentiment analysis and cognitive computing: getting practical!

MILANO – Codemotion: November 23-26. Two days for an unique conference created by software developers for software developers.

A few weeks ago I published an article about machines that are able to understand feelings, emotions and personality and I promised you a come back with more details, practical examples and real use cases.

On November 25th I had the honour to present the keynotes at Codemotion Milano 2016. This is an unique conference created by software developers for software developers. The main focus has been on artificial intelligence, machine learning, chat bots and go language. My goal has been to introduce the over 2000 attendees to the world of cognitive computing: the possibility to interact with machines that would not just understand sentiments, but, more in general, understand the natural language (i.e the human language) whether spoken or written, give meaning to images, infere relations among concepts even if apparently not connected, suggest insights, explanations and solutions to problems and learn from the previous interactions with humans or other machines.

I was supposed to be on the stage with one of my colleagues, but she was late… Unbelievably on the stage I found a humanoid robot, playing the role of the cognitive assistant: he greeted me, asked me to show him the QR code on my badge so he could identify me and would check if I had really a meeting with my colleague and actually texted her phone to let her come on stage: cool!

By the way I referred to the robot as “he” since to the question “are you a boy or a girl?” he answered me that in Europe he is typically considered a boy even though at the end he is just a robot 😉

Robots like the one on stage with me have been already employed as hotel concierge (see for example ), as sales assistant or to entertain kids or elderly people ( see for example ).
So we’re jumping into an era where robots can understand human language (actually also multiple languages at the same time), can recognize people looking at QR codes or applying sophisticated face recognition algorithms, can understand emotions interpreting facial expressions, the tone of the voice as well as the words and sentences used and can even provides information, directions, suggestions leveraging unbelievably rich and complex datastore, give hints and recommendations.

RobotHey hey hey, hold on: a robot like the one that was on stage with me is somewhat like 30 Kg of hardware, a quad core CPU, 4GB of RAM, 16GB of SSD…and is able to do all these things? In realtime?
Where does he store all data? How can he get information from the web or from other data sources? Where does he get the computational power to deal with natural language, big data, data mining, complex statistical and machine learning algorithms?

Well, you can easily imagine all this stuff cannot fit into that single robot. The trick is that typically these robots are examples of hybrid clouds: part of the data and processing is in the robot, while the deep cognitive analysis and heavy data processing are done somewhere in the cloud, reached via a network connection (typically wifi).

Can they do something more? Have a look at one of them dancing !

Would you like to get more examples about application of cognitive computing and sentiment analysis?

Let me show you a fun one: the fashion company Marchesa created for the Met Gala this year a dress that would change colour according to the emotional feedback provided on Twitter by the followers of the event. The dress was a waterfall of flowers, each of them equipped with a LED in the middle. The LEDs would be changing colour according to what was tweeted: they turned lavender to show curiosity, rose for joy, aqua for excitement, coral for passion.
So while having fun enjoying the event you could immediately get an understanding on how the event itself was received by the followers. From the technology point of view, the software behind the curtain had to be able to process real time all the tweets about the event; for each event understand the natural language and correctly interpret the sentiment expressed; aggregate the information and deliver the information to the LEDs controller. As you can imagine, the software was not embed into the dress, but it was running somewhere in the cloud.

One more example: this year the first movies trailer built by an artificial intelligence was created (see here ): a cognitive engine was trained, watching several horror movie, to understand what people perceived as scary. Once this training was done, the computing engine processed the given movie and extracted a 6 minutes long set of images that were sufficient to give an idea of the plot, of the anxiety and thrills you’ll be going to feel while watching it as a whole.

If you start getting scared this could be the beginning of the creation of a Skynet or of Matrix it may make you feel better the fact that computers are not able to reason by themselves, they maybe looking like they think like humans, but they’re just applying more sophisticated algorithms and programs that have to be created by humans: computers are going to work more closely with humans, but not to replace them!

Do you want to hear more? Stay tuned! 😉

Rossella De Gaetano
Latest posts by Rossella De Gaetano (see all)

Leave a Reply

Your email address will not be published.