Pattern-Recognition-Will-machines-replace-this-21st-century-skill
Reading Time: 3 minutes

Since the very beginning of decision support systems and business intelligence systems, there has been a promise of getting the 360 view of the customer or getting the big picture. The technological challenges of traditional technologies and data integration problems did not help in getting the right expectations. Getting the 360 view has been been a concept since our early times. Wise people were those who had integrated all kinds of knowledge, who were skilled in reducing it to simple, universal patterns.

Our ancestors used to develop skills by combining past experience, intuition, and common sense. People started to interact with the world around them and making sense of data whether they see, hear, taste, touch, or smell anything at the moment of their experience.

If the patterns are well-known to the brain, they will then be compared with other patterns in order to take action. However, if the information was novel, a mental model should be constructed to either process and grasp this data into other patterns or discard the data. Machines are now capable of processing latency-sensitive applications and building new mental models. Something challenging for humans to undertake in the age of information vomit.

Indeed, now the ways we interact with our environment has changed. Take this example. It was at that time when our ancestors used to go to a lake around their campfire area and have a bath and if the towel is missed then he’d want his voice echo to mingle between the trees to reach the cave and ask for help. If a man is having a shower today and he realizes that his towel is missing while his wife is listening to music in the other room, this is no longer a problem. With the help of Alexa or Google Home, the words “I need a towel” can be broadcast on the man’s TV. And as if by magic, the door will open, his wife’s hand will appear with a towel, hanging it within reach on the towel rack.

So it is not only about the data…Our interaction with our environment and the way we expect patterns to be recognized is also changing..

Besides, different technologies introduced recently are paving the way for this new era of pattern recognition. We now have retinal implants that act as an artificial eye that track patterns where people can’t go. Robotic animals implanted with such devices are now tracking animal’s locomotion and discovering patterns in the wildlife. This will help in capturing animal emotions by mimicking the natural movement of real-life counterparts.

Even beyond that, our big data systems can now predict patterns like what will happen next in a photo and turning it into a 2 second video. Disney Research can now track complex human patterns like actor’s movements and changing expressions so that the face can be painted with light, rather than with physical makeup. This signals the advancement we have now in terms of analyzing huge volumes of data with less latency by matching the actor’s pose with the image displayed.

Big Data is now taking the role of our senses by being there at the “moment” of experience and then reporting back to us in real-time. This will raise the question of our role as humans in this new era. Will this remove the need for us to perform mundane human tasks or will it replace our role entirely? In my point of view, if machines are capable of sensing things for us and delivering for us a palette of patterns, then our role should be to design the mosaics out of these palettes to solve the problems our businesses, and humanity in general, are facing.

How would technologies like data virtualization help us integrate these pattern palettes produced by machines and so many different sources by achieving the lowest latency possible? This theme will be addressed in my upcoming blog, watch this space!

Ali Rebaie