Scientists from both IBM and MIT have partnered up to develop technologies that’ll one day allow artificial intelligence to see and hear like humans. IBM will be supplying technology and expertise to MIT, whom will be in charge of the research itself.
IBM and MIT have set up a laboratory called the ‘Laboratory for Brain-inspired Multimedia Machine Comprehension’, or BM3C for short. The lab is led by MIT’s head of Department for Brain and Cognitive Science, Jim DiCarlo along with members that are made up of several folks from the same department as well a few from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). In addition, the team is also joined by Watson, IBM’s AI computer system.
Our ability to recognise a person crying as a sign of distress is something that computers aren’t very good at, yet. Even IBM admits that these simple tasks, that our brains do so well, are impossible for current AI to replicate. Hence, why the partnership between IBM and MIT can be seen as very important.
The goal of this program, at least roughly, is to allow AI to somehow have the ability to function like our brain. Pattern recognition as well as prediction will be the toughest problem that has to be solved when in comes to making AI smarter than it is today. Hopefully, with this partnership, AI will be able to replicate the core functions of our brain – to make our lives a little bit easier.