Machine learning is the newest thing at BYU, thanks to the work of engineer Dah-Jye Lee, who has created an algorithm that allows computers to learn without human help. According to Lee, his algorithm differs from others in that it doesn’t specify for the computer what it should or shouldn’t look for. Instead, his program simply feeds images to the computer, letting it decide on its own what is what.
Similar to how children learn differences between objects in the world around them in an intuitive way, Lee uses object recognition to show the computer various images but doesn’t differentiate between them. Instead, the computer is tasked with doing this on its own. According to Lee:
“It’s very comparable to other object recognition algorithms for accuracy, but, we don’t need humans to be involved. You don’t have to reinvent the wheel each time. You just run it.”
Nancy Houser, writing for Digital Journal, surveys the landscape of cognitive computing and provides a lucid view of what to expect over the next 5 years.
A key theme is that of computers moving from glorified calculators to thinking machines that augment all 5 of your senses.
For example, imagine computers responding to your inquiry with richer, more valuable outputs like the texture of fabric or the aroma of fresh cut grass.
What’s making it possible to enhance and extend our senses with intelligence is the convergence of computer vision, machine learning, big data, speech recognition, natural language processing, smartphones and biometric sensors.
One hurdle to humanizing computers has been making the leap from performing predefined tasks based on programmed logic to handling new tasks autonomously without dedicated software programs already in place. (e.g., unsupervised machine learning.)
Towards bridging the gap, Houser discusses the promising work of Prof. Chris Eliasmith on computing efforts to reverse engineer the biology of the mammalian brain.
In the past, human brain structure and operation has been replicated in fine detail to accomplish tasks like handwriting recognition, adding via counting, and even interpreting patterns associated with higher-level intelligence.
Now, by adding interfaces that sense touch, pressure and heat, we’re beginning to see more human-like computers that can park cars and perform biometric security functions. IBM is already at work on cognitive computing applications for retailers.
In some respects, cognitive computing could be the near term forerunner of cyborg singularity speculated in the Google hire of Ray Kurzweil.
So, how do you see, hear, smell, touch, taste cognitive computing in the future of your business?