A team of researchers has developed a computer program that can learn to decipher sounds the way a baby does.
The impetus behind the program was to better understand how people learn to talk, or more specifically, to see whether language is hard-wired in the brain.
Tests of the computer model back up a theory that babies learn to speak by sorting through different sounds until they understand the structure of a language, according to James McClelland, a psychology professor at Stanford University who wrote a paper on the subject that appeared in the Proceedings of the National Academy of Sciences. McClelland was quoted in a an article from Reuters.
McClelland's team found that the computer could track vowel sounds just like a baby. "In the past, people have tried to argue it wasn't possible for any machine to learn these things," he said in the Reuters article.