"The ability to learn tasks in succession without forgetting is a core component of biological and artificial intelligence," the computer scientists write in the paper. Kirkpatrick says a "significant shortcoming" in neural networks and artificial intelligence has been its inability to transfer what it has learned from one task to the next.Full details at Wired.
The group says it has been able to show "continual learning" that's based on 'synaptic consolidation'. In the human brain, the process is described as "the basis of learning and memory".
Google's DeepMind comes up with way to add memory to AI
Posted on Wednesday, March 15 2017 @ 13:54 CET by Thomas De Maesschalck