Psychology 209 – Winter 2017 March 9, 2017 Successes, Limitations, and Future Directions for Neural Network Models of Cognition Psychology 209 – Winter 2017 March 9, 2017
What cool things can neural networks learn to do? Classify pictures of objects Translate from one language to another, even without direct experience on the particular language pair Learn a strategy for searching through a random graph
What are they still struggling to do?
Lake et al Pattern recognition vs model building: Cognition is about using these models to understand the world, to explain what we see, to imagine what could have happened that didn’t, or what could be true that isn’t, and then planning actions to make it so.
Start up software Intuitive physics Intuitive psychology Infants have primitive object concepts that allow them to track objects over time and allow them to discount physically implausible trajectories – e.g. they know that objects will persist over time and that they are solid and coherent. Intuitive psychology Infants understand that other people have mental states like goals and beliefs, and this understanding strongly constrains their learning and predictions.
Learning as model building Explaining observed data through the construction of causal models of the world. ‘Early present capacities for intuitive physics and psychology are also causal models of the world’. A primary job of learning is to extend and enrich these models and build analogous causally-structured theories of other domains. Human learning is richer and more efficient than state-of-the-art algorithms in machine learning Compositionality and learning to learn are ingredients that make this type of rapid model learning possible
Model Based and Model Free Methods Using a model is cumbersome and slow; model free reinforcement learning can allow real-time ‘control’. Humans combine MB and MF competitively and cooperatively
Two Challenges Characters Frostbite
One example supports Classification of new examples Generation of new examples Parsing an object into its parts Generation of new concepts from related examples
Lake et al’s solution
How might you address this challenge using neural networks?
DQN learns Frostbyte slowly – people can do well from brief instruction or from watching a good player Construct an igloo Jump on white ice flows Gather fish Don’t fall in the water Avoid geese & polar bears
How might you address this challenge using neural networks?
Emergent intelligence vs built-in intelligence May be easier to create Since it is designed, it is likely to be easier to understand You need to have just the right stuff to get the stuff you want to learn to fit within it May not deal with quasiregularity Emergent Not as easy to create Not as easy to understand Deals with quairegularity Involves less prior commitment to structure
What other challenges can you envision?