Attractor and Recurrent Neural Networks in Grid Pattern Formation

MPhil Thesis Defence


Title: "Attractor and Recurrent Neural Networks in Grid Pattern Formation"

By

Mr. Yuezhang LIU


Abstract

Grid cells in the entorhinal cortex exhibit hexagonal spatial firing pattern, 
which are critical to mammalian navigation. The renaissance of deep learning 
evokes the study of grid pattern formation by training recurrent neural 
networks (RNN), while the underlying mechanism is still unclear. In this 
thesis, we aim to build connections between the RNN and classical 
model---continuous attractor neural networks (CANN). By simplifying the RNN 
architecture and comparing it with the CANN, we show that such models are 
unified from a band-pass filter perspective. Applying the theory, we manage to 
build a minimum model for grid pattern formation.

On the experimental side, we first train the RNN with different settings to 
verify our claim. The error stabilization phenomenon and generalization failure 
are discovered in the RNN model. By a joystick visualization, we identify that 
both phenomena attribute to the persistent grid activities near the border 
regions, revealing the distinct boundary dynamics between the attractor and 
recurrent neural networks.


Date:  			Tuesday, 17 August 2021

Time:			11:00am - 1:00pm

Zoom meeting: 
https://hkust.zoom.us/j/91080399553?pwd=MDdYOGJWNTJsU3kxaVdGWldEdzl4UT09

Committee Members:	Prof. Bo Li (Supervisor)
 			Dr. Qifeng Chen (Supervisor)
 			Dr. Wei Wang (Chairperson)
 			Dr. Yangqiu Song


**** ALL are Welcome ****