Understanding the computation of time using neural network models studied by Prof Changsong Zhou
Researchers at HKBU Physics reveal coding principles of time in neural network that “can potentially be used to develop new machine learning algorithms that put emphasis on the timing dimension”, commented in “news and views” by Nature Machine Intelligence in Sep, 2020.
Processing time is important in almost all the tasks in our everyday life, for example, sport, speaking language, dancing or playing music. There are three types of time, quantifying elapsed time, memorizing short intervals and forecasting when an upcoming event may occur. A recent study by of Prof. Changsong Zhou from the Department of Physics, Centre for Nonlinear Studies, and Institute of Computational and Theoretical Studies and his collaborator Dr. Zedong Bi from Qingdao University described the encoding principles of these three types of time using computational modelling. They trained artificial recurrent neural network with various timing and non-timing tasks like animal experiments and studied the neural population activity patterns and structure in the learned networks to reveal the coding principles that have also been observed from neuroscience experiments. The work has been published in Proceedings of the National Academy of Sciences USA, 117, 10530–10540 (2020).
Recently, this work was reported as “news and views” by Nature Machine Intelligence, vol 2, 492-493, 14 September 2020. The authors, Hugo Merchant and Oswaldo Pérez, commented that “Overall, this interesting study provides specific predictions on how the three times are processed by neural networks across different temporal and non-temporal contexts, which in turn can be tested in well-designed neurophysiological experiments. Moving forward, the neural computations behind temporal processing in its three forms should be key for new developments in artificial intelligence and machine learning”.