Neurocomputational Models of Interval and Pattern Timing

Read::

TABLE without id
file.link as "Related Files",
title as "Title",
type as "type"
FROM "" AND -"ZZ. planning"
WHERE citekey = "hardyNeurocomputationalModelsInterval2016" 
SORT file.cday DESC

Abstract

Most of the computations and tasks performed by the brain require the ability to tell time, and process and generate temporal patterns. Thus, there is a diverse set of neural mechanisms in place to allow the brain to tell time across a wide range of scales: from interaural delays on the order of microseconds to circadian rhythms and beyond. Temporal processing is most sophisticated on the scale of tens of milliseconds to a few seconds, because it is within this range that the brain must recognize and produce complex temporal patterns-such as those that characterize speech and music. Most models of timing, however, have focused primarily on simple intervals and durations, thus it is not clear whether they will generalize to complex pattern-based temporal tasks. Here, we review neurobiologically based models of timing in the subsecond range, focusing on whether they generalize to tasks that require placing consecutive intervals in the context of an overall pattern, that is, pattern timing.

Quick Reference

Top Comments

Let’s say grey is for overall comments This is actually not super useful to me. It mainly relies on the synfire chain method, which is a population encoding that relis on delay line encoding

Just an off reference to how time intervals are recorded

Tasks

Topics

Further Reading

β€”

Extracted Annotations and Comments

Figures