since 07 November 2024 :

View(s) :

1 (0 ULiège)

Download(s) :

0 (0 ULiège)

Attentional and Semantic Anticipations in Recurrent Neural Networks

p. 74-95

Abstract

Why are attentional processes important in the driving of anticipations ? Anticipatory processes are fundamental cognitive abilities of living systems, in order to rapidly and accurately perceive new events in the environment, and to trigger adapted behaviors to the newly perceived events. To process anticipations adapted to sequences of various events in complex environments, the cognitive system must be able to run specific anticipations on the basis of selected relevant events. Then more attention must be given to events potentially relevant for the living system, compared to less important events.

What are useful attentional factors in anticipatory processes ? The relevance of events in the environment depend on the effects they can have on the survival of the living system. The cognitive system must then be able to detect relevant events to drive anticipations and to trigger adapted behaviors. The attention given to an event depends on i) its external physical relevance in the environment, such as time duration and visual quality, and ii) on its internal semantic relevance in memory, such as knowledge about the event(semantic field in memory) and anticipatory power(associative strength to anticipated associates).

How can we model interactions between attentional and semantic anticipations ? Specific types of distributed recurrent neural networks are able to code temporal sequences of events as associated attractors in memory. Particular learning protocol and spike rate transmission through synaptic associations allow the model presented to vary attentionally the amount of activation of anticipations (by activation or inhibition processes) as a function of the external and internal relevance of the perceived events. This type of model offers a unique opportunity to account for both anticipations and attention in unified terms of neural dynamics in a recurrent network.

Text

Download Facsimile [PDF, 12M]

References

Bibliographical reference

Frederic Lavigne and Sylvain Denis, « Attentional and Semantic Anticipations in Recurrent Neural Networks », CASYS, 8 | 2001, 74-95.

Electronic reference

Frederic Lavigne and Sylvain Denis, « Attentional and Semantic Anticipations in Recurrent Neural Networks », CASYS [Online], 8 | 2001, Online since 07 October 2024, connection on 27 December 2024. URL : http://popups.uliege.be/3041-539x/index.php?id=452

Authors

Frederic Lavigne

Laboratoire de Psychologie Expérimentale et Quantitative, Université de Nice - Sophia Antipolis, 24 Ave des Diables bleus, 06357 Nice Cedex 4, France

By this author

Sylvain Denis

Laboratoire de Psychologie Expérimentale et Quantitative, Université de Nice - Sophia Antipolis, 24 Ave des Diables bleus, 06357 Nice Cedex 4, France

By this author

Copyright

CC BY-SA 4.0 Deed