Abstract
Several recent psychological investigations have demonstrated that planning an action biases visual processing. Symes et al. (2008) for example, reported faster target detection for a changing object amongst several non-changing objects following the planning of a target-congruent grasp. The current experimental work investigated how this effect might compare to, and indeed integrate with, effects of language cues. Firstly a cuing effect was established in its own right using the same change-detection scenes. Sentences cued object size (e.g., "Start looking for a change in the larger objects"), and these successfully enhanced detection of size-congruent targets. Having thereby established two effective sources of bias (i.e., action primes and language cues), the remaining three experiments explored their co-occurrence within the same task. Thus an action prime (participants planned a power or precision grasp) and a language cue (a sentence) preceded stimulus presentation. Based on the tenets of the biased competition model (Desimone and Duncan, 1995), various predictions were made concerning the integration of these different biases. All predictions were supported by the data, and these included reliably stronger effects of language, and concurrent biasing effects that were mutually suppressive and additive.
DOI
10.3389/fnbot.2010.00003
Publication Date
2010-01-01
Publication Title
Front Neurorobot
Volume
4
Organisational Unit
School of Psychology
Keywords
action intentions, biased competition, change detection, language cues, top-down and bottom-up interaction
Recommended Citation
Symes, E., Tucker, M., & Ottoboni, G. (2010) 'Integrating Action and Language through Biased Competition.', Front Neurorobot, 4. Available at: https://doi.org/10.3389/fnbot.2010.00003