Skip to main content
Advertisement
  • Loading metrics

Multiple visual objects are sampled sequentially

Abstract

When acting in a complex visual environment, it is essential to be able to flexibly allocate attention to parts of the visual scene that may contain goal-relevant information. The paper by Jia et al. provides novel evidence that our brains sequentially sample different objects in a visual scene. The results were obtained using “temporal response functions,” in which unique electroencephalographic (EEG) signals corresponding to the processing of 2 continuously presented objects were isolated in an object-specific way. These response functions were dominated by 10-Hz alpha-band activity. Crucially, the different objects were sequentially sampled at a rate of about 2 Hz. These findings provide important neurophysiological insights into how our visual system operates in complex environments.

How are multiple objects in a visual scene sampled?

Given that our visual system receives a constant flow of information, mechanisms need to be in place to help prioritize relevant from less relevant visual objects [1]. Such a mechanism involves the allocation of spatial attention. Spatial attention was initially explained by a searchlight analogy [24]. According to this analogy, the allocation of spatial attention results in a gain increase for attended objects and a reduced gain for unattended objects. While this analogy has received strong experimental support [58], it may be insufficient to fully describe the spatiotemporal dynamics of attention in daily life. For instance, we usually do not attend to a single object in a visual scene for long but rather continuously explore different parts of a visual scene by shifting attention. Shifts in attention might involve either the reallocation of spatial attention (covert attention) while maintaining one’s gaze at a given location or might involve saccades (changes in the position of the eyes, i.e., overt shifts of attention).

Importantly, saccades typically occur to informative parts of the visual scene [9]. This poses an interesting conundrum: how does our visual system know how to direct our gaze to objects that have not yet been consciously perceived? One potential explanation is that even when we focus on one object in a visual scene, possible candidates for future saccades are also processed, at least partially. This scheme implies that the focus of attention is not limited to a single object or location and raises the important question of whether attention reflects a parallel or serial process [10]. The work by Jia et al. published in PLOS Biology provides support for serial processing of multiple objects [11].

Identifying the dynamics of visual attention using temporal response functions

The experiments by Jia et al. were done using electroencephalogram (EEG) in combination with a technique based on so-called temporal response functions (TRF; Lalor, Pearlmutter, Reilly, McDarby, and Foxe, 2006). The basic principle underlying TFRs is that an object is presented as a flicker, of which the luminance is varied randomly over time, while the participant’s EEG is recorded. The TRF reflects the impulse response of the visual input (luminance changes of the object in the study by Jia et al.) in the brain as measured with the EEG and thus reflects the signal that best accounts for the measured EEG when convolved to the visual input [12,13]. The group of VanRullen and Macdonald previously demonstrated that the TRF of a visual stimulus is a decaying oscillatory response of about 10 Hz, which they termed perceptual echoes [13]. Importantly, the alpha-band response in the TRF links the oscillatory alpha dynamics associated with attentional modulations of sensory processing [1416] to object-specific attentional modulation of neural processing.

The group of Jia et al. used TRFs to simultaneously tag two objects that were presented in the left and right hemifield with orthogonal flicker signals (Fig 1A and 1B). The time-frequency representations of the impulse responses (TRFs) for each of the stimuli were then derived from the EEG. This approach allowed for isolating the brain dynamics associated with the processing of each object, providing a measure that can be used to assess how spatial attention affects object-specific neural processing.

thumbnail
Fig 1.

(A) In the study by Jia et al., two objects were presented. Participants were asked to detect the random appearance of a small target square in one of the objects. (B) The contrast of each object changed randomly over time. (C) The temporal response functions (TRFs) were calculated by relating the recorded EEG to the visual input train (contrast change of each object). The TRF can be considered to reflect the impulse response function that best explains the EEG when convolved with the stimulus train, also termed as perceptual echoes [13]. The TRFs were dominated by approximately 10-Hz alpha-band activity. The sequential sampling was observed as a rhythmic modulation (about 2 Hz) of the envelope of the alpha-band response.

https://doi.org/10.1371/journal.pbio.2003230.g001

The first experimental observation was that the frequency content of the TRF of both objects was dominated by an approximately 10-Hz response (see Fig 2A in Jia et al.), thus reproducing the perceptual echoes of VanRullen et al. [13]. Importantly, this response was modulated by spatial attention. When one object was clearly task relevant (indicated by a 100% valid attentional cue, rendering the other object task irrelevant), the difference in the power of the TRF corresponding to the relevant and irrelevant object revealed an 10-Hz response that was initially negative and then showed a trend for an increase after 0.2 s. The trend for this increase was termed the alpha rebound by the authors. As the authors point out, the alpha rebound is best explained by considering the 10-Hz alpha activity to reflect functional inhibition of neural processing [15,17,18]. As such, the initial decrease of about 10 Hz power in the difference-TRF may reflect a selective release from inhibition associated with the relevant object, followed by a later increase in active inhibition of the same object. This finding held up even when the task required participants to track the objects while the objects were moving. This shows the sampling mechanism is not based on spatial locations but is object-specific. The 10-Hz envelope of the difference-TRF thus appears to track the time course of attentional deployment between two different objects independently of their respective locations.

A second important finding was that when one object was more relevant than the other (e.g., more likely to contain a target than the other, indicated by a 75% valid attentional cue), the alpha rebound at 200–400 ms became more pronounced. Specifically, alpha power of the difference-TRF was initially negative but clearly changed sign after 200 ms. This pattern indicates that while attention is initially allocated to the most relevant object, resources are subsequently allocated to the less relevant object a few hundred milliseconds later. Importantly, the strength of the alpha rebound predicted behavioral markers of attentional deployment across individuals. Together, these findings were interpreted to reflect a sequential allocation of attention that is flexibly adjusted to changes in task context by first allocating attention to the most relevant object and a little later to the less relevant object.

Finally, something remarkable occurred when both objects were equally relevant (see Fig 4A and 4B in Jia et al.). The difference-TRF now revealed a slow oscillatory modulation of the 10-Hz envelope, such that the decrease and increase in TRF alpha power continued to alternate for nearly a second (see also Fig 1C). The authors interpret this pattern to reflect attention shifting back and forth between the two visual objects approximately every 0.5 s. As such, the allocation of attention in the presence of multiple objects appears not only to be sequential but also rhythmic in nature.

Additional evidence for sequential sampling

The results of Jia et al. relate to other recent intriguing findings of rhythmic attentional sampling. For instance, by quantifying the detection accuracy of subtle changes in two different objects as a function of time, Landau et al. recently demonstrated that visual sampling fluctuated at a 4-Hz rhythm [19]. This is consistent with similar findings by Fiebelkorn et al., who, in addition, investigated the spatiotemporal dynamics of between- versus within-object attentional sampling and found that attention switches between objects at a rate of about 4 Hz and within objects at about 8 Hz [20]. Furthermore, magnetoencephalographic (MEG) recordings during an object-based attention task demonstrated that the shift in visual sampling was reflected by modulations in posterior gamma-band activity [21]. This is important, as neuronal activity in the gamma band has been proposed to reflect feed-forward processing [22,23]. Another recent study by Song et al. suggests that sequential attentional sampling is mediated through oscillations in the 3–5-Hz theta band [24]. Together, these studies provide converging evidence for the notion that attentional sampling is indeed sequential and “jumps” from one object to another. It should be pointed out, however, that the sampling frequency identified by Jia et al. was about 2 Hz (about 2 “samples” of each object per second; see Fig 1C), which is slower than the approximately 4 Hz identified in previous research. As we will discuss below, the factors determining the sequential sampling frequency need to be elucidated in future research.

Future questions

The findings by Jia et al. raise several questions that deserve attention in the future (Fig 1D). One core question pertains to what happens in natural settings in which more than two objects are shown. One might hypothesize that attention will then jump sequentially among all the objects, potentially resulting in less frequent sampling of individual objects when attending multiple objects [25]. As such, the sequential mechanism might relate to the work on visual search by Treisman et al. [2628]. The scan rate implied by the Jia et al. study is, however, quite slow compared to the scan rate of the Treisman studies, possibly due to differences in the precise detection task and/or the number of items that were used across studies. The relationship between the physiological mechanism identified by Jia et al. and the classical psychophysical work on sequential scanning, which has thus far remained largely unconnected to findings of rhythmic sampling, deserves to be uncovered.

A related question pertains to the factors that determine the sequential allocation of attentional resources across different objects or locations. As Jia et al. demonstrate, the relative task relevance of different objects is an important determinant of the pattern of attentional sampling, suggesting that sequential sampling may be under top-down control. Previous research has shown that reward paring and long-term memory but also visual saliency modulate the allocation of attention [29,30]. How do these factors modulate the spatiotemporal dynamics of sequential sampling? The approach presented by Jia et al. now provides a tool to address these questions.

A third outstanding question is how rhythms of attentional sampling relate to naturally occurring neuronal oscillations. For instance, occipital activity in humans and monkeys shows strong neuronal oscillations in the theta (5–8 Hz) and alpha (8–13 Hz) bands [3135]. How do these rhythms link to the observed sequential sampling that also is rhythmic in nature? It is important to uncover if and how these different endogenous rhythms are related.

A final question concerns the temporal organization of visual exploration and search. It is plausible that the sequential attentional sampling process is part of the mechanism informing the visual system of where to saccade next. Saccades are obviously a major factor in the allocation of attention [3638]. By saccading, we shift our gaze to different parts of a visual scene 3–4 times per second [38,39]. How does sequential sampling relate to the timing of saccades? In conclusion, the findings reported by Jia et al. raise exciting questions concerning the nature, flexibility, and function of the spatiotemporal dynamics of attention.

References

  1. 1. Nobre AC, Kastner S. The Oxford Handbook of Attention. 1st ed. Oxford: Oxford University Press, Oxford; 2014.
  2. 2. Eriksen CW, Yeh Y-Y. Allocation of attention in the visual field. J Exp Psychol Hum Percept Perform [Internet]. 1985 [cited 2017 Feb 22];11(5):583–97. Available from: http://doi.apa.org/getdoi.cfm?doi=10.1037/0096-1523.11.5.583 pmid:2932532
  3. 3. Eriksen CW, St. James JD. Visual attention within and around the field of focal attention: A zoom lens model. Percept Psychophys [Internet]. 1986 Jul [cited 2017 Feb 22];40(4):225–40. Available from: http://www.springerlink.com/index/10.3758/BF03211502 pmid:3786090
  4. 4. Posner MI, Snyder CR, Davidson BJ. Attention and the detection of signals. J Exp Psychol [Internet]. 1980 Jun [cited 2017 Feb 22];109(2):160–74. Available from: http://www.ncbi.nlm.nih.gov/pubmed/7381367 pmid:7381367
  5. 5. Kastner S, Ungerleider LG. The neural basis of biased competition in human visual cortex. Neuropsychologia [Internet]. 2001 Jan [cited 2016 Jan 19];39(12):1263–76. Available from: http://www.sciencedirect.com/science/article/pii/S0028393201001166 pmid:11566310
  6. 6. Desimone R, Duncan J. Neural Mechanisms of Selective Visual. Annu Rev Neurosci. 1995;18(1):193–222.
  7. 7. Fries P, Schröder J-H, Roelfsema PR, Singer W, Engel AK. Oscillatory neuronal synchronization in primary visual cortex as a correlate of stimulus selection. J Neurosci [Internet]. 2002 May 1 [cited 2017 Jun 6];22(9):3739–54. Available from: http://www.ncbi.nlm.nih.gov/pubmed/11978850 pmid:11978850
  8. 8. Shapiro KL, Miller CE. The role of biased competition in visual short-term memory. Neuropsychologia. 2011;49(6):1506–17. pmid:21335016
  9. 9. Cavanagh P, Hunt AR, Afraz A, Rolfs M. Visual stability based on remapping of attention pointers. Trends Cogn Sci [Internet]. 2010 Apr [cited 2017 Jun 6];14(4):147–53. Available from: http://linkinghub.elsevier.com/retrieve/pii/S1364661310000288 pmid:20189870
  10. 10. Eimer M. The neural basis of attentional control in visual search. Trends Cogn Sci [Internet]. 2014 Oct [cited 2017 Jun 5];18(10):526–35. Available from: http://linkinghub.elsevier.com/retrieve/pii/S1364661314001120 pmid:24930047
  11. 11. Jia J, Liu L, Fang F, Luo H. Sequential sampling of visual objects during sustained attention. PLoS Biol 2017;15(6):e2001903. pmid:28658261
  12. 12. Lalor EC, Pearlmutter BA, Reilly RB, McDarby G, Foxe JJ. The VESPA: A method for the rapid estimation of a visual evoked potential. Neuroimage [Internet]. 2006 [cited 2017 Jun 4];32(4):1549–61. Available from: http://www.sciencedirect.com/science/article/pii/S1053811906006434 pmid:16875844
  13. 13. VanRullen R, MacDonald JSP. Perceptual echoes at 10 Hz in the human brain. Curr Biol. 2012;22(11):995–9. pmid:22560609
  14. 14. Mazaheri A, Jensen O. Rhythmic pulsing: linking ongoing brain activity with evoked responses. Front Hum Neurosci [Internet]. 2010 Jan [cited 2011 Aug 12];4(October):177. Available from: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2972683&tool=pmcentrez&rendertype=abstract pmid:21060804
  15. 15. Jensen O, Mazaheri A. Shaping functional architecture by oscillatory alpha activity: gating by inhibition. Front Hum Neurosci [Internet]. 2010 Jan [cited 2011 Jun 10];4(November):186. Available from: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2990626&tool=pmcentrez&rendertype=abstract pmid:21119777
  16. 16. Thut G, Miniussi C, Gross J. The Functional Importance of Rhythmic Activity in the Brain. Curr Biol [Internet]. 2012 [cited 2017 Mar 30];22(16):R658–63. Available from: http://www.sciencedirect.com/science/article/pii/S0960982212007373 pmid:22917517
  17. 17. Klimesch W, Sauseng P, Hanslmayr S. EEG alpha oscillations: the inhibition-timing hypothesis. Brain Res Rev [Internet]. 2007 Jan [cited 2011 Jul 5];53(1):63–88. Available from: http://www.ncbi.nlm.nih.gov/pubmed/16887192 pmid:16887192
  18. 18. Foxe JJ, Snyder AC. The Role of Alpha-Band Brain Oscillations as a Sensory Suppression Mechanism during Selective Attention. Front Psychol [Internet]. 2011 [cited 2011 Jul 18];2(July):1–13. Available from: http://www.frontiersin.org/Perception_Science/10.3389/fpsyg.2011.00154/abstract
  19. 19. Landau AN, Fries P. Attention samples stimuli rhythmically. Curr Biol [Internet]. 2012;22(11):1000–4. Available from: http://dx.doi.org/10.1016/j.cub.2012.03.054 pmid:22633805
  20. 20. Fiebelkorn IC, Saalmann YB, Kastner S. Rhythmic sampling within and between objects despite sustained attention at a cued location. Curr Biol [Internet]. 2013;23(24):2553–8. Available from: http://dx.doi.org/10.1016/j.cub.2013.10.063pmid:24316204
  21. 21. Landau AN, Schreyer HM, Van Pelt S, Fries P. Distributed Attention Is Implemented through Theta-Rhythmic Gamma Modulation. Curr Biol. 2015;25(17):2332–7. pmid:26279231
  22. 22. van Kerkoerle T, Self MW, Dagnino B, Gariel-Mathis M-A, Poort J, van der Togt C, et al. Alpha and gamma oscillations characterize feedback and feedforward processing in monkey visual cortex. Proc Natl Acad Sci [Internet]. 2014;111(40):14332–41. Available from: http://www.pnas.org/lookup/doi/10.1073/pnas.1402773111 pmid:25205811
  23. 23. Bastos AM, Vezoli J, Bosman CA, Schoffelen J-M, Oostenveld R, Dowdall JR, et al. Visual Areas Exert Feedforward and Feedback Influences through Distinct Frequency Channels. Neuron [Internet]. 2015 [cited 2017 Jun 5];85(2):390–401. Available from: http://www.sciencedirect.com/science/article/pii/S089662731401099X pmid:25556836
  24. 24. Song K, Meng M, Chen L, Zhou K, Luo H. Behavioral Oscillations in Attention: Rhythmic α Pulses Mediated through θ Band. J Neurosci [Internet]. 2014 [cited 2017 Jun 21];34(14). Available from: http://www.jneurosci.org/content/34/14/4837.short
  25. 25. Holcombe AO, Chen W. Splitting attention reduces temporal resolution from 7 Hz for tracking one object to <3 Hz when tracking three. J Vis [Internet]. 2013 Jan 9 [cited 2016 Dec 30];13(1):12. Available from: http://jov.arvojournals.org/Article.aspx?doi=10.1167/13.1.12 pmid:23302215
  26. 26. Treisman AM, Gelade G. A feature-integration theory of attention. Cogn Psychol [Internet]. 1980 [cited 2017 Jun 6];12(1):97–136. Available from: http://www.sciencedirect.com/science/article/pii/0010028580900055 pmid:7351125
  27. 27. Wolfe JM, Horowitz TS. What attributes guide the deployment of visual attention and how do they do it? Nat Rev Neurosci. 2004;5(6):495–501. pmid:15152199
  28. 28. Woodman GF, Luck SJ. Serial deployment of attention during visual search. J Exp Psychol Hum Percept Perform [Internet]. 2003 [cited 2017 Jun 6];29(1):121–38. Available from: http://doi.apa.org/getdoi.cfm?doi=10.1037/0096-1523.29.1.121 pmid:12669752
  29. 29. Peelen M V, Kastner S. Attention in the real world: toward understanding its neural basis. Trends Cogn Sci [Internet]. 2014 May [cited 2015 Aug 12];18(5):242–50. Available from: http://www.ncbi.nlm.nih.gov/pubmed/24630872 pmid:24630872
  30. 30. Stokes MG, Atherton K, Patai EZ, Nobre AC. Long-term memory prepares neural activity for perception. Proc Natl Acad Sci [Internet]. 2012 Feb 7 [cited 2017 Jun 5];109(6):E360–7. Available from: http://www.ncbi.nlm.nih.gov/pubmed/22109554 pmid:22109554
  31. 31. Spyropoulos G, Bosman CA, Fries P. A Theta Rhythm In Awake Macaque V1 And V4 And Its Attentional Modulation. bioRxiv [Internet]. 2017 [cited 2017 Jun 6]; Available from: http://biorxiv.org/content/early/2017/03/17/117804
  32. 32. Dugué L, VanRullen R. Transcranial Magnetic Stimulation Reveals Intrinsic Perceptual and Attentional Rhythms. Front Neurosci [Internet]. 2017 Mar 27 [cited 2017 Jun 6];11:154. Available from: http://www.ncbi.nlm.nih.gov/pubmed/28396622 pmid:28396622
  33. 33. Palva S, Palva JM. New vistas for alpha-frequency band oscillations. Trends Neurosci [Internet]. 2007 Apr [cited 2015 Nov 20];30(4):150–8. Available from: http://www.sciencedirect.com/science/article/pii/S0166223607000264 pmid:17307258
  34. 34. Berger H. Über das Elektrenkephalogramm des Menschen. Arch Psychiatr Nervenkr [Internet]. 1929 Dec [cited 2017 Jun 6];87(1):527–70. Available from: khttp://link.springer.com/10.1007/BF01797193
  35. 35. Bollimunta A, Mo J, Schroeder CE, Ding M. Neuronal mechanisms and attentional modulation of corticothalamic α oscillations. J Neurosci [Internet]. 2011 Mar 30 [cited 2011 Jul 3];31(13):4935–43. Available from: http://www.ncbi.nlm.nih.gov/pubmed/21451032 pmid:21451032
  36. 36. Fecteau JH, Munoz DP. Salience, relevance, and firing: a priority map for target selection. Trends Cogn Sci. 2006;10(8):382–90. pmid:16843702
  37. 37. Gottlieb J, Balan P. Attention as a decision in information space. Trends Cogn Sci [Internet]. 2010 Jun [cited 2017 Jun 5];14(6):240–8. Available from: http://linkinghub.elsevier.com/retrieve/pii/S1364661310000483 pmid:20399701
  38. 38. Itti L, Koch C. Computational modelling of visual attention. Nat Rev Neurosci. 2001;2(3):194–203. pmid:11256080
  39. 39. Bosman CA, Womelsdorf T, Desimone R, Fries P. A Microsaccadic Rhythm Modulates Gamma-Band Synchronization and Behavior. J Neurosci [Internet]. 2009 [cited 2017 Jun 6];29(30):9471–80. Available from: http://www.esi-frankfurt.de/fileadmin/user_upload/FriesPub/Bosman2009.pdf pmid:19641110