(C) PLOS One
This story was originally published by PLOS One and is unaltered.
. . . . . . . . . .
Feedback information sharing in the human brain reflects bistable perception in the absence of report [1]
['Andres Canales-Johnson', 'Conscious Brain Lab', 'Department Of Psychology', 'University Of Amsterdam', 'Amsterdam', 'The Netherlands', 'Amsterdam Brain Cognition', 'Cambridge Consciousness', 'Cognition Lab', 'University Of Cambridge']
Date: 2023-05
In the search for the neural basis of conscious experience, perception and the cognitive processes associated with reporting perception are typically confounded as neural activity is recorded while participants explicitly report what they experience. Here, we present a novel way to disentangle perception from report using eye movement analysis techniques based on convolutional neural networks and neurodynamical analyses based on information theory. We use a bistable visual stimulus that instantiates two well-known properties of conscious perception: integration and differentiation. At any given moment, observers either perceive the stimulus as one integrated unitary object or as two differentiated objects that are clearly distinct from each other. Using electroencephalography, we show that measures of integration and differentiation based on information theory closely follow participants’ perceptual experience of those contents when switches were reported. We observed increased information integration between anterior to posterior electrodes (front to back) prior to a switch to the integrated percept, and higher information differentiation of anterior signals leading up to reporting the differentiated percept. Crucially, information integration was closely linked to perception and even observed in a no-report condition when perceptual transitions were inferred from eye movements alone. In contrast, the link between neural differentiation and perception was observed solely in the active report condition. Our results, therefore, suggest that perception and the processes associated with report require distinct amounts of anterior–posterior network communication and anterior information differentiation. While front-to-back directed information is associated with changes in the content of perception when viewing bistable visual stimuli, regardless of report, frontal information differentiation was absent in the no-report condition and therefore is not directly linked to perception per se.
We focused on neural metrics inspired by information theory [ 27 ], to assess the neural underpinnings of awareness in report and no-report conditions while viewing this integrated or differentiated percept. In a recent report-based study on perceptual ambiguity, it was shown that frontoparietal information integration, computed as the amount of information sharing between frontal and parietal EEG/ECoG signals, increased when participants reported an ambiguous auditory stimulus as perceptually integrated compared to when it was reported as perceptually differentiated. On the contrary, information differentiation, computed as the amount of information diversity within frontal or parietal EEG/ECoG signals separately, showed the opposite pattern within the same frontoparietal electrodes: It increased when participants reported the bistable stimulus as perceptually differentiated compared to perceptually integrated. This suggests that information integration and information differentiation go hand in hand with observers’ phenomenology of an integrated or differentiated percept of an ambiguous stimulus, a hypothesis that we explore in the visual modality here. One crucial open question so far, however, is whether the observed changes in neural integration and differentiation are dependent upon reporting or not because, in this previous study, observers had to explicitly report the perceptual switches by pressing buttons. We address this issue here by specifically relating neural metrics of integration and differentiation to changing percepts in both report and no-report conditions [ 5 , 9 ]. Further, we here also specify the directionality of neural information flow to relate feedforward and feedback activity to changes in perceptual experience, independent of the necessity to report.
By studying conscious contents in the presence and absence of explicit report during visual bistable perception, we here investigate the neural mechanisms underlying perceptual transitions, while dissociating perception from report, or other related factors. The bistable stimulus we use consists of two overlapping gratings, known as ambiguous plaids or moving plaids [ 21 , 22 ]. This ambiguous stimulus can be perceived as one plaid moving coherently in a vertical direction, or two plaids sliding across one another horizontally. The plaid stimulus was thus perceived as perceptually integrated (one object) or perceptually differentiated (two objects), respectively. Due to the unique direction of motion associated with these stimuli, this stimulus allows us to track perception in the absence of report by capitalizing on the occurrence of optokinetic nystagmus (OKN) [ 23 – 26 ]. OKN is a visually induced reflex, comprised of a combination of slow-phase and fast-phase eye movements that allow the eyes to follow objects in motion, e.g., when looking at the trees alongside the road while moving past them in a car. Importantly, OKN follows perception when viewing bistable visual stimuli, which makes it useful for assessing perception in the absence of reports [ 23 , 25 , 26 ].
Perceptual ambiguity is a key phenomenon to study the brain mechanisms of conscious perception [ 10 ], often experimentally elicited by using “multistable stimuli” [ 11 ]. Multistability can be induced in several ways or using several ambiguous stimuli, e.g., using binocular rivalry, structure from motion, the Necker cube, and motion-induced blindness. The common feature of such paradigms is that an ambiguous stimulus can be interpreted in two, or multiple ways, without changing the sensory input that reaches the senses. Confronted with this ambiguity, observers experience spontaneous fluctuations between interpretations of the stimulus. Experimental evidence has varied regarding whether changes in perception when viewing multistable stimuli correlate with changes in early sensory or higher-order cortical activity. In support of higher loci, single-cell recordings in monkeys have revealed that the strongest perceptual modulations of neuronal firing occur in higher association cortices, including inferotemporal cortex (ITC) and dorsolateral prefrontal cortex (DLPFC) [ 12 – 15 ]. However, previous human fMRI studies have associated both early and higher sensory regions with perceptual transitions [ 16 ]. For example, when face and house stimuli compete for perceptual dominance, category-specific regions in ITC activate more strongly for the dominant percept, even before observers indicate a perceptual switch via button press. Another common observation in human fMRI studies is the association of a large network of parietal and frontal brain areas, traditionally associated with attentional and cognitive functions, during the report of perceptual transitions (for an overview, see [ 11 ]). One central and unresolved issue in consciousness science in general [ 8 ], and multistable perception in particular, is what processes these large clusters of activations in the frontoparietal cortex reflect. The feedback account states that frontal regions actively exert a top-down influence on sensory brain regions to resolve perceptual ambiguity. Evidence in favor of this account shows that targeting specific nodes in this network using transcranial magnetic stimulation (TMS) can shape the rate of perceptual transitions, suggesting their causal influence in resolving, or even driving, perceptual ambiguity [ 11 , 17 , 18 ]. The opposite feedforward account links frontal activity to processing of the consequences of perceiving, and, thus, processes occurring after perceptual ambiguity was resolved by posterior brain regions [ 18 – 20 ]. Here, we contribute to this debate by assessing the extent to which feedforward and feedback processes correlate with changes in multistable perception, in the presence and absence of explicit report about the dominant percept.
Consciousness is a subjective experience, the “what it is likeness” of our experience, e.g., when we perceive a certain scene or endure pain. Having such experiences may be the main reason why life matters to us, and it may set us apart from other smart but nonliving “things,” such as your phone or the internet [ 1 ]. Conscious experience is a truly subjective and private phenomenon, and it cannot be observed directly from the outside. To objectively study and understand it, we have to get access to the inner subjective experience of others, e.g., via self-report tasks or via behavioral tasks that we believe can capture conscious content [ 2 ]. In this way, when combined with neuroimaging tools, the neural correlates of consciousness (NCCs) are sought [ 3 ]. However, while doing so, what we observe in measures of brain activity may not be as pure as we hoped for. Our measurements of “consciousness” may be confounded with several cognitive factors, e.g., the act of reporting, attention, surprise, or decision-making, arising after conscious experience has emerged. This severely complicates our attempt to isolate the neural basis of conscious experience [ 4 – 9 ]. Here, we address this complication, by assessing the influence of report and no-report task instructions on neural measures reflecting perceptual transitions in situations in which sensory input is ambiguous.
To compare the integration and differentiation results with more traditional electrophysiological features of multistable perception, we computed the time-frequency profile of perceptual switches to the differentiated and integrated percepts in Fig 5 (using cluster-based corrections for multiple comparisons). We observed increased oscillatory power in a broad frequency band from 1 to 18 Hz at frontal electrode sites, preceding a perceptual transition to an integrated compared to a differentiated percept. This was only observed in the button press locked analyses of the report condition, not in any of the other conditions. Interestingly, the same pattern of results was observed in the frontal complexity measure (diff-INFO) ( Fig 2D and 2E ). Complexity measures have been linked to changes in frequency dynamics [ 42 ], and in particular, an increase in low-frequency power can increase the redundancy in neural signals, decreasing information complexity [ 43 – 45 ]. Accordingly, we observed that in the time window that overlaps both measures (from −1,050 ms before the response onset), the decrease in complexity negatively correlated with time-frequency power (Pearson’s r = −0.39; p = 0.013). We return to this result in our Discussion.
Finally, and most importantly, we have reanalyzed a previously published dataset in which we used an auditory bistable stimulus in combination with EEG measurements [ 27 ]. In that study, participants listened to a sequence of tones and indicated with a button press whether they experienced either a single stream (perceptual integration) or two parallel streams (perceptual differentiation) of sounds. In that dataset, there were no systematic associations between eye movements and perceptual switches due to the auditory nature of the task. In S4 Fig , we report the dir-INFO analysis showing a highly similar pattern of results as reported here for bistable visual stimuli. Again, we observed increased dir-INFO leading up to integrated percepts and uniquely in the feedback direction (see S4 Fig for details). Together, these considerations and additional results from auditory bistability indicate that the results observed here are unlikely to be caused by incidental differences in eye movement patterns between conditions or perceptual states.
We would like to note that the observed EEG effects are unlikely to be the result of differences in eye movement signals between conditions for several reasons. First, measures of neural differentiation were specific to the report-locked analyses, although similar eye movement patterns were present in all conditions. Second, the difference in the directionality of the dir-INFO effects (presence of a dir-INFO effect in the feedback but not feedforward direction, no effects for temporal electrodes) indicates that general eye movement confounds are also not likely. Third, OKN signals gradually change over time over a period of 2 seconds leading up to the perceptual switch, with highly similar time courses for both types of percepts (see Fig 1C ). The neural effects we observed do not reflect such a gradual buildup and differ based on the perceived content, which differs from what one would expect based on the pattern of the OKN signals. Fourth, it is also unlikely that eye-related signals (e.g., muscle activity) get somehow embedded in the EEG recordings and drive our anterior-to-posterior effects. The idea would be that if this signal is first measured on frontal and then on posterior electrodes, transfer entropy or source conduction would increase measures of directed information flow. This interpretation is unlikely given that we did not observe any differences in OKN signals when comparing to INT and to DIFF periods in the time window 1,200 to 500 ms before the button press, where we observed the differences in our neural measures in the report condition ( Fig 1C ; dependent-samples t test (to INT, to DIF): t 1,39 = 0.093; p = 0.926; BF 01 = 5.77). Fifth, when the eyes change their direction maximally, i.e., at the zero-crossings where the derivative of the slope is largest ( Fig 1C ), or when the eyes most strongly indicate a certain perceptual state (at the peak of the OKN signal in Fig 1C ), dir-INFO in the feedback direction was not maximal. If the eyes drive the neural effects, a peak in dir-INFO would be expected.
After establishing the dynamics of information integration and differentiation in the report condition, and finding that differentiation measures, but not integration measures, were dependent on report, we next analyzed both during the no-report condition (i.e., locked to the OKN crossings). Similar to the report condition, a cluster-based permutation test revealed significant clusters of increased dir-INFO when the visual stimulus was perceived as integrated as compared to differentiated, but uniquely in the feedback direction ( Fig 4A ). The observed clusters showed a similar range of delays (around 50 to 400 ms), and they were observed in similar time windows (−600 to 0 ms around the OKN crossings) as in the report condition when time-locked to the OKN ( Fig 3 ). We also observed significant clusters showing an interaction between perceptual switch and information direction (p < 0.01), showing stronger integration when the percept switched to integrated as compared to differentiated, and more so in the feedback than feedforward direction ( Fig 4C ).
The dir-INFO analyses were performed in the same way as in the report condition, but this time locked to OKN crossings instead of button presses. We observed significant clusters of increased dir-INFO when the visual stimulus was perceived as integrated as compared to differentiated, again only in the feedback direction ( Fig 3A ). Significant clusters spanned delays between 50 to 450 ms and occurred roughly −1,600 to 100 ms around the moment OKN crossed. Interestingly, significant time points were observed much closer in time to the perceptual switches when time locked to OKN crossings than when locked to responses ( Fig 2A ). Similar interaction analysis between switch direction and integration direction (feedback versus feedforward) was performed. The difference in directed information between integrated and differentiated percepts was stronger for the feedback direction than the feedforward direction (interaction cluster p < 0.01; Fig 3C ). For information differentiation, however, locked to the OKN crossings in the report condition, no robust clusters were observed ( Fig 3D ), neither for the front (left panel) nor the back (right panel) ROIs. As a strong diff-INFO effect was observed in this same data when aligned to button press ( Fig 2D ), we note that this measure may capture processes involved in (manual) report, rather than in the experience of a perceptual state. We return to this nuance in our Discussion.
We next tested for an interaction between the direction of perceptual change and information differentiation by subtracting between to DIF and to INT trials within frontal and back ROIs ( Fig 2E ). As predicted, a cluster-based permutation test revealed an interaction between perceptual change, perceptual direction, and ROI based on information differentiation (p < 0.01), showing higher diff-INFO in the anterior region prior to a change to seeing the differentiated percept, without changes to diff-INFO in the posterior regions ( Fig 2E ). Finally, we computed diff-INFO in the right and left ROI using the temporal electrodes. No diff-INFO differences between perceptual switches were observed in the right nor left ROI ( S3 Fig ).
We next analyzed the dynamics of information differentiation (diff-INFO) within frontal and parietal signals separately. Neural differentiation metrics quantify the diversity of information patterns within brain signals, and it has been useful for distinguishing between conscious states [ 38 – 40 ] and conscious contents previously [ 27 ]. We performed cluster-based permutation testing on the difference between the diff-INFO time series leading up to the integrated percept versus the differentiated percept for the same set of anterior (front) and posterior (back) electrodes. As expected, significantly increased diff-INFO was observed before the bistable stimulus was reported as differentiated as compared to when was reported as integrated. This effect was observed at approximately −1,300 to 250 ms around the button press, indicating the upcoming perceptual switch ( Fig 2E ) (note that in the 2,000 ms before the perceptual switch indicated by the response, no previous switches are incorporated). Importantly, no such effect was observed for the posterior electrodes ( Fig 2D ).
Next, we tested for an interaction between the direction of perceptual change (to integrated, to differentiated) and information direction (front to back, back to front). We performed a cluster-based permutation test on the difference between the perceptual switch (to INT minus to DIFF trials) and information direction (feedback minus feedforward). We observed two significant clusters (cluster p < 0.01) showing stronger dir-INFO when perception switched to integrated as compared to differentiated in the front-to-back direction, but not in the back-to-front direction ( Fig 2C ). Finally, to test for the spatial specificity of the dir-INFO effect (i.e., the anterior–posterior direction), we computed dir-INFO in the right–left direction using temporal electrodes ( S1B Fig ). No dir-INFO differences between perceptual switches were observed in the right-to-left direction, nor in the left-to-right direction ( S2 Fig ).
( A) Group-level dir-INFO between front and back ROIs (upper row) and between back and front ROIs (lower row) when moving plaids are reported as integrated (to INT; red color), reported as differentiated (to DIF; blue color), and the cluster-based permutation tests between the two. ( B) Distribution of single-participant dir-INFO values for the frontal to the parietal direction (upper row) and for the parietal to the frontal direction (lower row). Values were extracted based on the significant clusters obtained in the frontal-to-parietal contrast (purple blobs). ( C) Group-level dir-INFO statistical interaction effect computed as the difference between to INT and to DIFF trials between front and back ROIs (i.e., a double-subtraction). ( D) Group-level diff-INFO within the frontal ROI and the corresponding single-subject diff-INFO values (left panel) when moving plaids are reported as differentiated and reported as integrated, and the same for the parietal ROI (right panel). ( E) Group-level diff-INFO statistical interaction effect computed as the difference between to DIF and to INT trials between front (yellow color) and back (black color) ROIs. Data are available at
https://osf.io/a2f3v/ .
In the case of the report condition, Fig 2A shows dir-INFO between frontal and parietal signals, in both directions, so in the feedback direction (anterior to posterior electrodes) and in the feedforward direction (posterior to anterior electrodes; S1A Fig for electrode location). We plot feedforward and feedback-directed information as a function of signal delay between the two electrode sets, both when the moving plaids are reported as integrated (to INT; red color) as well as differentiated (to DIF; blue color). Signal delays are plotted on the y-axis, and testing time before the button press is plotted on the x- axis. A cluster-based permutation test performed on the difference between dir-INFO leading up to the integrated percept versus differentiated percept shows two significant clusters (cluster p < 0.01). One cluster was observed indicating an increase in dir-INFO for the switch to integrated condition compared to the differentiated condition, at approximately 50 ms delay between the time-period 1,050 to 900 ms prior to the button press. A second cluster indicated an increase in dir-INFO prior to a switch to the integrated condition approximately 250 ms delay of approximately 1,200 to 500 ms before the button press ( Fig 2 ). Importantly, these significant clusters were observed in the feedback direction only. Note that there are no button presses in the time window of interest, due to our trial exclusion procedure.
We first investigated the neural dynamics of information integration when participants reported perceptual switches by pressing buttons. Here, information integration refers to the transformation of inputs into outputs through recurrent neural dynamics. These nonlinear transformations are essential for pattern extraction in neural networks and may lead to signal amplification and broadcasting [ 32 ]. To this end, we computed a metric of information integration known as directed information (dir-INFO), which quantifies directional connectivity between neural signals [ 33 , 34 ]. Compared to traditional causality detection methods based on linear models (e.g., Granger causality), dir-INFO is a model-free measure and can detect both linear and nonlinear functional relationships between brain signals. We took advantage of previous work that made this measure statistically robust when applied to neural data [ 33 , 35 – 37 ]. dir-INFO quantifies functional connectivity by measuring the degree to which the past of a “sender signal” X (e.g., EEG traces of anterior electrodes) predicts the future of another “receiver signal” Y (e.g., EEG traces of posterior electrodes), conditional on the past of the receiver signal Y. Thus, if there is significant dir-INFO between EEG signal X at one time, and EEG signal Y at a later time, this shows that signal X contains information about the future signal Y. Conditioning out the past of signal Y ensures the delayed interaction is providing new information over and above that available in the past of signal Y. For all dir-INFO analyses, we tested multiple delays from 0 ms to 500 ms (in steps of 4 ms) between the sender and receiver signal, which allows us to investigate the characteristic time delay of directed information transfer during perceptual switches between anterior and posterior signals. For all analyses reported here, we lock data to perceptual switches (either marked by a button press or eye movement analysis) and inspect the EEG dynamics leading up to this perceptual switch. We have excluded all trials in which the previous perceptual switch occurred less than 2 seconds before the switch of interest (at time 0).
To investigate whether perceptual ambiguity is resolved differently when the stimulus is reported as integrated or differentiated, we computed the time delay between OKN crossings and button presses in the report condition. We observed a longer delay between OKN crossing and the button press when the stimulus was reported as integrated versus differentiated (t 1,39 = 2.23; p = 0.032; Cohen’s d = 0.352) ( Fig 1F ). This increase in reaction time after a change in percept has occurred (as confirmed via OKN) suggests an increase in cognitive demand when reporting an integrated percept (transitioning from two objects to one) than when reporting a differentiated percept (going from one to two objects). We return to these features in the time-frequency characteristics of perceptual switches.
To quantify the predictive value of the OKN crossings for identifying perceptual contents, we performed a decoding analysis using a convolutional neural network (CNN). To measure the generalization performance of the CNN, we split the OKN epochs and their corresponding button press labels into 3 parts: training, validation, and testing. Approximately 70% of the entire dataset was allocated to the training set. The remaining 30% were further split equally to create validation and test datasets, each containing 15% of the original data and labels (see Methods for details). First, we trained a CNN to decode the button presses from the oculomotor signal using a cross-classification procedure in the report condition. We obtained a classification accuracy of 85% ( Fig 1D ). Next, in the same report condition, a second CNN was trained to decode the labels based on the OKN crossings, reaching a classification accuracy of 79%, indicating that OKN changes could reliably decode changes in perception. We finally estimated OKN crossings in the no-report data, labeled each crossing accordingly, and trained a third CNN to decode the labels, obtaining a classification accuracy of 80% ( Fig 1D ). Finally, another indication that our CNN accurately marks the occurrence of perceptual switches is that the number of switches in the report condition strongly correlates when based on button presses versus CNN performance (across subject Pearson’s r = 0.88; p < 0.001; see [ 23 ] for a similar analysis).
We first characterized changes in oculomotor signals in both directions of perceptual change (vertical/horizontal) in the report condition time-locked to the button response. We computed the slow phase of the OKN (see Methods ) as it distinguishes between visual percepts during binocular rivalry [ 9 , 25 , 28 – 30 ]. OKN is particularly useful as positive, and negative zero-crossings tend to precede the perceptual changes indicated by button presses. We observed OKN zero-crossing going from positive to negative before the visual stimulus was reported as integrated, and zero-crossing going from negative to positive, when the stimulus was reported as differentiated ( Fig 1C ). In the left panel of Fig 1C (Report: BP), we depicted the raw OKN time series time-locked to the button press (−200 to 500 ms) in microvolts. Next, for both the report (Report: OKN; Fig 1C ) and no-report (No report: OKN; Fig 1C ) conditions, after computing the OKN crossings in the continuous oculomotor signal, we labeled the data with their corresponding button press labels and created epochs locked to the OKN crossings, which, as expected, revealed a clear moment for the perceptual transitions ( Fig 1C ). In the middle and right panels of Fig 1C , we plotted the slow phase of the OKN signal locked to zero-crossing in arbitrary units, which was computed as described in the Methods. It is important to note the similarity of OKN data in both report and no-report conditions, demonstrating its potential to capture the contents of perception in both cases.
(A) Phenomenology during visual bistability in the report condition. Participants observed an ambiguous stimulus (moving plaids) that are experienced either as one plaid moving vertically (integrated percept; red arrow) or as two plaids moving horizontally (differentiated percept; blue arrows). Perceptual transitions occur either in integrated to differentiated direction (to DIF; blue) or in the differentiated to integrated direction (to INT; red). Middle row: Behavioral responses during the task. Participants pressed one button when perceiving that the integrated percept had fully changed into the differentiated percept (red button) and another button when perceiving that the differentiated percept had fully changed into the integrated percept (blue button). Bottom row: Dynamical analyses for EEG and oculomotor (eye tracking) signals. From the oculomotor response, we estimated the speed of the slow phase of the OKN (grey line) to infer participant’s perceptual content around OKN zero-crossings (to INT: red eye; to DIFF: blue eye), which coincides with their reports indexed by button presses (to INT: red button; to DIFF: blue button). (B) In the no-report condition, observers passively viewed the bistable stimulus (no button presses), and perceptual alternations were inferred from oculomotor signals (to INT: red eye; to DIFF: blue eye). EEG y-axis represents voltage (microvolts) and x-axis time (milliseconds); OKN y-axis represents velocity (arbitrary units) and x-axis time (milliseconds). (C) Left panel: Raw oculomotor signal (in microvolts) locked to the button press (Report BP). Middle and right panels: slow phase of the OKN (in arbitrary units) extracted from the oculomotor signal and locked to the OKN zero-crossings (Report: OKN; No report: OKN). The slow phase of the OKN was obtained by computing the instantaneous velocity of the oculomotor signal smoothed over time (see Methods ). Decoding accuracies (D) and histograms (E) for perceptual switches in the report condition locked to the button response, in the report condition locked to the OKN crossings, and in the no-report condition locked to the OKN crossings. (F) Delay between OKN crossings and button responses in the report condition (left panel: single-subject distributions; right panel: group-level analysis). Data are available at
https://osf.io/a2f3v/ .
There were two experimental conditions that were performed in alternating runs. During the report runs ( Fig 1A ), observers pressed one button when the percept changed from vertical movement to horizontal movement and another button to indicate changes from horizontal movement to vertical movement (buttons were counterbalanced across blocks). During the no-report runs ( Fig 1B ), observers were instructed to remain central fixation and just passively view the stimulus, and, therefore, perceptual transitions were rendered task-irrelevant. Participants were not informed about the relationship between perception and oculomotor signals and not about our goal to infer perception from their eye signals.
Discussion
Perceptual rivalry is the phenomenon that the perceptual interpretation of a bistable visual stimulus alternate over time in the absence of physical changes in the presented sensory input. The use of perceptual rivalry, as a tool, has provided fundamental insight into what neural processes may reflect the content of our conscious experience. However, although great progress has been made in unraveling neural processes underlying the competition between neural representations and the perceptual change between alternative percepts, it has proven notoriously difficult to separate neural correlates of perceptual switches, from processes that have another origin and are associated, e.g., with report, attention, surprise, or commitment to a decision [8,9]. Here, we aimed to tackle this issue using an experimental setup that contained four crucial ingredients. First, we experimentally introduced conditions in which perceptual switches were task-relevant and had to be reported about, versus conditions in which perceptual transitions were irrelevant and bistable stimuli just had to be viewed passively [23,25,46]. Second, we relied on neural measures with millisecond temporal resolution, allowing us to pinpoint the relevant neural processes as they evolve over time. Third, we capitalized on novel information-based measures that have been shown to reflect the phenomenology of conscious perception [27,33], allowing us to track specific perceptual content over time, while it naturally alternates. Fourth and finally, we introduced a novel way to analyze eye-tracking data to be able to pinpoint when precisely in time perceptual interpretations alternate, allowing us to time-lock our analysis to focus on processing leading up to the perceptual change (times prior to OKN crossing), from processing involved in translating perception into action, as well as other cognitive confounds arising after the perceptual switch (times after OKN, prior to button-press). This effort has led to several novel results and conclusions, which are summarized below.
We showed that perceptual switches can systematically be inferred from eye movement measurements during passive viewing of a bistable visual stimulus by capitalizing on OKN measures in combination with deep neural network modeling. This confirms previous work that has shown that OKN and pupil measures can act as reliable indicators of the dynamics of perceptual and binocular rivalry [25,26,28]. In fact, percept durations varied commonly during experimental blocks in which perceptual switches had to be reported, and were therefore task-relevant, and blocks in which observers passively viewed the bistable stimulus (perceptual switches are task-irrelevant). In both conditions, the observer’s percept durations followed a right-skewed gamma distribution as has been observed previously across a wide range of bistability paradigms [31], suggesting that the phenomenology of perception was similar during active report and passive viewing. Next, we showed that when the perception of a bistable stimulus was explicitly reported, directed information from anterior to posterior signals was increased before the observers reported perceiving the integrated stimulus (one plaid moving upwards) compared to when the differentiated percept was reported. Note that although the posterior signals were not the source but the receiver of the information flow, posterior signals were still involved in the processing of the percept as the feedback pattern only emerged due to the statistical relationship between anterior and posterior signals. A different situation was observed when signals were analyzed in isolation using the information differentiation metric. Thus, before the percept was reported as differentiated (two stimuli, each moving sideways in a different direction), we observed higher information differentiation of anterior signals, as compared to integrated percepts, before the perceptual switch indicated by the button press. This effect was not observed on posterior sensors. These results indicate that our information-based measures derived from electrophysiological activity capture and track the phenomenology of the percept during perceptual ambiguity. These results agree with our previous report-based study on auditory bistable perception showing a correspondence between perceptual integration and differentiation and neural metrics of information integration and differentiation [27]. During report, when dir-INFO is high, an integrated percept is likely to be perceived, whereas when diff-INFO is high, a differentiated percept is more likely to be perceived.
When perceptual changes were inferred from eye movements during passive viewing, we observed that our directed integration measure still tracked the perceptual state of the observer, but uniquely so in the feedback direction (from anterior to posterior signals). However, information differentiation measures no longer tracked the evolving percept, suggesting that our differentiation measures may reflect other processes not directly linked to perception, such as differences in task set or, e.g., differences in attention due to the necessity to report. In sum, the relative strength of directed information in the feedback direction indicated which percept likely dominated during passive viewing, whereas neural differentiation does not. We would like to note that this means that directed information is thus a “symmetrical measure” in the sense that, when it is high, it reflects that an integrated percept is likely to be dominant, whereas when it is low, a differentiated stimulus is likely dominating perception.
During ambiguous perception (e.g., bistability, rivalry), feedback connections are thought to be critical for the comparison of internally generated predictions of sensory input with actual inputs [18,20]. Based on the neurobiology of feedback connections, they are vastly more numerous and divergent than feedforward ones, i.e., fewer neurons project in a feedforward manner, compared to a feedback one [47]. We believe that this dominance of feedback signals may account for the increase in directed information we observed prior to perceiving the integrated (vertical) motion percept (Figs 2A, 3A and 4A). More specifically, for our bidirectional plaid stimulus, perceiving an integrated percept of unified vertical motion required the holistic combination of local signals across the visual field. Thus, for perceptual integration, local signals from low-level areas may have required enhanced feedback modulations from higher areas to integrate visual information across space [48]. This integration can be seen as hypothesis testing in which the high-level interpretation can inform the low-level features where feedback projections are considered to enhance neural activity [49–51]. Supporting this view, recent studies on binocular rivalry have shown that PFC neurons increase their firing rate before the occurrence of perceptual switches both in monkeys [29] and humans [52]. In the case of multiunit activity (MUA) recordings in humans, increased frontal activation precedes the activity of lower areas such as MT, suggesting that the frontal cortex biases sensory information in the lower areas towards one of the two perceptual contents in a top-down feedback manner [52]. Thus, our results agree with this prior research, as stronger top-down modulation preceded the integrated interpretation of our bistable plaid stimulus.
It is important to stress that we used information-specific EEG measures, which allowed us to track the contents of rivalrous states while quantifying the directionality of integration for our observers. This approach is similar to previous binocular rivalry studies using, e.g., alternating face/house stimuli and isolating (the competition between) neural activity in feature or category-selective brain regions, either using electrophysiology [20] or fMRI [16,53]. The use of these EEG information-based measures enabled us to track the neural signatures of the content of perception with high temporal precision before perception alternated, similar to influential electrophysiological studies using feature-specific neural responses [12,29,52,54]. Due to the low temporal resolution of the BOLD signal, to be able to isolate the neural cause of perceptual switches during rivalry, previous fMRI studies have often included a replay condition with yoked switches that are simulated using video replay. These replay trials mimic perceptual switches but have no neural cause as they are not driven by fluctuations in brain activity. fMRI studies have shown that the dlPFC activates stronger for perceptual switches during rivalry (as compared to baseline) and often also activates stronger during rivalry than during replay [7,25,31,46,55]. Frassle and colleagues showed that when comparing rivalry to replay conditions, there was an increase in dlPFC activity when actively reporting on perceptual changes. However, the difference between rivalry and replay diminished in prefrontal areas during passive viewing [25], suggesting that actively reporting on rivalry recruits additional prefrontal resources compared to reporting on replay, which is not required for passively experiencing a change in percept (see also, e.g., [16]). Similarly, Brascamp and colleagues [31] have shown that prefrontal activation in fMRI (or in their words “responses in executive brain areas” in general) may be driven by the fact that perceptual switches often draw attention (possibly because they are surprising and relevant) and require report but are not related to perceptual transitions per se. They have shown this by designing a task in which switches in perception between two stimuli (dots moving in different directions in the two eyes) could go unnoticed by observers, as if these switches were preconscious (potentially accessible but not actually accessed) [56]. Although these switches remained unnoticed to the participants, sensory brain regions still showed neural activity patterns associated with those switches, whereas switch-related modulations in executive areas were minimized (although still numerically higher than baseline). Based on such results, it has been argued that PFC involvement during rivalry may potentially be related to report, or other consequences of perceptual switches, but that the PFC does not drive those switches [25]. Interestingly, our diff-INFO results also suggest that frontal information differentiation may reflect processing associated with report rather than perception per se. In the report condition, we found that reporting the stimulus as integrated took longer than reporting it as differentiated, indicative of a task-dependent increase in the cognitive demand of disambiguating the stimulus identity. It may be that these differences can explain the frontal information differentiation effect observed during the report of perceptual switches, compared to the absence of this effect during the passive viewing condition. Knapen and colleagues [7] showed that large parts of the prefrontal network, again especially dorsolateral PFC, activate stronger perceptual transitions with longer durations, so when the system takes longer to settle in a particular perceptual state (the transition phase between interpretations is longer) than when the transition phase is shorter. Finally, a recent study [18] has shown that transcranial stimulation of the right inferior frontal cortex (IFC) reduced the occurrence of perceptual changes for bistable stimuli when they had to be actively reported (see also [57] for similar results while stimulating the DLPFC). The authors argued that the IFC may register perceptual conflicts (or the mismatch/prediction error signal) between two possible perceptual interpretations, gradually building up towards the perceptual switch. Therefore, the IFC may be influencing the competition between pools of neurons coding for the different percepts in the visual cortex, thereby “steering” or “codetermining” conscious perception [19]. Our front-to-back dir-INFO results are more in line with this latter finding, suggesting that even during no report conditions, subtle influences from frontal signals codetermine perceptual switches during visual bistability (and report-based auditory bistability (S4 Fig), but see [57]).
The experiment designed here can be regarded as a so-called “no-report paradigm” and not the stricter “no-cognition paradigm,” because in principle when one eliminates any type of explicit report, this does not necessarily remove all possible processes prior to reporting that could happen throughout the experiment [5]. Observers may still reason, think, or reflect about the presented bistable stimulus and perceptual alternations. This issue is similar to previous attempts to minimalize cognitive processes during perceptual rivalry [23,25,46]. In our experiment, the crucial aspect is that perceptual switches were task-irrelevant during the passive viewing condition and task-relevant during the report condition [58]. We cannot and did not control what observers were doing during passive viewing and therefore do not know whether and how observers were reasoning about the perceived stimulus or the occurrence of perceptual switches. Therefore, we only intended to arbitrate between task/report-related and perception-related information-based measures of perceptual transitions. Interestingly, using a similar setup, combining OKN measures with manipulation of task relevance of perceptual switches, it has recently been shown that also in the pupil response, task/report and perception-related dilations and constrictions can be separated [23]. Finally, a cautionary note on the experimental design is that it is possible that participants were preparing an overt response in both conditions but just withheld their response in the no-report condition. This may be a consequence of alternating blocks, in which we chose to equate the conditions as much as possible (e.g., match rates of perceptual learning, fatigue, and drowsiness over time between conditions). It is possible that perceptual changes were attended to in the no-report blocks, although they were not task-relevant.
We note, however, that we also observed frontal theta-band power increases uniquely in manual report conditions—supporting the separation between report and no-report states in our design. Frontal-central theta power has often been related to the processing of conflict, both at the level of stimuli as well as for conflicting stimulus–response mappings [59,60]. In the time-frequency response-locked analyses, we observed increased theta-band power leading up to the integrated compared to differentiated percept, as well as longer reaction times between OKN crossings and report. This pattern of results agreed with the analysis of information differentiation in the report condition (Fig 2D), which was also only observed in the response-locked analyses. Combined, these results suggest that there may be additional differences between reporting an integrated and differentiated percept, which are contingent on subjective conflict, an exciting possibility for future research that may unify mechanisms of perceptual and cognitive decision-making [61].
Despite much debate in the field, influential theories of consciousness [62], such as Global Neuronal Workspace theory, Recurrent Processing theory, and Integrated Information Theory, agree that the common property of conscious experience relates to the brain’s capacity to integrate information through recurrent processing (the combination of lateral and feedback interactions), enhancing cortico-cortical interactions at a local scale (between nearby regions) or global scale (between distant regions) [63–65]. Note that such consciousness theories are mostly concerned with explaining which neural processes can be observed when sensory information “crosses the threshold” from subliminal (unconscious) to phenomenal or access consciousness and thus aim to isolate the necessary ingredients constituent to conscious experience. Instead, we focus here on which neural processes influence the competition between alternative interpretations of ambiguous visual input (see [19] for a short review of this difference). In light of that latter debate, we show that the amount of directed information flow between anterior to posterior electrodes reflects the likelihood of perceiving the integrated version of an ambiguous stimulus during perceptual ambiguity, even in a context where report is not required. The low spatial resolution of our EEG measurements, however, precludes any claims about the specific origins of these observed feedback signals and future studies are needed to obtain more spatial specificity as well as a more mechanistic explanation of the neural processes the directed information measure reflects.
In conclusion, our results suggest that the relevant dynamical mechanism for perceiving different contents during visual bistability, controlling for many factors associated with reporting perception, is the directed information between frontal and posterior signals, rather than the isolated information differentiation contained within the front or the back of the brain.
[END]
---
[1] Url:
https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3002120
Published and (C) by PLOS One
Content appears here under this condition or license: Creative Commons - Attribution BY 4.0.
via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/plosone/