The post headline may seem an oxymoron , but it is indeed possible to perceive colors unconsciously. How do we know that someone has perceived a color, when he doesn’t report the qualia. We do so by measuring the effects on subsequent behavior. Consider subliminal priming. Consider a subliminal stroop test, in which color patches are presented subliminally and then color lexical terms are presented consciously in neutral (say black) ink. I’m sure with this subliminal modified stroop test one could still get a color and lexical term interaction effect; the point is that color , when not perceived, may still influence subsequent behavior.
The experimental paradigm in this PNAS article did not go so far, but restricted itself to color stimuli that was not attended to; that is, the color was indeed perceived, but it was not attended to (the task involved attention to form rather than color) and so as the color was not attended to, they presumed that the effects that the color information would have on behavior would be completely unconscious. I’m not convinced, but that doesn’t invalidate their otherwise very beautiful study that once again provides strong evidence for the milder version of the Sapir-Whorf hypothesis, at least as it relates to categorical color perception.
Now, I have written previously about Sapir- Whorf hypothesis in general, and in particular about the ability of Russians( who have two separate terms for light and dark blue) to visually discriminate between light and dark blue significantly better than their English counterparts, thanks to their rich color lexicon; so this new study that found that Greek-natives (who also have different lexical terms for light and dark blue) were superior to English-natives in terms of discriminating categorical color perception for light and dark blue color, did not come as a surprise or seemed ground-breaking; but there are important differences both in terms of the procedures used and the processes involved.
This study, works at pre-attentive level, uses physiological measures like ERP (they studied the vMMN – visual Mismatch Negativity) to determine whether the color stimuli had differential effect even at pre-attentive perception and thus provides independent evidence for the effect of Language on color perception. I’ll now quote from the abstract and discussion section:
It is now established that native language affects one’s perception of the world. However, it is unknown whether this effect is merely driven by conscious, language-based evaluation of the environment or whether it reflects fundamental differences in perceptual processing between individuals speaking different languages. Using brain potentials, we demonstrate that the existence in Greek of 2 color terms—ghalazio and ble—distinguishing light and dark blue leads to greater and faster perceptual discrimination of these colors in native speakers of Greek than in native speakers of English. The visual mismatch negativity, an index of automatic and preattentive change detection, was similar for blue and green deviant stimuli during a color oddball detection task in English participants, but it was significantly larger for blue than green deviant stimuli in native speakers of Greek. These findings establish an implicit effect of language-specific terminology on human color perception.
This study tested potential effects of color terminology in different languages on early stages of visual perception using the vMMN, an electrophysiological index of perceptual deviancy detection. The vMMN findings show a greater distinction between different shades of blue than different shades of green in Greek participants, whereas English speakers show no such distinction. To our knowledge, this is the first demonstration of a relationship between native language and unconscious, preattentive color discrimination rather than simply conscious, overt color categorization.
To conclude, our electrophysiological findings reveal not only an effect of the native language on implicit color discrimination as indexed by preattentive change detection but even electrophysiological differences occurring as early as 100 ms after stimulus presentation, a time range associated with activity in the primary and secondary visual cortices (22). We therefore demonstrate that language-specific distinctions between 2 colors affect early visual processing, even when color is task irrelevant. At debriefing, none of the participants highlighted the critical stimulus dimension tested (luminance) or reported verbalizing the colors presented to them. The findings of the present study establish that early stages of color perception are unconsciously affected by the terminology specific to the native language. They lend strong support to the Whorfian hypothesis by demonstrating, for the first time, differences between speakers of different languages in early stages of color perception beyond the observation of high-level categorization and discrimination effects strategically and overtly contingent on language specific
I think this fits in with predictive models of perception, wherein, earlier stages of visual processing, that are unrelated to color discrimination, may still be primed by color information that one has obtained earlier and has processed pre-attentively. I, as always , am excited by this proof of whorfian hypotheses.
The 54th edition of Encephalon, the premium brain carnival , is now up and running on the Neurophilosophy blog. There are many interesting articles there, like the article on color vision (I too have written about color vision extensively in the past-), so go and have a look, and savor what fancies you!
New research has unearthed that the grapheme-color synesthesia is not idiosyncratic , but follows some typical patterns. Grapheme – color synesthesia is one of the common types of synesthesia wherein one sees color associated with visualizing an alphabet / letter. Thus, whenever one see the alphabet ‘A’ one may also have a perception of color ‘red’. Till now, it was believed that this association of colors with alphabets was random and idiosyncratic; but new research has now revealed that it follows a pattern with most synesthetes more likely to associate typical colors with alphabets and for example report ‘A’ as red and V as ‘purple’.
Jamie Ward’s team that found this phenomenon speculates that the hue could be associated with the frequency of the word. Thus, as ‘A’ is a frequently used word it is associated with a common color ‘red’. ‘V’ which is infrequently used in the lexicon is associated with a similar infrequently encountered color purple. I am not sure how their new study is different from their earlier study that also found thus association and I believe that there would be some truth to their theory. however, the science daily article also talks about saturation. So I though I would jump in.
Colors can be conceptualized as per the HSV/ HSL or HSB system and understood in terms of hue , saturation and value/ brightness. I would personally be inclined to interpret the ‘A’ is red and ‘V’ is purple mapping as the outcome of a mapping of the alphabet order (a, b, c, ….x, y, z) to the color order in the rainbow / hue dimension (VIBGYOR). ‘A’ is one end ofthe spec trim and thus red in color, while ‘V’ is on another end of spectrum and thus more likely to be ‘violet’ in color. The frequency of usage of the alphabet should ideally map to brightness/ value of the synesthete color as in color space value is mapped to the amount of light reflected. saturation or ‘purity’ of color is a bit difficult to map onto the alphabet; but one could venture forth and suggest it has to do with how ‘pure’ the alphabet is ….is it always pronounced in one way….or are their multiple pronunciations associated with the same alphabet.
Mapping a linear progression of hues along VIBGYOR axis to alphabet order or numeral oredr is not that hard to envisage or visualize. If neurons of adjacent colorotopic and lexicotopic maps (assuming there are such maps for color and lexicon in the brain) in the brain overlap/ cross-over we would have the phenomenon of grapheme-color synestehesia that accounted for the commonalities in hues and alphabet association. However, we just know of retinotopic sort of maps in brains and these fit in with our existing knowledge. How the brain stores information about saturation/ value and correspondingly frequency and purity of alphabets and maps between the too, can lead to novel insights as to how information is stored in the brain.
I am excited and believe that we are on verge of breaking new ground ( I haven’t read the new Jamie ward paper though yet) and I have my own theories on why color is so important and may provide us many more clues (color and music are two most interesting phenomenon I believe). Are you excited? Do you have any theories?
PS: I just found that Jamie Ward is writing a book called “The Frog who Croaked Blue: Synaesthesia and the Mixing of the Senses” in which he recounts the experience of a synesthete who heard frog croaks as blue and chirping of cricket as red. To me this immediately conjures up the colortopic map with red at one end (high, feminine, shrill noises) and blue at the other (more manly, bass noise). This mapping of sound with colors may again follow the hue, saturation and value (three dimensions) with loudness of sound being proportional to the value of color being perceived and the hue and pitch mapped. Also , this may be an idiosyncratic experience, or this may be true of the species as a whole that we map more shrill noises to red and soothing and duller sounds to blue/ violet.
I touched on the sapir-whorf hypothesis and how Russians are better able to do better Categorical Perception (CP) of color, thanks to the fact that they have a richer color terms lexicon than English, last month.
I have also covered the research of P. Kay earlier regarding color terms and their evolution. Now a new PNAS paper by Kay et al shows that while the left hemisphere(LH) , which is involved in language, shows superior CP effect in adults, the reverse trend is shown in infants i.e.e the infants show a stronger CP of colors when the stimuli is presented to Left Visual field (LVF) and hence processed by RH.
Their hypothesis was that while the CP of colors in adults is mediated by language, the CP in infants is non-verbal and the cP in adults may or may not build on this childhood CP ability. The results go on to show that not only doers language affect the left hemisphere dominance on categorical perception of colors ; it does so by overriding an inborn RH dominance for the same task. thus, there is no doubt that the color term lexicon heavily influences how we categorize colors in the adulthood.
Here is their conclusion:
Evidence suggesting that color CP varies cross-linguistically, and that color CP is eliminated by verbal interference, has supported the hypothesis that color CP depends on access to lexical codes for color . However, the finding of color category effects in prelinguistic infants and toddlers has led others to argue that language cannot be the only origin of the effect . The current study finds evidence to support both positions. Color CP is found in 4- to 6-month-old infants, replicating previous infant studies. However, the absence of a category effect in the LH for infants, but the presence of a greater LH than RH category effect for adults, suggests that language-driven CP in adults may not build on prelinguistic CP, but that language instead imposes its categories on a LH that is not categorically prepartitioned. The current findings may therefore suggest a compromise between the two positions: there is a form of CP that is nonlinguistic and RH based (found in infancy) and a form of CP that is lexically influenced and biased to the LH (found in adulthood). Color CP is found for both infants and adults, but the contribution of the LH and RH to color CP appears to change across the life span.
I have blogged extensively earlier regarding language, color and the sapir -whorf hypothesis. My position in the above is clear, I lean towards the sapir-whorf hypothesis and a mild form of linguistic determinism. Now a new study (which I had missed earlier) by Lera Boroditsky presents further corroborating evidence that language influences even such basic functions as color perception. As per their 2007 PNAS paper, Russians are better (more speedily) able to distinguish between the light blue and dark blue color in an objective color perception task, thanks to the fact that Russian has a different color term for dark blue and a different one for the light blue. It is an excellent paper and I present some excerpts from the introduction:
Different languages divide color space differently. For example,the English term ‘‘blue’’ can be used to describe all of the colors in Fig. 1. Unlike English, Russian makes an obligatory distinction between lighter blues (‘‘goluboy’’) and darker blues (‘‘siniy’’). Like other basic color words, ‘‘siniy’’ and ‘‘goluboy’’ tend to be learned early by Russian children (1) and share many of the usage and behavioral properties of other basic color words (2). There is no single generic word for ‘‘blue’’ in Russian that can be used to describe all of the colors in Fig. 1 (nor to adequately translate the title of this work from English to Russian). Does this difference between languages lead to differences in how people discriminate colors?
The question of cross-linguistic differences in color perception has a long and venerable history (e.g., refs. 3–14) and has been a cornerstone issue in the debate on whether and how much language shapes thinking (15). Previous studies have found cross-linguistic differences in subjective color similarity judgments and color confusability in memory (4, 5, 10, 12, 16). For example, if two colors are called by the same name in a language, speakers of that language will judge the two colors to be more similar and will be more likely to confuse them in memory compared with people whose language assigns different names to the two colors. These cross-linguistic differences develop early in children, and their emergence has been shown to coincide with the acquisition of color terms (17). Further, cross-linguistic differences in similarity judgments and recognition memory can be disrupted by direct verbal interference (13, 18) or by indirectly preventing subjects from using their normal naming strategies (10), suggesting that linguistic representations are involved online in these kinds of color judgments.
Because previous cross-linguistic comparisons have relied on memory procedures or subjective judgments, the question of whether language affects objective color discrimination performance has remained. Studies testing only color memory leave open the possibility that, when subjects make perceptual discriminations among stimuli that can all be viewed at the same time, language may have no influence. In studies measuring subjective similarity, it is possible that any language-congruent bias results from a conscious, strategic decision on the part of the subject (19). Thus, such methods leave open the question of whether subjects’ normal ability to discriminate colors in an objective procedure is altered by language.
Here we measure color discrimination performance in two language groups in a simple, objective, perceptual task. Subjects were simultaneously shown three color squares arranged in a triad (see Fig. 1) and were asked to say which of the bottom two color squares was perceptually identical to the square on top.
This design combined the advantages of previous tasks in a way that allowed us to test for the effects of language on color perception in an objective task, with an implicit measure and minimal memory demands.
First, the task was objective in that subjects were asked to provide the correct answer to an unambiguous question, which they did with high accuracy. This feature of the design addressed the possibility that subjects rely only on linguistic representations when faced with an ambiguous task that requires a subjective judgment. If linguistic representations are only used to make subjective judgments in ambiguous tasks, then effects of language should not show up in an objective unambiguous task with a clear correct answer.
Second, all stimuli involved in a perceptual decision (in this case, the three color squares) were present on the screen simultaneously and remained in full view until the subjects responded. This allowed subjects to make their decisions in the presence of the perceptual stimulus and with minimal memory demands.
Finally, we used the implicit measure of reaction time, a subtle aspect of behavior that subjects do not generally modulate explicitly. Although subjects may decide to bias their decisions in choosing between two options in an ambiguous task, it is unlikely that they explicitly decide to take a little longer in responding in some trials than in others.
In summary, this design allowed us to test subjects’ discrimination performance of a simple, objective perceptual task. Further, by asking subjects to perform these perceptual discriminations with and without verbal interference, we are able to ask whether any cross-linguistic differences in color discrimination depend on the online involvement of language in the course of the task.
The questions asked here are as follows. Are there crosslinguistic differences in color discrimination even for simple, objective, perceptual discrimination tasks? If so, do these differences depend on the online involvement of language? Previous studies with English speakers have demonstrated that verbal interference changes English speakers’ performance in speeded color discrimination (21) and in visual searching (22, 23) across the English blue/green boundary. If a color boundary is present in one language but not another, will the two language groups differ in their perceptual discrimination performance across that boundary? Further, will verbal interference affect only the performance of the language group that makes this linguistic distinction?
They then go on to discuss their experimental setup (which I recommend you go and read). Finally they present their findings:
We found that Russian speakers were faster to discriminate two colors if they fell into different linguistic categories in Russian (one siniy and the other goluboy) than if the two colors were from the same category (both siniy or both goluboy). This category advantage was eliminated by a verbal, but not a spatial, dual task. Further, effects of language were most pronounced on more difficult, finer discriminations. English speakers tested on the identical stimuli did not show a category advantage under any condition. These results demonstrate that categories in language can affect performance of basic perceptual color discrimination tasks. Further, they show that the effect of language is online, because it is disrupted by verbal interference. Finally, they show that color discrimination performance differs across language groups as a function of what perceptual distinctions are habitually made in a particular language.
They end on a philosophical note:
The Whorfian question is often interpreted as a question of whether language affects nonlinguistic processes. Putting the question in this way presupposes that linguistic and nonlinguistic processes are highly dissociated in normal human cognition, such that many tasks are accomplished without the involvement of language. A different approach to the Whorfian question would be to ask the extent to which linguistic processes are normally involved when people engage in all kinds of seemingly nonlinguistic tasks (e.g., simple perceptual discriminations that can be accomplished in the absence of language). Our results suggest that linguistic representations normally meddle in even surprisingly simple objective perceptual decisions.
To me this is another important paper that puts sapir-whorf hypothesis on the forefront. I would love to hear from those who do not endorse the spair-whorf hypothesis as to what they make of these results?
hat tip: Neuroanthropology blog.