Major conscious and unconcoscious processes in the brain

Today I plan to touch upon the topic of consciousness (from which many bloggers shy) and more broadly try to delineate what I believe are the important different conscious and unconscious processes in the brain. I will be heavily using my evolutionary stages model for this.

To clarify myself at the very start , I do not believe in a purely reactive nature of organisms; I believe that apart from reacting to stimuli/world; they also act , on their own, and are thus agents. To elaborate, I believe that neuronal groups and circuits may fire on their own and thus lead to behavior/ action. I do not claim that this firing is under voluntary/ volitional control- it may be random- the important point to note is that there is spontaneous motion.

  1. Sensory system: So to start with I propose that the first function/process the brain needs to develop is to sense its surroundings. This is to avoid predators/ harm in general. this sensory function of brain/sense organs may be unconscious and need not become conscious- as long as an animal can sense danger, even though it may not be aware of the danger, it can take appropriate action – a simple ‘action’ being changing its color to merge with background. 
  2. Motor system:The second function/ process that the brain needs to develop is to have a system that enables motion/movement. This is primarily to explore its environment for food /nutrients. Preys are not going to walk in to your mouth; you have to move around and locate them. Again , this movement need not be volitional/conscious – as long as the animal moves randomly and sporadically to explore new environments, it can ‘see’ new things and eat a few. Again this ‘seeing’ may be as simple as sensing the chemical gradient in a new environmental.
  3. Learning system: The third function/process that the brain needs to develop is to have a system that enables learning. It is not enough to sense the environmental here-and-now. One needs to learn the contingencies in the world and remember that both in space and time. I am inclined to believe that this is primarily pavlovaion conditioning and associative learning, though I don’t rule out operant learning. Again this learning need not be conscious- one need not explicitly refer to a memory to utilize it- unconscious learning and memory of events can suffice and can drive interactions. I also believe that need for this function is primarily driven by the fact that one interacts with similar environments/con specifics/ predators/ preys and it helps to remember which environmental conditions/operant actions lead to what outcomes. This learning could be as simple as stimuli A predict stimuli B and/or that action C predicts reward D .
  4. Affective/ Action tendencies system .The fourth function I propose that the brain needs to develop is a system to control its motor system/ behavior by making it more in sync with its internal state. This I propose is done by a group of neurons monitoring the activity of other neurons/visceral organs and thus becoming aware (in a non-conscious sense)of the global state of the organism and of the probability that a particular neuronal group will fire in future and by thus becoming aware of the global state of the organism , by their outputs they may be able to enable one group to fire while inhibiting other groups from firing. To clarify by way of example, some neuronal groups may be responsible for movement. Another neuronal group may be receiving inputs from these as well as say input from gut that says that no movement has happened for a time and that the organism has also not eaten for a time and thus is in a ‘hungry’ state. This may prompt these neurons to fire in such a way that they send excitatory outputs to the movement related neurons and thus biasing them towards firing and thus increasing the probability that a motion will take place and perhaps the organism by indulging in exploratory behavior may be able to satisfy hunger. Of course they will inhibit other neuronal groups from firing and will themselves stop firing when appropriate motion takes place/ a prey is eaten. Again nothing of this has to be conscious- the state of the organism (like hunger) can be discerned unconsciously and the action-tendencies biasing foraging behavior also activated unconsciously- as long as the organism prefers certain behaviors over others depending on its internal state , everything works perfectly. I propose that (unconscious) affective (emotional) state and systems have emerged to fulfill exactly this need of being able to differentially activate different action-tendencies suited to the needs of the organism. I also stick my neck out and claim that the activation of a particular emotion/affective system biases our sensing also. If the organism is hungry, the food tastes (is unconsciously more vivid) better and vice versa. thus affects not only are action-tendencies , but are also, to an extent, sensing-tendencies.
  5. Decisional/evaluative system: the last function (for now- remember I adhere to eight stage theories- and we have just seen five brain processes in increasing hierarchy) that the brain needs to have is a system to decide / evaluate. Learning lets us predict our world as well as the consequences of our actions. Affective systems provide us some control over our behavior and over our environment- but are automatically activated by the state we are in. Something needs to make these come together such that the competition between actions triggered due to the state we are in (affective action-tendencies) and the actions that may be beneficial given the learning associated with the current stimuli/ state of the world are resolved satisfactorily. One has to balance the action and reaction ratio and the subjective versus objective interpretation/ sensation of environment. The decisional/evaluative system , I propose, does this by associating values with different external event outcomes and different internal state outcomes and by resolving the trade off between the two. This again need not be conscious- given a stimuli predicting a predator in vicinity, and the internal state of the organism as hungry, the organism may have attached more value to ‘avoid being eaten’ than to ‘finding prey’ and thus may not move, but camouflage. On the other hand , if the organisms value system is such that it prefers a hero’s death on battlefield , rather than starvation, it may move (in search of food) – again this could exist in the simplest of unicellular organisms.

Of course all of these brain processes could (and in humans indeed do) have their conscious counterparts like Perception, Volition,episodic Memory, Feelings and Deliberation/thought. That is a different story for a new blog post!

And of course one can also conceive the above in pure reductionist form as a chain below:

sense–>recognize & learn–>evaluate options and decide–>emote and activate action tendencies->execute and move.

and then one can also say that movement leads to new sensation and the above is not a chain , but a part of cycle; all that is valid, but I would sincerely request my readers to consider the possibility of spontaneous and self-driven behavior as separate from reactive motor behavior. 

Low Latent Inhibition, high faith in intuition and psychosis/creativity

Well, the cluster goes together. Previous research has found that Low LI and psychosis (schizophrenia) and creativity are related; previous research has also found that psychotic /some types of creative people have more faith in intuition; and this research ties things by showing that Low LI and high faith in intuition are correlated.

The research under question is by Kaufman and in it he explores the dual-process theories of cognition- the popular slow high road of deliberate conscious reasoning and the fast low road of unconscious processing. I would rather have the high road consist of both cognitive and affective factors and similarly the unconscious low road consist of both cognitive and affective factors. Kaufman focuses on the unconscious low road and his factor analysis reveal three factors: Faith in intuition: a meta cognition about ones tendency to use intuition; Holistic intuition: the cognitive factor; and affective intuition: the affective factor. with this in mind let us see what Kaufman’s thesis is:

He first introduces the low road and the high road:

In recent years, dual-process theories of cognition have become increasingly popular in explaining cognitive, personality, and social processes (Evans & Frankish, 2009). Although individual differences in the controlled, deliberate, reflective processes that underlay System 2 are strongly related to psychometric intelligence (Spearman, 1904) and working memory (Conway, Jarrold, Kane, Miyake, & Towse, 2007), few research studies have investigated individual differences in the automatic, associative, nonconscious processes that underlay System 1. Creativity and intelligence researchers might benefit from taking into account dual-process theories of cognition in their models and research, especially when exploring individual differences in nonconscious cognitive processes.

Then he explain LI:

Here I present new data, using a measure of implicit processing called latent inhibition (LI; Lubow, Ingberg-Sachs, Zalstein-Orda, & Gewirtz, 1992). LI reflects the brain’s capacity to screen from current attentional focus stimuli previously tagged as irrelevant (Lubow, 1989). LI is often characterized as a preconscious gating mechanism that automatically inhibits stimuli that have been previously experienced as irrelevant from entering awareness, and those with increased LI show higher levels of this form of inhibition (Peterson, Smith, & Carson, 2002). Variation in LI has been documented across a variety of mammalian species and, at least in other animals, has a known biological basis (Lubow & Gerwirtz, 1995). LI is surely important in people’s everyday lives—if people had to consciously decide at all times what stimuli to ignore, they would quickly become overstimulated.
Indeed, prior research has documented an association between decreased LI and acute-phase schizophrenia (Baruch, Hemsley, & Gray, 1988a, 1988b; Lubow et al., 1992). It is known, however, that schizophrenia is also associated with low executive functioning (Barch, 2005). Recent research has suggested that in highfunctioning individuals (in this case, Harvard students) with high IQs, decreased LI is associated with increased creative achievement (Carson et al., 2003). Therefore, decreased LI may make an individual more likely to perceive and make connections that others do not see and, in combination with high executive functioning, may lead to the highest levels of creative achievement. Indeed, the link between low LI and creativity is part of Eysenck’s (1995) model of creative potential, and Martindale (1999) has argued that a major contributor to creative thought is cognitive disinhibition.

He then relates this to intuition and presents his thesis:

A concept related to LI is intuition. Jung’s (1923/1971, p. 538) original conception of intuition is “perception via the unconscious.” Two of the most widely used measures of individual differences in the tendency to rely on an intuitive information-processing style are Epstein’s Rational- Experiential Inventory (REI; Pacini & Epstein, 1999) and the Myers-Briggs Type Indicator (MBTI) Intuition/Sensation subscale (Myers, McCaulley, Quenk, & Hammer, 1998). Both of these measures have demonstrated correlations with openness to experience (Keller, Bohner, & Erb, 2000; McCrae, 1994; Pacini & Epstein, 1999), a construct that has in turn shown associations with a reduced LI (Peterson & Carson, 2000; Peterson et al., 2002), as well as with divergent thinking (McCrae, 1987) and creative achievement.

The main hypothesis was that intuitive cognitive style is associated with decreased latent inhibition.

He found support for the hypothesis from his data. It seemed people with low LI were high in faith in intuition factor. Here is what he discusses:

The results of the current study suggest that faith in intuition, as assessed by the REI and the MBTI Thinking/Feeling subscale, is associated with decreased LI. Furthermore, a factor consisting of abstract, conceptual, holistic thought is not related to LI. Consistent with Pretz and Totz (2007), exploratory factor analysis revealed a distinction between a factor consisting of REI Experiential and MBTI Thinking/Feeling and a factor consisting of MBTI Intuition/Sensation and REI Rational Favorability. This further supports Epstein’s (1994) theory that the experiential system is directly tied to affect. The finding that MBTI Intuition/Sensation and REI Rational Favorability loaded on the same factor supports the idea that the type of intuition that is being measured by these tasks is affect neutral and more related to abstract, conceptual, holistic thought than to the gut feelings that are part of the Faith in Intuition factor.

Here are the broader implications:

The current study adds to a growing literature on the potential benefits of a decreased LI for creative cognition. Hopefully, with further research on the biological basis of LI, as well as its associated behaviors, including interactions with IQ and working memory, we can develop a more nuanced understanding of creative cognition. There is already promising theoretical progress in this direction.

Peterson et al. (2002) and Peterson and Carson (2000) found a significant relationship between low LI and three personality measures relating to an approach-oriented response and sensation-seeking behavior: openness to experience, psychoticism, and extraversion. Peterson et al. found that a combined measure of openness and extraversion (which was referred to as plasticity) provided a more differentiated prediction of decreased LI.

Peterson et al. (2002) argued that individual differences in a tendency toward exploratory behavior and cognition may be related to the activity of the mesolimbic dopamine system and predispose an individual to perceive even preexposed stimuli as interesting and novel, resulting in low LI. Moreover, under stressful or novel conditions, the dopamine system in these individuals will become more activated and the individual will instigate exploratory behavior. Under such conditions, decreased LI could help the individual by allowing him or her more options for reconsideration and thereby more ways to resolve the incongruity. It could also be disadvantageous in that the stressed individual risks becoming overwhelmed with possibilities. Research has shown that the combination of high IQ and reduced LI predicts creative achievement (Carson et al., 2003). Therefore, the individual predisposed to schizophrenia may suffer from an influx of experiential sensations and possess insufficient executive functioning to cope with the influx, whereas the healthy individual low in LI and open to experience (particularly an openness and faith in his or her gut feelings) may be better able to use the information effectively while not becoming overwhelmed or stressed out by the incongruity of the situation. Clearly, further research will need to investigate these ideas, but an understanding of the biological basis of individual differences in different forms of implicit processing and their relationship to openness to experience and intuition will surely increase our understanding of how certain individuals attain the highest levels of creative accomplishment.

To me this is exciting, the triad of creative/psychotic cognitive style, intuition and Latent Inhibition seem to gel together. the only grip eI have is that the author could also have measured intuition directly by using some insight problems requiring ‘aha’ solutions; maybe that is a project for future!
Kaufman, S. (2009). Faith in intuition is associated with decreased latent inhibition in a sample of high-achieving adolescents. Psychology of Aesthetics, Creativity, and the Arts, 3 (1), 28-34 DOI: 10.1037/a0014822

Exploration/ Exploitation == Maximisers/ Satisficers?

There is an interesting research coverage at We are Only Human blog regarding whether people may have two different cognitive styles- one based on exploration of novel ideas and the other based on exploitation or focus on a particular familiar idea. The study employs evolutionary concepts and theorizes that these different cognitive styles may be a reflection of the different foraging styles that might have been selected for and relevant in EEA.

Specifically, while foraging for food in a habitat where the food supply and resources are unpredictable , one is faced with a choice when one has discovered a food source: whether to exploit this food source (a jungle area having sparse edible leaves) or to move ahead in search of a potentially better food source (a jungle area having abundant edible and nutritious fruits) . Both strategies , that of exploring or exploiting can be advantageous and may have been selected for. It is also possible that humans can use either of the strategies based on the environment- (food source distribution) , but may be inclined towards one strategy or the other. The authors of the study surmised that both the strategies have been selected for and we have the potential to use either of the strategy. Moreover, the same foraging strategy we use or are primed of, would also be visible in the cognitive strategy we use.

They used an ingenious technique to prime the subjects with either of the foraging strategies (go read the excellent We are only human blog post) and found that humans were flexible in the use of the appropriate strategy, given the appropriate context, and that the foraging strategy primed the corresponding cognitive strategy. To boot, those primed with an exploratory foraging strategy would be more prone to using exploratory cognitive strategies when confronted with a cognitive task and vice versa. They also found systematic differences between individuals cognitive and foraging styles- some were more exploratory than the others.

This reminds me of the Maximizers/ Satisficers distinction in decision-making style that Barry Scwatrz has introduced and brought to public attention. Basically a Maximizer , when faced with a decision and choice, would go on computing the utility of different choices and try to choose the option that maximizes his utility and is the ‘best’. A Satisficer, on the other hand would also explore options, but stop his exploration, when he finds an option that is ‘good enough’. I wonder, if just like the exploratory/ Exploitative cognitive and foraging styles, this is just another dimension of the same underlying phenomenon- whether to explore more – or to exploit what is available. To take an example, for marriage, a satisficing strategy may work best – as told in “The Little Prince” one should stop searching for more flowers if one has already had the fortune of possessing a flower.

“People where you live,” the little prince said, “grow five thousand roses in one garden… yet they don’t find what they’re looking for…”

“They don’t find it,” I answered.

“And yet what they’re looking for could be found in a single rose, or a little water…”

An interesting experiment would be to see, if the foraging style, the cognitive style, and the decisions style are all correlated within individuals and if priming one can influence the outcome of the other style.

If so, could there be an underlying neural phenomenon , common to all?

Wray, the author of We are only human blog makes a bold conjecture and relates this to the finding that dopamine levels.

Exploratory and inattentive foraging—actual or abstract—appears linked to decreases in the brain chemical dopamine.

He even relates this to cognitive disorders like Autism and ADHD.

By analogy, in conditions where baseline dopamine is more, like in bipolar and psychosis, one may be more inclined to a more staisficing/ ‘I’m feeling Lucky’ strategy in which the very first option is acceptable. This may explain the ‘jumping-to-conclusions’ bias in schizophrenia/ psychosis.

To make things more explicit, though the leading dopamine theory in vogue now is of ‘error-prediction’ , a competing, and to me more reasonable, view of dopamine function is incentive salience i.e. what ‘value’/ importance does the stimuli have for the person in question. The importance can be both positive and negative and thus we have found that dopamine is involved in both dread and desire. The dominant reward prediction theory faces many challenges, the least of which is response of dopamine neurons to novel events. A dopamine burst is also associated with ‘novel’ events and thus dopamine is somehow involved in/ triggered by Novelty. Baseline dopamine may constrain the dopamine surge felt on a novel event. Thus, in schizophrenia/ psychosis , with baseline dopamine high, a dopamine burst on novelty detection may be high enough so that it is meaningful and may not lead to more exploratory behavior. While in the disorders where baseline dopamine is low, one may require a more profound dopamine burst before the stimuli becoming meaningful and thus may go on seeking novel stimulus till one finds one ‘big enough to trigger salience’.

We may extend the salience argument to other domains than incentive. If the chief function of dopamine is to mark salience, then it may also be instrumental in memory and attention. Only what is Salient gets attention, and only what is salient gets into Working Memory. Thus,a high dopamine level may predispose to treating almost everything as salient, leading to delusions of reference (everything is meaningfully related to self etc) etc. Working Memory may be taxed due to everything trying to get in- and thus poor WM in people with schizophrenia. Also, every trivial thing may grab attention- leading to poor sensory gating and conditions like lack of pre-pulse inhibition. On the flip side, while making sense of ones experience, one may accept the first possible explanation and do not search further – thus leading to persistence of delusions.

An opposite scenario would be when one keeps exploring the environment and nothing seems novel due to low dopamine levels. This would be the classical Autistic repetitive and stereotype behaviors. There would be sensory over stimulation, as nothing is salient and one needs to explore more and more. On the other hand, WM capabilities may be good/ savant like, as not every piece of information grabs attention. Everything should seem insignificant and the only way to arrive at decision / choose action would be via exhaustive enumeration and logical evaluations of all options. even after obvious explanations for phenomenon, one may keep looking for a better explanation. No wonder , as per my theory, more scientists would be autistic.

Perhaps, I am stretching things too far, but to me the dopamine connection to Salience/ Meaning/ Importance is sort of worth exploring and I will write more about that in future. For now, let us be willing to associate Salience not just with stimuli related to motivation, but also with stimuli relevant in sensation, perception,learning and memory. If so the common underlying mechanism responsible for differentiating us as a exploratory and expolitatory forager (food) may also be related to our different cognitive styles, our different decision-making styles and our different baseline dopamine levels.

Dopamine though is most strongly related to food and sex. I could even stretch this argument and say this may be related to r and K reproductive styles (note these styles are species specific, but I believe individuals in a specie may also vary on the reproductive strategy along this dimension). Thus, while explorers may have r type of reproductive style, the exploiters may have a K reproductive style.

At one extreme are r-strategies, emphasizing gamete production, mating behavior, and high reproductive rates, and at the other extreme are K-strategies, emphasizing high levels of parental care, resource acquisition, kin provisioning, and social complexity.

If K-strategy is what humans have chosen, maybe exploitation in all areas (cognitive, decision-making, foraging) is more relevant and in tune with our nature. Maybe that’s why I’ll always be on the side of Psychosis than Autism!! Though, to put things in perspective, maybe humans have evolved to use both strategies as the situations demands , and the best thing would be to use the strategy situation-specific and not lean towards either extremes.

Developmental Stages: New Age concurs

I recently came across a series of article by Bill Harris, director of Centerpointe institute, regarding cognitive development and I found them relatively well-informed. Bill is a new Age Guru, but his articles were relatively well -informed regarding Piaget’s developmental stages; moreover he shares my enthusiasm for developmental stages and believes in extending these stages beyond Piaget’s four. The series is still incomplete and I link to the first two posts in the series.

I liked his linking these stages with the Jean Gebser‘s structure of consciousness and the consequent archaic, magical, mythical, mental and integral stages. I also liked his emphasis on perspective taking as an integral part of developmental process and I have covered that in detail here. However, he doesn’t differentiate between the stages whereby one starts understanding that others have a different viewpoint/ perspective ( social-informational perspective) vis-a-vis when one starts adopting the viewpoint of another (self-reflective perspective). See my earlier post for more on these perspective stages as outlined by Robert Selman.

What I didn’t like though, and found many issues with , was the various pathologies he associated with failures of developmental tasks at each stage. These he seemed to just pull out of his hat , with neither empirical support or strong theoretical foundations. Nevertheless, the series of articles may serve as a good refreshed for Piaget’s theories of cognitive development for readers of this blog.

Some excerpts:

Cognitive development refers to our ability to perform various types of operations on what we encounter in the world and in our awareness. To live in the world, accomplish various things, and deal with the challenge of being human, we first learn to ”work with” (deal with, manage, get things done with) our body, then with objects, then with symbols, concepts, and ideas, and–if development continues to the highest transpersonal or transrational levels of development–we eventually add ways of dealing with life that are beyond the realm of ideas.

Sensorimotor, Piaget’s first stage (the stage before preoperational), is sometimes referred to as archaic in other naming conventions (in this case, in that of Jean Gebser).

Piaget divided cognitive development into four broad stages: 1) sensorimotor (0-2 years), 2) preoperational, or “preop” (2-7 years), concrete operational, or “conop” (7-11 years), and formal operational, or “formop” (11 years onward). Each of these can be divided into several substages. The ages are averages, and since a person could stop and remain at any level, you can find many adults at each level (though not many are found at the sensorimotor stage).

In this discussion I’ll also use some of the stage names used by Jean Gebser and Ken Wilber: archaic (similar to sensorimotor), magic (similar to early preoperational), magic-mythic (late preoperational), mythic (early concrete operational), mythic-rational (late concrete operational), and rational (formal operational). This is just to confuse you, of course.

In the sensorimotor stage, the infant uses senses and motor abilities to understand the world, beginning at first with reflexes and eventually using complex combinations of sensorimotor skills. At the beginning of this stage, the infant cannot yet distinguish itself from its environment (what some have called an experience of oceanic oneness). This has also been called a state of “primary narcissism,” because the infant is embedded in or undifferentiated from the environment.

I suggest, this should be enough to whet your appetite and that you go to the original source to get additional servings.

Simulating the future and remebering the past: Are we prediction machines?

This post is about an article by Schacter et al (pdf) regarding how the constructiveness of memories may crucially be due to the need to simulate future scenarios. But before I go to the main course, I would like to touch upon a starter: Jeff Hawkins Heirarchical Temporla Memory (HTM) hypothesis. I would recommend that you watch this excellent video.

As per Jeff Hawkins, we humans are basically prediction machines, constantly predicting the external causes and our responses to them. Traditionally, the behaviorist account has been that we are nothing but a bundle of associations- either conditioned pavlovian associations between stimuli and stimulus-response or a skinerrian association between our operant actions and environmental rewards. Thus every behavior we indulge in is guided by our memory of past associations and the impending stimulus. Jeff Hawkins refines this by postulating that we are not passive responders to environmental stimuli, but actively predict what future causes (stimuli) are expected and what our response to those stimuli may be. Thus in his HTM model, the memory of past events not only exerts influence via a bottom up process of responding to impending stimulus; but it is also used for a top-down expectation or prediction of incoming stimulus and our responses to it. Thus, we are also prediction machines constantly using our memory to predict future outcomes and our possible responses.

Now lets get back to the original Schacter article. Here is the abstract:

Episodic memory is widely conceived as a fundamentally constructive, rather than reproductive, process that is prone to various kinds of errors and illusions. With a view toward examining the functions served by a constructive episodic memory system, we consider recent neuropsychological and neuroimaging studies indicating that some types of memory distortions reflect the operation of adaptive processes. An important function of a constructive episodic memory is to allow individuals to simulate or imagine future episodes, happenings, and scenarios. Because the future is not an exact repetition of the past, simulation of future episodes requires a system that can draw on the past in a manner that flexibly extracts and re-combines elements of previous experiences. Consistent with this constructive episodic simulation hypothesis, we consider cognitive, neuropsychological, and neuroimaging evidence showing that there is considerable overlap in the psychological and neural processes involved in remembering the past and imagining the future.

As per the paper the same brain areas and mechanisms are involved in both remembering a past event and imagining a future one – and the regions involved include the hippocampus. These findings in itself are not so fascinating, but the argument that Schacter et al give for , as to why, the same regions are involved in both memory retrieval and future imaginings, and how this leads to confabulations and false recognitions is very fascinating. As per them , because we need to simulate the future events, and as the future events are never an exact replica of past events, hence we do not store the past events verbatim, but store a gist of the event, so that we can recombine the nebulous gist to create different possible future scenarios. Due to this fact (the need for simulation of future events), the memory is not perfect, and in normal individuals it is possible that they confabulate (attribute the source of their memory erroneously) or make false recognitions on memory tests like the DRM.

Fisrt a bit of background on DRM paradigm. In this test, a list of related words are presented to a subject: eg yawn, bed, night, pillow, dream, rest etc. All of these relate to the theme of sleep. Later in a recall test, when this thematically related word is presented to normal subjects, they most often say that they had encountered the word sleep earlier. However given an unrelated word like hunger, most are liable to recognize that the word was not encountered previously. What Schachter et al found was , that in those subjects that had damage to hippocampus/ other memory areas and were amnesics, this effect of confabulating the gist word was reduced. In other words, those with brain damage to memory areas were less likely to say that they had encountered the related word sleep during the original trial. this, despite their poor performance in overall remembering of old list items as compared to controls. This clearly indicates that remembering the gist vis-a-vis details is very important memory mechanism.

I believe that we should also take into account the prototype versus exemplar differences in categorization between the males and females into account here. I would be very interested to know whether the data collected showed the expected differences between males and females and hopefully the results are not confounded due to not taking this gender difference into account.

Anyway , returning to the experimental methodology, another sticking point seems to be the extending of results obtained with semantic memory (like that for word lists) to episodic memory.

Keeping that aside, the gist and false recognition data results clearly indicate that the constructive nature of memory is an adaptation (it is present in normal subjects) and is disrupted in amnesics/ people with dementia.

Thus, now that it is established that memory is reconstructive and that this reconstruction is adaptive, the question arises why it is reconstructive and not reproductive. To this Schacter answers that it is because the same brain mechanism used for reconstructing memory from gist are also used for imagining or simulating future scenario. They present ample neuropsychological, neuroimaging and cognitive evidence on this and I find that totally convincing.

The foregoing research not only provides insights into the constructive nature of episodic memory, but also provides some clues regarding the functional basis of constructive memory processes. Although memory errors such as false recognition may at first seem highly dysfunctional, especially given the havoc that memory distortions can wreak in real-world contexts (Loftus 1993; Schacter 2001), we have seen that they sometimes reflect the ability of a normally functioning memory system to store and retrieve general similarity or gist information, and that false recognition errors often recruit some of the same processes that support accurate memory decisions. Indeed, several researchers have argued that the memory errors involving forgetting or distortion serve an adaptive role.

However, future events are rarely, if ever, exact replicas of past events. Thus, a memory system that simply stored rote records of what happened in the past would not be well-suited to simulating future events, which will likely share some similarities with past events while differing in other respects. We think that a system built along the lines of the constructive principles that we and other have attributed to episodic memory is better suited to the job of simulating future happenings. Such a system can draw on elements of the past and retain the general sense or gist of what has happened. Critically, it can flexibly extract, recombine, and reassemble these elements in a way that allows us to simulate, imagine, or ‘pre-experience’ (Atance & O’Neill 2001) events that have never occurred previously in the exact form in which we imagine them. We will refer to this idea as the constructive episodic simulation hypothesis: the constructive nature of episodic memory is attributable, at least in part, to the role of the episodic system in allowing us to mentally simulate our personal futures.

I’ll finally like to end with the conclusions the author drew:

In a thoughtful review that elucidates the relation between, and neural basis of, remembering the past and thinking about the future, Buckner and Carroll (2007) point out that neural regions that show common activation for past and future tasks closely resemble those that are activated during “theory of mind” tasks, where individuals simulate the mental states of other people (e.g., Saxe & Kanwisher 2003). Buckner and Carroll note that such findings suggest that the commonly activated regions may be specialized for, and engaged by, mental acts that require the projection of oneself in another time, place, or perspective”, resembling what Tulving (1985) referred to as autonoetic consciousness.

This Seems to be a very promising direction. The ‘another time and place’ can normally be simulated withing hippocampus that also specializes in cognitive maps. We may use the cognitive maps to not only remember past events, but also simulate new events. In this respect the importance of dreams may be paramount. Dreams (and asleep) may be the mechanism whose primary purpose is not memory consolidation; rather I suspect that the primary function of dreams is to work on the gist of the memory from the previous day, simulate possible future scenarios, and then keep in store those memories that would help and are likely to be encountered in future. Thus, while dreaming we are basically predicting future scenarios and sorting information as per their future relevance. Not a particularly path-breaking hypothesis, but I’m not aware of any thinking is this direction. Do let me know of any other similar hypothesis regarding the function of dream as predictors and not merely as consolidators.

Go to Top