The Birth of Cognitive Psychology:
William James distinguished between primary and secondary memories, a distinction that would later be made in the Atkinson-Shiffrin model between STM and LTM.
Kohler-- how all the parts of a complex problem fit together (the “Gestalt) p.7 (1925)
Fechner studied scaling (the relationships among numbers generated in psychological estimates of apparent brightness, weights, etc.)and was the father of psychophysics- the study of the relation between apparent stimulus intensity and physical intensity.
Donders used the subtractive method to estimate the duration of mental events.
Behaviorism Gets In The Way: 1924
Watson's book "Behaviorism", published in 1924, urged psychologists to limit their studies to what they could observe.
Noam Chomsky argued that behavorism ignored innate, biological components of langauge learning
The Information Processing Approach to Understanding The Mind: 1958
(popularized with the advent of computers)
input®sensory store®filter®pattern recognition®selection®STML®LTM
Information processing models commonly break a complex task into stages that can be analyzed separately
Broadbent's (1958) filter model placed the filter between sensation and perception. Broadbent's theory originated from research in which he found that if two different sets of characters were presented as a list to each ear separately, recall was best for those subjects who recalled them from each ear separately.
In Sperling's partial report task, subjects were cued by a tone signal to report all the letters in the cued row. Sperling estimated the number of letters "available" in a visual information store (iconic memory) by multiplying the number of letters reported in any row by the number of rows
memory. Sperling's partial report technique showed that the visual information store (iconic memory) persisted for less than a second.
Neuroscience—Cognitive neuroscience is the study of the relation between cognitive processes and brain activity.
Event related potentials (ERPs) measure brain activity in response to some stimulus, whereas positron emmision tomography (PET) measures br5ain activity while performing some task. ERPs yield good temporal resolution but poor spatial resolution.
Information in the sensory store is lost forever unless it is processed by pattern recognition.
Pattern recognition includes substages such as encoding, comparison, and decision.
Theories of Pattern Recognition
Template matching theory was initially rejected because of problems with stimulus variability
perception. Major extreme theory.
Feature Analysis Theory was supported Hubel and Weisel's pioneering research measuring the activity of single cells in the visual cortex initially. Eleanor Gibson's theory of perceptual learning first required children to learn to discriminate between distinguishing and nondistinguishing features. Major extreme theory.
Structural theories differ from simpler theories in that they emphasize relations among features
perception. The term "geons" was used by Beiderman (1965) to describe simple 3-D components of objects. Beiderman also showed that deleting vertices produced more segmentation than deleting midsegments from line drawings.
The stroop effect is used to support the argument that reading words is an automatic skill in most adults because naming the colors is worse for words that are color names.
Reicher (1969) discovered that a letter was actually perceived better if it was presented in a word rather than among some unrelated letters by devising an experiment involving a test display followed by a masking field and two letter alternatives showed that subjects were best at picking which of the two letters had been in the previous display if the display had been a common word.
The interactive activation model of McClelland and Rumelhart (1989) assumes that letters in a string are processed with bottom up sensory input added to top-down activation
An automatic process is one that can be performed without attention.
"Attention" is used to refer to the two following ideas in cognitive Psychology:
selection and capacity
Hasher and Zacks argued that automatic and effortful tasks differ in developmental trends: automatic tasks should show little change with age.
A secondary task, such as responding as rapidly as possible to an audible tone, is sometimes used to measure the attentional demands of a primary task.
A sudden movement, mention of our own name, or any novel event might attract our attention automatically.
Capacity theories of attention are based on the assumption that tasks require mental effort
Healy (1980) found that when subjects searched for a target letter like "f", they were most likely to miss it if it occurred in a common word like "of".
Late selection theories of attention, such as those of Norman and Deutsch and Deutsch, placed the filter between STM and LTM.
One of the main purposes of attention is to select what to enter into STM.
Treisman's attenuation model was based on data that indicated that unattended information could sometimes get through.
Consciousness is most closely associated with STM.
The Atkinson-Shiffron model of memory made a fundamental distinction between memory stores and memory processes.
The predominant form of coding information in the STM is acoustic. Hardyck and Petrinovich (1970) used a biofeedback system to encourage subjects to stop subvocalizing when they read silently. They found that suppression of subvocalization decreased comprehension for difficult materials only.
Baddeley's theory of STM, which coined the phrase working memory, had three components: phonological loop, visuospatial scratch pad, central executive.
The two main roles of rehearsal in STM are maintenance and storage in the LTM.
Information can be maintained in STM through rehearsal.
Information in STM is lost rapidly unless it is rehearsed.
Rundus (1971) found that the number of rehearsals was predictive of the probability of recalling a word from memory.
Peterson and Peterson (1959) studied forgetting in STM by having subjects try to remember three consonants after a brief period of counting backwards.
"Release from proactive interference" refers to memory gain from changing the types of items to be remembered.
Keppel and Underwood (1962) found that people did consistently worse over the first few trials in the Brown-Peterson short-term memory task, presumably due to a build up of proactive interference
memory. Wickens, Born, and Allen (1963) showed that performance got worse over the first few trials in the Brown-Peterson short-term memory task but that performance could show a sudden improvement if the category of the memory items was changed.
Waugh and Norman found that the phenomenon of forgetting is better explained by the number of intervening items that occurs after a to-be-remembered item than by the amount of elapsed time. This result is consistent with the interference theory.
Hermann Ebbinghaus discovered that the human memory span is about 7 chunks long.
In a type of memory span task an attempt is made to measure the number of digits that can be recalled immediately after hearing them.
DeGroot (1966) showed that chess experts differed from novices in their ability to remember chess positions from actual game situations. p 90
STM capacity can be divided between its two main uses: storage and processing.
Sternberg did research for the telephone company. In 1966 he argued that the search through a list of items in STM is serial and exhaustive. When he varied the number of items held in STM and had subjects search them for the presence or absence of a test digit, he found that response times increased linearly with the number of memory items. Sternberg (1967) used a visual mask to degrade the quality of the test digit. If degradation slowed down the search for a match in STM , then the slope of the response time by set size function should increase.
Cavanaugh (1972) found a trade-off relationship between the STM comparison time (slope of the Sternberg function) and memory span for a variety of different items.
The more or less permanent and almost infinitely large memory store is called LTM
Learning can most clearly be identified with a change in LTM and can be equated with the process of storage.
Atkinson (1972) showed that if a computer program selected items for study according to a model that optimized learning, performance was worse during initial learning (relying on STM), but better for the final test (relying on LTM).
Neural network models of behavior can the ability to learn.
Serial Position Effects
The serial position effect in free recall refers to recall probability across list item input positions. Better recall for the last few items in a list is called the recency effect.
The primacy effect in free recall is probably due to the fact that the first few items get more rehearsals than the other words.
The recency effect in free recall is probably due to the fact that the last few items are more likely to be in STM at the time of retrieval.
Thus, when a list is followed by a few seconds of mental arithmetic before recall, the recency effect is most affected. When the rate of presentation of words in a list is speeded up, the primacy and middle effects are most affected.
Indirect Tests of Memory
Hyde and Jenkins (1969) studied incidental learning, in which most subjects were surprised by a final memory test. They found that incidental meaning could be just as good if subjects attended to the meaning of the words.
According to Tulving (1972), episodic memory includes knowledge of personal experience and autobiographical information.
The Levels of Processing Theory
According to levels-of-processing theory (Craik & Lockhart, 1972) the success in recalling anything depends mainly on the operations performed during encoding.
Distinctiveness of Memory Codes
Orthographic knowledge describes our knowledge of spelling rules.
The principle of encoding specificity states that recall is best if the recall situation reproduces the cues present during learning.
Signal Detection Theory
In signal detection theory, moving the decision criterion down (to the left) increases the probability of which kind of error? False alarm
In signal detection theory, moving the decision criterion up (to the right) increases the probability of which kind of error? miss
In signal detection theory, there are two kinds of errors; one is to report negatively when there is a signal present. This is called a miss.
In signal detection theory, what refers to subject sensitivity, or the discriminability of targets (old items) from distracters (new items)? d-prime
In signal detection theory, what refers to the decision criterion adopted by the subject in order to discriminate targets (old items) from distracters (new ones)? Beta