• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/137

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

137 Cards in this Set

  • Front
  • Back
What is cognition? What is cognitive psychology?
Cognition: (umbrella term for all higher mental processes) collection of mental processes and activities used in perceiving, remembering, thinking, and understanding, as well as the act of using these processes. Special emphasis on mental activities that most people engage in everyday.
Cognitive Psychology: refers to all processes by which the sensory input is transformed, reduced, elaborated, stored, recovered, and used.
What is Memory?
Memory: the mental process of acquiring and retaining information for later retrieval and the mental storage system that enables these processes.
What is the distinction between structure and process
Structure: the knowledge you possess; the information in your memory.
Process: an operation on an external stimulus or an internal representation.
What is the Reductionist Approach, what is Nessiers criticism of it, and what is the response?
Reductionist Approach: attempting to understand complex events by breaking them down into their components.
Neisser’s Criticism: lacks ecological validity: not able to generalize to the real-world situations in which people think and act. Also, we can’t rally understand a whole by its parts.
Response: Cognitive psychology is in its early years. The reductionist approach is a starting point and later on might reveal insight into the complex events as a whole. Simple situations can sometimes reveal rather than obscure a process. Also, in the future can put the pieces back together and deal with events as a whole.
Who was Diogenes of Apollania?
Wanted to give idea that there is common sense—everything consists of elements. Thought is air because we always breathe in air so it must be the drive.
What is the transcendental method proposed by Kant?
Work backward from observed effects to infer their causes. For example the factory analogy.
Know how Aristotle and Plato differed in their ideas of universals and particulars.
Plato: universals or forms for everything, known from birth, forms are separate from particulars/things in reality, universals are a realer reality.
Aristotle: Universals/forms are learned through associations, forms are dependent and components of particulars/things in reality.
Where did St. Augustine think that cognition/ memory occurred?
Cognition/memory occurred in the stomach.
What was psychophysics?
The systematic study of the relation between the physical characteristics of stimuli and the sensation that they produce.
Who was Wilhelm Wundt?
Started first psychological laboratory and father of introspection.
What is the introspective method?
(self-observation): a method in which one looks carefully inward, reporting on inner sensations and experiences. Takes much training, only report immediate experiences that were conscious, not mediated experiences where memory intruded.
What is structuralism and who started this movement?
Edward Titchener started Structuralism: the structure of the conscious mind, the sensations, images, and feelings that were the very element of the mind’s structure. But was unscientific because Titchener was the ultimate authority to validate the observations, leading to bias e.g. a researcher found evidence for imageless thought, but Titchener said there was no such thing!
Who was Hermann von Ebbinghaus?
Disagreed with Wundt and thought study of the mind by objective methods was possible, just figure out how. Goal: to study the mind’s process of association formation, using thoroughly objective methods. Needed material that had not preexisting associations, used nonsense syllables: CVC’s. He would learn a list, later relearn the list and note how many fewer trials it would take to relearn the list called savings scores. Found that forgetting is a function of time. First to invent reasonable scientific way to study memory and mental processes, one of the strongest influences on cognitive psychology.
What is functionalism and who started this movement?
William James: approached to psychology was functionalism: the functions of consciousness, rather than its structure, were of interest. Proposed that memory consists of two parts, short-term and long-term memory.
What is behaviorism and how did this movement affect cognitive psychology?
Behaviorism: the scientific study of observable behavior. Antimentalist concept or idea that mentalist, consciousness, memory and mind should be excluded from psychology. Observable, quantifiable behavior was the proper topic of psychology.
Behaviorism was rejected for lack of progress, and interest in important questions. Replaced with cognitive psychology and the information processing approach.
What events lead to the re-birth of cognitive psychology / science?
a. Challenges and Changes: the 1940’s and 1950’s: researcher begin finding significant instances of instinctive behavior, defined behaviorism’ laws.
b. WWII: psychology used practice problems of making war. Because all psychologists studied behaviorism, were poorly prepared to answer questions of the war, how can well learned tasks break down? Psychologists also met professionals in different fields and got new outlooks and perspectives. Returning to laboratories with broaden interests.
c. Verbal Learning: was the breach of experimental psychology that dealt with human subjects as they learned verbal material, stimuli composed of letters or sometimes worlds. Ebbinghause had great research in this area and much research done by others in 20’s and 30’s. Early Psychologists were not committed to any theory; looked at behavior, but were not against instinct. Easy transitions to new cognitive psychology of 50’s and 60’s. If not committed to theory, don’t mind changing it. The researchers in the verbal learning area devised and refined lab tasks of learning and memory still used today.
d. Linguistics (Chomsky’s impact): changes in verbal learning to emergence of cognitive psychology in 1960’s. Skinner published book that language is learned like animals. Chomsky wrote a review, criticized Skinner’s scientific approach as using scientific vocabulary but dogmatic with no substance. Language is novel and uses internal rules. Painful blow to behaviorism.
What are the assumptions of cognitive psychology?
a. Mental processes exist: single most defining feature.
b. Mental processes can be studied scientifically. Unlike what Wundt and Titchener thought, objectivity and reliable method is possible.
c. Active information processing: humans are active participants in the act of cognition. Contrasted to the behaviorist who viewed the organism as passive.
What does the scientific method require of experimental cognition?
a. Guiding Principles: a Metatheory: a set of assumptions and guiding principles. The metatheory in cognitive psychology for many years was: Information-Processing Approach: coordinated operation of active mental processes within a multicomponent memory system.
What is a process model?
A hypothesis about the specific mental processes that take place when a specific talk is performed. For example, the information processing model.
How do cognitive scientists measure information processing?
Reaction Time (RT): a measure of the time elapsed between some stimulus and the person’s response to the stimulus (measured in milliseconds). The reason used RT, is mental processes take time, the RT differences give clue about what’s going on inside. Using time based measures as a window into mind focuses the research.
Accuracy: earliest use by Ebbinghaus in 1885. Broad interest is people’s accuracy such as correct recall and omitted words. More modernly study incorrect responses, not on the studied list. People have high accuracy when memorizing overall ideas. But poor when is comes to exact verbatim.
Other types include verbal reports: (used when questions would take to long, so RT is pointless, so person tell own mental processes; and neuropsychological evidence: were processes take place in brain.
What are some common guiding analogies for cognitive scientists?
Channel Capacity: any channel—any psychical device that transmits messages or information—has limited capacity. Like telephone wire, our mind can only carry so much information before some is lost. Opened new avenue in cognitive psych, what is this limitation, and can it be overcome?
The Computer Analogy: Computer scientists developed a machine with very essence of the human mental system.
What is the Lexical Decision Task?
Lexical Decision task: a timed task in which people decide whether letter stings are or are not English words. Process—encode to STM, search LTM, decision, and response.
What is the Standard Information Processing Approach (Atkinson & Shiffrin, 1968)?
Based off of computers design. “Hardware” comprised of three memory components: sensory memory (input stimulates any sensory receptors it can), short-term memory (info is stored for short period so it can be worked with), and long-term memory (get answer for LTM and give it to STM who then gives a response).
What is the problem with the Standard Information Processing Approach?
Required something more specific and testable so design the Process Model.
Process Model: a hypothesis (narrowing the Strict Info-Processing Approach to something simpler and more focused) showing specific mental processes that take place when a task is being performed.
Process Model Design: Input—Encode—Search—Decision—Response
Showing limitations to the process model using the Lexical Decision task: a timed task in which people decide whether letter stings are or are not English words. Process—encode to STM, search LTM, decision, and response.
Word Frequency Effect: it takes significantly longer to judge words of lower frequency than it does to judge high-frequency words (in written language). Encoding, Decision, and Response should be the same for all low, med, and high-frequency; but search should be influenced.
Words used more frequently in our language might be stored more strongly in memory, or maybe even stored repeatedly in memory.
What are some problems with Process Models?
Sequential stages of processing: that occurs on every trial. A set of stages that occur one-by-one that completely account for mental processing in the task.
No reason to expect that humans are limited to one-by-one processing in all situations.
Independent and Nonoverlapping: any single stage was assumed to finish its operation completely before the next stage in the sequence could begin, and the duration of any single stage had no bearing or influence on the other stages. Humans may be able to process stages sequentially.
Parallel Processing: evidence that multiple mental processes can operate simultaneously, in parallel.
Context Effect: no mechanisms to take into account the effects of the context e.g. priming effect.
Uses mainly RT as measure
Explain Neisser’s (1976) New Information Processing Approach.
New Information-Processing Model: revision to Atkinson and Shiffrin’s model: Memory components are arranged in a triangle, with bidirectional arrows. Each component can now affect each other, and attentional mechanisms have explicit influence throughout.
Parallel Processing: different mental components can operate simultaneously, in parallel.
Context: bidirectional arrows, long-term memory can easily have effect right now. Top-Down Processing: when existing context or knowledge has an influence on earlier (doing right now) or simpler forms of mental processes.
Fixing Narrowness: emphasis on RT has shifted to include variety of techniques including verbal protocol.
What types of photoreceptors does the human retina have and how are they distributed?
Rods in peripheral vision, tens of hundreds of rods to one single bipolar cell. Information loss. Rods relay black, white, and motion detection.
Cones lie in small area known as the Fovea: which provides us with our most accurate, precise vision. One cone has it sown bipolar cell for relaying impulses. Cones relay colors.
What is contralaterality?
Each half of the retina picks up visual information from the contralateral visual field (left half of each eye picks up info from right visual field and visa versa.
What is the Retina, what is it made of?
Retina: the retinal surface is composed of three basic layers of neurons: rods and cones, bipolar cells, and ganglion cells.
Rods and Cones (first Layer): form the back layer of neurons on the retina and are the first neurons stimulated by light.
Bipolar Cells (second layer): collect messages and pass them along to the third layer, the ganglion cells.
Ganglion Cells (thirds layer): axons of ganglion cells converge at back of eye, forming a bundle called the optic nerve (blind spot).
How does sensation differ from perception?
Sensation: (without conscious awareness) reception of stimulus from the environment and the initial encoding of that stimulus into the nervous system—transduction.
Perception: (awareness, interpretation) the process of interpreting and understanding sensory information.
What is Compression?
The information is already analyzed and summarized in the eye quite a bit before it gets to the brain—compression. Already processed and summarized record of original stimulus.
What is Helmholtz’s trichomatic colour perception theory?
We perceive color through a combo of input from different cones. Only need three cone color receptors to make all combos. Long wavelength for red, medium wavelength for green and short wavelength for blue.
Evidence is color blind people missing certain cones, can’t see certain colors.
What is the opponent process theory?
After staring at a color and then look at white see and after image of the opposite color. If looked at red see green, blue see yellow, black see white.
What is the order of neural activation in visual sensation?
Photoreceptors—Bipolar Cells—Ganglion Cells—Axon of Ganglion Cells—Visual Cortex.
More photoreceptors, less bipolar cells, and even less ganglion cells.
What is the difference between bottom-up and top-down processing?
Bottom-up or data-driven processing system: processing driven by the stimulus, the incoming data.
Top-down or conceptually driven processing effects: context and higher-level knowledge influence lower-level processes.
How do visual saccades and fixations differ?
Saccades: eye sweeps from one point to another in fast movements.
Fixations: pauses where information is gathered.
What is visual persistence? And what is visual sensory memory?
Visual Persistence: apparent persistence of visual stimulus beyond its physical duration. Can see scene of lightning for while after it happens then fades away. Not just turned off as it appears to be in real life.
Visual sensory memory or Iconic memory: physical eye doesn’t keep seeing lightning; we have a temporary visual buffer that holds information for a brief period of time. Any information held beyond this point would be memory. Icon: Contents of the visual image that resides in iconic memory.
Explain Sperling’s (1960) tachistoscope experiment.
Study used Tachistoscope: for presenting visual stimuli in the lab. Presented array of letters in brief duration (like in class). Followed by blank. Subjects asked to report as many letters from the display as they could. Found could report 4.5 items correctly same for long exposure 500ms and short exposure 5ms.
Led to conclusion that we have a span of apprehension: the number of items recalled after any short display (4.5 items).
Sperling challenged this idea. Maybe its not that we can only store 4.5 items, but that because of fading we can’t recall the items fast enough. Maybe the storage amount is actually endless! Used with Whole report: P’s are to report any letters they can, the Partial Report Condition: any one of the rows was to be reported. Used same procedure but also asked partial report. P’s had 76% accuracy in partial report, where whole report was 36%. Showing could recall before fading took place. When P’s waited 1s before recall, fading took place and P’s could only remember 36%, same as the whole report condition. Shows that visual Iconic memory can hold a great deal of information.
How does backward masking inform our understanding of iconic memory?
Original research though that iconic memory is subject to Decay: the mere passage of time degraded the icon, making it illegible after some short interval. But ecological validity—doesn’t happen that we have blanks after stimulus is shown in real life. We have Interference: forgetting caused by the effects of intervening stimulation or mental processing. Sperling did study; found subjects were much less accurate with interference. Backward Masking: a later visual stimulus can drastically affect the perception of a previous one e.g. subjects claim only saw the mask. Backward masking is a particular example of Erasure: when the contents of visual sensory memory are degraded by subsequent visual stimuli (erase perception).
What is Focal Attention?
Focal Attention (Neisser): Not blind during saccade as previously thought but attend visual stimulus, then while scanning for new visual stimulus, looking at old visual stimulus in iconic memory; therefore looks seamless!
What is the format of representation in iconic memory?
Raw sensory, uninterrupted information.
How do template theories of pattern recognition differ from Feature theories?
One has exact template pattern and other matches features in mind to the letter in reality.
What is the Template Approach and in which spot was it?
(EARLY THEORY) The Template Approach: stored models of all categorizable patterns. Match the letter you see with the exact template stored in memory.
Criticisms: there are so many forms of letters, shapes out there, there would have to be endless storage room.
With so much stored away, it would take to much time to find a template.
What is the Visual Feature Detection Model and in which spot was it?
(MIDDLE THEORY) Visual feature Detection: a very simple pattern, a fragment or component that can appear in combination with other features across a wide variety of stimulus patterns e.g. straight horizontal line. Recognize whole patterns by breaking them apart e.g. T, one straight horizontal, one straight diagonal line.
What is the pandemonium theory?
Selfridge’s Pandemonium: mental mechanisms that process a visual stimulus. Mechanisms are demons.
Data Demons: encode the information.
Computational Demons: each demon has a simple feature it is trying to match e.g. a horizontal line, and then once match is found it shouts.
Cognitive Demons: represent letters, they listen for shouting and also begin to shout. Whichever cognitive demon shouts the loudest gets activated by decision demon e.g. T.
Decision Demon: has the final say in recognizing and categorizing the pattern.
What is the connectionism theory (pattern recognition) and in which spot was it?
(NEW THEORY)Connectionist Modeling: computational in sense that not only read but figure out or compute what a sentence means. Basic idea is making connections, larger numbers in mathematical computations need higher speed connections in brain.
Input unites: are elementary structures such as a horizontal line that encodes visual stimuli for this input.
Hidden unites (hidden meaning inside brain): when input unit matches to a hidden unit, attach a positive weight to hidden unit (a letter e.g. H). When it doesn’t match attach a negative weight. When enough input units match and weight down a hidden unit H, then H is activated.
Output Units: output activates enough words with letter H in them, then when hidden unit detects enough letters in the word, gets enough weight to word and is chosen e.g. height. This is where top-down processing comes in. Knowledge of what letters usually go together like TH and IE and words known as WORK not WORR help to decode the knowledge is also added as weight to help choose the right word.
What is the recognition by components theory
Recognition by Components (RBC): applying connectionist theory to spatial objects, real life. We have basic geometric forms to match real life objects to called Geons. There are 5 geons that comprise every object in the world.
We find the edges of objects, regardless of orientation and specific characteristics of the object.
Scan regions of pattern where lines intersect. Edges and areas of connection enable us to determine which basic components are present in the pattern.
Evidence: intersections or junction points are crucial to understanding what the object is.
Shortcomings of RBC: major difficulty, same as pandemonium, tied to bottom-up processing, not accounting for context effect.
Evidence that object recognition comes from two places in the brain, one for seeing parts (bottom-up), one for seeing the whole (top-down).
How quickly does pattern recognition copy information from sensory store to working memory?
Pattern recognition goes about 1 letter in every 10 msec up to max of 5.
What is prosopagnosia? Apperceptive agnosia? Associative agnosia?
Prosopagnosia: a disruption of face recognition.
Apperceptive Agnosia: a basic disruption in perceiving patterns. Can see the color and texture, but can’t put information into a whole.
Associative Agnosia: seems able to construct a mental percept, but the person still cannot associate the pattern with meaning, still cannot link the perceived whole with stored knowledge about its identity.
What do are the implications of agnosias for cognitive science?
Shows separate step in linking pattern to meaning.
What is Auditory Sensory Memory?
Auditory Sensory Memory or Echoic Memory: brief memory system that receives auditory stimuli and preserves them for some amount of time for mental system to gain access to it.
What did Darwin find about amount and duration of Auditory Sensory Memory?
Darwin, el al used Shadowing task: tape-recorded message into left and different into right, participant asked to shadow or repeat one message. Found two differences to visual sensory memory:
Not as much storage space in auditory memory.
Sensory traces are longer in auditory memory for a longer time if they represent simpler information.
What is the Modality Effect?
Modality Effect: superior recall of the end of the list when the auditory mode is used instead of the visual mode of presentation.
When said zero after hearing list degraded or erased list digit in list.
Both visual and auditory sensory memory hold info for certain time, but auditory vary on complexity of stored info.
Both lose sensory memory over time or erasure when other stimuli present.
What is explicit and implicit processing and give an example of each.
Explicit Processing: involving conscious processing, conscious awareness that a task is being performed, and usually conscious awareness of the outcome of that performance e.g. first time driving standard.
Implicit Processing: processing with not necessary involvement of conscious awareness e.g. driving standard for years.
What is attention?
Attention as a Mental Process.
Attention as a Limited Mental Resources or fuel.
What is input attention?
Input attention: the basic process of getting sensory information into the cognitive system.
What it alertness and arousal?
Alertness and Arousal: are necessary states of the nervous system. Arousal is dependent on the environment e.g. on a first date, nervous system aroused in order to pay attention to date.
What is Overt Orienting Response?
The Overt Orienting Response Orienting Reflex (can see from outside point of view): the reflexive redirection of attention that orients you toward the unexpected stimulus e.g. light flashes, grabs your attention and you turn to see it. Location-finding response of the nervous system evolutionary for protecting self against danger.
When novel stimulus that originally grabbed your attention becomes normal again, called habitation.
What is Spotlight attention?
The Covert Orienting response or Spotlight Attention (Can’t see from outside point of view): shift attention of focus to a stimulus without moving body or eyes. Doesn’t sweep, but uses saccades.
What is Selective Attention?
Selective Attention: in any moment we are bombarded with sensory information. The ability to attend to one source of info while ignoring or excluding other messages around us.
What is the difference between input attention and controlled attention?
Input Attention: for the fast, unconscious, automatic process of attention. Data-driven processing.
Spotlight attention, although contradictions between automatic and controlled process, it is still not requiring conscious effort.
b. Controlled Attention: requiring controlled conscious effort, allows us to respond to data driven with knowledge.
What is the Dual Task Method?
Dual Task or Dual Message Procedures: in effort to see how we process audition, two tasks or messages are presented such that one task or message captures the person’s attention as completely as possible.
Describe two important factors related to selective attention in Norman’s Pertinence Model.
Norman’s Pertinence Model (late selection, things processed way more before selecting to attend to one): all messages receive attention for sensory activation: loudness of info, and pertinence: importance of info. Highest combination of both is selected for attention.
Describe Johnston and Heinz’s Multimode Model of Attention.
Multimode Model of Attention: selective attention can happen in any stage, but as going higher in stages, uses more capacity, subtracting for other conscious processes going on.
Stage 1: sensory analysis, selection occurs based on physical differences e.g. tone, loudness…
Stage 2: grammatical and semantic analysis, selection based on meaning.
Stage 3: awareness, remembering content. Uses up much cognitive capacity, slowing down ability to multitask e.g. on test, didn’t realize light was turning on and off.
How does Kahneman’s Capacity theory differ from the early, middle or late section models proposed by Broadbent, Treisman and Norman?
Differs in that Kahneman focuses on recources/attention capacity—that’s our limitation.
What is the Filter Theory, what is its flaw and which level of selection is it?
Broadbent’s Filter Theory (early selection, deciding what to attend to early): auditory mechanisms acts as a selective filter, only one message can be passed through the filter at a time.
Flaws: notice messages we are not attending too e.g. like our name being said. Some could slip past filter.
What is the Attention Theory, who started it and which level of selection is it?
Treiman’s Attention Theory (Selection at middle level): all incoming messages receive low-level attention in physical characteristics of the message. When unattended message is found to be unimportant, then begin to tune it out. Leaves room for top-down processing, understanding content, and works within cognitive apparatus not just filtering based on differences in sound, tone, pitch…
What is the Pertinence Model, who started it and which level of selection is it?
Norman’s Pertinence Model (late selection, things processed way more before selecting to attend to one): all messages receive attention for sensory activation: loudness of info, and pertinence: importance of info. Highest combination of both is selected for attention.
What is Kahneman’s Attention Model?
Attention as fuel, fuel is limited. Psychological Refractory Period or Attentional Blink: a brief slow down in mental processing due to having processed another very recent event, attention needed to process one stimulus, slows down processing of next stimulus. Some things require Controlled Attention: not daydreaming, willful act to keep reading, some things require automatic processing.
What were the main findings of Shiffrin & Schneider’s (1977) study on controlled versus automatic search.
Visual Search (in terms of spotlight attention): evidence to show that when T is easy to find, letters are all spread out and round, T is only angular one, largely automatic and must represent very early visual processing. When T is hard to find among L’s, visual search rates depend critically on the kinds of distracter patterns through which subjects are searching. But Automaticity can be learned with lots of training—very specific—when change target in any way, gets harder again.
What is automaticity?
Consumes little if and attentional fuel.
Not aware of or open to introspection.
Without attention and conscious decision.
What are the disadvantages of automaticity?
It is hard to undo what has become automatic.
Sometimes we should be consciously aware of a process that has become too automatic.
Explain and describe the stroop effect.
Stroop task: showing words of colors in three states, neutral, words of colors in same colors, words of colors in different colors. Then must say what color word is. Reading word RED is automatic, activating the meaning, demonstrating Priming. Yet color of word red, GREEN interferes so delays response of saying what color the word is.
What is Conscious Processing?
Occurs only with intent, with deliberate decision.
Open to awareness and introspection.
Uses attentional fuel, less to go around to other processes.
Describe the procedures, results and implications (including the cost and benefits) of Posner and Snyder’s (1975) study on high validity and low validity priming.
P’s were shown prime either neutral (+), facilitated (H), or interference (D) and then were shown target (HH) where P’s said if two target letters were same or not. Priming of (H) could be 80% of time helped HIGH VALIDITY, or priming (D) could be 80% mislead LOW VALIDITY.
i. Low validity: bottom-up priming affect, not relying on being helped, so when mislead didn’t really have an affect.
ii. High validity: top-down priming affect, relying on prime helping you, when priming went wrong, really mislead and got a lot wrong.
What is hemi-neglect?
Hemineglect: disruption or decreased ability to look at something in the (often) left field of vision and pay attention to it. Cannot voluntarily direct attention to half of the perceptual world, be it visual, auditory, or any other type of sensation.
How does Logan (1988), in his Instance Theory of Automaticity, explain how something becomes automatic?
In doing a task we can do it 2 ways, automatic or if new start with controlled algorithm—slow sequence of processes guaranteed to work. Each time we do the task an instance of the task is stored in memory. Hard to find instances if only a few, but the more times doing the task via algorithm, the more instances stored in memory, the easier it will be to find the instances and it will become automatic. Needle in hay stack analogy.
What is the Horse Race Model?
In Logan’s theory, each time doing the task the instances race the algorithm and another instances is stored. When very many instances, the instances will beat the algorithm and win the race meaning it is automatic.
According to Sternberg, how do we access info from STM?
The Sternberg Task: P’s first stored list of letters called memory set. Then saw a single letter, the probe, and made yes-no response depending if probe letter was in letters stored. Repeated several hundred times. In the process model for short-term memory scanning, Sternberg deduced that the time to encode, make yes/no decision, and respond would be the same for all. The only step that would differ would be the search and comparison for each probe.
Sternberg’s Results: search rate through short-term memory is approx 38ms, very fast. Two possible things he could find:
Serial Self-terminating Search: items in short-term memory are scanned one by one, and the scan stops when a match is found. Graph of yes’s would be faster then no’s because yes’s would self terminate, therefore quicker RT time.
Parallel Search: each item in short-term memory scanned simultaneously. Graph would show that it would take no longer to scan 6 items then 35 items.
Answer is Serial Exhaustive Search: the memory set is scanned one item at a time (serial), and the entire set is scanned on every trial, whether or not a match is found (exhaustive).
How did Sternberg determine how long it takes us to scan/search STM?
The only step that would differ would be the search and comparison for each probe.
What is the capacity of STM and how was the capacity found to be increased?
7+-2 chunks.
What is recoding?
Recoding: process of grouping items together then remembering the newly formed groups. Has an active, attention-consuming nature. Goal of recoding scheme is to make newly formed units as meaningful and as related to easily retrievable info as possible e.g. when remembering phone number 980- 98—69 (3 chunks).
What are the limitations to Sternberg’s conclusions on Serial Exhaustive Search?
Limitation on interpreting reaction time to necessarily mean serial exhaustive search e.g. could be parallel search and longer memory sets slow down search RT time.
Assumption that the several stages or processes are sequential and that one must be completed before next one begins.
STM can hold a variety of informational codes. Name four codes and provide evidence for each.
Verbal Codes: early research thought to be held only in acoustic verbal code. Even though letters presented visually, stored in short-term memory as acoustic e.g. P’s made acoustic errors like D for E but little visual errors like F for E. Code usually in Acoustic-articulatory Code: because either sound (acoustic code) or the pronunciation (articulatory code) is important. But not only format, now found many different formats.
Semantic Code: short-term memory can also retain a semantic, meaning-based code. Wickens’ release from PI study. P’s recall on items from same category (semantic) got worse and worse, but when changed category, recall improved.
Visual Code: can also hold visual code. Shepard and Metzler made P’s perform complex, visually based mental task: holding a mental image in short-term memory, then rotating it to fit next image. Found mental rotation takes time. Could not have been done based on acoustic or verbal-based codes.
Other Codes: kinesthetic image e.g. riding a bicycle, feels like physical memory.
What are three differences between STM and WM?
Baddeley shifted emphasis from short-term memory to working memory.
i. Short-term memory:
1. Has focus on the input and storage of new information.
2. No components.
ii. Working memory:
1. Mental workbench, a place where conscious mental effort is applied and manipulation of info.
2. Has components: Central executive, Visio-spatial sketchpad, and Phonological loop.
Explain the serial position curve.
Serial Position Curve: a graph of item-by-item accuracy on a recall task. Serial position 3, third item on the task.
Serial Position Effects: Primacy effect: accuracy of recall for the early list positions. Strong primacy because of rehearsal, weak primacy, lack of rehearsal. Due to LTM. Asymptote: average memory for info in the middle of a sequence, usually bad. Recency effect: the level of correct recall on the final items of the originally presented list. Due to STM.
What is the effect of distraction, word frequency and rate of presentation on the serial position curve?
Distraction at the end of the task will overwrite the recency portion because it’s in short-term memory; won’t change primacy and asymptote.
Word frequency (the more common a word is in language) will make remembering high frequency words better in primacy portion.
What is the difference between dissociation and double dissociation?
Dissociation is when manipulation to Serial Position test affects only LTM or WM.
Double dissociation is when manipulation to Serial Position test affects both LTM and WM.
What is release from PI?
Wickens’s Release from PI: when the decline in performance caused by PI is reversed because of switch in the to-be-remembered stimuli e.g. change letters, to remembering numbers.
What is the Brown-Peterson Task?
Brown-Peterson Task: to test, showed P’s three-letter stimulus and three-number stimulus and then asked to count backwards by threes from high number. Found letters were quickly forgotten from short-term memory. Decay?
What is the difference between decay and interference in STM?
Decay: forgetting might be caused simply by the passage of time. Used Brown-Peterson Task to test. Found letters were quickly forgotten from short-term memory. WHY?
Interference versus Decay in Short-term Memory: interpreted evidence as decay because thought should be no interference because letters differ from numbers when counting backwards. But on closer look Waugh and Norman thought maybe forgetting was caused by interference not decay. Almost impossible to test decay because if give items to remember followed by blank period, P’s will use blank period to rehearse. Then not testing short-term but long-term memory.
So conclude different kinds of interference produce different kinds of forgetting. Impossible to explain results by simple decay.
What are the two kinds of interference?
Keppel and Underwood:
Proactive Interference (PI): when older material interferes forward in time with your recollection of the current stimulus. The more trials in the Brown-Peterson task, recalling the stimulus becomes more difficult because the previous trials are generating interference.
Retroactive Interference (RI): newer material interferes backwards in time with recollection of older items.
What is the key difference between explicit and implicit memory?
Explicit is conscious—episodic—semantic.
Implicit is unconscious—not coving this in this class.
What types of information are ‘stored’ in episodic and semantic memory?
Episodic is a person’s autobiographical memory, memory of the personally experienced and remembered events of a lifetime. Different for every person.
Semantic is general working knowledge, relying on concepts and ideas to one another e.g. what does mother mean. Similar for people in a culture.
Describe the mnemonic devices (1) Method of Loci and (2) Peg Word.
Method of Luci: based on visual image and number of locations. Put the list that needs to be remembered into the locations in memory e.g. places in your house. The more outrageous the better remember.
Peg Word: pre-memorized set of words serve as a sequence of mental pegs onto which to be remembered items can be hung e.g. 1 is a bun, peg boat, 2 is a shoe, peg car.
What was Ebbinghaus’s major contribution to the study of memory, what was the problem?
Major contribution CVC’c nonsense syllables. Developed way of learning new info. Saving Scores, Forgetting Curve, longer lists are remembered better because they took more time to learn because they are longer. If you continued to relearn there was no forgetting at all.
CVC’s had no meaning, and memory relied heavily on meaning.
What are metamemory and metacognition?
Understanding and insight into working of our own memory and overall cognition system. Metamemory: knowledge about one’s own memory, how it works, and how it fails to work. Metacognition: knowledge about one’s own cognitive systems and its functioning.
What is the Isolated Effect?
Isolated effect (von Restorff effect): improved memory for one piece of info that is made distinct or different from the info around it.
Outline the Levels of Processing view (Craik & Lockhart). What was a major problem with this view?
a. 2 types of rehearsal: maintenance (repetition) and elaboration.
Level 1: maintenance is seen as shallow processing.
Level 2: elaboration is seen a deep processing.
Baddeleys Criticism: No method for deciding a head of time whether a particular kind of rehearsal would prompt shallow or deep processing—circular reasoning.
Task effect—different types of tasks would produce different kinds of results.
What was the main finding in Bousfield’s (1953) experiment on organization and storage of information in memory?
Present list of items, some of list random, some of them similar. At test subjects recalled things in clusters—started grouping based on similarity whether facilitate or not when encoding it. As means of rehearsal. Bousfield used free recall task (simply a recall task in which recall items in any order). Found P’s tended to recall items by category. The way material was stored governed the way it was recalled. Organization is necessary condition for memory. Any info stored in memory was organized. Rehearsal came to be viewed as organization.
What is the paired-associate learning task argument against decay by Endel Tulving.?
Pared-Associated Learning: a list of stimulus terms is paired, item by item, with a list of response terms. After learning, the stimulus terms should prompt the recall of the proper response terms e.g. tree-jam, ice-fly. Found number of correct responses grew across repeated trials. Then asked to learn another pared list where some word pars were very similar to first e.g. plan-leaf to plan-tree. Found proactive interference and retroactive interference.
Problems of meaning: old idea was when learn new list, old list would be forgotten, but found old list interfered with recall of new list.
What are Pavio’s findings on dual coding hypothesis?
Paivio described paired-associate learning: learn the list so that the correct response item can be reproduced when the stimulus item is presented e.g. elephant-book. Seeing elephant cued book. Paivio suggested dual coding hypothesis: words that denote concrete objects, as opposed to abstract words, can be encoded into memory twice, once in terms of their verbal attribute, and once in terms of their imaginable attributes e.g. book is both word in had and image of book help to encode. P’s remembered better when recall type is consistent with the P’s type of rehearsal.
What did Jenkins & Dallenbach do and what were the results of this experiment?
2 groups, after reading info one group slept 8 hours other didn’t. P’s who slept didn’t get interference of daily life, so performed better on recall test.
What is Encoding Specificity?
Encoding Specificity: in Tulving and Thompson’s view info encoded not seen as set of isolated terms. Each item encoded into a richer memory representation, one that includes any extra info about the item that was present during encoding. Retrieval depends on the way item was encoded. When retrieving in original context items were learned like underwater better at retrieving. Retrieval Cue: for recalling cat—a useful prompt to remember for the info to be retrieved.
What were Godden and Baddely showing with deep sea divers?
Context dependency on how you learn info. Better at retrieving info in same context/environment (underwater).
What is Dissociation and Double Dissociation?
Dissociation: disruption in one component of the cognitive system but no impairment of another e.g. A is impaired while B is intact. Double Dissociation: reciprocal patterns of cognitive disruption e.g. both A and B is impaired. Association: when A and B are completely related damage to one leads to damage to the other.
What is retrograde and Anterograde amnesia?
Retrograde Amnesia: loss of memory for events before brain injury. Anterograde Amnesia: disruption of memory for events occurring after brain injury, especially disruption in acquiring new LTM.
What was main idea of H.M.’s condition?
H.M.’s has no explicit memory, idea that ability to encode new connections into memory strengthens existing memory.
Define semantic memory and give an example of a semantic memory.
Semantic Memory: permanent memory store of general world knowledge. Described as thesaurus, dictionary, and encyclopedia. Any time say “I remember learning that” it is episodic memory.
What were the results of Loftus & Palmer’s (1974) experiment?
Showed P’s traffic safety movies. Then asked to recall how fast car were going before crashed or accident. When said “crashed,” P’s remembered cars going faster.
Be able to describe the components in Collins & Quillian’s “Network” model (i.e., what are nodes? Spreading activation? Intersection search?)
Collins and Quillian (and Loftus) Model: looked at structure of semantic memory and process of retrieving information.
i. Nodes in a Network:
1. Structure is said to be a Network: an interrelated set of concepts or interrelated body of knowledge. Each concept represented as a Node: a point or location in the semantic space. Nodes are linked together by Pathways: labeled directional associations between concepts.
ii. Spreading Activation: the mental activity of accessing and retrieving information from this network. At baseline not activated, but when read e.g. Machine and machine gets activated. Once concept becomes activated then concept spreads to all other concepts relating to which it is linked e.g. dig, metal, bolts… Connections between nodes record the elementary fact or proposition: a relationship between two concepts e.g. isa is a property statement.
iii. Intersection Search: when the spreads if activation encounter one another. Although other intersections can be found, a decision stage find which intersections are valid or invalid and best suited.
iv. Related Concepts: spreading not only retrieves relevant pathways but relatedness can be based on semantics and lexical or phonological features e.g. sound.
How was Collins and Quillians Model tested re hierarchical memory?
Tested Collins and Quillians Model (two concepts closer together in network should take less time to verify yes no response then father apart. Found that as semantic distance increased from level 1 to level 3 then RT increased. Concluded a hierarchical nature to semantic memory (takes longer to verify a relationship father apart in the hierarchy level 3).
What were the problems with Collin and Quillians Network Model and what were revisions to theirs and Smiths Models?
Problem was Cognitive Economy: too much to store every feature with every concept (node) so only nonrelated facts are stored in memory. To save space, anything true of animal (skin, legs, and eyes) is also true of concepts stored under animal like bird, called Inheritance: The members of a category possess or inherit the properties of the category itself. Conrad found little evidence of the economic scheme.

Revision to Collin and Quillians Model to incorporate three issues discussed. No strict hierarchical fashion, properties for a concept are linked directly to concept instead of indirect pathways, pathways are different length showing relationship between higher-frequency pathways (typical features) and lower-frequency (atypical features) take and are longer. Overall, the higher the semantic relatedness between concepts, the shorter the pathways and the faster the RT time. Semantic Relatedness Effect: concepts that are more highly interrelated can retrieve and judge true more rapidly than those with a lower degree of relatedness.
Amount of Knowledge: P’s with more knowledge were able to read faster and easier to integrate new related knowledge into memory.
What is Smith’s Feature Comparison Model? (i.e., what are feature lists? Defining features? Characteristic features?)
Feature Lists: consider semantic memory of be a collection of lists made of semantic features: simple one-element characteristics or properties of the concept with most defining features (an essential feature) at the top of the list e.g. Robin stored with a list of its features bird on top and lower features called Characteristic Features.
Retrieval is Feature Comparison: ask a robin is a bird?
1. Stage 1: feature comparison process involves rapid, comparison of all the features. If hardly any features in common or very many features in common respond quick yes or no.
2. Stage 2: if found some things in common then return for another comparison of only the defining features. Takes longer to reply yes or no.
What was the problem with and contribution of Smith’s Feature Comparison Model and what were revisions to his and Collin and Quillings Model?
Property Statements: Smith’s Model had problem with category and property relatedness. Lists of categories and properties have features on lists e.g. category list of bird has wings, feet… but what would be on a property list of things with wings? Then why compare lists when Robin already has wings on its list. Also, problem with lists like Ostrich is large because large is relative; Ostrich is small compared to mountain.
Typicality Effects: the typical members of a category can be judged more rapidly than atypical members. Rosch found some items are listed as members of a category much more frequently then others more frequent words are able to make judgment about them faster.

Revision to incorporate three issues discussed. No strict hierarchical fashion, properties for a concept are linked directly to concept instead of indirect pathways, pathways are different length showing relationship between higher-frequency pathways (typical features) and lower-frequency (atypical features) take and are longer. Overall, the higher the semantic relatedness between concepts, the shorter the pathways and the faster the RT time. Semantic Relatedness Effect: concepts that are more highly interrelated can retrieve and judge true more rapidly than those with a lower degree of relatedness.
Amount of Knowledge: P’s with more knowledge were able to read faster and easier to integrate new related knowledge into memory.
What is the ‘typicality effect’?
Semantic Categories: Rosch’s research revealed repeatedly that natural concepts and categories have an internal structure. Category members vary in their typicality, in how well they represent or belong to categories e.g. sled is toy and vehicle. They come in correlated bundles where some represent the typicality of a category better than others.
What is a prototype?
Prototypes: central core instances of a category, “a really red-red.” Typical members are stored close to the prototype and atypical members are stored father away e.g. typical Bird types are Robin, Crow…atypical are Chicken.
Describe priming. (define: prime, probe/target, costs/benefits)
Prime: any stimulus that is presented first to see if it influences some later process or information. Target: the stimulus that follows the prime, the later information. Facilitation: when the target is easier or faster to process because it was primed. Inhibition: is the negative influence on processing.
Define stimulus onset asynchrony (SOA).
Stimulus Onset Asynchrony (SOA): the length of time between the onset of the prime and the onset of the target.
What was the study using SOA’s done by Neely and what did he find?
Priming is Automatic: priming is automatic, access words meaning automatically. (Neely) Study used three conditions: 1 where primes mostly facilitated, think of Butter—then showed Bread, 2 where primes mostly inhibited, think of Butter—then showed Fork, and 3 where neutral prime X was shown X—then shown Bread. Also manipulated the SOA’s and found short SOA’s had benefit but long SOA’s much more benefit (more time to think and spreading to prime related concepts). Shows automatic spreading in short SOA but automatic and top-down priming in long SOA’s. Second experiment told P’s to expect the opposite. If shown prime of Body part think of Machine part and visa versa. In short SOA’s got no priming effect because top-down processing interfered with word reversal but in long SOA’s got priming effect because P’s could think and activate the right network.
Is priming implicit or explicit?
Priming is and Implicit Process: Marcel study where prime was followed by scrambled visual pattern (backward masking). P’s said didn’t’ seen prime at all yet primes still facilitated lexical decisions e.g. Child—infant. But can also be explicit—Neely’s expect a switch showed priming effect in long SOA’s.
What is connectionism (knowing)?
Connectionism: a framework in which interconnected nodes in a network, pathways, and priming can be studied.
Connectionist Models (PDP Models): contain a massive network of interconnected nodes. Nodes can represent almost any kind of info from line segment that represent letter recognition to “has wings.” Pathways are weighted whenever two nodes relate to one another, those with positive with indicate pathways that facilitate and negative with indicate pathways that inhibit e.g. Furniture—chair is +0.8 so primed and Chair—sofa is +0.7 so chair is chosen.
ii. Connectionism and the Brain: gives us tool for understanding brain. Similar to structure of the network of neurons, neurons fire or not like firing of connectionist units, positive and negative weights mimic excitatory and inhibitory neural synapses, and spreads of activation and inhibition are co-occurring at same as parallel processing in brain.
How is context important to words?
Words are mostly ambiguous: it has more than one meaning e.g. what does “count” means? Need context so correct word meaning can be retrieved from memory. The effect of the context is the effect of priming.
What did Meyer & Schvaneveldt find about priming and the lexical decision task?
Priming and the Lexical Decision Task (P’s judge whether a string of letters is a word): Meyer and Schvaneveldt presented two letter strings at a time respond yes only if both are words e.g. Bread-Butter. Found people judge Bread-Butter more quickly than Nurse-Butter. Don’t need to find meaning for word, but study shows automatically look up meaning too.
What is Anomia?
Anomia: the complete and successful retrieval of the semantic concept but inability to find any part of the word that names that concept.
What are the 7 sins of memory?
a. Sins of Omission (can’t remember):
i. Transience: the tendency to lose access to information across time, whether through forgetting, interference, or retrieval failure.
ii. Absent-mindedness: Everyday memory failure in remembering information and intended activities, probably caused by insufficient attention or superficial, automatic processing during encoding.
iii. Blocking: temporary retrieval failure or loss of access, such as the tip-of-the-tongue state, in either episodic or semantic memory.
b. Sins of Commission (remember but in error):
i. Misattribution: remembering a fact correctly from past experience but attributing it to an incorrect source or context.
ii. Suggestibility: the tendency to incorporate information provided by others into your own recollection and memory representation.
iii. Bias: the tendency for knowledge, beliefs, and feelings, to distort recollection of previous experiences and to affect current and future judgments and memory.
iv. Persistence: the tendency to remember facts or events, including traumatic memories, that one would rather forget, that is, failure to forget because of intrusive recollection and rumination.
What is hypermnesia?
Bartlett’s Research: studied that process of memory for meaningful material. P’s studied material then recall once shortly after and again at later intervals. Compared the changes in P’s recall found two sides to task—short periods between tests improves recall called hypermnesia, but longer periods between tests forgetting occurs.
What is reconstructive memory?
Reconstruction (Bartlett) versus Episodic Recall: not just episodic recall, P’s are adding in new things. One answer is proactive interference. Another answer is what we already know exerts a strong influence on what we remember about new material—reconstruction.
Extensions of Reconstructive Effects: knowledge of the theme or topic improves people’s memory of the passage.
How did Ebbinghaus & Bartlett differ in their approach to studying LTM?
Ebbinghaus Studied data-driven info in LTM and Bartlett studied meaningful top-down info in LTM.
What were Barlett’s major findings of how people recalled material over time?
Bartlett’s Research: studied that process of memory for meaningful material. P’s studied material then recall once shortly after and again at later intervals. Compared the changes in P’s recall found that memory is reconstructive memory: we construct a memory by combining elements from the original material together with existing knowledge (schema).
i. Two possible aspects:
1. P’s failed to recall detailed and storied became shorter.
2. Strong tendency to normalize or rationalize the occurrences in the story. Tendency to add in and alter stories. Bias of top-down processing.
What is a schema? Script? Header? Frame? Default Value?
Schemata: stored frameworks or body of knowledge about some topic. Also use script as synonymous for schema.
i. Headers: are phrases or words that activate a script e.g. John went to restaurant, got burger and paid cheque.
Story of man in restaurant who forgot glasses—we assume it will be problem to read menu without menu mentioned because restaurant script has been activated in turn activating Frames: details about specific events within the script. In script terminology for menu is Default Value: the common typical value of concept that occupies the frame.
Describe the Bransford & Franks (1971) study on semantic integration.
Semantic Integration: Bransford and Franks looked at how people acquire and remember ideas. P’s listen to sentences (24 sentences were mixed up parts from 4 basic idea groups) and then answer simple question about each sentence. Then after break given more sentences and had to answer yes or no if had seen sentence on original procedure. Also, P’s had to indicate how confident they were on 1 point scale. Found P’s recognized the sentences that expressed the overall idea groups most thoroughly saying false positives (old) when actually new. Conclusion, P’s had constructed a wholistic semantic idea where related info was stored together in memory: Semantic Integration.
What did Sulin and Dooling find in study about Hitler and Helen Keller?
Theme can distort recall. Sulin and Dooling used one story but change name where one was random name and other group read about Hitler. On recall P’s in Hitler group (because of existing knowledge about Hitler) had distortions about the story in favor of Hitler. This thematic effect grew stronger when P’s tested one week later.
Sulin and Dooling’s did another experiment this time using Helen Keller and concluded that thematic effects are prominent during retrival or recall because they were observed a full week after exposure to the passage.
What’s the difference between technical accuracy and content accuracy in memory?
Technical Accuracy: talked about in chapter 6 (data-driven), recalling or recognizing exactly what was experienced. For meaningful material found Content Accuracy: defined as recalling or recognizing the meaning or content of what was experienced.
Is technical Accuracy Important: both are important e.g. data-driven important for knowing a password and Content Accuracy for ideas for what’s your professor said yesterday. Seems whole point of memory is to combine. If dint’ would just remember isolated fragments, bad.
What is a proposition? How did Sachs’ study support the validity of propositions?
Propositions are simple relationships between two concepts e.g. a robin has wings. Proposition approach as: the set of semantic nodes connected by labeled pathways, where the entire collection of concepts and relationships expresses the meaning of a sentence or coherent phrase.
Remembering Propositions: Sachs tested P’s to see if remember verbatim or meaningful sentences better. P’s given recognition test among for alternatives 1) verbatim repitition 2) change both in surface and meaning 3) & 4) represent changes only in surface form. Found when tested after no delay good at recognizing exact repetition. After a delay found accurate only in rejecting the alternative. Concluded that usually remember only meaning and forget exact verbatim. Special situation e.g. joke, then retain verbatim string. 2 days some verbatim memory, 5 days very little.
What is remembered better Typical or Atypical events?
Smith and Graesser wanted to find if atypical or typical events were remembered better. Gave P’s story of both typical and atypical scripted activity. P’s tested for recall after 2 days, 1 week, and 3 weeks. Results: at first typical info was remembered better than atypical info. But then corrected for guessing on typical—easier because of scripts. Then found atypical events were remembered better than typical events. Conclusion: store a copy of generic script as your main memory; tag onto that the specific atypical details.
What are Hannegins and Reintz’s results of cause and effect?
Hannegins and Reintz looked at cause and effect of scripts. Introduced manipulation due to cause of event or to an effects of some event e.g. orange on floor but not shown women taking orange from bottom of pile—same slide but women taking orange from pile but no orange on floor. Some P’s say cause and some effect. Found if P’s say effect, mistake a new cause as old. But if see orange on floor, later remember you say women putting an orange. If there was an effect, there must be a cause according to script made it up (support for scripts).