Save
Busy. Please wait.
Log in with Clever
or

show password
Forgot Password?

Don't have an account?  Sign up 
Sign up using Clever
or

Username is available taken
show password


Make sure to remember your password. If you forget it there is no way for StudyStack to send you a reset link. You would need to create a new account.
Your email address is only used to allow you to reset your password. See our Privacy Policy and Terms of Service.


Already a StudyStack user? Log In

Reset Password
Enter the associated with your account, and we'll email you a link to reset your password.
focusNode
Didn't know it?
click below
 
Knew it?
click below
Don't Know
Remaining cards (0)
Know
0:00
Embed Code - If you would like this activity on your web page, copy the script below and paste it into your web page.

  Normal Size     Small Size show me how

Clinical Psychology

Week 4 Powerpoints

QuestionAnswer
Kazdin (1982) Symptom substitution and Generalization
Symptom Substitution Treatments that focus on symptom reduction with targeting the "underlying causes" of dysfunction risk merely having reduced symptoms replaced by non-targeted symptoms
Generalization Changes in one behavior during treatment relate to changes in other, similar domains of behavior
Empirically testing Symptom Substitution Operationally define construct, identify constituent parts, post alternative explanations, review studies to determine evident support
New Model: Response Covariance Alternative to symptom substitution. Two or more correlated behavioral responses to treatment. Response to treatment affects other responses. Responses can go in either direction (one behavior improves but other worsens/improves)
Kazdin (1982) Take-Home Message Treatment produces changes in sets of behaviors that relate changes (both positive and negative) in related or distinct sets of other behaviors.
Examples of poor decision making? False Dilemma and Appeal to Ignorance
False Dilemma Situation in which only two alternatives are considered
Appeal to Ignorance Something is true because it has not yet been proven false.
What can lead to poor decision-making? "gut" feelings
Correlation Relation between two variables, no necessary direction of the relation (variable X leads to variable Y, or vice versa)
Causation Levels of a variable directly/indirectly influence a second variable's levels
Mediation Two variables are related in some way, and a third variable explains WHY the relation exists.
Moderation Two variables are related under some circumstances, and a third variable influences the DIRECTION or MAGNITUDE of this relation.
Prevention Decreasing likelihood that an outcome occurs
Intervention Decreasing an outcome that has already occurred
Stages of Ethics -Institution approval -Consent -Compensation -Debriefing -Reporting -Publication -Sharing data
Psychologists cannot address research Q's strictly using: Experimental Approaches
How do designs vary? Ability to draw inferences between "cause" all the way to "effect" Also vary by number of participants.
Two factors involved in understanding outcomes of research. Internal Validity & External Validity
Internal Validity Does the data tell you what you think it tells you?
Impact 1 on Internal Validity History: what participants went through during the study but had nothing to do with variable of interest
Impact 2 on Internal Validity Maturation: participants changed over course of study in ways that had nothing to do with study
Impact 3 on Internal Validity Testing: completing measures might change things
Impact 4 on Internal Validity Instrumentation: changing measures over course of study, esp. as participants aged.
Impact 5 on Internal Validity Statistical Regression: how many times did participants complete the measures.
Impact 6 on Internal Validity Selection Bias: recruiting participants or assigning them to conditions
Attrition Form of selection bias due to a lot of participants dropping out before end of study. This causes bias in data interpretation.
External Validity Will the study's outcomes apply to people who were not in the sample; generalization
Impact 1 on External Validity Sample Characteristics: matching between sample and rest of people who are targets.
Impact 2 on External Validity Stimulus Characteristics/Settings: what if study was conducted somewhere else, would there be similar outcomes?
Impact 3 on External Validity Reactivity: change in participants behavior b/c they are in a study.
Impact 4 on External Validity Timing: assessments at other periods would have same results?
Case Studies are detailed descriptions about someone, usually with a new treatment.
Case Studies are great for generating what? Research hpyotheses
Case studies cannot rule what out? Threats to internal validity
Single-Case studies can partially rule out? Internal Validity issues
Single-Case studies Have multiple measurements of outcome (before,after, during). Exact manipulated variable Need to introduce and then remove treatment.
Single-Case studies detect what kind of patterns? Patterns between manipulated variable and measurement outcomes
Correlation designs are not the same as? Correlational Anallyses
Correlational Designs have no: Experimental manipulation or random assignment
Quasi-Experimental Designs Reseacher-based manipulation of a variable, such as treatment condition
Quasi-Experimental Designs have no: Random assignment to experimental conditions
Quasi-Experimental Designs cannot rule out: Extraneous influences (variation among participants in different conditions)
Experimental Designs do have: Random assignment and manipulation
Experimental Designs allow for: unambiguous interpretations of effects of manipulation on outcome
Which study design provides the best protection against internal validity Experimental Design
Randomized controlled trials Treatment studies
Which is a quantitative review a Meta-Analysis
Meta-Analysis is a: Group of studies addressing the same topic; main findings.
Effect Size The average results across studies using a common scale
Probability Sampling Interest in ensuring that research sample represents population
Non-probability Sampling No specific interest in representing a population
Sample size needs to be large enough to: Ensure statistical power to detect hypothesized effects
Measurement Reliability The degree of consistency in measurement
3 Examples of Measurement Reliability Internal Consistency, Test-retest reliability, and interrater reliability.
Internal Consistency Items on test relate highly with each other
Test-retest Reliability Measure are stable over time
Interrater Reliability Different observers provide similar scores about the same person's behaviors
Measurement Validity Degree to which the construct of interest is accurately measured
3 Examples of Measurement Validity Face Validity, Predictive Validity, Convergenet Validity
Face Validity Does it look like a measure of the construct?
Predictive Validity Predicting the development of construct from child to adult; does childhood diagnosis predict adult diagnosis?
Convergent Validity Does the measure relate to other measures of the same and similar contructs?
Common Kinds of Measures Self-Report, Informant report, trained rater, observation, psychophysiological, archives, performance-based.
Statisitical Conclusion Validity Aspects of data analysis that impact validity of conclusions drawn
Threats to Statistical Conclusion Validity Low Statistical Power, Multiple Comparisons, Measurement Unreliability.
Low Statistical Power No significant effects b/c the sample size was too small
Multiple Comparisons Significant effects due to chance, given the number of tests conducted
Measurement Unreliability No significant effects b/c the measures were unreliable
Statistical Significance p < .05
An effect that is not statistically significant: Reveals little about how meaningful a finding is
Psychological measures scores Measures yield scores that do not have direct relation to the real world
Definition of Statistical Significance The degree an effect had a meaningful impact on the "real world" functioning of participants
Created by: roxandsocks
Popular Psychology sets

 

 



Voices

Use these flashcards to help memorize information. Look at the large card and try to recall what is on the other side. Then click the card to flip it. If you knew the answer, click the green Know box. Otherwise, click the red Don't know box.

When you've placed seven or more cards in the Don't know box, click "retry" to try those cards again.

If you've accidentally put the card in the wrong box, just click on the card to take it out of the box.

You can also use your keyboard to move the cards as follows:

If you are logged in to your account, this website will remember which cards you know and don't know so that they are in the same box the next time you log in.

When you need a break, try one of the other activities listed below the flashcards like Matching, Snowman, or Hungry Bug. Although it may feel like you're playing a game, your brain is still making more connections with the information to help you out.

To see how well you know the information, try the Quiz or Test activity.

Pass complete!
"Know" box contains:
Time elapsed:
Retries:
restart all cards