SCIENTIFIC RESEARCH
Argument w/ dad: is Y
just common sense? NO! Why? Bc of the scientific method.
The Scientific Method
- Observation
- Defining a problem
- Proposing a hypothesis
- Gathering evidence/testing the hypothesis
- Publishing results
- Theory building
Hypothesis Testing
- Hypothesis - an educated guess as to why certain things happen. Why did things happen the way they did? Ex: car experiment from chapter one
- Hypothesis must be testable - have to be able to put your hunch to the test
- Interested in answering a specific question.
Operational Definitions
- Y
gists investigate lots of things that are not tangible: depression, aggression
- To measure, use operational definitions - states what you are testing in real world terms
- Ex: depression is measured by a depression scale
Theories
- Theory: summarizes what you learned in your experiment. You can predict the future from a good one.
- Must be falsifiable - have to state how you can prove it wrong. Ex: chicken ex from class
- Very broad
, veeeerrrry broad.
Publication
Have to let others see findings, usu in journals. Ex: American Psychologist
RESEARCH METHODS
NATURALISTIC OBSERVATION
- Y
gists observe ppl in the natural setting - researcher goes & watches stuff in the env it usu occurs in. Ex: watching kids on the playground, adults at a bar, monkeys in the jungle
- Info is descriptive, not explanatory. Doesn't tell you why.
Limitations:
- The observer effect
: ppl/animals act different when they know someone is watching!
- Observer bias
: Believing is seeing. May see the behav you're looking for, even though it wasn't really there!
- Anthropomorphic fallacy
- attribute human qualities to animals. Ex: happy doggie. This can lead to incorrect conclusions about the behav of animals.
Recording observations: helps keep bias low by keeping meticulous observations records.
CORRELATIONAL STUDIES
- Look @ relationship between 2+ things.
- Cannot IN ANY WAY imply causation (Cannot! Got that?) Only says that 2 factors are related in a predictable manner. Ex: ice cream & murder
- Correlational study
- finds the degree of relationship btw the factors. Once found, can predict from this relationship. Ex: jumping dog & biscuits
Correlation Coefficients
- Expresses the strength & direction of the correlation. Range from +1 to -1.
- Perfect positive( +1)
: as 1 variable increase, the other matches the increase exactly OR as one decreases, the other matches the decrease exactly.
- Perfect negative (-1):
as 1 variable decreases, the other matches the increase exactly
- Zero correlation:
no relationship. Ex: height & eye color
- Positive correlation
: as 1 thing goes up, the other goes up OR as one thing goes down, the other thing goes down. Ex: hr studying & grades. The variables are moving
- Negative correlation
: as 1 thing goes up, the other goes down. Ex: # of absences & grade
Remember: The bigger the NUMBER, the stronger the relationship. The sign merely indicates direction. Number = strength, sign = direction. -0.7 is just as strong as +0.7!
Scientists graph the relationships they find in their studies. They can be
- Linear -
it forms a straight line when graphed
- Curvilinear
- when graphed, it appears as a curve. See page 34.
THE PSYCHOLOGY EXPERIMENT - WHERE CAUSE MEETS EFFECT
- Experiment - tool used to try to confirm or disconfirm a hypothesis.
- An experiment is how you ID cause & effect. This is MEGA IMPORTANT TO KNOW.
- Subjects (participants)
- ppl/animals in your experiment. Divided into @ least 2 grps:
- Control grp - don't receive the trmt you're investigating
- Experimental grps - do receive the trmt. Ex: effect of drug on test grades
Variables & Groups
- In experiments, you observe variables - something that can change how the exp turns out.
- Independent variable (IV)
- condition the experimenter manipulates. Ex: drug level - 5mg, 10mg.
- What is suspected to be the cause of the behav of interest. Experimental grp exposed to IV.
- Dependent variable (DV)
- the results of the exp, what you are measuring
- Ex: give students drug, then test them. Test scores are DV. DV = how much IV affected behav.
- Extraneous variable
- outside variables you want to exclude, could alter outcome of experiment.
Experimental Control
Random assignment - helps ensure everything is equal across grps. Ss assigned in such a fashion that ensures they have an = chance of being assigned to any group.
Field Experiments
Study ppl in the real world, in their natural settings. Not as much control as in the lab, though.
Try the Do-It-Yourself example I go over in class!
Evaluating Results
How does the experimenter know if her results mean anything? Did they just occur by chance? Are they real? ARG! What's one to do??? Turn to statistics (No, we do not turn to the Dark Side.) And yes, even Y
is laden with math. Stats help is determine if the odds of getting our results were greater than chance. If we can determine that the odds of chance are 5/100 or less, we say our results are Statistically significant (and there is much rejoicing….) Any Monty Python fans out there??
Meta-Analysis
Statistical technique that can summarize and combine the results of many experiments into one.
PLACEBO EFFECTS
- Placebo is a fake trmt. Ex: in a drug study, some ppl get the real drug, some get a sugar pill.
- Sometimes you get the placebo effect - behav of person receiving the placebo will change bc they expect it to. Try to Do-It-Yourself example that I give in class!
Controlling Placebo Effects
- Single blind experiment - don't tell the Ss who is getting the real drug & who isn't
- Double blind experiment - neither the experiment nor the Ss know who is getting the real deal.
The Experimenter Effect
- Can unintentionally infl the outcome of experiment bc of own beliefs. Can find what you expect to find, which is another case of Believing is Seeing. Ex: evaluation ex from class
- Another prob is the Self-fulfilling prophecy - Experimenter inadvertently conveys to Ss her expectations about the experiment, and the Ss thus conform to these expectations.
CLINICAL METHOD
Case Study
This is a detailed & descriptive study of one individual. Freud uses this method.
Ex: detailed study of the life of a serial killer or an exceptional person, like "Rainman" or Stephen Hawking. It's like a natural clinical test - study a person w/ a pre-exisiting condition bc it would be unethical to create that condition solely for research purposes. For ex, you wouldn't go out and shoot someone in the head just to see how it would effect their brain. Book ex: Phineas Gage. Get to know this example.
SURVEY METHOD
Use carefully worded & selected questions to ensure you're investigating the proper area of interest. Test them out on a small sample before adm to gen pop. This also ensures that your questions aren't offensive or unclear.
Some things can limit the validity of a survey: Sample bias - sample not rep of pop. Ex: Presidential poll example from class. Social desirability/courtesy bias - ppl respond in way they think the investigator wants, and answer to make themselves look good. Ex: church ex from class
Science and Critical Thinking
Yes, believe it or not, Y
really does need all the research hoopla. Helps discover the truth.
My-side bias - you only see thing that support your side of the argument. Research can help deter us from falling prey to the my-side bias.