Deck 9: Nonreactive Research and Secondary Analysis

Full screen (f)
exit full mode
Question
What reliability problems can arise in existing statistics research?
Use Space or
up arrow
down arrow
to flip the card.
Question
What is secondary data analysis? What are its advantages and disadvantages?
Question
Describe the aggregation problem in existing statistics.
Question
Explain problems with inferences and validity in content analysis research.
Question
Refer to the following paragraph to answer the questions below.
Sally Simpson conducted a content analysis study of the New York Times newspaper between 1915 and 2005. She first identified relevant articles involving government regulation of business. After finding 20,000 such articles, she systematically sampled articles with a sampling interval of 50. She then coded each sampled article based on the subjective meaning it expressed, as pro- or anti-government regulation using a 1 to 10 scale (1 = very anti-regulation, 10 = very pro-regulation).

-How many articles did Ms. Simpson code?

A) 20,000
B) 4,000
C) 1,000
D) 400
E) insufficient information is given
Question
Which of the following is FALSE about secondary data analysis?

A) A gap may exist between a researcher's conceptualization of a variable and how it is measured in available data.
B) Locating data with specific variables of interest can be time consuming and sometimes a researcher may not make data available.
C) Information about how data was collected may be insufficient to determine whether there is bias.
D) It is very expensive compared to equivalent primary data collection.
E) It facilitates replication.
Question
Professor Deve Gowda was interested in using Statistics Canada data to examine the trend in Canadian unemployment rates for the last forty years. Yet, he found that unemployment was not recorded accurately in each year. This is an issue of

A) reliability.
B) validity.
C) ecological fallacy.
D) ideal types.
E) verstehen.
Question
Claude DuPere has a list of measures on the French influence in the New Orleans area. He asked you to identify the one that is NOT an unobtrusive measure. Which one is it?

A) the wear on novels in the New Orleans Public Library written in French
B) walking down a street in New Orleans and noticing that most of the signs in stores in a neighborhood are in French or French-Cajun
C) a list of votes supporting bills on bilingual education in the Louisiana state legislature with the area represented by each legislator noted on the list
D) a box of 300 letters written by people living in New Orleans to relatives living in French-speaking areas outside the state (e.g., Quebec) between 1980 and 1985
E) a survey using a three-page questionnaire partly written in French that was distributed to residents of a neighborhood
Question
Dr. Headstrong compared two textbooks. He counted the word "he" 80 times in book A and 20 times in book B. He also found book A contained the word "chairman"
40 times while it only appeared 10 times in book B. He can conclude

A) book A is four-times more sexist than book B.
B) the words "he" and "chairman" appeared four times more often in book A than in book B.
C) students using book A will become more sexist than those using book B.
D) a teacher who chooses book A is more sexist than one who chooses book B.
E) C and D.
Question
Talk about:
-accretion measures
Question
Talk about:
-coding
Question
Talk about:
-coding system
Question
Talk about:
-content analysis
Question
Talk about:
-erosion measures
Question
Talk about:
-General Social Survey (GSS)
Question
Talk about:
-latent coding
Question
Talk about:
-manifest coding
Question
Talk about:
-nonreactive
Question
Talk about:
-recoding sheet
Question
Talk about:
-Statistical Abstract of the United States
Question
Talk about:
-structured observation
Question
Talk about:
-text
Question
Talk about:
-unobtrusive measure
Unlock Deck
Sign up to unlock the cards in this deck!
Unlock Deck
Unlock Deck
1/23
auto play flashcards
Play
simple tutorial
Full screen (f)
exit full mode
Deck 9: Nonreactive Research and Secondary Analysis
1
What reliability problems can arise in existing statistics research?
In existing statistics research, several reliability problems can arise that can compromise the validity and credibility of the findings. Here are some of the key issues:

1. Sampling Errors: If the sample is not representative of the population, the results may not be generalizable. Poor sampling techniques or a small sample size can lead to sampling bias and errors.

2. Measurement Errors: Inaccurate data collection methods or tools can lead to measurement errors. If the instruments used to measure variables are not reliable or valid, the results will be questionable.

3. Nonresponse Bias: When a significant portion of the respondents does not participate or drops out of the study, the results may be skewed. This is particularly problematic if the nonrespondents differ in important ways from those who do participate.

4. Data Processing Errors: Mistakes made during data entry, coding, or analysis can lead to incorrect results. This includes human error as well as software or computational errors.

5. Misuse of Statistical Techniques: Inappropriate or incorrect application of statistical methods can lead to misleading conclusions. This includes the misuse of statistical tests, incorrect model specifications, and overreliance on p-values without considering effect sizes or confidence intervals.

6. Researcher Bias: Researchers may unintentionally influence the results due to their expectations or preferences. This can occur through the selective reporting of results, data dredging (looking for patterns in the data that support a hypothesis), or confirmation bias.

7. Publication Bias: Studies with significant or positive results are more likely to be published than those with nonsignificant or negative results. This can skew the literature and give a false impression of the evidence.

8. Lack of Replication: Many studies are not replicated, which means that their findings may not be as reliable as initially thought. Replication is essential for confirming the validity of research findings.

9. Confounding Variables: Failure to control for confounding variables can lead to spurious associations between the variables of interest. This can result in incorrect inferences about causal relationships.

10. Longitudinal Data Challenges: In longitudinal studies, where data is collected over a period of time, issues such as attrition (loss of participants over time) and changes in measurement tools or procedures can affect reliability.

11. Ethical Issues: Violations of ethical standards, such as not obtaining informed consent or not ensuring confidentiality, can lead to data that is not reliable or that cannot be ethically used.

Addressing these reliability problems requires careful research design, thorough data collection and processing procedures, appropriate statistical analysis, and a commitment to ethical research practices. Transparency in reporting methods and findings, as well as peer review and replication studies, are also crucial for ensuring the reliability of statistical research.
2
What is secondary data analysis? What are its advantages and disadvantages?
Secondary data analysis refers to the process of analyzing existing data that has been collected by someone else for a different purpose. This type of analysis involves using data that has already been gathered and analyzed by others, such as government agencies, research organizations, or other institutions.

Advantages of secondary data analysis include cost-effectiveness, as it eliminates the need to collect new data, saving time and resources. It also allows researchers to access a wide range of data sources that may not have been available otherwise. Additionally, secondary data analysis can provide valuable historical or longitudinal data, allowing for trend analysis and comparisons over time.

However, there are also disadvantages to secondary data analysis. One major drawback is the lack of control over the data collection process, which can lead to potential biases or limitations in the data. There may also be issues with data quality, as the original data may not have been collected with the specific research questions in mind. Additionally, researchers may face challenges in accessing and interpreting the data, as it may be complex or require specialized knowledge.

In conclusion, secondary data analysis can be a valuable tool for researchers, offering cost-effective access to a wide range of data sources. However, it is important to carefully consider the limitations and potential biases associated with using existing data, and to critically evaluate the quality and relevance of the data for the specific research questions at hand.
3
Describe the aggregation problem in existing statistics.
The aggregation problem in existing statistics refers to the challenge of accurately representing a large and diverse set of data in a way that is meaningful and useful for analysis. This problem arises when trying to summarize or combine data from different sources, or when attempting to draw conclusions from data that is inherently complex and multifaceted.

One aspect of the aggregation problem is the risk of oversimplification. When aggregating data, there is a tendency to lose important nuances and variations that exist within the original dataset. This can lead to misleading or incomplete conclusions, as the aggregated data may not accurately reflect the true nature of the underlying information.

Another aspect of the aggregation problem is the potential for bias. Aggregating data from different sources or subgroups can introduce biases that skew the overall results. For example, if certain subgroups are overrepresented or underrepresented in the aggregated data, the conclusions drawn from that data may not be representative of the entire population.

Additionally, the aggregation problem can also lead to issues of data quality and reliability. When combining data from multiple sources, there is a risk of including inaccurate or incomplete information, which can compromise the integrity of the aggregated dataset.

Overall, the aggregation problem in existing statistics highlights the need for careful consideration and thoughtful analysis when summarizing and combining data. It is important to be aware of the limitations and potential biases inherent in aggregated data, and to approach the interpretation of such data with caution and skepticism.
4
Explain problems with inferences and validity in content analysis research.
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
5
Refer to the following paragraph to answer the questions below.
Sally Simpson conducted a content analysis study of the New York Times newspaper between 1915 and 2005. She first identified relevant articles involving government regulation of business. After finding 20,000 such articles, she systematically sampled articles with a sampling interval of 50. She then coded each sampled article based on the subjective meaning it expressed, as pro- or anti-government regulation using a 1 to 10 scale (1 = very anti-regulation, 10 = very pro-regulation).

-How many articles did Ms. Simpson code?

A) 20,000
B) 4,000
C) 1,000
D) 400
E) insufficient information is given
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
6
Which of the following is FALSE about secondary data analysis?

A) A gap may exist between a researcher's conceptualization of a variable and how it is measured in available data.
B) Locating data with specific variables of interest can be time consuming and sometimes a researcher may not make data available.
C) Information about how data was collected may be insufficient to determine whether there is bias.
D) It is very expensive compared to equivalent primary data collection.
E) It facilitates replication.
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
7
Professor Deve Gowda was interested in using Statistics Canada data to examine the trend in Canadian unemployment rates for the last forty years. Yet, he found that unemployment was not recorded accurately in each year. This is an issue of

A) reliability.
B) validity.
C) ecological fallacy.
D) ideal types.
E) verstehen.
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
8
Claude DuPere has a list of measures on the French influence in the New Orleans area. He asked you to identify the one that is NOT an unobtrusive measure. Which one is it?

A) the wear on novels in the New Orleans Public Library written in French
B) walking down a street in New Orleans and noticing that most of the signs in stores in a neighborhood are in French or French-Cajun
C) a list of votes supporting bills on bilingual education in the Louisiana state legislature with the area represented by each legislator noted on the list
D) a box of 300 letters written by people living in New Orleans to relatives living in French-speaking areas outside the state (e.g., Quebec) between 1980 and 1985
E) a survey using a three-page questionnaire partly written in French that was distributed to residents of a neighborhood
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
9
Dr. Headstrong compared two textbooks. He counted the word "he" 80 times in book A and 20 times in book B. He also found book A contained the word "chairman"
40 times while it only appeared 10 times in book B. He can conclude

A) book A is four-times more sexist than book B.
B) the words "he" and "chairman" appeared four times more often in book A than in book B.
C) students using book A will become more sexist than those using book B.
D) a teacher who chooses book A is more sexist than one who chooses book B.
E) C and D.
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
10
Talk about:
-accretion measures
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
11
Talk about:
-coding
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
12
Talk about:
-coding system
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
13
Talk about:
-content analysis
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
14
Talk about:
-erosion measures
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
15
Talk about:
-General Social Survey (GSS)
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
16
Talk about:
-latent coding
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
17
Talk about:
-manifest coding
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
18
Talk about:
-nonreactive
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
19
Talk about:
-recoding sheet
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
20
Talk about:
-Statistical Abstract of the United States
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
21
Talk about:
-structured observation
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
22
Talk about:
-text
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
23
Talk about:
-unobtrusive measure
Unlock Deck
Unlock for access to all 23 flashcards in this deck.
Unlock Deck
k this deck
locked card icon
Unlock Deck
Unlock for access to all 23 flashcards in this deck.