Exam 9: Nonreactive Research and Secondary Analysis
Dr. Headstrong compared two textbooks. He counted the word "he" 80 times in book A and 20 times in book B. He also found book A contained the word "chairman"
40 times while it only appeared 10 times in book B. He can conclude
B
Talk about:
-manifest coding
A type of content analysis coding in which a researcher first develops a list of specific words, phases, or symbols, then finds them in a communication medium.
What is secondary data analysis? What are its advantages and disadvantages?
Secondary data analysis refers to the process of analyzing existing data that has been collected by someone else for a different purpose. This type of analysis involves using data that has already been gathered and analyzed by others, such as government agencies, research organizations, or other institutions.
Advantages of secondary data analysis include cost-effectiveness, as it eliminates the need to collect new data, saving time and resources. It also allows researchers to access a wide range of data sources that may not have been available otherwise. Additionally, secondary data analysis can provide valuable historical or longitudinal data, allowing for trend analysis and comparisons over time.
However, there are also disadvantages to secondary data analysis. One major drawback is the lack of control over the data collection process, which can lead to potential biases or limitations in the data. There may also be issues with data quality, as the original data may not have been collected with the specific research questions in mind. Additionally, researchers may face challenges in accessing and interpreting the data, as it may be complex or require specialized knowledge.
In conclusion, secondary data analysis can be a valuable tool for researchers, offering cost-effective access to a wide range of data sources. However, it is important to carefully consider the limitations and potential biases associated with using existing data, and to critically evaluate the quality and relevance of the data for the specific research questions at hand.
Explain problems with inferences and validity in content analysis research.
What reliability problems can arise in existing statistics research?
Refer to the following paragraph to answer the questions below.
Sally Simpson conducted a content analysis study of the New York Times newspaper between 1915 and 2005. She first identified relevant articles involving government regulation of business. After finding 20,000 such articles, she systematically sampled articles with a sampling interval of 50. She then coded each sampled article based on the subjective meaning it expressed, as pro- or anti-government regulation using a 1 to 10 scale (1 = very anti-regulation, 10 = very pro-regulation).
-How many articles did Ms. Simpson code?
Claude DuPere has a list of measures on the French influence in the New Orleans area. He asked you to identify the one that is NOT an unobtrusive measure . Which one is it?
Which of the following is FALSE about secondary data analysis ?
Professor Deve Gowda was interested in using Statistics Canada data to examine the trend in Canadian unemployment rates for the last forty years. Yet, he found that unemployment was not recorded accurately in each year. This is an issue of
Filters
- Essay(0)
- Multiple Choice(0)
- Short Answer(0)
- True False(0)
- Matching(0)