Evaluating a Research Report

Evaluating a Research Report

Educational Research:

Competencies for Analysis and Application

11/E

Geoffrey Mills and Lorraine Gay

© 2016, 2012, 2009, 2006 Pearson Education, Inc. All Rights Reserved

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

After reading Chapter 22, you should be able to do the following

Evaluate each of the major sections and subsections of a research report.

For each type of research, evaluate the adequacy of a study representing that specific type of research.

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

General Evaluation Criteria

As a professional, it is important to know how to consume and evaluate research.

Evaluating a research study requires knowledge of each component of the research process.

It builds on knowledge gained through previous chapters.

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

General Evaluation Criteria

Reports share common flaws:

Failure to report reliability and validity information

Research design weaknesses

Biased selection of participants

Failure to state limitations

Lack of description about the study

The chapter convers evaluative questions about research strategies and areas.

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Introduction

Problem

Is there a statement of the problem? Does the problem indicate a particular focus of study?

Is the problem researchable? That is, can it be investigated by collecting and analyzing data?

Is background information on the problem presented?

When necessary, are variables directly or operationally defined?

Does the problem statement indicate the variables of interest and the specific relations among the variables that were investigated?

Did the researchers have the knowledge and skills to carryout the research?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

5

Evaluating a Research Report

Introduction

Review of related literature

Is the review comprehensive?

Is the review well-organized? Does it flow?

Is the review more than a series of abstracts or annotations?

Are all cited references relevant to the problem under investigation? Is the relevance of each reference explained?

Are most of the sources primary?

Are references cited completely and accurately?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Introduction

Review of related literature

Does the review conclude with a summary and interpretation of the literature and its implications for the problem under study?

Do the implications form an empirical or theoretical rationale for the hypotheses that follow?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Introduction

Hypotheses

Are specific research questions listed or specific hypotheses stated?

Is each hypothesis testable?

Does each hypothesis state an expected relation or difference?

If necessary, are variables directly or operationally defined?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Method

Participants

Are the size and major characteristics of the population described?

If the sample was selected, is the method of selecting the sample clearly described?

Does the method of sample selection suggest any limitations or biases in the sample?

Are the size and major characteristics of the sample described?

If the study is quantitative, does the sample size meet the suggested guidelines for the method of research presented?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Method

Instruments

Do instruments and their administration comply with IRB standards? Were permissions obtained?

Are the instruments appropriate for measuring the intended variables?

Was the correct type of instrument used for data collection?

Are the purpose, content, validity, and reliability of each instrument described? Was the correct type of instrument used for data collection?

Is a rationale given for the selection of the instruments used?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Method

Instruments

Are the purpose, content, validity, and reliability of each instrument described?

If appropriate, are subtest reliabilities given?

Is evidence presented to indicate instruments were appropriate for the intended sample?

If an instrument was developed, are procedures that establish reliability and validity shared?

If an instrument was developed, are administration, scoring, and interpretation procedures described?

Does the researcher have the needed skills or experience to construct or administer the instrument?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Method

Design and Procedure

Are the design and procedures appropriate for examining the research question or testing the hypotheses of the study?

Are the procedures described in sufficient detail to permit replication by another researcher?

Do the procedures logically relate to one another?

Were the instruments and procedures applied correctly?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Method

Design and Procedure

If a pilot study was conducted, are its execution and results described? Is the effect on the subsequent study explained?

Are control procedures described?

Does the researcher discuss and account for confounding variables that were not controlled?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Results

Are appropriate descriptive statistics presented?

Are the tests of significance appropriate, given the hypotheses and design of the study?

If parametric tests were used, is there evidence that the researcher avoided violating the required assumptions for parametric tests?

Was the probability level at which the tests of significance were evaluated specified in advance of the data analyses? Was every hypothesis tested?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Results

Are the tests of significance interpreted using the appropriate degrees of freedom?

Was the inductive logic used to produce results in the qualitative study explicit?

Are the results clearly described?

Are the tables and figures easy to understand and are they organized?

Are the data in each table and figure described in the text of the research report?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Discussion (Conclusions & Recommendations)

Is each result discussed in terms of the original hypothesis or topic to which it relates?

Is each result discussed in terms of its agreement or disagreement with previous results obtained by other researchers in other studies?

Are generalizations consistent with the results?

Are theoretical and practical implications of the findings discussed?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Discussion (Conclusions & Recommendations)

Are possible effects of uncontrolled variables on the results discussed?

Are recommendations for future action made?

Are the suggestions for future action based on practical significance or on statistical significance only?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Evaluating a Research Report

Abstract or summary

Is the problem stated?

Are the number and type of participants and instruments described?

Is the design identified?

Are the procedures described?

Are the major results and conclusions stated?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Survey Research

Are questionnaire validation procedures described?

Was the questionnaire pilot tested and if so, are the pilot test procedures adequately described?

Are directions to questionnaire respondents clear

Does each item in the questionnaire relate to an objective of the study?

Does each questionnaire item deal with a single concept?

When necessary, is a point of reference given for questionnaire items?

Are leading questions avoided in the questionnaire?

Are there sufficient alternatives for each questionnaire item?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Survey Research

Does the cover letter explain the purpose and importance of the study, and does it give the potential respondent a good reason for cooperating?

If appropriate, is confidentiality or anonymity of responses assured in the cover letter?

What is the percentage of returns, and how does it affect the study results?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Survey Research

Are follow-up activities to increase the number of returns described?

If the response rate was low, was any attempt made to determine any major differences between respondents and nonrespondents?

Are data analyzed in groups or clusters rather than in a series of single variable analyses?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Correlational Research

Relationship studies

Were variables carefully selected (i.e., was a shotgun approach avoided)?

Is the rationale for variable selection described?

Are conclusions and recommendations based on values of correlation coefficients corrected for attenuation or restriction in range?

Do the conclusions avoid suggesting causal relations among the variables investigated?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Correlational Research

Prediction studies

Is a rationale given for selection of predictor variables?

Is the criterion variable well defined?

Was the resulting prediction equation validated with at least one other group?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Causal-Comparative Research

Are the characteristics or experiences that differentiate the groups (i.e., the grouping variable) clearly defined or described?

Are critical extraneous variables identified?

Were any control procedures applied to equate the groups on extraneous variables?

Are causal relations discussed with due caution?

Are plausible alternative hypotheses discussed?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Experimental Research

Was an appropriate experimental design selected?

Is a rationale for design selection given?

Are threats to validity associated with the design identified and discussed?

Is the method of group formation described?

Was the experimental group formed in the same way as the control group?

Were groups randomly formed and the use of existing groups avoided?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Experimental Research

Were treatments randomly assigned to groups?

Were critical extraneous variables identified?

Were any control procedures applied to equate groups on extraneous variables?

Were possible reactive arrangements (e.g., the Hawthorne effect) controlled for?

Are the results generalized to the appropriate group?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Single-Subject Research

Are the data time-constrained?

Was a baseline established before moving into the intervention phase?

Was condition or phase length sufficient to represent the behavior within the phase?

Is the design appropriate to the question under study?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Single-Subject Research

If a multiple-baseline design was used, were conditions met to move across baselines?

If a withdrawal design was used, are limitations to this design addressed?

Did the researcher manipulate only one variable at a time?

Is the study replicable?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Qualitative Research (In General)

Does the researcher give a general sense of the focus of study?

Does the researcher state a guiding hypothesis for the investigation?

Is the application of the qualitative method described in detail?

Is the context of the qualitative study described in detail?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Qualitative Research (In General)

Is the purposive sampling procedure described and related to the study focus?

Is each data collection strategy described?

Is the researcher’s role stated (e.g., nonparticipant observer, participant observer, interviewer, etc.)?

Are the research site and the researcher’s entry into it described?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Qualitative Research (In General)

Were the data collection strategies used appropriately, given the purpose of the study?

Were strategies used to strengthen the validity and reliability of the data (e.g., triangulation)?

Is there a description of how any unexpected ethical issues were handled?

Are strategies used to minimize observer bias and observer effect described?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Qualitative Research (In General)

Are the researcher’s reactions and notes differentiated from descriptive field notes?

Are data coding strategies described and examples of coded data given?

Is the inductive logic applied to the data to produce results stated in detail?

Are conclusions supported by data (e.g., are direct quotations from participants used to illustrate points)?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Evaluating Validity and Reliability in Qualitative Studies

Threats to internal validity

Did the researcher effectively deal with problems of history and maturation by documenting historical changes over time?

Did the researcher effectively deal with problems of mortality by using a sample large enough to minimize the effects of attrition?

Was the researcher in the field long enough to minimize observer effects?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Evaluating Validity and Reliability in Qualitative Studies

Threats to internal validity

Did the researcher take the time to become familiar and comfortable with participants?

Were interview questions pilot tested?

Were efforts made to ensure intraobserver agreement by training interview teams in coding procedures?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Evaluating Validity and Reliability in Qualitative Studies

Threats to internal validity

Were efforts made to cross-check results by conducting interviews with multiple groups?

Did the researcher interview key informants to verify field observations?

Were participants demographically screened to ensure that they were representative of the larger population?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Evaluating Validity and Reliability in Qualitative Studies

Threats to internal validity

Were data collected using different media (e.g., audiotape, videotape, etc.) to facilitate cross-validation?

Were participants allowed to evaluate research results before publication?

Are sufficient data presented to support findings and conclusions?

Were variables repeatedly tested to validate results?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Evaluating Validity and Reliability in Qualitative Studies

Threats to external validity

Were constructs defined in a way that has meaning outside the setting of the study?

Were both new and adapted instruments pilot tested to ensure that they were appropriate for the study?

Does the researcher fully describe participants’ relevant characteristics, such as socioeconomic structure, gender makeup, level of urbanization and/or acculturation, and pertinent social and cultural history?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Evaluating Validity and Reliability in Qualitative Studies

Threats to external validity

Are researcher interaction effects addressed by fully documenting the researcher’s activities in the setting?

Were all observations and interviews conducted in a variety of fully described settings and with multiple trained observers?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Evaluating Validity and Reliability in Qualitative Studies

Reliability

Is the researcher’s relationship with the group and setting fully described?

Is all field documentation comprehensive, fully cross-referenced and annotated, and rigorously detailed?

Were observations and interviews documented using multiple means (e.g., written notes and recordings)?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Evaluating Validity and Reliability in Qualitative Studies

Reliability

Was the interviewer’s training documented, and is it described?

Was the construction, planning, and testing of all instruments documented, and are they described?

Are key informants fully described, and is information on groups they represent and their community status included?

Are sampling techniques fully documented and sufficient for the study?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

40

Design Specific Evaluation Criteria

Narrative Research

Does the researcher provide a rationale for the use of narrative research to study the chosen phenomenon?

Is there a rationale for the choice of individual to study the chosen phenomenon?

Does the researcher describe data collection methods and give particular attention to interviewing?

Does the researcher describe appropriate strategies for analysis and interpretation (e.g., restorying)?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Ethnographic Research

Does the written account (i.e., the ethnography) capture the social, cultural, and economic themes that emerged from the study?

Did the researcher spend a full cycle in the field studying the phenomenon?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Case Study Research

Was the phenomenon under investigation appropriate for investigation using a case study research method?

Is there a rationale for the selection of the case (i.e., unit of analysis)?

Does the researcher provide a clear description of the case?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Case Study Research

Was an appropriate analysis of the case, or cross-site analysis, conducted?

Is there a clear link between the data presented in the case study and the themes that are reported?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Mixed Methods Research

Did the study use at least one quantitative and at least one qualitative data research method?

Did the study investigate both quantitative and qualitative research questions?

Is a rationale for using a mixed methods research design provided?

Is the type of mixed methods research design stated?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Mixed Methods Research

Is the priority given to quantitative and qualitative data collection and the sequence of their use described?

Are qualitative and quantitative data collection techniques clearly identified?

Are the data analysis techniques appropriate for the type of mixed methods design?

Was the study feasible given the amount of data to be collected and concomitant issues of resources, time, and expertise?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Action Research

Does the area of focus involve teaching and learning in the researcher’s own practice?

Was the area of focus within the researcher’s locus of control?

Is the area of focus something the researcher was passionate about?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Design Specific Evaluation Criteria

Action Research

Is the area of focus something the researcher wanted to change or improve upon?

Does the researcher state questions that were answerable given the researcher’s expertise, time, and resources?

Does the researcher provide an action plan detailing the effect of the research findings on practice?

Gay & Mills

Educational Research, 11e

© 2016 Pearson Education, Inc. All rights reserved.

22-‹#›

Order from us and get better grades. We are the service you have been looking for.