Led On Line
Presentazione - About us
Novita' - What's new
E-Journals
E-books
Lededizioni Home Page Ricerca - Search
Catalogo - Catalogue
Per contattarci - Contacts
Per gli Autori - For the Authors
Statistiche - Statistics
Cookie Policy
Privacy Policy

L’affidabilità dei criteri di inclusione nelle meta-analisi in educazione: una rassegna di studi

Marta Pellegrini

Abstract


Reliability of meta-analysis standards in education: an overview of studies.

Research syntheses, such as meta-analyses and systematic reviews, are methods for combining results of different primary studies on a certain theme. These methods have been widespread in the early eighties in educational research with the purpose of giving more reliable information to the teaching practice. As primary studies, not all the reviews carried out are reliable to inform practice on programs and strategies that are effective for learning. Although some systematic reviews and meta-analyses have weaknesses, it is possible to identify which procedures and standards are more valid and reliable for carrying out metaanalyses. This article reviews and examines studies that have evaluated methodological factors that affect effect sizes in meta-analyses of educational practices. The studies of this review have showed that the following methodological factors affect effect sizes: publication bias, sample size, study design, outcome measures and intervention duration. The conclusion specifies which inclusion criteria, based on the review results, are more reliable to carry out meta-analyses that have the objective to inform educational practices.


Keywords


Effect size; Inclusion criteria; Meta-analysis; Methodological factors; Reliability; Affidabilità; Criteri di inclusione; Fattori metodologici; Meta-analisi

Full Text:

PDF

References


Best Evidence Encyclopedia. Basis for program ratings. http://www.bestevidence.org/aboutbee.htm (accessed 10/11/2017).

Bloom, H. S., Hill, C. J., Black, A. R., & Lipsey, M. W. (2008). Performance trajectories and performance gaps as achievement effect-size benchmarks for educational interventions. Journal of Research on Educational Effectiveness, 1(4), 289-328.

Borenstein, M., Hedges, L. V., Higgins, J., & Rothstein, H. R. (2009). Introduction to meta-analysis. Chichester: John Wiley & Sons.

Card, N. A. (2012). Applied meta-analysis for social science research. New York: The Guilford Press.

Cheung, A. C., & Slavin, R. E. (2016). How methodological features affect effect sizes in education. Educational Researcher, 45(5), 283-292.

Cochrane Collaboration. Levels of evidence. http://consumers.cochrane.org/levelsevidence (accessed 10/11/2017).

Cook, T. D., Shadish, W. R., & Wong, V. C. (2008). Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within‐study comparisons. Journal of Policy Analysis and Management, 27(4), 724-750.

Cooper, H., Hedges, L. V., & Valentine, J. C. (Eds.). (2009). The handbook of research synthesis and meta-analysis. New York: Russell Sage Foundation.

Cronbach, L. J., Ambron, S. R., Dornbusch, S. M., Hess, R. D., Hornik, R. C., Phillips, D. C., …, & Weiner, S. S. (1980). Toward reform of program evaluation. San Francisco: Jossey-Bass.

de Boer, H., Donker, A. S., & van der Werf, M. P. (2014). Effects of the attributes of educational interventions on students’ academic performance: A metaanalysis. Review of Educational Research, 84(4), 509-545.

Di Nuovo, S. (1995). Le meta-analisi. Fondamenti teorici e applicazioni nella ricerca psicologica. Roma: Edizioni Borla.

Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3-4), 327-350.

Duval, S. J., & Tweedie, R. L. (2000). A non-parametric «trim and fill» method of accounting for publication bias in meta-analysis. Journal of the American Statistical Association, 95, 89-98.

Egger, M., Smith, G. D., Schneider, M., & Minder, C. (1997). Bias in meta-analysis detected by a simple, graphical test. Bmj, 315(7109), 629-634.

Glass, G. V. (1976). Primary, secondary, meta-analysis of research. American Educational Research Association, 5(10), 3-8.

Hattie, J. (2009). Visible Learning: A synthesis of over 800 meta-analysis relating to achievement. London - New York: Routledge.

Ioannidis, J. P., Cappelleri, J. C., & Lau, J. (1998). Issues in comparisons between meta-analyses and large trials. Jama, 279(14), 1089-1093.

Kjaergard, L. L., Villumsen, J., & Gluud, C. (2001). Reported methodologic quality and discrepancies between large and small randomized trials in meta-analyses. Annals of Internal Medicine, 135(11), 982-989.

Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis, Vol. 49. Thousand Oaks, CA: Sage.

Mosteller, F., Light, R. J., & Sachs, J. A. (1996). Sustained inquiry in education: Lessons from skill grouping and class size. Harvard Educational Review, 66, 797-842.

Pearson, P. D., Ferdig, R. E., Blomeyer Jr., R. L., & Moran, J. (2005). The effects of technology on reading performance in the middle-school grades: A meta-analysis with recommendations for policy. Learning Point Associates - North Central Regional Educational Laboratory (NCREL).

Pellegrini, M. (2017, August). How do different standards lead to different conclusions? A comparison between meta-analyses of two research centers. Paper presented at the European Conference on Educational Research (ECER), København.

Rosenthal, R. (1979). The «file-drawer problem» and tolerance for null results. Psychological Bulletin, 86, 638-641.

Rothstein, H. R., Sutton, A. J., & Borenstein, M. (Eds.). (2006). Publication bias in meta-analysis: Prevention, assessment and adjustments. West Sussex: John Wiley & Sons.

Schmidt, F. L., & Hunter, J. E. (2004). Methods of meta-analysis: Correcting error and bias in research findings. Thousand Oaks, CA: Sage.

Slavin, R. E. (2008). What works? Issues in synthesizing educational program evaluations. Educational Researcher, 37(1), 5-14.

Slavin, R. E., & Madden, N. A. (2011). Measures inherent to treatments in program effectiveness reviews. Journal of Research on Educational Effectiveness, 4(4), 370-380.

Slavin, R. E., & Smith, D. (2009). The relationship between sample sizes and effect sizes in systematic reviews in education. Educational Evaluation and Policy Analysis, 31(4), 500-506.

Sterne, J. A., Egger, M., & Smith, G. D. (2008). Investigating and dealing with publication and other biases. In M. Egger, G. D. Smith, & D. G. Altman (Eds.), Systematic reviews in health care: Meta-analysis in context (2nd ed., pp. 189-208). London: John Wiley & Sons.

Thornton, A., & Lee, P. (2000). Publication bias in meta-analysis: Its causes and consequences. Journal of Clinical Epidemiology, 53(2), 207-216.

Torgerson, C. J. (2007). The quality of systematic reviews of effectiveness in literacy learning in English: A «tertiary» review. Journal of Research in Reading, 30(3), 287-315.

WWC – What Works Clearinghouse (2015). Procedures and standards handbook (version 3.0). Washington, DC: Author.




DOI: https://doi.org/10.7358/ecps-2017-016-pell



 


Journal of Educational, Cultural and Psychological Studies (ECPS)
Registered by Tribunale di Milano (19/05/2010 n. 278)
Online ISSN 2037-7924 - Print ISSN 2037-7932

Research Laboratory on Didactics and Evaluation - Department of Education - "Roma Tre" University


Executive Editor: Gaetano Domenici - Managing  Editor: Valeria Biasci
Editorial Board: Giuditta Alessandrini - Eleftheria Argyropoulou - Massimo Baldacci - Joao Barroso - Richard Bates - Christofer Bezzina - Paolo Bonaiuto - Lucia Boncori - Pietro Boscolo - Sara Bubb  - Carlo Felice Casula - Jean-Émile Charlier - Lucia Chiappetta Cajola - Carmela Covato - Jean-Louis Derouet - Peter Early - Franco Frabboni - Constance Katz - James Levin - Pietro Lucisano  - Roberto Maragliano - Romuald Normand - Michael Osborne - Donatella Palomba - Michele Pellerey - Clotilde Pontecorvo - Vitaly V. Rubtzov - Jaap Scheerens - Noah W. Sobe - Francesco Susi - Giuseppe Spadafora - Pat Thomson
Editorial Staff: Fabio Alivernini - Guido Benvenuto - Anna Maria Ciraci - Massimiliano Fiorucci - Luca Mallia - Massimo Margottini - Giovanni Moretti - Carla Roverselli 
Editorial Secretary:
Nazarena Patrizi