Led On Line
Presentazione - About us
Novita' - What's new
E-Journals
E-books
Lededizioni Home Page Ricerca - Search
Catalogo - Catalogue
Per contattarci - Contacts
Per gli Autori - For the Authors
Statistiche - Statistics
Cookie Policy
Privacy Policy

Definizione di una procedura di codifica delle domande aperte basata sui modelli delle indagini internazionali

Giorgio Asquini

Abstract


Definition of a Coding Procedure of Open-ended Questions Based on the Models of International Studies

The paper presents an in-depth study within the «Problem-solving and geographical skills» project funded by Sapienza University of Rome in 2011. The main research instrument consists of open-ended items which was used as a basis for designing a coding procedure of student responses in order to maximise coding and coder control and reliability. Bearing in mind that when analyzing open answers, the margins of the assessor’s subjective interpretation are certainly greater than with classic structured items, the adoption of strict encoding and coding control procedures allows us to steer this type of semi-structured questions toward objectivity, thereby significantly reducing errors of interpretation. The main reference is the procedure used in the OECD-PISA for openended questions, but also the assessment of written tests established in the IEA study on «Written composition». The aim is to improve the quality of the dataset of the study, with the least possible burden on resources. The results confirmed the effectiveness of the procedure in terms of reliability of coding, with the estimation of a low-level error. It was also possible to provide timely feedback to each coder, thereby enabling an improvement in coding ability.


Keywords


Assessment, Coding, Open-ended items, Problem-solving, Reliability

Full Text:

PDF

References


Ackerman, T. A., & Smith, P. L. (1988). A comparison of the information provided by essay, multiple-choice, and free response writing tests. Applied Psychological Measurement, 12(2), 117-128.

Albertoni, D. (1988). L’addestramento dei valutatori delle prove IEA/IPS. Ricerca Educativa, 5(2-3), 95-112.

Anderson, P., & Morgan, G. (2008). Developing Tests and Questionnaires for a national assessment of educational achievement. Washington: The World Bank. http://siteresources.worldbank.org/EDUCATION/Resources/278200-1099079877269/547664-1222888444288/National_assessment_Vol2.pdf (consulted 25/08/2014).

Asquini, G. (2006). La capacità di problem solving dei quindicenni. In M. T. Siniscalco (a cura di), Il livello di competenza dei quindicenni italiani in matematica, lettura, scienze e problem solving. Rapporto nazionale di PISA 2003. Roma: Armando.

Asquini, G., & Corsini, C. (2010). L’evoluzione dei risultati in lettura nelle diverse edizioni di PISA. In AA.VV., PISA 2006. Approfondimenti tematici e metodologici (pp. 177-201). Roma: Armando.

Bennett, R., & Ward, W. (Eds.). (1993). Constructing versus choice in cognitive measurement: Issues in constructed response, performance testing, and portfolio assessment. Hillsdale, NJ: Lawrence Erlbaum Associates Publishers.

Benvenuto, G. (1993). L’affidabilità delle valutazioni. In AA.VV., La produzione scritta nel biennio superiore (pp. 27-34). Campobasso: IRRSAE Molise.

Bolasco, S. (2010). TaLTaC2.10. Sviluppi, esperienze ed elementi essenziali di analisi automatica dei testi. Milano: LED.

Domenici, G. (1996). Gli strumenti della valutazione (2a ed.). Napoli: Tecnodid.

INVALSI (2010). Rilevazione degli apprendimenti – SNV. Prime analisi. INVALSI. http://www.invalsi.it/download/rapporti/snv2010/Rapporto_SNV_09_10.

pdf (consulted 25/08/2014).

INVALSI (2014). Rilevazioni nazionali degli apprendimenti 2013-14. Rapporto risultati. INVALSI. http://www.invalsi.it/areaprove/rapporti/Rapporto_Rilevazioni_Nazionali_2014.pdf (consulted 25/08/2014).

Jonassen, D. H., (2011). Learning to solve problems: A handbook for designing problem-solving learning environments. New York: Routledge.

McGaw, B. (2008). The role of the OECD in international comparative studies of achievement. Assessment in Education: Principles, Policy & Practice, 12(2), 223-243.

Morris, A. (2011). Student standardised testing: Current practices in oecd countries and a literature review. OECD Education Working Papers, 65. Paris: OECD.

OECD (1999). Measuring Student knowledge and skills. A new framework for assessment. Paris: OECD.

OECD (2003). The PISA 2003 Assessment framework. Paris: OECD.

OECD (2004). Problem solving for tomorrow’s world. Paris: OECD.

OECD (2012). PISA 2009 Technical report. Paris: OECD.

OECD (2013). PISA 2012 Assessment and analytical framework. Paris: OECD.

Polya, G. (1945). How to solve it. Princeton, NJ: Princeton University Press.

Ryjchen, D., & Salganik, L. H. (2000). Definition and selection of key competencies (DeSeCo). Paris: OECD.

Sabella, M. (2014). Primi della classe si nasce? Indagine longitudinale sul Summer Learning Loss nella scuola secondaria di I grado. Roma: Nuova Cultura.

Toch, T. (2006). Margins of error: The education testing industry in the no child left behind era. Education sector reports. http://www.educationsector.org/sites/default/files/publications/Margins_of_Error.pdf (consulted 25/08/2014).

Ward, W. C., Dupree, D., & Carlson, S. B. (1987). A comparison of free-response and multiple-choice questions in the assessment of reading comprehension (RR-87-20). Princeton, NJ: Educational Testing Service.

Weber, K., Radu, I., Mueller, M., Powell, A., & Maher, C. (2010). Expanding participation in problem solving in a diverse middle school mathematics classroom, Mathematics Education Research Journal, 22(1), 91-118.




DOI: https://doi.org/10.7358/ecps-2014-010-asqu

Copyright (©) 2014 Journal of Educational, Cultural and Psychological Studies (ECPS Journal) – Editorial format and Graphical layout: copyright (©) LED Edizioni Universitarie



 


Journal of Educational, Cultural and Psychological Studies (ECPS)
Registered by Tribunale di Milano (19/05/2010 n. 278)
Online ISSN 2037-7924 - Print ISSN 2037-7932

Research Laboratory on Didactics and Evaluation - Department of Education - "Roma Tre" University


Executive Editor: Gaetano Domenici - Associate Executive Editor & Managing  Editor: Valeria Biasci
Editorial Board: Eleftheria Argyropoulou - Massimo Baldacci - Joao Barroso - Richard Bates - Christofer Bezzina - Paolo Bonaiuto - Lucia Boncori - Pietro Boscolo - Sara Bubb  - Carlo Felice Casula - Jean-Émile Charlier - Lucia Chiappetta Cajola - Carmela Covato - Jean-Louis Derouet - Peter Early - Franco Frabboni - Constance Katz - James Levin - Pietro Lucisano  - Roberto Maragliano - Romuald Normand - Michael Osborne - Donatella Palomba - Michele Pellerey - Clotilde Pontecorvo - Vitaly V. Rubtzov - Jaap Scheerens - Noah W. Sobe - Francesco Susi - Giuseppe Spadafora - Pat Thomson
Editorial Staff: Fabio Alivernini - Guido Benvenuto - Anna Maria Ciraci - Massimiliano Fiorucci - Luca Mallia - Massimo Margottini - Giovanni Moretti - Carla Roverselli 
Editorial Secretary:
Nazarena Patrizi 


Referee List


© 2001 LED Edizioni Universitarie di Lettere Economia Diritto