Please use this identifier to cite or link to this item: http://doi.org/10.25358/openscience-8442
Authors: Nagel, Marie-Theres
Zlatkin-Troitschanskaia, Olga
Fischer, Jennifer
Title: Validation of newly developed tasks for the assessment of generic Critical Online Reasoning (COR) of university students and graduates
Online publication date: 2-Dec-2022
Year of first publication: 2022
Language: english
Abstract: In recent decades, the acquisition of information has evolved substantially and fundamentally affects students’ use of information, so that the Internet has become one of the most important sources of information for learning. However, learning with freely accessible online resources also poses challenges, such as vast amounts of partially unstructured, untrustworthy, or biased information. To successfully learn by using the Internet, students therefore require specific skills for selecting, processing, and evaluating the online information, e.g., to distinguish trustworthy from distorted or biased information and for judging its relevance with regard to the topic and task at hand. Despite the central importance of these skills, their assessment in higher education is still an emerging field. In this paper, we present the newly defined theoretical-conceptual framework Critical Online Reasoning (COR). Based on this framework, a corresponding performance assessment, Critical Online Reasoning Assessment (CORA), was newly developed and underwent first steps of validation in accordance with the Standards for Educational and Psychological Testing. We first provide an overview of the previous validation results and then expand them by including further analyses of the validity aspects “internal test structure” and “relations with other variables”. To investigate the internal test structure, we conducted variance component analyses based on the generalizability theory with a sample of 125 students and investigated the relations with other variables by means of correlation analyses. The results show correlations with external criteria as expected and confirm that the CORA scores reflect the different test performances of the participants and are not significantly biased by modalities of the assessment. With these new analyses, this study substantially contributes to previous research by providing comprehensive evidence for the validity of this new performance assessment that validly assesses the complex multifaceted construct of critical online reasoning among university students and graduates. CORA results provide unique insights into the interplay between features of online information acquisition and processing, learning environments, and the cognitive and metacognitive requirements for critically reasoning from online information in university students and young professionals.
DDC: 370 Erziehung
370 Education
Institution: Johannes Gutenberg-Universität Mainz
Department: FB 02 Sozialwiss., Medien u. Sport
Place: Mainz
ROR: https://ror.org/023b0x485
DOI: http://doi.org/10.25358/openscience-8442
Version: Published version
Publication type: Zeitschriftenaufsatz
Document type specification: Scientific article
License: CC BY
Information on rights of use: https://creativecommons.org/licenses/by/4.0/
Journal: Frontiers in education
7
Pages or article number: 914857
Publisher: Frontiers Media
Publisher place: Lausanne
Issue date: 2022
ISSN: 2504-284X
Publisher URL: https://www.frontiersin.org/articles/10.3389/feduc.2022.914857/full
Publisher DOI: 10.3389/feduc.2022.914857
Appears in collections:DFG-491381577-G

Files in This Item:
  File Description SizeFormat
Thumbnail
validation_of_newly_developed-20221129164836112.pdf1.71 MBAdobe PDFView/Open