Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorNagel, Marie-Theres-
dc.contributor.authorZlatkin-Troitschanskaia, Olga-
dc.contributor.authorFischer, Jennifer-
dc.description.abstractIn recent decades, the acquisition of information has evolved substantially and fundamentally affects students’ use of information, so that the Internet has become one of the most important sources of information for learning. However, learning with freely accessible online resources also poses challenges, such as vast amounts of partially unstructured, untrustworthy, or biased information. To successfully learn by using the Internet, students therefore require specific skills for selecting, processing, and evaluating the online information, e.g., to distinguish trustworthy from distorted or biased information and for judging its relevance with regard to the topic and task at hand. Despite the central importance of these skills, their assessment in higher education is still an emerging field. In this paper, we present the newly defined theoretical-conceptual framework Critical Online Reasoning (COR). Based on this framework, a corresponding performance assessment, Critical Online Reasoning Assessment (CORA), was newly developed and underwent first steps of validation in accordance with the Standards for Educational and Psychological Testing. We first provide an overview of the previous validation results and then expand them by including further analyses of the validity aspects “internal test structure” and “relations with other variables”. To investigate the internal test structure, we conducted variance component analyses based on the generalizability theory with a sample of 125 students and investigated the relations with other variables by means of correlation analyses. The results show correlations with external criteria as expected and confirm that the CORA scores reflect the different test performances of the participants and are not significantly biased by modalities of the assessment. With these new analyses, this study substantially contributes to previous research by providing comprehensive evidence for the validity of this new performance assessment that validly assesses the complex multifaceted construct of critical online reasoning among university students and graduates. CORA results provide unique insights into the interplay between features of online information acquisition and processing, learning environments, and the cognitive and metacognitive requirements for critically reasoning from online information in university students and young professionals.en_GB
dc.description.sponsorshipGefördert durch die Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 491381577de
dc.rightsCC BY*
dc.subject.ddc370 Erziehungde_DE
dc.subject.ddc370 Educationen_GB
dc.titleValidation of newly developed tasks for the assessment of generic Critical Online Reasoning (COR) of university students and graduatesen_GB
jgu.type.contenttypeScientific articlede
jgu.type.versionPublished versionde
jgu.organisation.departmentFB 02 Sozialwiss., Medien u. Sportde
jgu.organisation.nameJohannes Gutenberg-Universität Mainz-
jgu.journal.titleFrontiers in educationde
jgu.publisher.nameFrontiers Mediade
jgu.subject.dfgGeistes- und Sozialwissenschaftende
Appears in collections:DFG-491381577-G

Files in This Item:
  File Description SizeFormat
validation_of_newly_developed-20221129164836112.pdf1.71 MBAdobe PDFView/Open