The 35th Language Testing Research Colloquium
 
Untitled Document
 
 
Title: Cognitive Diagnostic Assessment: Modeling and Application
Date: Monday, July 1 - Tuesday, July 2, 2013
Instructors: Eunice E. Jang, University of Toronto & Jimmy de la Torre, Rutgers University
 
The demands from educational accountability and global migration have expanded educational and language assessment toward utilization-oriented practices integrated in where learning takes place. Cognitive diagnostic models (CDM) and its application to assessments (CDA) are widely researched and practiced to enhance the utility of assessment by providing cognitively diagnostic feedback. Grounded in deep understandings of theories of learning and language proficiency development, CDA employs psychometrically rigorous probabilistic modeling approach to make inferences about students' skill mastery profiles. The purposes of the workshop are three-fold: (1) to provide an overview of one of the most recent advances in the fields of assessment and measurement; (2) to give participants hands-on experience with various cognitive diagnosis models (CDM) and their applications to specific assessment contexts; and (3) to engage participants in stimulating discussions about the potential benefits and limitations of CDM in language assessment.
 
The development of cognitive diagnosis modeling (CDM) reflects needs to develop and utilize assessments that provide information with more diagnostic and pedagogical value compared to traditional approaches. Building upon theoretical underpinnings regarding mastery of cognitive skills, CDM takes a psychometric profiling approach in order to identify individual students' strengths and areas of improvement in a domain of interest. This workshop aims to provide participants with the opportunity to learn theoretical underpinnings associated with various CDM approaches and apply their knowledge to language assessments. The intended audience of the workshop includes anyone interested in skills or cognitive diagnosis who has some familiarity with item response theory or classical test theory. No previous knowledge of cognitive diagnosis is required.
 
Dr. Eunice Eunhee Jang is Associate Professor, Applied Psychology and Human Development at Ontario Institute for Studies in Education, University of Toronto. Her research interests lie at the intersection of new educational and language assessment initiatives and educational accountability issues. Her recent work has examined Grade 6 English language learners' reading skill development patterns using cognitive diagnosis modeling. She also collaborated with various stakeholders including policy makers, educators, and researchers on multidimensional factors that affect school success in challenging school circumstances. She currently has three active programs of research. The first is evaluation of the feasibility of a new performance-based classroom assessment system that Ontario Ministry of Education developed in collaboration with teachers. The second examines how formative diagnostic feedback from assessment can have positive impact on pedagogy and student motivation. She is leading a new computer-based French as a second language (FSL) assessment system which will be primarily used for placement and diagnostic profiling for undergraduate students. In addition, in collaboration with international researchers, she examines the effects of interactive diagnostic feedback on students' learning in technology-rich learning environments that simulate real-life learning contexts. She co-authored the research monograph, OECD Reviews on Evaluation and Assessment in Education: Denmark (Shewbridge, Jang, Matthews, & Santiago, 2011). She is a member of Editorial Advisory Board for Language Assessment Quarterly, Language Testing, and Educational Measurement: Issues and Practice. She serves the Young Learner Assessment Subcommittee, TOEFL Committee of Examiners, WBTT Research Advisory Board, and Supporting ELL Advisory Group.
 
Dr. Jimmy de la Torre is an Associate Professor at the Department of Educational Psychology at Rutgers University. His research interests include latent variable models for educational and psychological measurement, and how to use assessment to improve classroom instruction and learning. His recent work in CDM includes development of various cognitive diagnosis models, implementation of estimation codes for cognitive diagnosis models, and development of a general framework for model estimation, test comparison, and Q-matrix validation. He is an ardent advocate of CDM, and to date, has conducted more than a dozen national and international CDM workshops. Jimmy was a recipient of the 2008 Presidential Early Career Award for Scientists and Engineers given by the White House, and the 2009 Jason Millman Promising Measurement Scholar Award given by the National Council on Measurement in Education.
 
Agenda:
Monday, July 1 2013
 9:00 a.m. - 10:00 a.m.
 
 10:00 a.m. - 12:15 p.m.
Introduction of CDA and CDM
 
Theoretical backgrounds
 1. Paradigm issues in testing
 2. Research on CDM/CDA in language testing
 3. Diagnosis modeling frameworks
 1:15 p.m. - 3:00 p.m. Diagnostic assessment development
 Q matrix development
 Q matrix validation
 3:00 p.m. - 5:00 p.m. Hands-on activities
Wrap up

 
Tuesday, July 2 2013
 9:00 a.m. - 10:30 a.m.
 
 10:30 a.m. - 12:15 p.m.
 
Diagnosis modeling
 
Evaluation of model fit
 
 1:15 p.m. - 3:00 p.m.
 
 3:00 p.m. - 4:00 p.m.
 
Arpeggio application& validation
 
Use of diagnostic information
 
 4:00 p.m. - 5:00 p.m.
 
Current issues and future directions in CDM/CDA
 
 
 
Untitled Document
 
Untitled Document