Norwegian version of this page

Latent Variable Mixture models to track Longitudinal Differentiation Patterns (completed)

In everyday educational work or clinical practice, development is something that is aspired, monitored, and worked on to make sure that students learn or patients improve upon their current condition.

What happens if the ruler changes?

Usually, progression addresses the question how far you moved along a ruler and assumes that as long as the same ruler (i.e., measurement instrument) is used, scores can naturally be compared across time. If the ruler would change or what the ruler is trying to measure changed throughout the process, the common ground for comparisons disappears. Hence, you risk comparing apples and oranges.

Can we compare apples and oranges, illustration photo.
Can we compare apples and oranges? Photo: Colourbox

Yet, in some situations, you are unable to use the same ruler or have to redefine what you are measuring, and changing from an apple into an orange would be an actual sign of development!

Redefinition might be necessary

For instance, we can expect that progressing from the level of a novice student-teacher to the level of a more expert veteran teacher is not simply "growing" more of the same competence, but that instead it actually requires redefining your understanding and definition of the teaching practice. Similarly, the reported quality-of-life of patients might also undergo a response shift as they redefine/re-evaluate what quality of life means for them as a person while disease progresses or impactful events such as operations happen.

Aim

In such situations, we need to rethink to what extent simple growth comparisons remain useful and how to provide alternative ways to measure and model such differentiating developmental patterns. This project aims to develop sound statistical procedures to accommodate the tracking of development in such settings.

Tools

From a psychometric perspective, the project will focus on longitudinal measurement equivalence and on the creative use of latent variable mixture models to account for inequivalent progress trajectories and individual differences in development.

We will make R-packages and Shiny-applets available here for methods and statistical models developed in the project, and R-code for papers written in the context of the project.

Financing

The project is funded by the Research Council of Norway

Publications

  • Chen, Jianan; van Laar, Saskia & Braeken, Johan (2023). Who are those random responders on your survey? The case of the TIMSS 2015 student questionnaire. Large-scale Assessments in Education. ISSN 2196-0739. 11(1). doi: 10.1186/s40536-023-00184-6. Full text in Research Archive
  • Van Laar, Saskia & Braeken, Johan (2022). Caught off Base: A Note on the Interpretation of Incremental Fit Indices. Structural Equation Modeling. ISSN 1070-5511. 29(6), p. 935–943. doi: 10.1080/10705511.2022.2050730. Full text in Research Archive
  • Van Laar, Saskia & Braeken, Johan (2022). Random Responders in the TIMSS 2015 Student Questionnaire: A Threat to Validity? Journal of Educational Measurement. ISSN 0022-0655. p. 1–32. doi: 10.1111/jedm.12317. Full text in Research Archive
  • Steinmann, Isa; Sanchez, Daniel; van Laar, Saskia & Braeken, Johan (2021). The impact of inconsistent responders to mixed-worded scales on inferences in international large-scale assessments. Assessment in Education: Principles, Policy & Practice. ISSN 0969-594X. doi: 10.1080/0969594X.2021.2005302. Full text in Research Archive
  • Van Laar, Saskia & Braeken, Johan (2021). Understanding the Comparative Fit Index: It's all about the base! Practical Assessment, Research, and Evaluation (PARE). ISSN 1531-7714. 26(26), p. 1–25. doi: 10.7275/23663996. Full text in Research Archive
  • Steinmann, Isa; Strietholt, Rolf & Braeken, Johan (2021). A Constrained Factor Mixture Analysis Model for Consistent and Inconsistent Respondents to Mixed-Worded Scales. Psychological methods. ISSN 1082-989X. doi: 10.1037/met0000392. Full text in Research Archive

View all works in Cristin

  • Steinmann, Isa; Braeken, Johan & Strietholt, Rolf (2021). Identifying Inconsistent Respondents to Mixed-Worded Scales in Large-Scale Assessments.
  • Van Laar, Saskia & Braeken, Johan (2021). Random responders in international large-scale assessments in education: A threat to validity?
  • Van Laar, Saskia & Braeken, Johan (2019). Random responders in international large-scale assessments in education: A threat to validity.
  • Braeken, Johan & Van Laar, Saskia (2019). Self-reported personality of Parents & Perceived Temperament of their Infant.
  • Van Laar, Saskia & Braeken, Johan (2019). Decomposing the Comparative Fit Index: Effect of model characteristics on CFI performance.
  • Van Laar, Saskia & Braeken, Johan (2018). Measurement Invariance in Within-Group Designs.
  • Van Laar, Saskia & Braeken, Johan (2018). Model fit: Rules-of-thumb vs Sampling distributions.
  • Van Laar, Saskia & Braeken, Johan (2018). Measurement Invariance in Within-Subjects Designs.

View all works in Cristin

Published Feb. 13, 2017 6:23 PM - Last modified June 12, 2023 10:29 AM

Contact

Any queries regarding the project can be directed to Johan Braeken.

Logo of the Research Council of Norway.

The project is funded by a FRIPRO Young Research Talents grant, received in 2017 from the Norwegian Research Council.
Anticipated start and end date: 01-09-2017  to 31-06-2021.

Participants

  • Johan Braeken Universitetet i Oslo
  • Saskia Van Laar Universitetet i Oslo
Detailed list of participants