- Singmann, Henrik;
- Heck, Daniel W;
- Barth, Marius;
- Erdfelder, Edgar;
- Arnold, Nina R;
- Aust, Frederik;
- Calanchini, Jimmy;
- Gümüsdagli, Fabian E;
- Horn, Sebastian S;
- Kellen, David;
- Klauer, Karl C;
- Matzke, Dora;
- Meissner, Franziska;
- Michalkiewicz, Martha;
- Schaper, Marie Luisa;
- Stahl, Christoph;
- Kuhlmann, Beatrice G;
- Groß, Julia
Researchers have become increasingly aware that data-analysis decisions affect results. Here, we examine this issue systematically for multinomial processing tree (MPT) models, a popular class of cognitive models for categorical data. Specifically, we examine the robustness of MPT model parameter estimates that arise from two important decisions: the level of data aggregation (complete-pooling, no-pooling, or partial-pooling) and the statistical framework (frequentist or Bayesian). These decisions span a multiverse of estimation methods. We synthesized the data from 13,956 participants (164 published data sets) with a meta-analytic strategy and analyzed the magnitude of divergence between estimation methods for the parameters of nine popular MPT models in psychology (e.g., process-dissociation, source monitoring). We further examined moderators as potential sources of divergence. We found that the absolute divergence between estimation methods was small on average (<.04; with MPT parameters ranging between 0 and 1); in some cases, however, divergence amounted to nearly the maximum possible range (.97). Divergence was partly explained by few moderators (e.g., the specific MPT model parameter, uncertainty in parameter estimation), but not by other plausible candidate moderators (e.g., parameter trade-offs, parameter correlations) or their interactions. Partial-pooling methods showed the smallest divergence within and across levels of pooling and thus seem to be an appropriate default method. Using MPT models as an example, we show how transparency and robustness can be increased in the field of cognitive modeling. (PsycInfo Database Record (c) 2024 APA, all rights reserved).