What we often forget when making use of statistics in our daily research practice is that statistics itself is a rapidly developing research field that is rapidly progressing. For instance, in recent yearsBayesFactors have been proposed asan alternative to p-values and their adaptation has quickly increased in psychological papers in recent years with many scholars claiming that they should become the standard way of statistical inference. However, while people start to adopt Bayes Factors quickly, there are problems and challenges that they might often go unnoticed. Furthermore, many Bayesian statisticians themselves say that we should not use p-values and BayesFactorsand suggest yet other alternatives.
In statistics classes, students often learn a zoo of different models, without grasping how those all link together. In consequence, they often fit several models, one for each research question. Hierarchical/ multi-level/ mixed-effects models constitute a unifying framework that allows to address many questions in a single model, including questions that are not easily answered with standard ANOVAs (e.g. trial-by-trial effects). However, this increased flexibility comes at the costs of more complex computation. We will discuss the logic and benefits/ pitfalls of using mixed-effects models, and then introduce Markov chain Monte Carlo (MCMC) as a particularly suited fitting algorithm. We will introduce the R package brms, an easy-to-handle wrapper for fitting Bayesian mixed-effects models using MCMCs in the language Stan.