Why so many terms in models these days?
It's now more intuitive to add lots of terms using a regression framework than the older ANOVA framework
The move to model comparison frameworks (e.g. AIC) makes it intuitive to start with more complex models
It's just statistical machismo/showing off/more impressive
Why wouldn't I put in every factor I've noticed that varies/could have an effect?
I'm trying to address pseudoreplication
The bar has been raised and I now have to collect data with more complex structuring leading to more complex models
Mixed effect models are now more common and so I don't have to worry much about losing degrees of freedom when I add random factors
It is more predictive with more variables
The bar has been raised and I have to collect more data than before so I can use more degrees of freedom/support more complex models
It satisfies reviewers/avoids challenges from reviewers
It seems to be the best practice everybody is using now
I'm trying to get more power
I'm moving away from hypothesis testing towards exploratory approaches
Mixed effect models are now common and its more acceptable to add lots of terms when they're random
You're wrong - models aren't using more terms than 10 years ago
Its just so easy to build complex models in the R package lme4/lmer
Select up to 6 answers.
Vote
View Results
See this poll on:
https://poll.fm/8477932/embed