Why so many terms in models these days?
I'm moving away from hypothesis testing towards exploratory approaches
Its just so easy to build complex models in the R package lme4/lmer
It satisfies reviewers/avoids challenges from reviewers
The move to model comparison frameworks (e.g. AIC) makes it intuitive to start with more complex models
It seems to be the best practice everybody is using now
Why wouldn't I put in every factor I've noticed that varies/could have an effect?
The bar has been raised and I now have to collect data with more complex structuring leading to more complex models
It's just statistical machismo/showing off/more impressive
Mixed effect models are now common and its more acceptable to add lots of terms when they're random
Mixed effect models are now more common and so I don't have to worry much about losing degrees of freedom when I add random factors
The bar has been raised and I have to collect more data than before so I can use more degrees of freedom/support more complex models
It is more predictive with more variables
I'm trying to address pseudoreplication
It's now more intuitive to add lots of terms using a regression framework than the older ANOVA framework
You're wrong - models aren't using more terms than 10 years ago
I'm trying to get more power
Select up to 6 answers.
Vote
View Results
See this poll on:
https://poll.fm/8477932/embed