Tag Archives: Logistic regression

Recommended: ROSE: A package for binary imbalanced learning

This page is moving to a new website.

Logistic regression and other statistical methods for predicting a binary outcome run into problems when the outcome being tested is very rare, even in data sets big enough to insure that the rare outcome occurs hundreds or thousands of times. The problem is that attempts to optimize the model across all of the data will end up looking predominantly at optimizing the negative cases, and could easily ignore and misclassify all or almost all of the positive cases since they consistute such a small percentage of the data. The ROSE package generates artificial balanced samples to allow for better estimation and better evaluation of the accuracy of the model. Continue reading

PMean: Nonparametric tests for multifactor designs

Dear Professor Mean, I want to run nonparametric tests like the Kruskal-Wallis test and the Friedman test for a setting where there may be more than one factor. Everything I’ve seen for these two tests only works for a single factor. Is there any extension of these tests that I could use when I suspect that my data is not normally distributed. Continue reading