Skip to content
Snippets Groups Projects
Commit a6753f3e authored by Alexandre Gramfort's avatar Alexandre Gramfort Committed by Joel Nothman
Browse files

[MRG+1] Ridgecv normalize (#9302)

* FIX : normalize was not passed to grid search in RidgeCV

* update what's new
parent a08555a2
No related branches found
No related tags found
No related merge requests found
...@@ -471,10 +471,13 @@ Bug fixes ...@@ -471,10 +471,13 @@ Bug fixes
by :user:`Andre Ambrosio Boechat <boechat107>`, :user:`Utkarsh Upadhyay by :user:`Andre Ambrosio Boechat <boechat107>`, :user:`Utkarsh Upadhyay
<musically-ut>`, and `Joel Nothman`_. <musically-ut>`, and `Joel Nothman`_.
- Add ``data_home`` parameter to - Add ``data_home`` parameter to
:func:`sklearn.datasets.fetch_kddcup99` by `Loic Esteve`_. :func:`sklearn.datasets.fetch_kddcup99` by `Loic Esteve`_.
- Fix inconsistent results between :class:`linear_model.RidgeCV`
and :class:`linear_model.Ridge` when using ``normalize=True``
by `Alexandre Gramfort`_.
API changes summary API changes summary
------------------- -------------------
......
...@@ -1119,7 +1119,8 @@ class _BaseRidgeCV(LinearModel): ...@@ -1119,7 +1119,8 @@ class _BaseRidgeCV(LinearModel):
raise ValueError("cv!=None and store_cv_values=True " raise ValueError("cv!=None and store_cv_values=True "
" are incompatible") " are incompatible")
parameters = {'alpha': self.alphas} parameters = {'alpha': self.alphas}
gs = GridSearchCV(Ridge(fit_intercept=self.fit_intercept), gs = GridSearchCV(Ridge(fit_intercept=self.fit_intercept,
normalize=self.normalize),
parameters, cv=self.cv, scoring=self.scoring) parameters, cv=self.cv, scoring=self.scoring)
gs.fit(X, y, sample_weight=sample_weight) gs.fit(X, y, sample_weight=sample_weight)
estimator = gs.best_estimator_ estimator = gs.best_estimator_
......
...@@ -383,6 +383,16 @@ def _test_ridge_loo(filter_): ...@@ -383,6 +383,16 @@ def _test_ridge_loo(filter_):
return ret return ret
def _test_ridge_cv_normalize(filter_):
ridge_cv = RidgeCV(normalize=True, cv=3)
ridge_cv.fit(filter_(10. * X_diabetes), y_diabetes)
gs = GridSearchCV(Ridge(normalize=True), cv=3,
param_grid={'alpha': ridge_cv.alphas})
gs.fit(filter_(10. * X_diabetes), y_diabetes)
assert_equal(gs.best_estimator_.alpha, ridge_cv.alpha_)
def _test_ridge_cv(filter_): def _test_ridge_cv(filter_):
ridge_cv = RidgeCV() ridge_cv = RidgeCV()
ridge_cv.fit(filter_(X_diabetes), y_diabetes) ridge_cv.fit(filter_(X_diabetes), y_diabetes)
...@@ -462,6 +472,7 @@ def check_dense_sparse(test_func): ...@@ -462,6 +472,7 @@ def check_dense_sparse(test_func):
def test_dense_sparse(): def test_dense_sparse():
for test_func in (_test_ridge_loo, for test_func in (_test_ridge_loo,
_test_ridge_cv, _test_ridge_cv,
_test_ridge_cv_normalize,
_test_ridge_diabetes, _test_ridge_diabetes,
_test_multi_ridge_diabetes, _test_multi_ridge_diabetes,
_test_ridge_classifiers, _test_ridge_classifiers,
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment