From fe07b8c726922582edf1889df69c65d7dc79bca4 Mon Sep 17 00:00:00 2001
From: Aman Dalmia <amandalmia18@gmail.com>
Date: Mon, 13 Feb 2017 18:54:49 +0530
Subject: [PATCH] DOC: added explanation for LARS (#8310)

---
 doc/modules/linear_model.rst | 8 ++++++--
 1 file changed, 6 insertions(+), 2 deletions(-)

diff --git a/doc/modules/linear_model.rst b/doc/modules/linear_model.rst
index 8b6c232597..887a590f23 100644
--- a/doc/modules/linear_model.rst
+++ b/doc/modules/linear_model.rst
@@ -270,7 +270,7 @@ Comparison with the regularization parameter of SVM
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
 The equivalence between ``alpha`` and the regularization parameter of SVM,
-``C`` is given by ``alpha = 1 / C`` or ``alpha = 1 / (n_samples * C)``, 
+``C`` is given by ``alpha = 1 / C`` or ``alpha = 1 / (n_samples * C)``,
 depending on the estimator and the exact objective function optimized by the
 model.
 
@@ -398,7 +398,11 @@ Least Angle Regression
 
 Least-angle regression (LARS) is a regression algorithm for
 high-dimensional data, developed by Bradley Efron, Trevor Hastie, Iain
-Johnstone and Robert Tibshirani.
+Johnstone and Robert Tibshirani. LARS is similar to forward stepwise
+regression. At each step, it finds the predictor most correlated with the
+response. When there are multiple predictors having equal correlation, instead
+of continuing along the same predictor, it proceeds in a direction equiangular
+between the predictors.
 
 The advantages of LARS are:
 
-- 
GitLab