| sklearn.linear_model |
解释 |
样例 |
| linear_model.AddRegression |
|
|
| linear_model.BayesianRidge |
|
|
| linear_model.ElasticNet |
|
|
| linear_model.ElasticNetCV |
|
|
| lnear_model.HuberRegressor |
|
|
| linear_model.Lars |
|
|
| linear_model.LarsCV |
|
|
| linear_model.Lasso |
Lasso回归,可以将一些线性相关的变量系数变为0 |
from sklearn import linear_model
reg = linear_model.Lasso(alpha=0.001)
reg.fit([[0,0],[1,1],[2,2]], [0,1,2])
print reg.coef_, reg.intercept_
output:
[ 0.9985 0. ] 0.0015
|
| linear_model.LassoCV |
Lasso回归,给定一个alpha列表,通关cross-validation选择合适的alpha |
from sklearn import linear_model
reg = linear_model.LassoCV(n_alphas = 3, alphas = [0.1, 1, 10])
reg.fit([[0,0],[1,1],[2,2]], [0,1,2])
print reg.coef_, reg.intercept_, reg.alpha_
output:
[ 0.85 0. ] 0.15 0.1
|
| linear_model.LassoLars |
|
|
| linear_model.LassoLarsCV |
|
|
| linear_model.LassoLarsIC |
|
|
| linear_model.LinearRegression |
线性回归 |
from sklearn import linear_model
reg = linear_model.LinearRegression()
reg.fit([[0,0],[1,1],[2,2]], [0,1,2])
print reg.coef_, reg.intercept_
output: [ 0.5 0.5] 2.22044604925e-16
|
| linear_model.LogsticRegression |
逻辑回归,本质是一个分类算法 |
from sklearn import linear_model
reg = linear_model.LogisticRegression()
reg.fit([[0,1],[0,2],[0,3],[1,1],[1,2],[1,3]],[1,1,1,2,2,2])
print reg.coef_, reg.intercept_
print reg.predict([[0,100],[1,100],[1,4]])
output:
[[ 1.04846721 -0.1433622 ]] [-0.13840227]
[1 1 2]
|
| linear_model.Ridge |
岭回归,通过alpha控制系数的分散程度 |
from sklearn import linear_model
reg = linear_model.Ridge(alpha=0.001)
reg.fit([[0,0],[1,1],[2,2]], [0,1,2])
print reg.coef_, reg.intercept_
output: [ 0.49987503 0.49987503] 0.000249937515621
|
| linear_model.RidgeCV |
岭回归,给定一个alpha列表,通过cross-validation选择合适的alpha |
from sklearn import linear_model
reg = linear_model.RidgeCV([0.1, 1, 10])
reg.fit([[0,0],[1,1],[2,2]], [0,1,2])
print reg.coef_, reg.intercept_, reg.alpha_
output: [ 0.48780488 0.48780488] 0.0243902439024 0.1
|
| |
|
|
| |
|
|
| |
|
|
| |
|
|