1、L1正则化——Lasso回归

## 五、潜藏问题解决方法——正则化

### 2、L2正则化——岭回归

L2正则化均匀选择参数，让拟合曲线各项系数都差不多，虽然没能减少项的个数，但是均衡了各项系数，这是原理上与L1正则化不同的地方。

## 七、python实现

### 1、多元线性回归

```from sklearn.linear_model import LinearRegression
from sklearn.model_selection import train_test_split
X_train,X_test,Y_train,Y_test=train_test_split(x,y,test_size=0.3,random_state=1)//x,y分别为已经分好的属性数据和标记数据
model = LinearRegression()
model.fit(X_train, Y_train)
score = model.score(X_test, Y_test)
print('模型测试得分：'+str(score))
Y_pred = model.predict(X_test)
print(Y_pred)```

### 2、岭回归

```from sklearn.linear_model import Ridge
from sklearn.model_selection import train_test_split
X_train,X_test,Y_train,Y_test=train_test_split(x,y,test_size=0.3,random_state=1)//x,y分别为已经分好的属性数据和标记数据
model = Ridge(alpha=1)
model.fit(X_train, Y_train)
score = model.score(X_test, Y_test)
print('模型测试得分：'+str(score))
Y_pred = model.predict(X_test)
print(Y_pred)```

### 3、lasso回归

```from sklearn.linear_model import Lasso
from sklearn.model_selection import train_test_split
X_train,X_test,Y_train,Y_test=train_test_split(x,y,test_size=0.3,random_state=1)//x,y分别为已经分好的属性数据和标记数据
model = Lasso(alpha=0.1)
model.fit(X_train, Y_train)
score = model.score(X_test, Y_test)
print('模型测试得分：'+str(score))
Y_pred = model.predict(X_test)
print(Y_pred)```