### 简单线性回归

#### 假设分析

X = [x0 x1] = [1 IP-Address]

#### 逻辑回归的类型

1. 二元逻辑回归：分类反应只有两种可能结果。例子:垃圾邮件或非垃圾邮件

1. 多项逻辑回归：三个或更多的类别，没有排序。例子:预测哪种食物更受欢迎(素食，非素食，纯素食)

1. 顺序逻辑回归：三个或更多类别的排序。例子:电影分级从1到5

### Python实现

```def weightInitialization(n_features):
w = np.zeros((1,n_features))
b = 0
return w,b
def sigmoid_activation(result):
final_result = 1/(1+np.exp(-result))
return final_result
def model_optimize(w, b, X, Y):
m = X.shape[0]

#Prediction
final_result = sigmoid_activation(np.dot(w,X.T)+b)
Y_T = Y.T
cost = (-1/m)*(np.sum((Y_T*np.log(final_result)) + ((1-Y_T)*(np.log(1-final_result)))))
#

dw = (1/m)*(np.dot(X.T, (final_result-Y.T).T))
db = (1/m)*(np.sum(final_result-Y.T))

grads = {"dw": dw, "db": db}

def model_predict(w, b, X, Y, learning_rate, no_iterations):
costs = []
for i in range(no_iterations):
#
#
#weight update
w = w - (learning_rate * (dw.T))
b = b - (learning_rate * db)
#

if (i % 100 == 0):
costs.append(cost)
#print("Cost after %i iteration is %f" %(i, cost))

#final parameters
coeff = {"w": w, "b": b}
gradient = {"dw": dw, "db": db}

def predict(final_pred, m):
y_pred = np.zeros((1,m))
for i in range(final_pred.shape[1]):
if final_pred[0][i] > 0.5:
y_pred[0][i] = 1
return y_pred```

#### 成本与迭代次数

**https://github.com/SSaishruthi/LogisticRegression_Vectorized_Implementation/blob/master/Logistic_Regression.ipynb **