DNN (Deep Neural Network)
Keras 기본
선형 신경망
model = Sequential()
model.add(Dense(1, input_dim=2))
model.add(Activation('sigmoid'))
model.summary()
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_2 (Dense) (None, 1) 3
_________________________________________________________________
activation_2 (Activation) (None, 1) 0
=================================================================
Total params: 3
Trainable params: 3
Non-trainable params: 0
_________________________________________________________________
신경망 하나만 사용한 것.
입력은 2개(input dimension), 출력 1개
-> 파라미터 3개 (w1, w2, b)
model = Sequential()
model.add(Dense(5, input_dim=2)) #입력층 5*2+5=15개 (학습 파라미터 수)
model.add(Activation('sigmoid'))
model.add(Dense(1)) #출력층
model.add(Activation('sigmoid'))
model.summary()
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_3 (Dense) (None, 5) 15
_________________________________________________________________
activation_3 (Activation) (None, 5) 0
_________________________________________________________________
dense_4 (Dense) (None, 1) 6
_________________________________________________________________
activation_4 (Activation) (None, 1) 0
=================================================================
Total params: 21
Trainable params: 21
Non-trainable params: 0
_________________________________________________________________
학습 파라미터 수: 입력층 5*(2+1)=15개 + 출력층 6개 = 21개
model.compile(loss='binary_crossentropy', optimizer='adam') #모델 컴파일
model.fit(X,y,batch_size=4, epochs=2000, verbose=0)
print(model.predict_proba(X)) #4x1 matrix
p = model.predict(X)
print(p > 0.5)
print((p > 0.5)*1) #binary로 표현
'Python > Deep learning' 카테고리의 다른 글
CNN (0) | 2020.01.21 |
---|---|
다층 퍼셉트론을 향상시키는 방법 - 2 (0) | 2020.01.21 |
다층 퍼셉트론을 향상시키는 방법 - 1 (0) | 2020.01.21 |
케라스 회귀 (0) | 2020.01.20 |
경사하강법(SGD) (0) | 2020.01.20 |