Google I / O가 5 월 18 ~ 20 일에 돌아옵니다! 공간을 예약하고 일정을 짜세요

# tf.keras.losses.binary_crossentropy

Computes the binary crossentropy loss.

### Used in the notebooks

Used in the tutorials

#### Standalone usage:

````y_true = [[0, 1], [0, 0]]`
`y_pred = [[0.6, 0.4], [0.4, 0.6]]`
`loss = tf.keras.losses.binary_crossentropy(y_true, y_pred)`
`assert loss.shape == (2,)`
`loss.numpy()`
`array([0.916 , 0.714], dtype=float32)`
```

`y_true` Ground truth values. shape = `[batch_size, d0, .. dN]`.
`y_pred` The predicted values. shape = `[batch_size, d0, .. dN]`.
`from_logits` Whether `y_pred` is expected to be a logits tensor. By default, we assume that `y_pred` encodes a probability distribution.
`label_smoothing` Float in [0, 1]. If > `0` then smooth the labels by squeezing them towards 0.5 That is, using `1. - 0.5 * label_smoothing` for the target class and `0.5 * label_smoothing` for the non-target class.

Binary crossentropy loss value. shape = `[batch_size, d0, .. dN-1]`.

[{ "type": "thumb-down", "id": "missingTheInformationINeed", "label":"필요한 정보가 없음" },{ "type": "thumb-down", "id": "tooComplicatedTooManySteps", "label":"너무 복잡함/단계 수가 너무 많음" },{ "type": "thumb-down", "id": "outOfDate", "label":"오래됨" },{ "type": "thumb-down", "id": "samplesCodeIssue", "label":"Samples / code issue" },{ "type": "thumb-down", "id": "otherDown", "label":"기타" }]
[{ "type": "thumb-up", "id": "easyToUnderstand", "label":"이해하기 쉬움" },{ "type": "thumb-up", "id": "solvedMyProblem", "label":"문제가 해결됨" },{ "type": "thumb-up", "id": "otherUp", "label":"기타" }]