Google I / O가 5 월 18 ~ 20 일에 돌아옵니다! 공간을 예약하고 일정을 짜세요

tf.keras.losses.sparse_categorical_crossentropy

Computes the sparse categorical crossentropy loss.

Used in the notebooks

Used in the guide

Standalone usage:

````y_true = [1, 2]`
`y_pred = [[0.05, 0.95, 0], [0.1, 0.8, 0.1]]`
`loss = tf.keras.losses.sparse_categorical_crossentropy(y_true, y_pred)`
`assert loss.shape == (2,)`
`loss.numpy()`
`array([0.0513, 2.303], dtype=float32)`
```

`y_true` Ground truth values.
`y_pred` The predicted values.
`from_logits` Whether `y_pred` is expected to be a logits tensor. By default, we assume that `y_pred` encodes a probability distribution.
`axis` (Optional) Defaults to -1. The dimension along which the entropy is computed.

Sparse categorical crossentropy loss value.

[{ "type": "thumb-down", "id": "missingTheInformationINeed", "label":"필요한 정보가 없음" },{ "type": "thumb-down", "id": "tooComplicatedTooManySteps", "label":"너무 복잡함/단계 수가 너무 많음" },{ "type": "thumb-down", "id": "outOfDate", "label":"오래됨" },{ "type": "thumb-down", "id": "samplesCodeIssue", "label":"Samples / code issue" },{ "type": "thumb-down", "id": "otherDown", "label":"기타" }]
[{ "type": "thumb-up", "id": "easyToUnderstand", "label":"이해하기 쉬움" },{ "type": "thumb-up", "id": "solvedMyProblem", "label":"문제가 해결됨" },{ "type": "thumb-up", "id": "otherUp", "label":"기타" }]