**1. First Things First: What are Loss Functions?**

A *loss function* (or *objective function*, or *optimization score function*) is one of the three parameters (the first one, actually) required to compile a model:

```
model.compile(loss='categorical_crossentropy', # <== LOOK HERE!
optimizer='adam',
metrics=['accuracy'])
```

**2. "cat. crossentropy" vs. "sparse cat. crossentropy"**

We often see `categorical_crossentropy`

used in multiclass classification tasks.

At the same time, there's also the existence of `sparse_categorical_crossentropy`

, which begs the question: **what's the difference between these two loss functions**?

**3. The Answer, In a Nutshell**

- If your targets are
**one-hot encoded**, use`categorical_crossentropy`

.- Examples of one-hot encodings:
`[1,0,0]`

`[0,1,0]`

`[0,0,1]`

- Examples of one-hot encodings:
- But if your targets are
**integers**, use`sparse_categorical_crossentropy`

.- Examples of integer encodings (
*for the sake of completion*):`1`

`2`

`3`

- Examples of integer encodings (

If you enjoyed this post and want to buy me a cup of coffee...The thing is, I'll

alwaysaccept a cup of coffee. So feel free to buy me one.Cheers! ☕️