# Softmax or `LogSoftmax`

`>>> for x in range(20):print(x, math.e**x)...0 1.01 2.7182818284590452 7.38905609893064953 20.0855369231876644 54.598150033144235 148.413159102576576 403.4287934927357 1096.63315842845838 2980.9579870417279 8103.0839275753810 22026.46579480670311 59874.1417151977812 162754.7914190038313 442413.392008920214 1202604.284164775915 3269017.37247210816 8886110.52050786517 24154952.75357527718 65659969.1373304519 178482300.96318707`

# An example to compare the output of Softmax and LogSoftmax

`>>> m = nn.Softmax(dim=1)>>> input = torch.randn(2, 3)>>> output = m(input)>>> inputtensor([[-1.1723,  0.3103,  1.7434],        [ 0.1054,  0.0876,  1.9890]])>>> outputtensor([[0.0419, 0.1845, 0.7736],        [0.1168, 0.1148, 0.7684]])>>> mm = nn.LogSoftmax(dim=1)>>> output2 = mm(input)>>> output2tensor([[-3.1724, -1.6899, -0.2568],        [-2.1470, -2.1649, -0.2634]])>>> torch.exp(output2)tensor([[0.0419, 0.1845, 0.7736],        [0.1168, 0.1148, 0.7684]])>>>`

--

--

--

## More from Jimmy Shen

Data Scientist/MLE/SWE @takemobi

Love podcasts or audiobooks? Learn on the go with our new app.

## Jimmy Shen

Data Scientist/MLE/SWE @takemobi