Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is there an error in the positional encoding example? For example when calculating PE(1, 3), I'd expect i = 1 as 3 = 2 * 1 + 1

So for “World”

PE(1, 0) = sin(1 / 10000^(2*0 / 4)) = sin(1 / 10000^0) = sin(1) ≈ 0.84

PE(1, 1) = cos(1 / 10000^(2*0 / 4)) = cos(1 / 10000^0) = cos(1) ≈ 0.54

PE(1, 2) = sin(1 / 10000^(2*1 / 4)) = sin(1 / 10000^.5) ≈ 0.01

PE(1, 3) = cos(1 / 10000^(2*1 / 4)) = cos(1 / 10000^.5) ≈ 1

I also wondered if these formulae were devised with 1-based indexing in mind (though I guess for larger dimensions it doesn't make much difference), as the paper states

> The wavelengths form a geometric progression from 2π to 10000 · 2π

That led me to this chain of PRs - https://github.com/tensorflow/tensor2tensor/pull/177 - turns out the original code was actually quite different to that stated in the paper. I guess slight variations in how you calculate this encoding doesn't affect things too much?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: