# [2019.10.10] KNN Strikes Back

When I started my work with recommenders during my previous job, I
thought that using methods based on similarity was somehow obsolete.
It was time to move forward to matrix factorisations, I supposed. My
first disillusionment was the toughest one. The fold-in procedure in
ALS is so computationally intense that it's hardly possible to use it
in real-time systems. Yet it was possible, and together with my team,
we got it running. Only after several years, I discovered that in
pure SVD fold-in is nearly as cheap as in vanilla KNN. But everybody
used ALS, not SVD, because of the fame the former won during the
Netflix Prize competition. Now its time to use Deep Learning for
recommenders, even though they are known to be worse than classical
similarity-based methods from the late 90s in many cases. Research is
about whys, but engineering is about getting the shit working, no
questions asked. If something works, don't try to mend it. That's the
great commandment of an engineer.

But when you create an instrument that is supposed to work, not to be
fancy, you have a choice. And I've chosen to use similarity-based
methods for the first version of our new library. And you know, it
works:)