In my experience it was a nice reader, though it hyper-specialized a bit too much. I read a few articles about cars, and it fed me nothing but car articles. It became impossible to read anything else, further cementing its belief that I love to read about cars.
That seems to be the state of the industry everywhere.
YouTube does this on a extreme level as well. Just the other day I watched 3 videos in a row about the WRC Kenya stage after never watching anything related to rally or cars on my YouTube/Google account, an account I've had for almost 2 decades by now.
But, after watching those 3 videos, Google now thinks I'm all about cars, and 80% of all my recommendations are now WRC, F1 or other car-related videos. My other interests that they knew since before, are almost completely gone.
How can they screw up something so basic? Does anyone actually like things to work like this? Am I missing something super obvious?
This is imho one of the big draws and competitive advantages of TikTok. TikTok is by no means immune to this, but it handles it much better than Youtube or Instagram. It actively shows you some amount of content outside your current bubble, and reacts strongly to any signal you give it about that content (including watch time). It also seems to stop showing you topics when you show continued disinterest.
My Youtube has been ruined by my kid's shows. I literally have none of my interests showing up anymore because I've let them watch a few things.
I actually work on a recommendation team at a large content company and this has me realizing we don't really think about this ie how to build an overall profile that isn't impacted by one-offs. I think part of the issue is many models are black boxes without much explainability so it's not always clear or easy to understand why a model provided recommendations.
We just look to optimize for engagement so if our recommendations are driving more engagement, we dial into that. If they're not, we adjust course
This is exactly the problem with any recommendation engine, IME. They either:
1. Feed you nothing but content about something you looked at once.
2. Show you stuff you already bought. No, I don't need another exact or very similar XYZ widget; show me something RELATED yet very different, thank you very much.
3. They show absolute garbage that is in no way related to what I like.
---
Exhausting, really; and, probably why most people find AI/ML largely useless as it exists today.
From a practitioners perspective, I view this type of behavior as tuning the model more towards the exploitation side of the exploration/exploitation trade-off. I think a lot of recommendation engines do this (looking at you YouTube) because it’s more profitable.
Which at its core is probably an alignment problem in the way the models are evaluated: they are measured on their short-term effects, and there exploitation rules. But if you look at the long-term effect of recommendations you really need a healthy dose of exploration to keep your users around.