I'm using OpenSUSE Tumbleweed that has this option enabled by default.
Until about a year ago, whenever I would try to download moderately large files (>4GB) my whole system would grind to a halt and stop responding.
It took me MONTHS to figure out what's the problem.
Turns out that a lot of applications use /tmp for storing files while they're downloading. And a lot of these applications don't cleanup on fail, some don't even move files after success, but extract and copy extracted files to destination, leaving even more stuff in temp.
Yeah, this is not a problem if you have 4X more ram than the size of files you download. Surely, this is a case for most people. Right?
If it's easily reproducible, I guess checking `top` while downloading a large file might have given a clue, since you could have seen that you're running out of memory?
Until about a year ago, whenever I would try to download moderately large files (>4GB) my whole system would grind to a halt and stop responding.
It took me MONTHS to figure out what's the problem.
Turns out that a lot of applications use /tmp for storing files while they're downloading. And a lot of these applications don't cleanup on fail, some don't even move files after success, but extract and copy extracted files to destination, leaving even more stuff in temp.
Yeah, this is not a problem if you have 4X more ram than the size of files you download. Surely, this is a case for most people. Right?