Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm using OpenSUSE Tumbleweed that has this option enabled by default.

Until about a year ago, whenever I would try to download moderately large files (>4GB) my whole system would grind to a halt and stop responding.

It took me MONTHS to figure out what's the problem.

Turns out that a lot of applications use /tmp for storing files while they're downloading. And a lot of these applications don't cleanup on fail, some don't even move files after success, but extract and copy extracted files to destination, leaving even more stuff in temp.

Yeah, this is not a problem if you have 4X more ram than the size of files you download. Surely, this is a case for most people. Right?



How did you figure that this was the problem?

If it's easily reproducible, I guess checking `top` while downloading a large file might have given a clue, since you could have seen that you're running out of memory?


I was trying to solve another problem related to mounting, ran `df -h` a couple times and noticed that:

  1) tmpfs is mounted to /tmp
  2) available size on /tmp is very low
  3) my free ram indicator in status bar is red
And then I tried downloading some files while looking at htop. It was immeduately obvious that it was the problem causing the hangs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: