Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I can believe the author. I have a fast internet, but I have DNS set up to resolve over DOH and over tor.

This is when an idea that having everything loaded from 7 different CDNs for a page to show anything at all falls on its head.

My uncached domain resolve times are in the range 1-10 seconds. So in an unhappy scenario, when things chain up, because a page needs one resource to decide that another resource from another domain needs to be loaded, and so on, I can easily wait 10-30s on a web page to load. Combine that with an idiotic FOUC prevention, that many website's designers fall for - and it means starring at a white space for the entire time, until some stupid web font loads, despite the actual text content having been already loaded for the most of the time.

So if the wifi is combined with a high latency DNS, it's certainly possible. Not everything is about raw throughput.



I'm not disputing that website developers do stupid shit all the time. We know that's true. I'm disputing that it's got something to do with React.

I've seen websites that are driven by vanilla JS that use 30MB images for backgrounds. I don't claim that's a reason to use a JS framework.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: