Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's the future but raycasting currently has some major drawbacks... namely that everything looks like a noisy mess when you look around.

Curious if anyone has plotted GPU compute increases against display resolution + update frequency increases? When do the two lines cross?



I'm fairly certain that pixelpoet is referring to direct raycasting, not raytracing (light transport simulations). These aren't noisy.


I got nice and downvoted for it too, serves me right.

The noisy / Monte Carlo one is path tracing (which I've been doing since age 15 or something and commercially for over a decade), and that's indeed not what I meant but I guess all the expert gfxcoders at HN have done the efficiency analysis versus rasterisation.

Meh, I always have to remind myself how bad it is here for gfx stuff, might as well have been discussing cryptocurrency...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: