Hacker Newsnew | past | comments | ask | show | jobs | submit | dmacfour's commentslogin

For me it's the soundtracks to Deus Ex (basically all the games), Mr. Robot, and Halt and Catch Fire.


That is so specific, I can't believe there is someone else out there that flows to Deux Ex. I've had whole sessions with Human Revolution on repeat. You might also like some synthwave: https://www.youtube.com/watch?v=_Gajv2yJt5M The Moebius FM and Frequency channels have many nice long mixes.


Check out the most recent comment about the paper on OpenReview. This doesn't seem like isolated behavior:

https://openreview.net/forum?id=tO3ASKZlok


I have a background in ML and work in software development, but studied experimental psych in a past life. It's actually kind of painful watching people slap phases related to cognition onto things that aren't even functionally equivalent to their namesakes, then parade them around like some kind of revelation. It's also a little surprising that there no interest (at least publicly) in using cognitive architectures in the development of AI systems.


> The transformer architecture absolutely keeps state information "in its head" so to speak as it produces the next word prediction, and uses that information in its compute.

How so? Transformers are state space models.


There's an absurd amount of astroturfing in discussions about AI. Especially on Reddit.


grounding


"There are two cultures in the use of statistical modeling to reach conclusions from data. One assumes that the data are generated by a given stochastic data model. The other uses algorithmic models and treats the data mechanism as unknown. The statistical community has been committed to the almost exclusive use of data models. This commitment has led to irrelevant theory, questionable conclusions, and has kept statisticians from working on a large range of interesting current problems. Algorithmic modeling, both in theory and practice, has developed rapidly in fields outside statistics. It can be used both on large complex data sets and as a more accurate and informative alternative to data modeling on smaller data sets. If our goal as a field is to use data to solve problems, then we need to move away from exclusive dependence on data models and adopt a more diverse set of tools."

-Leo Breiman, like 24 years ago

Machine learning isn't the native language of biology, the author just realized that there's more than one approach to modeling. I'm a statistician working in an ML role and most of the issues I run into (from a modeling perspective) are the reverse of what this article describes - people trying to use ML for the precise things inferential statistics and mechanistic models are designed for. Not that the distinction is that clear to begin with.


Agreed wholeheartedly. I have argued with the VP of our department about this paper quite a few times.

I feel like Breiman sets up a strawman that I've never encountered when I work with my colleagues that are trained in the statistics community. That doesn't mean it didn't exist 25 years ago when he wrote it. I concede that we are sometimes willing to make simplifying assumptions in order to state something particular, but it's almost like we've been culturally conditioned to steep everything we say with every caveat possible.

Whereas I am constantly having to point out the poor feedback we've had about some of the XGBoost models despite the fact that they're clearly the most "predictive" when evaluated naively.


This is largely my feeling as well.


You’re setting up a dichotomy where there isn’t one. Measuring the impact of changes isn’t any less necessary if you hand UI work off to genAI, if anything it’s more important.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: