Yes a human can hack together a compiler in two weeks.
If you can't, you should turn off the AI and learn for yourself for a while.
Writing a compiler is not a flex; it's a couple very well understood problems, most of which can be solved using existing libraries.
Parsing is solved with yacc, bison, or sitting down and writing a recursive descent parser (works for most well designed languages you can think of).
Then take your AST and translate it to an IR, and then feed that into anything that generates code. You could use crainlift or whatever it's called, you could roll your own.
Afaik the Linux Kernel strongly depends on GCC extensions and GCC specific behavior, so maybe that's why this is such an interesting part? Also extensions like inline assembly seem wildly complicated to add to an existing compiler WHILE replicating the syntax and semantics of another compiler (which has a different software architecture).
> Parsing is solved with yacc, bison, or sitting down and writing a recursive descent parser (works for most well designed languages you can think of).
No human being writes a recursive descent parser for "Linux Kernel C" in two weeks, though. And AFAIK there's no downloadable BNF for that you can hand to an automatic generator either, you have to write it and test it and refine it. And you can't do it in two weeks.
Yes yes, we all know how to write a compiler because we took a class on it. That's like "Elite CS Nerd Basic Admission". We still can't actually do it at the cost being demonstrated, and you know it.
So did most of us, join the club. What you can't do is write such a compiler for $20k if you want to put food on the table, or do it in two weeks (what it costs to buy your time currently until AI eats your job). And let's be honest: it's not going to build something of the complexity of Linux either. Hobby compilers run hobby code. Giant decades-old source trees test edge cases like no one's business.
exactly and in case of free software it is not even competition with financial incentive and (not always) so many projects can live long without a good output because of this. i think many people do not appreciate the usefulness of 'non-useful' things
The US already provides publicly accessible conjunction avoidance data based on data points they have. They don't have the same number of satellites in the sky to make real time observations in as many different directions though.
> so, these systems should have existed for decades now.
Dubious. Perhaps if Congress could be persuaded to invest in tons of radio telescopes / radars, positioned all around the world, but good luck with that. The space-based approach used by SpaceX is something that presently only SpaceX is equipped to implement. Tracking star conjunctions only gives you high quality data on space debris / satellite maneuvers if you have a huge net of star trackers in orbit, and that's something which only SpaceX has been able to do.
I fully agree. Not reading your own references should be grounds for banning, but that's impossible to check. Hallucinated references cannot be read, so by definition,they should get people banned.
Like when you only need a single table from another researcher's 25-page publication, you would cite it to be thorough but it wouldn't be so bad if you didn't even read very much of their other text. Perhaps not any at all.
Maybe one of the very helpful things is not just reading every reference in detail, but actually looking up every one in detail to begin with?
If you can't, you should turn off the AI and learn for yourself for a while.
Writing a compiler is not a flex; it's a couple very well understood problems, most of which can be solved using existing libraries.
Parsing is solved with yacc, bison, or sitting down and writing a recursive descent parser (works for most well designed languages you can think of).
Then take your AST and translate it to an IR, and then feed that into anything that generates code. You could use crainlift or whatever it's called, you could roll your own.
reply