Hacker Newsnew | past | comments | ask | show | jobs | submit | krtkush's commentslogin

I wonder if it makes sense to do the nand2tetris course for an absolute beginner since it too has compiler creation in it.

I highly recommend nand2tetris to everyone. For me, nothing ever explained the whole domain from logic gates and inner workings of a CPU to compilers better than this course.

I think it's worth mentioning Gustavo Pezzi's lectures at pikuma.com. The one on "Digital Electronics" and the one on "Interpreters & Compilers" really helped me.

On a side note, why is imrozim's comment dead? What in the world is wrong with it? It's perfectly fine IMO.


nand2tetris only requires programming ability at the level of someone who's taken freshman level CS IIRC.

You could take Harvard's CS50 and then tackle it.


Someone, claiming to be from the Raycast team, in the original HN thread said that they were not aware of the advert or were involved with it in any manner.


For a deep dive into the topic, I will recommend the book - Means of control by Byron Tau [1]

[1] https://www.penguinrandomhouse.com/books/706321/means-of-con...


..and It’s Actually Worse Than You Think


It is not a cartoon, it is an interpretation of the position in Bauhaus style.


maybe BauHaus style is cartoony


They need to fix Mac OS first. It’s one of the worst OS I have ever used - apps keep crashing, random UI/UX glitches and bad decisions overall.

I’ll probably ditch Mac if this degradation continues.


Are you primary using electron-based apps, or true native macOS apps?

Maybe I’m lucky but I run macOS daily without any problems.

(Yes, there’s annoying fit/finish issues in the UI - but no issues with stability)


> Are you primary using electron-based apps, or true native macOS apps? Maybe I’m lucky but I run macOS daily without any problems.

There’s an in-between abomination — Catalyst based apps from/by Apple (quickly migrated from iOS to macOS). Reminders, Notes and others are downright unnavigable and unusable with a keyboard and are so, so terrible in their UX. It’s a shame that Apple hasn’t spent any effort in fixing those and making them true native macOS apps.

For the last several years, there has been nobody at Apple who has good taste and a deep and committed interest in UX.


Most Apple apps are somewhat bad nowadays. It largely defeat the marketing/purpose of the "ecosystem" because the 3rd party stuff doesn't necessarily integrate the "special sauce" (like sharing for passing stuff around). So if you end up just running 3rd party apps that are just some web app wrapper or custom implementation UI it begs the question of even using Apple hardware. Yes it's top of the line, but it is also very expensive at any given level of performance.

They are just milking their media/dev niches at this point and mostly caters to the common denominator with low expectation for premium prices.

If you gotta run Chrome, Microsoft Office, Google Web Apps and the likes it doesn't feel worth it. Meanwhile the indie app market is insane with expensive subscription for utilities that are basically free elsewhere.

And I lowkey hate what iOS has become. Convoluted and unpredictable. Now ugly as well.


I don't think so. There might be some, but definitely not a majority.

My biggest complaint is with Firefox - it works fine on my older Mac, but crashes on Tahao and only works after a system restart.


Same here. While Liquid Glass might be a bit distracting, I don't remember the last time I had an app crash. It's been quite a while.

26.1 fixed a lot of the buggy/laggy feeling too.


Same here (except when switching branch in a repo with Xcode open…)


Yes. The impossible to disable system services (photoanalysisd and friends) are an abomination of software design.


Try "killall -STOP photoanalysisd", this will pause the process instead of killing it (which would result in restarting it by launchd). You can unpause it by using "-CONT".


I use MacOS daily on different machines and don't have that experience. I also manage many Mac's and I don't hear people reporting this kind of instability to me.


The article says this will apply to macOS as well.


How does one start acquiring skills like these?


Spending a lot of time debugging code. Eventually, the pattern recognizer in your brain will pick out the bugs. The term for this is "code smell".

For example, when I'd review C code I'd look at the str???() function use. They are nearly always infested with bugs, usually either neglecting to add a terminator zero or neglecting to add sufficient storage for the terminating zero.


It is crazy that anytime someone works on application layer and wants to manipulate string, which is a very, very common thing to do when writing application, one has to consider \0 which would be an implementation detail.

How can that language still be so popular?


Programming is the consideration of implementation details. When you manipulate strings in C you consider the terminating nul byte just like when you manipulate strings in Python you consider how its stores codepoints or when you manipulate strings in Swift you think about grapheme clusters. There is no free lunch. (Though, of course, you can get reduced price lunches based on the choices you make!)


Pardon my ignorance, since I don't know C, but is it true to say that the length of string "Foo" is greater than 4 because of the null terminating byte? Or maybe there is no concept of string length? I could see this getting annoying since Foo is three chars long, you would assume it's length is 3, but we could be speaking of the actual length of bytes, in which i assume it is sizeof(char)*3+1 i.e. the sizeof(char F, char o, char o)+1nullbyte


The string length in C is "whatever number of bytes are there between the beginning of the string and the first \0 character". That's different from "how much memory is being used by this string" because you usually allocate a bigger buffer.

The length of the string "Foo", when properly terminated, is 3. The minimum number of bytes needed [1] to represent that string properly is 4 (3+'\0'). The actual number of bytes used by that string is whatever you asked for and received when using "malloc".

[1] Assuming ASCII and 1-byte characters.


strlen("Foo") == 3 but you need 4 bytes to store it.


The language is just fine. The real question is: Why do people not use a string library that abstracts this away safely?


Oh, people tried. Every C programmer tried it. I tried multiple times. They all failed.

Back when I was musing about what D would be like, I happened across some BASIC code. I was drawn to the use of strings, which were so simple in BASIC. I decided that D would be a failure if strings weren't as easy to use as in BASIC.

And D strings turned out to be better than I'd dared hope!

I proposed an enhancement to C to get much of that benefit, but it received zero traction in the C community. Oh well.

https://www.digitalmars.com/articles/C-biggest-mistake.html


Why does the language not make one?


because at that time, C creator didn't know thing would evolve into the future. after all computer is a new thing


Ok, but the question asks why one isn't made today.


There are many string libraries.


As you can expect, the answer to your question is the obvious one.


I do not think it is obvious or trivial question. I think the problem is mostly that there is no money for enhancing the C ecosystem and educating people about possibilities. The cooperate money goes into random new things.


I think most of the money goes to new languages that have a better strings story, yes.


C was popular because, if one is familiar with assembler, it takes about an hour to become adept at programming in it.

It's also an easy language to write a compiler for. At one point I counted over 30 C compilers available for DOS.


Okay, I want to make a desktop app that runs on Linux. Which language should I use? Java?


Some current trendy options would be Kotlin (with Kotlin Multiplatform) or C# (with Avalonia UI).

Edit: I guess I should've at least asked myself if the question was rhetorical.


My problem with "crossplatform" GUIs that run on Linux is that they aren't made to run on Linux desktop, they are made to run on Android, iOS, Windows, macOS, and finally Linux desktop.

All I want is a menubar, a toolbar, a statusbar, and some dialog windows. I don't want fading transitions when I click a tab.

It's crazy that I'm forced to write header files just to have a menubar.

Zig 1.0 can't come soon enough.


Wouldn't Qt or GTK be good for this, then?

Or... https://quickshell.org/ ?


Whatever you do, please do not use a language that makes it difficult to provide security updates: https://www.debian.org/releases/trixie/release-notes/issues....


That questions is kind of the point I want to make. We live in 2025 and C is still an option for new applications, i.e wrong abstraction layer for application level development.

No doubt there are valid reasons to use it, that is just the state of things they are unfortunately.


Because whatever language you think should be popular instead is running on a mountain of C code, but the reverse isn't true.


The D implementation and runtime library has zero C code in it.


And when you run that compiler implementation, what language family was used to implement the OS and kernel it's running on, the firmware you're using etc?

That's what I meant, not that self hosted compliers don't exist.


Lots of C applications nowadays don’t actually use any of the str functions or null termination.


I get the feeling these kind of skills are very rare because they fall in the category "understanding and debugging other people code/mess", while most people prefer to build new things (and often struggle to debug their own work).

It takes a lot a passion and dedication to security and reverse engineering to get there.


Practice, and having supernatural perseverance (although probably not in that order)

I'd guess the curriculum is half reverse engineering and half reading any write-ups to see the attacks and areas of attack for inspiration


By reading and keeping up with the published work in browser exploit development, replicating it yourself, and then finding you have a knack for spotting vulnerabilities in C++ code.



Read the blogs of the guys creating the bugs.


> I hadn’t even realized that Apple had rolled out Time in Daylight

Unfortunately, it is not the most accurate. If you wear full sleeved clothing which cover your Apple Watch, it will report wrong numbers.


Yeah unfortunately you have to have the screen exposed for a few minutes each time you go outside.

I spoke to the team at Apple, they said as long as you have a couple of minutes in the sunlight while you're on a walk it will then use sensor fusion to count the rest of the time outside.


As an Indian living in a western European country - I very much prefer the gray/ neutral colors here. I always found the excessive and ugly use of color in India overwhelming. Though, I agree, a bit of more colors in winter wear would be nice.


I have a similar setup with Tailscale and Nginx Proxy Manager.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: