I understand that you like zero values as an alternative to initialization of objects as nil like Java does it. I think the parent comment explains pretty well what the issues with this approach are. Taking Rust as an example (because that's what I'm most familiar with), it's possible to simply enforce that variables are initialized explicitly or that safe constructors are provided, avoiding all the nil safety issues.
I'm currently in uni and most courses are 3-credits meaning 45 hours of lectures per semester + 90 hours of tutorials, assignments, and studying.
Those 45 hours of lectures are usually condensed material with little to no time to practice. It's expected that you practice during the other 90 hours (and on your own time if you plan on having straight A's).
While you may get more hands-on experience in a few months of working full-time, you usually learn much fewer concepts.
All of those hours of practice are merely covering what was already gone over in lecture, only it is largely unguided, and you're not really expected to correct your mistakes and do it right once the assignment is graded and handed back to you.
Once you get into a job, you're constantly revising past mistakes, doing new things and all the while you have coworkers who are helping you- they don't want to wait for you to make a mistake, they want to help you get it right the first time if you need the help.
Uni courses rarely cover real-world knowledge that you will use on the job. Aside from some specialized jobs, most of what they teach you is either too low-level or mostly useful as background knowledge. So many practices aren't covered in college courses- even things as simple as version control have only recently started to become common.
You're going to be learning a lot on the job, and at a decent job what you learn will make what you went through in college pale in comparison.
> you're constantly revising past mistakes, doing new things and all the while you have coworkers who are helping you- they don't want to wait for you to make a mistake, they want to help you get it right the first time if you need the help.
I genuinely don't think most of workplace actually reassemble this ideal. Sometimes you learn ... plenty of times you do something repetitive. Sometimes you don't even learn about own bugs (hello agile). And sometimes they give you great advice and plenty of time they just don't.
In the U.S., MedlinePlus [0] seems to be the best repository for vetted medical information. It's government-backed and links to the relevant organizations where appropriate.
1. I think it's good enough for concurrency. A big selling point of linear types is that, since there can only be a single reference to a linear value at any time, concurrency is trivial. You just send linear values through other threads on a channel, and they're consumed (from the point of view of the sending thread).
2. The set of types is divided into two universes: the Free universe (i.e. unrestricted) types like bool, int, records and unions containing other free types; and the Linear universe, containing linear types. So non-linear types are allowed and are the default for anything that's not a resource with a particular lifecycle (i.e., anything other than memory, file handles, socket handles, that kind of thing).
3. References are possible and they work like borrowing in Rust.
4. No refcounting or GC, it's done at compile time like in Rust.
I'm curious to know how much bias there is against smaller languages.
From my experience reading job postings, a lot of job postings for jobs where you may work with more "niche" technologies list more common ones beside what they are looking for and expect you to learn the technology during onboarding.