Hacker Newsnew | past | comments | ask | show | jobs | submit | fishtoucher's commentslogin

I think I have a fairly similar experience to the author. Different in some respects, I'm not aphantasic, but I resonate strongly with the lack of autobiographical memory and feeling like an observer in my own history.

I can't describe how many times someone's asked me what I did last weekend and I told them that I did just had a quiet one at home, then later I'm reminded I actually went skiing or something notable that I completely forgot. Not just passing conversations with strangers either, this happens with family just as much.

I take a more pessimistic view on how it affects my life than the author does though. He handwaves the downsides even saying that he '... had learned my lessons from them [trials in university], even though I forgot how they unfolded.' I understand the desire to think that but I'm not sure how you can really justify it. The brain definitely learns to compensate in other ways and I have really unbalanced cognitive tests I took as a kid as evidence, but memory deficiencies are undoubtedly a disadvantage.


This is uncanny, I was going to write almost this exact comment. I've been told mine is due to a deficiency in working memory, which can then lead to the brain not converting to long term memory. something that ADHDers present commonly with.


I'm in the opposite camp - I also have poor working memory, but instead I have extremely good episodic memory. Good enough I have to remind other people about our shared experiences routinely. I never claim to have eidetic memory but I've only met a couple other people who have memory like mine (one is my mother, so that's kind of cheating)

It's interesting though because I associate with a lot of what the article is talking about with spatial and knowledge memory too. I often have to remember where something was to "step into" the memory again.


Depression causes similar memory impairments.


> and I told them that I did just had a quiet one at home, then later I'm reminded I actually went skiing or something

This is wild!


It's a ridiculous and extreme example but it actually did happen. I was going skiing quite often at the time though so maybe it didn't have the sticking power it would've normally.

In general, without some kind of trigger my ability to remember specific episodes is nonexistent. The OP talks about this too but those times when you have to give 3 fun facts about yourself or when someone asks my hobbies are moments of of existential dread because I genuinely have no idea what to say.

I realised recently that there are whole years of my life I'm basically missing when I think back. Like I know where I was living, what job I was working, and my general circumstances, just not anything specific I actually did.


I have come back from week long international trips, having arrived by plane on Monday or Tuesday, and by Thursday someone will ask me what I did last weekend, and I'll draw a blank, and the social pressure of not having an answer and not wanting to pause for a whole minute to exercise my memory to reconstruct the last week will make me say something like, "nothing much, you?". It's that by the second or third day of being back in my routine, I forget what it was like outside of that routine, feel so immersed in it day-to-day, so unless I remember that I did have a disruption I won't really remember anything...


Fuchsia may not be outright dead, but it's definitely on life support and would've been killed a long time ago if senior people at Google weren't personally backing it. It had great foundations but without a concrete use case or product development was constantly pulled in different directions. It seemed like every year a new niche for Fuchsia was on the horizon, 6 months of development time would be dedicated to it, an extremely hacky demo would get the public hyped up, and then the whole thing would be abandoned because it didn't make any business sense. Starnix, for example, has been completely deprecated. There was even a newer system to replace it which also got cancelled.

* My knowledge is a couple years old at this point and I haven't kept up with recent developments so maybe the future is brighter than I think.


To wit, starnix has never been cancelled. Source: I work on fuchsia.


So the hope is that Starnix can emulate Linux syscalls well enough, while gVisor has been abandoned in later Google Cloud stuff because it couldn't emulate Linux syscalls well enough. Uh-huh.


Several operating systems successfully provide a Linux emulation mode. gvisor has very different constraints and requirements. It's also still heavily used and under active development so I'm not sure how you determined it is abandoned.


Newer generations of Google Cloud services run Linux VMs instead of gVisor.


I work on Starnix and I've never heard of anything meant to replace it. What are you talking about?


They might be thinking about POSIX Lite losing favor


Posix lite didn't lose favor. It's still an important part of writing fuchsia native software. It enables us to use the c++ and rust standard libraries with minimal upstream changes. It was never meant to enable running all existing programs, only lowering the barrier. There isn't really much software that has been ported to run on fuchsia natively. Instead runners are implemented or ported and those provide the environment applications require. For instance a flutter runner, web runner (chromium), and starnix (a Linux runner of sorts) provide the basis of running many existing applications.


Right, that is the current status

But the historical perspective is that Starnix is a relatively recent addition to Fuchsia. Even though Fuchsia is roughly 10 years old now, Starnix has only been useful for about 2 years (RFC 4 years ago)

Before Starnix came along to help run Linux apps, as you said, “There isn't really much software that has been ported to run on fuchsia natively”. Because POSIX Lite wasn’t / isn’t being used much. So I guessed the OP could have been thinking about that. But who knows.


This merging of semantically different concepts into the same language feature was the biggest mistake of Go in my opinion. The gains in simplicity are entirely offset by the losses in correctness and comprehension.

The worst example is pointers since they're used to represent both references and optional values. It's often very difficult to know which meaning a function is using at first glance.


> The worst example is pointers since they're used to represent both references and optional values.

using a pointer as an optional value is an anti-pattern. correct idiom is to return a tuple:

    type positive struct{}
    
    func new_positive(i int) (*positive, bool) {
       if i >= 0 {
          return positive{}, true
       }
       return nil, false
    }


A new operating system done right gets engineers salivating and a lot of very senior staff have personal attachment to Fuchsia.

Personally I think it's a lost cause; every few years they change direction in hopes of finding a concrete use case that doesn't displace some existing Google project. The Nest was Fuchsia's greatest achievement and even that was held together with strings and glue behind the scenes.


>Neurotypical people actually get a kind of high from amphetamines This is intellectual dishonesty to the point of nausea. Amphetamine is very well understood as far as drugs go and affects those with and without ADHD in the same way. Neurotypicals also experience increased focus at therapeutic doses, just ask anyone who's taken it as a study aid, and ADHD-sufferers also experience a high from the flood of dopamine. The most damning evidence for me is the absolutely obsessive relationship many ADHD patients have with their medication. It's immediately obvious and unlike any I know of.

Society (American society in particular) has just decided that the medical benefits of amphetamine outweigh the risks for people with ADHD vice versa for those without.


Isn't there evidence for neurotypicals only _thinking_ they are focussing better and actually not? Also note that this 'high' you are talking about would likely only apply at higher doses than ADHDers actually get.


I think you are generally correct- but just because someone disagrees with you, or hasn't seen the same information you've seen, doesn't mean it is intellectual dishonesty.

There are good reasons why this seems to be true even if it isn't... you take a hyperactive person with ADHD, because they have poor executive control of motor function, and give them a therapeutic dose of a stimulant, suddenly they can control motor function. It appears on the surface to be almost exactly the opposite of giving someone a high stimulant dose. They appear to be opposite responses, but in fact are the same type of response... it's just that the ADHD person is regaining a level of executive control that the non-ADHD person already had anyways.


No, what I want is a higher salary and lower tax.


What will you cut from government spending to get the lower tax rate that won't affect anyone negatively?

I know you didn't say this, and I'm not trying to put words into your mouth, but I'd rather have higher taxes and a better safety net than lower taxes and a look-after-number-one dog-eats-dog world.


Not OP and not necessarily in favour of blanket lower taxes:

Dramatically push up capital gains taxes on non-speculative investments like land ownership, housing and superannuation (capital movement must be slowed here)

Australia has too much unoptimised capital appreciating in these “non-functional” assets which should be moved into more “functional” investments

——————

I use “functional” in the sense that certain asset types have potentially emergent utility over others eg: investing a growing business or working-age person has a higher probability of high or non-linear growth over linearly appreciating assets like bonds, stocks or land.

Businesses and people “do” things invariant of their quantitative value, while stocks, land and bonds are just fancy ways of dressing up bearer bonds/instruments

Our economic system needs to reemphasise the importance of systems which provide utility over a baseline static capital to just syphon into the shortest path to profit

——————

Disclaimer: I’m not an economist and don’t claim to be


As an Aussie that has spent significant time in the USA - have a look at what lower taxes have done to the quality of life in the USA.

You DO NOT want Australia to be like the USA, trust me.


Australians are getting an income tax cut next year, so that's nice.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: