Hacker Newsnew | past | comments | ask | show | jobs | submit | jbandela1's commentslogin

I think the biggest mistake people make when thinking about mathematics is that it is fundamentally about numbers.

It’s not.

Mathematics is fundamentally about relations. Even numbers are just a type of relation (see Peano numbers).

It gives us a formal and well-studied way to find, describe, and reason about relation.


The most commonly used/accepted foundation for mathematics is set theory, specifically ZFC. Relations are modeled as sets [of pairs, which are in turn modeled as sets].

A logician / formalist would argue that mathematics is principally (entirely?) about proving derivations from axioms - theorems. A game of logic with finite strings of symbols drawn from a finite alphabet.

An intuitionist might argue that there is something more behind this, and we are describing some deeper truth with this symbolic logic.


Vast piles of mathematics exist without any relational objects, and not exclusively in the intuitionistic sense either. Geometers say it's about rigidity. Number theorists say it's about generative rules. To a type-theorist, it's all about injective maps (with their usual sense of creating new synonyms for everything).

The only thing these have in common is that they are properties about other properties.


You just said the same thing as GP, but it sounds like you’re trying to argue with them about it.

Perhaps there’s a math formula to describe the relation between your messages’ properties.


But that thing the property is from is called a number, isn't it?

I prefer a more direct formulation of what mathematics is, rather than what it is about.

In that case, mathematics is a demonstration of what is apparent, up to but not including what is directly observable.

This separates it from historical record, which concerns itself with what apparently must have been observed. And it from literal record, since an image of a bird is a direct reproduction of its colors and form.

This separates it from art, which (over-generalizing here) demonstrates what is not apparent. Mathematics is direct; art is indirect.

While science is direct, it operates by a different method. In science, one proposes a hypothesis, compares against observation, and only then determines its worth. Mathematics, on the contrary, is self-contained. The demonstration is the entire point.

3 + 3 = 6 is nothing more than a symbolic demonstration of an apparent principle. And so is the fundamental theorem of calculus, when taken in its relevant context.


To form or even to define a relation you need some sort of entity to have a relation with.

My wife would have probably gone postal (angry-mad) if I had tried to form an improper relationship with her. It turns out that I needed a concept of woman, girlfriend and man, boyfriend and then navigate the complexities involved to invoke a wedding to turn the dis-joint sets of {woman} and {man} to form the set of {married couple}. It also turns out that a ring can invoke a wedding on its own but in many cases, it also requires way more complexity.

You might start off with much a simpler case, with an entity called a number. How you define that thing is up to you.

I might hazard that maths is about entities and relationships. If you don't have have a notion of "thingie" you can't make it "relate" to another "thingie"

It's turtles all the way down and cows are spherical.


A former Wikipedia definition mathematics: Mathematics is the study of quantity, structure, space and change.

Current definition:

"Mathematics is a field of study that discovers and organizes methods, theories, and theorems that are developed and proved for the needs of empirical sciences and mathematics itself."

In order to understand mathematics you must first understand mathematics.


Only mathematics can define objects in a non recursive way. Human language can’t (Münchhausen Trilemma)

> I think the biggest mistake people make when thinking about mathematics is that it is fundamentally about numbers. It’s not. Mathematics is fundamentally about relations.

Eh, but you can also say that about philosophy, or art, or really, anything.

What sets mathematics apart is the application of certain analytical methods to these relations, and that these methods essentially allow us to rigorously measure relationships and express them in algebraic terms. "Numbers" (finite fields, complex planes, etc) are absolutely fundamental to the practice of mathematics.

For a work claiming to do mathematics without numbers, this paper uses numbers quite a bit.


I think of pure math as choosing a set of axioms and then proving interesting theories with them.

Prime numbers are the queens/kings of mathematics though.

This is one of the areas where memorization/deep familiarity with material is important.

Sometimes, when I have a difficult problem, I will spend time reading up as much on the principles of the problem and then go to bed.

Sometimes, I wake up with the answer.


Actually we have the descendants of both with us now and they are roughly the same size in terms of spectatorship

Circus Maximus - Nascar - 250,000 spectators

Coliseum - Football - 50,000 - 80,000 spectators


Brings back memories.

But with a focus more on C++ features.

I started with Visual C++ 1.0. There were no templates.

Visual C++ 2.0 had templates.

There was by Visual C++ 3.0. They went to 4.0 to sync the MFC (Microsoft Foundation Classes) versions.

IIRC, you could use a bit of STL with 4.0

Visual C++ 5.0 was mainly optimizer differences.

Visual C++ 6.0 was actually pretty good. However it lacked partial template specialization. I was a Boost author at the time and lots of Boost code had specific workarounds for Visual C++ 6.

Visual C++ 2002 also had no partial template specialization.

Visual C++ 2003 was the first version with partial template specialization and that could compile all of Boost.

Visual C++ 2005 and 2008 did not have much changes.

Visual C++ 2010 tried to get back to 6 in the IDE (there was a deliberate marketing as such). It also had some C++11 features. But no variadic templates.

Visual C++ 2012 had no variadic templates.

Visual C++ 2013 was the first with variadic templates.

Nowadays, Visual C++’is doing much better tracking the C++ standard and often has compiler and library features before clang and GCC.


https://www.youtube.com/watch?v=ncHmEUmJZf4

Is a fun presentation by Matthew Kulukundis (designer of Google's Hash Table) with Hyrum Wright offering objections from the crowd (Hyrum's law) about the design and rollout of it at Google


I think that is what this article is addressing.

You can never have a thorough protocol that always works.

Doctors need to be trained in the limits of protocol.

This is why the humanities are important. Doctors should not just be unthinking executors of protocols. Trained human intuition, expertise, experience still matters. Knowledge of the human factors still matters.


Crucially, I don’t think the humanities are the solution here. I’d rather have a doctor that follows best practices than one who follows the supposed benefits of a humanities education — in particular I object to the idea that a doctor would follow their intuition when there is always an established best practice. The humanities are valuable, but don’t think it solves the problem statement laid out in the article.


Bioethics fall under the humanities!


A lot of it is just "being human", and you don't need separate college education for that.

Trouble is, too many doctors have internalized the ideal of being efficient robots instead.


There are lots and lots of different ways of "being human" - an infinity of ways, really, but most fall into broadly recognizeable patterns. The Humanities, properly understood, are the study of "being human" - which involves both the way you experience this (which, yeah, for most people is a learned behavior: it's hard to get outside your own perspective and evaluate your own experience), and also the way others' perspectives influence them.

There's no "just" about it. (It's like saying "Facebook is just a CRUD app, right?" - which from one point of view might be literally true, but's hardly relevant to any of the problems Meta has to solve.) Much like tech, humanities are a path of life-long learning, for which a college course of study can be (though isn't strictly necessary as) a helpful starting point, but is hardly adequate.

Efficient robots, though: yes. Many who work in tech have also internalized that mind-set.


> Several years ago, I was involved in a case that illuminates the difficult position many doctors today find themselves in. The patient was pregnant, close to delivery, and experiencing dangerous declines in her baby’s heart rate. She had been on a blood thinner, which kept me, the anesthesiologist, from placing an epidural in her back. She also had strange airway anatomy, which would make it a struggle to put her to sleep quickly if an emergency cesarean section became necessary. I advised the obstetrician to perform an elective cesarean section now, in advance, while we had good working conditions, and not to wait for an emergency, where time is of the essence, and where the delay needed to induce general anesthesia might seriously injure the baby.

I am a doctor and that scenario scares me. This has a very high likelihood of stuff hitting the fan and you need to think about your plan when it does.

You want stuff to hit the fan during daytime when everyone is around. In this case, during the day surgery is around, ENT, around, other anesthesiologists all of these can rush in if needed to help you secure an airway. You also have the neonatologists around.

If it happens in the middle of the night, the staffing will be much reduced and you won’t have as many resources available.

One of the most important things to learn as a doctor is when algorithms and guidelines actually apply to the current situations.

“Life is short, the art long, opportunity fleeting, experiment treacherous, judgment difficult”

- Hippocrates


When I had a stat section late at night and the nurse who was circulating the case (for the non-medical, a “circulator” is a nurse whose job is to get whatever is needed to make the surgery happen smoothly) didn’t know how to hook up the Glidescope (um, the best airway-securing device ever) while I’m trying to mask-ventilate a full-term patient and save the baby (you don’t want to mask-ventilate highly pregnant patients; their stomachs empty slowly and they are at high risk for vomiting and then inhaling it; you want a tube straight past their vocal cords so that the lungs are protected), I went to the nurse manager on the next regular day and said that not knowing what a Glidescope is and how to set it up was an unforgivable lack of knowledge. I don’t directly blame the nurse; she was thrown into a situation she had not been trained for. I blame those who didn’t teach her before putting her on night shifts with very few other nurses around.

“This is a chance to do this case electively, in a controlled manner, in a situation in which Bad Things are monumentally more likely to occur. At noon, I can have all the help in the world. At two AM, it’s me, and I only have two hands and one brain.”

As I have said in codes before, I’m eventually out of ideas, so if you have one that we haven’t tried yet, talk. I will not judge you as dumb. I may not do it, but I will listen and consider it seriously before making that call.


Ho bíos brakhús, hē dè tékhnē makrḗ, ho dè kairòs oxús --- O βίος βραχύς, η δὲ τέχνη μακρή, ὸ δὲ καιρὸς οξύς


Ah, it's Romanized Greek. Yes, I've forgotten most of the Classical Greek I took in college but I'm not as stupid as I sometimes feel.

https://en.m.wikipedia.org/wiki/Ars_longa,_vita_brevis


Apart from networking, the hourglass design is common throughout

Electricity is an hourglass. Coal plants, solar panels, gas turbines, wind turbines, nuclear power plants all produce electricity. This is then consumed by electric cars, computers, washing machines, etc.

LLVM IR is an hourglass. Many compilers and languages produce LLVM IR. This is then converted to many different instructions sets.

I think if you want many-to-many non-coupled relationships, you will end up with some sort of hourglass design eventually.


POSIX is also an hourglass, right? It creates expectations on the part of apps of how the OS is interfaced to and expectations for how an OS is shaped for POSIX-compliant apps to interface to it.

Details may vary, but that baseline makes it much easier to, for example, have emacs on Windows, Mac, and every flavor of Linux under the sun.


Interfaces are the narrow waists of potential hourglasses: many consumers, many producers.

Any market is an hourglass too.


I personally find software development to be enchanting and am often thankful to have a job like this.

I get to make in the comfort of my home creations where I am only limited by my knowledge, imagination, and attention. I can experiment easily and explore completely different designs with an ease that any other discipline would be envious of. And then I can allow millions or billions to have access my creation with ease.

That seems pretty enchanting and almost god-like and magical to me.


Seems as good a place as any to ask: how do I get back to this state?

I used to feel exactly this way, but after 3 years now slinging code for money I’ve become very disillusioned. Anyone have any tips for finding the magic again?


Yes I also went through this state.

There are a couple of unlocks:

Do you remember how cool building software was when you were starting out? The only problem then was you couldn't. Well now you can, most people don't realize that. You can create the software for your loved one, or friend, or yourself that you always wanted.

The thing is to take it slow. Use a language you always wanted to use. Change that color scheme. And when you type, do so slowly, cherishing each keystroke.

We take so much for granted. The fact that writing text can make atoms move about and transistors turn on or off millions of times a second is just astonishing. You are taking part in humanity's most sacred rituals. Give it it's due and soon you will have the love come back.


Write a lecture about best practices to avoid burnout.

Computer science is such a new human endeavor that there is so much uncharted territory about what the future of training the next generation looks like.

Obviously, some pretty bad habits led to the massive amount of burnout I am seeing amongst computer scientists.


Find another profession and make programming a hobby instead.

Some fortunate people can make their professional life also their passion and derive fulfillness from it, but not everyone can be like that. For everyone else, work makes damn near anything miserable even if it's something you otherwise love.

You loved programming before you made money from it, and you hate programming now that you make money from it. The problem is clearly money and you need to decouple it from programming.

Yes, I'm aware changing your line of work is more easier said than done, but that's the only proper way of resolving your conflict in my opinion.


i did this, i get much more enjoyment from computing and tech without all the tedium, and i can dream of projects i want to work on (and sometimes do them :p) - only for my own enjoyment ofc.


start with creating a smol indie game you yourself would want to play


> after 3 years now slinging code for money I’ve become very disillusioned

.. with hardware/software or the systems of humans which create them?


Both. It feels even now like we're just monkeys sitting on a tower of cards swinging around tools we don't understand. There are a lot of very smart people that have built the software and hardware we have today, and lots of smart people coming in to replace them; but the majority of engineers are not that, and (myself included) largely pretend at being wizards while in reality we are all the sorcerer's apprentice.

The systems we've built for society are even worse. For every story about how tech is improving lives, you hear another two about yet another horrible dystopian usecase for the same technology, and it's never hard to imagine a dozen more.


> We are all the sorcerer's apprentice

Back in the mists of time at BigCo internship day1, the incoming cohort of interns was briefed by outgoing cohort, without managers present. One of their pearls of wisdom was that a great manager (and by extension, team) could lead and defend an oasis of excellence in a desert of industry detritus. Today, this dynamic can also be seen in serial founding teams and spinoff projects.

If time is more spiral than linear, then cyclical patterns are not merely swings of a pendulum, but a recursive opportunity for redemption or relapse. In that worldview, merely waiting would be enough to experience a different set of local maxima. But human actors with agency can move to new contexts instead of waiting for cyclical change, or work with peers to spark new cycles.

Stargate/Fringe/Eureka posit multiple universes and versions of ourselves, each universe further having multiple timelines. If it were possible for us to see an infinite tapestry of futures, would we be invigorated or paralyzed by choice? Looking in the other direction, we have access to the near-infinite, unfinished work of our ancestors, some of whom deserve the sorcerer moniker.

One way to gain perspective on local minima/maxima is to step back to a different scale of time or space. The classic film "Powers of 10" [1] offers a spatial perspective. There are history of technology books which span decades to centuries, showing recurring patterns of both good and bad. One tiny example is the decoding of the Maya language, which was unsolved for centuries, then cracked by an art history teacher on vacation.

There is a 1930s sci-fi book [2] which spans 2 billion years and many generations of humans going through cycles of technology and social structures. The successor book has an even longer timeframe. As a thought exercise, it leaves one with the impression that everything has already happened before, and will happen again, but slightly differently in future spirals of time and space. Where does that leave mortal humans prioritizing action within finite lifespans?

If we can't find a sorcerer from whom to learn, we can choose which sorcery we want to teach. Be the change.

> For every story about how tech is improving lives, you hear another two about yet another horrible dystopian usecase for the same technology, and it's never hard to imagine a dozen more.

If one looks into the history of tech, they can find initially benevolent tech that was twisted into dystopian purposes. But there are also examples of tech designed for one purpose, later adapted to positive use. Even hostile social media can be tamed by manually curated lists of non-hostile writers. With the proliferation of open-source, it has never been easier to Embrace & Extend, or harder to choose a direction.

If you have the space/time/tools to build something tangible that fuses software with the physical world, it can combine the dream-catcher inspiration of software with physical constraint. If lacking inspiration, recreate a pioneering tech demo or meet a hyper-local need, then extend that foundation.

[1] "Powers of 10", https://www.youtube.com/watch?v=0fKBhvDjuy0

[2] https://en.wikipedia.org/wiki/Last_and_First_Men

[3] "Only those questions that are in principle undecidable, we can decide." --Heinz von Foerster https://news.ycombinator.com/item?id=8018832


Get a hobby


All hail the Faustian screen ;)


What is really interesting is looking at the meta analysis cited in the Vox article:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3932762/

This reaches the conclusion that beta blockers are harmful. However, if you look at the meat analysis, specifically figure 2, you find that the conclusion is mainly driven by a single trial - the 2008 POISE trial.

If you go to the POISE trial: https://www.thelancet.com/journals/lancet/article/PIIS0140-6...

You find that they discovered fraud in at least some of the hospitals:

" Concern was raised during central data consistency checks about 752 participants at six hospitals in Iran coordinated by one centre and 195 participants associated with one research assistant in three of 11 hospitals in Colombia. On-site auditing of these hospitals and cases indicated that fraudulent activity had occurred. Before the trial was concluded, the operations committee—blinded to the trial results at these hospitals and overall—decided to exclude these data (webappendix 1). "

We have an important question - should pre-op patients be given beta blockers - and the largest, most definitive trials to answer that question have at least some taint of fraud.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: