Hacker Newsnew | past | comments | ask | show | jobs | submit | localuser13's commentslogin

With enough layers you will also weed out almost all of the good actors. Normal people are busy and don't have time nor patience to jump over too many hoops to promote their cool new research, or to respond in a thread where someone linked it.

Reddit has more friction to sign up or post while new or low karma.

The main subreddits will basically shadowban you until your account is aged and has more than X karma.


This is why I don’t create a Reddit account or post there: there are so many rules that dissuade new accounts. I don’t even bother to try.

Reddit is fantastic, to me. It's worth the struggle to get past the initial bullshit.

There are a lot of flaws, though. Their appeal system is very broken, for instance.


Which in itself is annoying, IMO. It creates a whole separate set of problems. You need karma, so people post in karma-farming subs to get a few crumbs. Then you get auto-banned from a dozen of the top subreddits preemptively for farming.

Reddit hasn't been as overrun by bots yet, for the most part, although how long they can hold out I don't know.


maybe not overrun by spam, but the amount of bots I see on popular subs is definitely not 0

You don’t have a choice.

We live with GenAI, and the human to bot ratio is now leaning in a different direction. The old norms are dead, because the old structures that held them up are gone.

This idea that theres “more hoops - losing participation” on this thread keeps assuming that the community is unaffected by the macro trends.

It’s weirdly positing that HN posts and users, are somehow immune/unaffected by those trends.


If I understand correctly, they are literally giving things away for free for a 6 months period and we are complaining that they don't promise it stays free forever?

No, you did not understand correctly. They are not “literally giving things away for free”, they are providing a very conditional free trial, which is a business decision and not anything new. Then a commenter speculated they might extend that program because they didn’t say they won’t and I pointed out it doesn’t make sense to assume they will. No one on this immediate thread made any complaint, we’re discussing the facts of the offering.

If you want your code to run, you need a python interpreter that supports the newest of your dependencies. You may not use features that came after 3.6 (though you obviously do), but even if just one dependency or sub-depdendency used a python 3.10 specific feature you now need interpreter at least this new.

That is true, and it is also a huge pet peeve of mine. If more library maintainers showed some restraint it using the newest and hottest features, we'd have much less update churn. But on the other hand, this is what keeps half of us employed, so maybe we should keep at it after all.

Hundreds, this is one of the most famous functional programming exercises (mostly thanks to SICP). Just check https://github.com/search?q=scheme%20compiler&type=repositor... (858 results on github)

Is it? Gemini 3-pro-preview and 3-flash-preview, respectively top2 and top3, had 44% and 37% true positive and whooping 65% and 86% false positives. This is worse than a coin toss. Anything more than 0% (3% to be generous) is useless in the real world. This leaves only grok and GPT, with 18%, 9% and 2% success rate.

In fact, this is what authors said themselves: "However, this approach is not ready for production. Even the best model, Claude Opus 4.6, found relatively obvious backdoors in small/mid-size binaries only 49% of the time. Worse yet, most models had a high false positive rate — flagging clean binaries." So I'm not sure if we're even discussing the same article.

I also don't see a comparison with any other methodology. What is the success rate of ./decompile binary.exe | grep "(exec|system)/bin/sh"? What is the success rate of state-of-the-art alternative approaches?


I set up several email servers over the past 10 years (my personal one, my new personal one, one for my small company), and it worked every time.

I think legends about email being impossible to set up are greatly exaggregated.


And they can send and receive messages no problem? Fair enough, if that's the case. Every time I've revisited the thought to run my own I kept coming back to the idea of having to monitor blacklists to request your own removal.


They do. I honestly believe personal email is not a problem. I use maddy nowadays (instead of suffering through postfix), but I believe it's not maintained anymore and I will switch eventually.

When it gets hard is when you onboard other people onto your email server. I never did this, but a friend of mine hosted a server for a small group of people (~20 members), and at some point we had problems with deliverability. For example people set up automated forwards to their personal gmail. Gmail hated that, since everything - including incoming spam - was forwarded. We also had a "broadcast" email that was equally hated by email servers.

One obvious issue is that maintaining anything is work. Last year my email server was down a total of 15 minutes (because of load problems on the host) and this was the exact time when an important client decided to send me an email (I only know because he complained using another channel). And that was in a summer, when I was on holidays, and I fixed that as soon as I received a monitoring alert, while sitting on a bench in a park using a mobile hotspot. And that was still too slow. People are really used to reliable email nowadays.


I might give it a try, then. It's something I've been interested in for a long time.


I also like to take potshots at Americans, but come on. It's unlikely that a newspaper called "the berliner" in a article about Berlin included this line specifically thinking about citizens of a far-away foreign country who don't use metric units that often.

Occam's razor says that it's actually one of our noble and enlightened European journalists who made that sloppy remark without realising it.


>None of the code is open source

Well, not all, for example mObywatel was recently open-sourced (in a ridiculous way, but still).

I think you raise some important points. In my opinion, a lot of code funded by public money should be open-sourced, but it's not as clear-cut as some people believe. I'll use this comment to point out some of fallacies that people responding to you make:

>Also open source government code means other governments can fork it, overall lowering implementation costs, while still keeping code sovereignty.

This is completely unrelated. French government won't deploy a Polish public health management website just because they found it on Github. For projects of such magnitude you need deep mutual cooperation between both governments, and a lot of changes. Making the code open-source is the least important part, the code can be just shared privately.

In fact, there are many such European code, data and information sharing initiatives. There are meetings and conferences where countries can discuss this on a technical level. The code is shared, just not via public channels.

>The government - and taxpayers - should care that having closed-source software means they are tied to the company that wrote it forever, so changes and bugfixes will be much more expensive.

If a private company owns code used by government for critical purposes and can take the government hostage it's outrageous and taxpayers should riot. This probably happens[1], but most code is either written by government itself, or at least government owns the code and can switch contractors if necessary.

In particular, AFAIR the government code we're discussing right now was written by COI (~central informatics department), which is a public institution.

[1] For example, governments use Azure and GCP, even though - to me - it's clearly shortsighted. Fortunately there was a wake-up call recently, and it changes slowly.


>> Also open source government code means other governments can fork it, overall lowering implementation costs, while still keeping code sovereignty. > This is completely unrelated.

This is an option which does sometimes happen. And there is motivation to make happen more often, at least for EU-wide services. And there is also the side that it's doesn't have to happen between countries, it could be also happen the local level, like between administration of cities in the same country. The main reasoning here is more about spreading awarness and building the mindset that sharing code on all levels and working together even on such internal tools, can be good and should be increased.

> French government won't deploy a Polish public health management website just because they found it on Github.

Some governments have also their own platforms, specifically for co-working on code accross administrations. They are usually not public for reasons.

> For projects of such magnitude you need deep mutual cooperation between both governments, and a lot of changes. Making the code open-source is the least important part, the code can be just shared privately.

You still have to put it under a licence when you are co-working, even when it's shared privatly. Open Source does not neccessaly mean that the source is automatically accessable to the whole world.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: