Hacker Newsnew | past | comments | ask | show | jobs | submit | westernpopular's commentslogin

> It also links to a Telegram exporter.

Telegram has this natively already


Indeed! That was unexpected for me.


For what it's worth I do the exact same thing (although not on Mac so I haven't encountered this bug), so not too niche I suppose!


I tested it in a bit more and realize that for me it always happens with PDFs (maybe only PDFs) if I select multiple lines.

Out of curiosity if you go to https://legacy.python.org/psf/contrib/contrib-form/contribut..., select the top two lines, and do the CTRL L V A C taps does it copy the URL or the text?

Also found a related bug report: https://bugzilla.mozilla.org/show_bug.cgi?id=1770723


This bug was originally on MacOS but now reported for all platforms. So I guess this is my issue to follow :)

https://bugzilla.mozilla.org/show_bug.cgi?id=1817233

Who knows maybe one day I'll look into fixing it. Until then I guess I'll just keep using my bash script with raycast that removes newlines and then pastes.


Bemoaning people maximizing terminals - what a weird hill to die on.


Nobody ever talked about beginnings. The question is, even if the universe always was, why does it exist? It wouldn't be impossible for there to be nothing and never have been anything.


It's an abstraction. If there was no beginning there was never the possibility of nothing. Just because you can conceive of a question doesn't mean it is valid.


Why does this only talk about lyrics?


Well that's probably because Indian accents are not present as a category and everyone is forced into the US/UK/AU categories?


Could be. But still one would expect the Indian accent to lean towards the UK's because of the colonial past.


> To be able to predict accurately sentences that make sense, GPT-4 must have a internal way of representing concepts, such as "objects', "time", "family" and everything else under the sun.

[citation needed]


That's more or less exactly what these models do by design, otherwise the predictors/estimators for the next token in a sequence would fail spectacularly. These models aren't just plucking random tokens out of a bag and ordering them based on some brute force large-scale memory lookup or whatever. There is an internal representation of the tokens in a more meaningful sense. That sense, however, is limited to the statistical/mathematical framework upon which the models are built. It's a huge (in my opinion completely unjustified, wishful thinking exercise) leap to call it "reasoning."


While one might well argue over whether they "must have" this, it's clear that they do end up with internal models of the world. Even quite literally:

https://twitter.com/wesg52/status/1709551516577902782



How can any system use a word such as "time" in a way that makes sense without representing the word? It definitely has at least one representation: the binary ASCII code 01110100011010010110110101100101.


Reminder that Replit's CEO bullied an open-source maintainer into killing his independent project: https://news.ycombinator.com/item?id=27424195


Fun fact: the unusual way of saying eighty in French ("four twenties") actually came about through the local contact with Celtic languages! I'm linguistic terms this is called "substratum influence"


You can use https://www.composerize.com/ to see what a `docker run` command would look like as a docker-compose file


Wow I didn’t know! Thanks


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: