Hacker Newsnew | past | comments | ask | show | jobs | submit | forgingahead's commentslogin

I truly enjoy how the naming conventions seem to follow how I did homework assignments back in the day: finalpaper-1-dec2nd, finalpaper-2-dec4th, etc etc.

Where does "Smol" come from? It's supposed to mean "Small" right? If yes then what's the etymology and reason for popular usage?


It's just internet speak from the days of tumbler. It usually has cutsie connotations.

Tumbler speak has a bunch of whacky things, notably "chimkin nuggers."


In the specific case of SmolLM, it originates from the meme in this dataset https://huggingface.co/datasets/bigcode/the-stack-smol


> before I realized people thought he is a kook

But what is your own opinion?


I didn’t listen long enough to get to his really fringe stuff.


But why would you call it fringe then, if you didn't hear it yourself?


Hilariously naive comment - "if you have money you should spend it on salaries". This is not a charity, it's a transaction, the salary is paid in exchange for revenue generating activities or supporting activities in a viable business. If the business is only doing 2.7mill in ARR, then it's entirely valid for the whole lot to get laid off.


It’s not about being a charity, the entire point of startups is to trade capital for time to market. If the CEO doesn’t know what to do with 130 people they don’t know what to do with 70.


What are you talking about? How does it follow that "if they don't know what to do with 130, they also don't know what to do with 70"? This is literally capital allocation, the decision was that the people laid off couldn't be deployed effectively within this particular org. But maybe they would be effective elsewhere. Don't take this stuff personally, else the working world will always be difficult.


Tekin's conclusion: "it will send a clear message to the wider Ruby community (and those who may be considering joining it) that the majority does not stand with DHH and his toxic views."

He is going to be ultra surprised to learn what the majority thinks and how it's not what he thinks it is.


what does the majority think then according to you?


Additional context — DHH's latest blog post: https://world.hey.com/dhh/as-i-remember-london-e7d38e64

Using your personal brand to espouse the values of ethnonationalism fundamentally serves the capital class wishing to divide and exploit social order among those who labor. This is so rich, coming from the guy who literally created a tool that increases the value of labor.

So, if I had to guess, the smart, critical thinkers in the _global_ Ruby community might find this whole situation reeks.

If I were an immigrant to the UK and a Rails developer, and DHH is getting re-platformed while saying crazy stuff like this, I would think twice about my career choices going forward — Or, push the Ruby community not to stand with a garbage attitude like this, even if from a BDFL-type personality. I _invested_ my life into promoting the use of your tool, while you disparage me based on skin color and country of origin for the sake of some 'ye olde country' vibefest?

Does DHH even know where his principles lie?


Sad, truly an end of an era. Big thanks to all maintainers!


Yeah, mind sharing any of the scripts? I looked at the docs briefly, looks like we need to install ALL of nemo to get access to Parakeet? Seems ultra heavy.


You only need the ASR bits -- this is where I got to when I previously looked into running Parakeet:

    # NeMo does not run on 3.13+
    python3.12 -m venv .venv
    source .venv/bin/activate

    git clone https://github.com/NVIDIA/NeMo.git nemo
    cd nemo

    pip install torch torchaudio torchvision --index-url https://download.pytorch.org/whl/cu128
    pip install .[asr]

    deactivate
Then run a transcribe.py script in that venv:

    import os
    import sys
    import nemo.collections.asr as nemo_asr

    model_path = sys.argv[1]
    audio_path = sys.argv[2]

    # Load from a local path...
    asr_model = nemo_asr.models.EncDecRNNTBPEModel.restore_from(restore_path=model_path)

    # Or download from huggingface ('org/model')...
    asr_model = nemo_asr.models.EncDecRNNTBPEModel.from_pretrained(model_name=model_path)

    output = asr_moel.transcribe([audio_path])
    print(output[0])
With that I was able to run the model, but I ran out of memory on my lower-spec laptop. I haven't yet got around to running it on my workstation.

You'll need to modify the python script to process the response and output it in a format you can use.


Thanks!


Wish it was on the web app as well!


The HN peanut gallery remains undefeated


System prompts are fine and all, but how useful is it really when LLMs clearly ignore prompt instructions randomly? I've had this with all the different LLMs, explicitly asking it to not do something works maybe 85-90% of the time. Sometimes they just seem "overloaded", even in a fresh chat session, so like a human would, they get confused and drop random instructions.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: