Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

(Article is from 2023, so the title should be updated to say "32 years ago", or something)

The biggest loss in TUIs is the latest wave of asynchronous frameworks, which bring the joy of dropped keypresses to the terminal.

In any TUI released before the year 2000, if you press a key when the system wasn't ready, the key would just wait until the system was ready. Many TUIs today still do this, but increasingly frequently (with the modern "web-inspired" TUI frameworks), the system will be ready to take your keypress, and discard it because the async dialog box hasn't registered its event listener yet.

Other than that antipattern, TUIs are doing great these days. As for terminal IDEs, Neovim has never been more featureful, with LSPs and other plugins giving all the features this article discusses. I guess it isn't a mouse-driven TUI, so the author wouldn't be interested, but still.



Yes. Back in the DOS days, and even before, when people used actual terminals, there was a keystroke buffer. You'd see people who really knew the interface fly through tasks being multiple keystrokes ahead of the UI. Stuff would just flash onto the screen and disappear as it processed the input that was already in its buffer. It should be possible to implement this with modern frameworks, but it requires thought.


Yeah. I used to work as a phone surveyor, the one you hate. Our software is a terminal connected to a mainframe. I got used to it after a few weeks and was very productive.

Costco Canada vision shops still use a terminal connected to an AS/400 machine as I snooped around last month.


In the late 90s I was required to slowly replace dumb terminals with PCs. One of the older ladies taking phone orders was most put out by this, understandably. She was lightning fast on that terminal. She'd never used a PC (I hit on the idea of using solitaire to learn to use a mouse, which worked amazingly well), and was never able to get to the same speed with one as she'd done on her dumb terminal. It's hard to beat the performance of dedicated devices.


While I agree that dedicated devices can be more efficient than Windows-style user interfaces, and even more so than browser-based user interfaces, many people don't use those modern interfaces in efficient ways.

I have observed countless times how many people fill in a field, than move their hand to the mouse to move the focus to the next field or button, than move their hand back to the keyboard, instead of just pressing tab to move the focus. It's painful to watch. Knowing just a few keyboard shortcuts makes filling in forms so much faster.

Things are getting worse, unfortunately. Modern user interfaces, especially in web interfaces, are made by people who have no idea about those efficient ways of using them, and are starting to make it more and more difficult to use any other method than keyboard -> mouse -> keyboard -> mouse -> ... . Tab and shift-tab often don't work, or don't work right. You can't expand comboboxes with F4, only the mouse. You can't type dates, but have to painstakingly select all the parts in inefficient pickers. You can't toggle options with the spacebar. You can't commit with enter or cancel with esc.


It's for this reason that I dream of us going back to keyboard-first HCI. I wish the underlying BIOS could easily boot and run multiple operating systems simultaneously and there were keys that were hardwired to the BIOS to switch out of whatever GUI crap you were in to the underlying "master control mode".

I wish we'd made better correspondence between the GUI and the keys on the keyboard. For example, the ESC should always be top-left of the keyboard and every dialog box should have an escape that basically always does the same thing (go back/cancel) and is wired to the hardware key. Instead of drop-down menus at the top of the screen, we could have had pop up menus at the bottom of the screen that positionally correspond to the F1-F12 keys.


I recall reading somewhere that the entire point of solitaire (at least the original implementation that came with windows 3) was to teach users how to click and drag, so I'm not surprised that it was good for teaching your colleague how to use a mouse


An inventory management app was one of my first paid software engineering projects. Sometime in early 00s I had to rewrite it for Windows because the ancient DOS codebase had a bunch of issues running on then-modern Windows versions. I sat down with the users and watched how they were using the DOS version, including the common patterns of keyboard navigation, and then meticulously recreated them in the WinForms version.

For example, much of the time would be spend in a search dialog where you had a textbox on top and a grid with items right below. In the TUI version, all navigation was with arrow keys, and pressing down arrow in the textbox would move the focus to the first item on the grid. Similarly, if you used up arrow to scroll through the items in the grid all the way to the top, another press would move the cursor to the textbox. This was not the standard focus behavior for Windows apps, but it was very simple to wire up, and the users were quite happy with the new WinForms version in the end.


The world needs more of this. It is nowadays rare for programmers to sit down with users and observe what they are doing. Instead we have UX designers designing the experience and programmers implementing that.


It is so frustrating that I’m not good enough to create software for myself. Maybe I should just buckle up and start working on that.

I use an iPhone and found a lot of usability issues. Some apps such as Stock are perhaps not too difficult to recreate in Obj-C. I’m kinda old timer so prefer Obj-C even that I don’t know anything about it.


The sit down with users part seemed to be the most crucial one. Sadly nowadays developers of such software perhaps are not even in the same continent, and zoom talk can’t do this easily.


In my own little world, I saw this first with mail and news readers. It was fast and simple to read mail and news with pine and tin: The same keystroke patterns, over and over, to peruse and reply to emails and usenet threads.

As the network ebbed and flowed, email too-often became unreadable without a GUI, and what was once a good time of learning things on usenet became browsing web forums instead. It sucked. (It still sucks.)

In the greater world, I saw it happen first at auto parts stores.

One day, the person behind the counter would key in make/model/year/engine and requested part in a blur of familiar keystrokes on a dumb terminal. It was very, very fast for someone who was skilled -- and still pretty quick for those who hadn't yet gotten the rhythm of it.

But then, seemingly the next day: The terminals were replaced by PCs with a web browser and a mouse. Rather than a predictable (repeatable!) series of keystrokes to enter to get things done, it was all tedious pointing, clicking, and scrolling.

It was slow. (And it's still slow today.)


I saw this at an airport. Took the same plane twice, one year apart, in between they had replaced the terminal by a web UI. First trip it took 15 seconds from the hostess (well into her 50s) to find my booking and print my pass. Second trip (on the web UI), it took 4 hostesses to team up for something that felt like 5 good minutes to do the same thing.


Costco still uses AS/400 company-wide for their inventory system I think


Interesting. Looks like it suits them perfectly. I wonder if the AS/400 is running in an emulator or on a real machine.


I doubt it, probably just running on a regular Power ISA rack mount server from IBM. Though I guess technically all IBM i aka AS/400 is running on an emulator.

https://en.wikipedia.org/wiki/IBM_i#Technology_Independent_M...


Nope, we still have an IBM i deployment kicking around at $DAYJOB, it's running natively on POWER hardware. Way back in the days of the original OS/400 running on AS/400 hardware, IBM had the foresight to have applications compile to MI (Machine Interface) code; which is a bytecode format closer to something like LLVM IR instead of something like JVM or CLR bytecode. When a PGM object is copied or created on an IBM i system, TIMI (Technology Independent Machine Interface) takes the MI code and translates it to a native executable for the underlying platform.

We probably still have a couple of PGM objects kicking around on our modern POWER hardware that were originally compiled on an old AS/400 system, but they run as native 64-bit POWER code like everything else on the machine.

The IBM midrange line gets a lot of undue disgust these days, it's not sexy by any means, sure, but just like anything running on modern day Z/OS you know that anything you write for it is going to continue to run decades down the line. Well, as long as you limit the amount of stuff you have running on 'modern' languages; because Java, Node, Python, Ruby, etc. are all going to need upgrades while anything written in 'native' languages (RPG, COBOL, C/C++, CL) compiles right down to MI and will keep working forever without changes.


In some ways the IBM mainframe line is an amazing piece of engineering. My understanding is that the emulation layers can even emulate hardware bugs/issues from specific lines of long-dead equipment so that ancient code (that was written to take these issues in mind) will still function as expected.


Nitpick:

The Machine Interface dates back to AS/400's predecessor, the System/38.


Thanks, I was desperately trying to remember because I swore there was something beforehand, but It's been a very long time since I did the reading.


As far as I know, there are no AS/400 emulators.

It's still updated by IBM and runs on POWER. It's just called "i" now.

I believe the naming went something like AS/400->iSeries->System i->System i5->i


It's funny that they keep renaming it and everyone still calls it AS/400. I remember when they wanted people to call it iSeries but everyone just still used AS/400. I didn't even know about the others you posted and I still use the AS/400 occasionally.


Fun story: When I worked at blockbuster I had my computer access revoked and summoned to explain because a colleague told management I was “hacking” when they saw me doing this on the computer system.


Makes me wonder if that’s where the TV trope of a hacker flying through screens faster than you can see came from


Was that still on the VMS-based blockbuster video system?

Weird question, but I accidentally ended up with one of those in my hands that ran in probably non-blockbuster place from 1996 to 2000 :)


> You'd see people who really knew the interface fly through tasks being multiple keystrokes ahead of the UI.

I remember.

This, unfortunately, killed people: Therac-25. Granted, the underlying cause was a race condition, but the trigger was the flying fingers of experts typing ahead, unknowingly having been trained to rely on the hardware interlock present in older models.


> This, unfortunately, killed people: Therac-25. Granted, the underlying cause was a race condition

So it didn't kill people, something else was that cause


I'm not trying to shift blame to the operators here, but in the absence of flying fingers, nobody would have died. Many, many, people received the right treatment in the Therac-25 machine.

Also, the author of the buggy software had no idea it would be used to operate a machine without a hardware interlock as, AFAIR, it was not modified prior to being used with the Therac-25 model.


Remember the venomous, desperate BEEP! when the keystroke buffer was full. (Or was it when pressing too many keys at once?) Like a tortured waveform generator constantly interrupted by some higher-priority IRQ. Good times.


The keyboard buffer size was something like sixteen keystrokes. This was bad news if you noticed your input wasn't working and you needed to press CTRL + whatever to quit the program since the buffer was full and unable to accept the CTRL + whatever. Instead it had to be CTRL + ALT + DEL.

Three decades later I learn that there were utilities to make the keyboard buffer bigger. But, in those days before search engines, how was I to know?


Yes! That phenomenon drives me crazy. I used to be able to use a computer at warp speed by staying ahead of its responses with chains of rapid keyboard shortcuts etc. Now it's like I'm trying to sride through molasses.


Windows maintains both a synchronous and asynchronous key state. The async one gives the result of the state of keys in a polled fashion, and the other the state as applied by messages as you pump the the Win32 message queue (it's in sync with respect to messages you have observed from the message queue)

https://devblogs.microsoft.com/oldnewthing/20041130-00/?p=37...


Rough quote: "in 1984 we had at my house",

so even 41 years seems to be in the scope.

I was expecting

- early projects that ended in Visual Studio 1.0 or NetBeans soon after, (2 to 9 years too early for them)

not

- "vim (1991) was not out yet" (not-a-quote, but my feeiling upon looking at ncurses instead of floating windows)


I snickered a little because I know Visual Studio didn't have a version 1.0. Wikipedia identifies the first version as Visual Studio 97, which was at version 5.0. I remember before that there was "Microsoft Developer Studio 4.0" which came out around Windows 95, and could run on 95 or on NT 3.51. There was a Visual C++ 1.0 and a Visual Basic 1.0 released at different times. Meanwhile there were also the workhorses, Microsoft C and MASM. In those days, Borland and Watcom were real competitors to Microsoft for C and C++.


Yeah, by 1995, Visual Basic / C++, Delphi / Borland C++, and Symantec C++ were all-conquering.

A few years before, it was very different - VisualAge and Rational Application Developer were the big names in the early 90s in "professional" IDEs. Interface Builder for university spin-outs or funky startups (and SunWorks / Forte Studio for the less-funky ones). CodeWarrior on the Mac (perhaps with THINK! hanging on too). I think Softbench was popular for scientific software, but I never actually saw it myself.

And then just a few years later, the rise of Java turned things upside down again and we got Jbuilder, Visual Cafe, & NetBeans as the beginning of yet another new wave. The Visual Studio suite really began to take off around then, too.

In short, the 90s were a time of huge change and the author seems to have missed most of it!


An all-in-one like Rational Rose may be making a comeback in terms of these agentic AI projects, because now you actually can turn a spec into code without layers of tagging and UML.


I wasn't paying attention to when 30 years ago actually was...

So disappointing to expect a GUI Smalltalk System Browser and seeing DOS TUIs.

And then delight recalling Turbo C/Pascal and MS C 4.0 with CodeView that even worked in 43 or 50 line modes.


Yes, me too, I was expecting either Smalltalk or LISP machine GUIs.

Having said that, some old TUIs were clearer and faster even on weaker hardware. This should be a lesson for us today. Color transitions and animated icons flying over the desktop are NOT what I need, but speed, clarity, and discoverability of more rarely used functionality are vital.


May 1988 -- Smalltalk/V 286 -- on IBM-PC, PS/2 or compatible, with an 80286 or 80386

"INTRODUCTION TO THE SMALLTALK/V 286 ENVIRONMENT"

http://stephane.ducasse.free.fr/FreeBooks/SmalltalkVTutorial...


That was my introduction to Smalltalk.


ditto

So much better than the TUI Smalltalk/V


Methods.


Yes. I'd seen Methods a year earlier and chose to make a prototype with Lotus 123 instead. Then Smalltalk/V 286 became available.


Anyone else on here recall IBM VisualAge for Smalltalk -> VAST, or Cincom Smalltalk?


When people love an IDE product so much that they can't work without it, they have overspecialised to their detriment. And possibly to the detriment of the code itself.

> As for terminal IDEs

The GNU/Linux terminal is the killer app. Multiple terminals in a tiling window manager is peak productivity for me. (Browser in a separate virtual workspace.)

And modern scaling for a big display is unbeatable for developer ergonomics.


> When people love an IDE product so much that they can't work without it, they have overspecialised to their detriment.

I think you are wrong.

https://en.wikipedia.org/wiki/Muscle_memory

Being extremely good at something increases the gap between said something and everything else. That doesn't mean being extremely good at the first thing is "over-specialization to detriment". If someone is equally mediocre at everything, they have no such gap, so no "over-specialization to detriment"; but is that really worth desiring? I think not.


> Being extremely good at something increases the gap between said something and everything else.

You're also potentially over-specializing at one level while at the same time neglecting other levels.

Musicians run into this problem when, for example, they rely solely on muscle memory to make it through a performance. Throw enough stress and complicated music at them and they quickly buckle.

Meanwhile, a more seasoned performer remembers the exact fingers they used when drilling the measure after their mistake, what pitch is in the bass, what chord they are playing, what inversion that chord is in, the context of that chord in the greater harmonic progression, what section of the piece that harmonic progression is in, and so forth.

A friend of mine was able to improvise a different chord progression after a small mistake. He could do this because he knew where he was in the piece/section/chord progression and where he needed to go in the next measure.

In short, I'm fairly certain OP is talking about these levels of comprehension in computer programming. It's fine if someone is immensely comfortable in one IDE and grumpy in another. But it's not so fine if changing a shortcut reveals that they don't understand what a header file is.


What if the IDE is a LeapFrog 2-in-1 Educational Laptop


If you make usable products that solve problems for others from that then it’s a great IDE…


Why is it to their detriment? It's not like they're stuck with it forever. "Can't work without it" is really "won't work without it because they prefer installing it over going without."


As someone that started when only rich people could afford GUIs, I don't understand what is killer app about it.

We used text terminals because that is what we could afford, and I gladly only start a terminal window when I have to.


The killer thing about it is that it is a gateway to the shell, all the command line tooling and the best cross-platform UI.


Xerox PARC, Atari, Amiga and many others had shells, without needing to live on a teletype world.

It is only cross platform as long as it pretends to be a VT100.


It's not about needing to live in a teletype world, it is about how language/text is just a better interface for a general use computer. Computers primary feature is that they are programmable and an interface that allows you to take advantage of that is superior to one that doesn't. The programmable GUIs all failed to gain traction (smalltalk and like), that left the shell (and maybe spreadsheets) as the best UI for this. Though as AIs mature we might see a shift here as they could provide a programmable interface that could rival shell scripting.


The reason why GUIs became so popular so quickly after they were introduced is because text is not "just a better interface for a general use computer".

Like OP, I remember the days when command line was all you had, and even then we used stuff like TUI file managers to mitigate the pain of it.


But GUIs never took off as a UI for a general purpose computer, they became the UI for application on a general purpose computer. For them to be the former requires them to be programmable. Smalltalk is the best/most-famous example of a Graphical UI for a general purpose computer I can think of...

The main point is that for a general purpose computer the UI needs to integrate programming. Programming is how you use a computer. The shell (text) is currently the primary UI that inherently allows programming.


CLI is also specific to apps in practice, and I don't see any obvious difference between scripted CLI and scripted (with the likes of Active Scripting or AppleScript, or for that matter Tcl etc) GUI apps.


The difference is that you don't use Active Scripting, AppleScript, TCL, etc. as your primary UI. The shell is a script-able UI.


Is a modern phone a general purpose computer?

What kind-of UI does a modern phone present?


A modern phone is not a general purpose computer. They are proprietary, locked down devices. Appliances.


"The PinePhone is a smartphone that empowers users with control over the device. It is capable of running mainline Linux, features hardware privacy switches, and is designed for open-source enthusiasts."

Perhaps I simply failed to see your definition of "a general purpose computer".

Please say what rules must be passed to meet your definition.


Great that Microsoft, Apple and Google are on the right path then, with AI voice controlled and gestures OSes.


> I gladly only start a terminal window when I have to.

Exactly so. I am perfectly able to work entirely in a text/CLI world, and did for years. I don't becase I don't have to. I have better, richer alternative tools available to me now.

It was very odd to join Red Hat briefly in 2014 and meet passionate Vi advocates who were born after I tried Vi and discarded it as a horrible primitive editor.


Good luck writing Java with notepad.


Tons of people did that but with nvi/vim and calling javac by hand.


We did that back in 1996, however the sentiment applies to most languages.

Example Notepad versus Turbo C++ described on the article.


Was literally a thing in some colleges.


[flagged]


People should also stop using terminal emulators. It is pretty silly to base software around ancient printing terminals. Everyone knows for a fact that only tech illiterates use a console instead of a GUI. Since all great devs use a GUI. Just a fact.

Also, people should stop playing 2D games. It is pretty silly to base your entertainment on ancient technology when modern GPUs can render super-complex 3D scenes.

And don't make me start on people who still buy vinyl...


Current GPU's can't compete with my brain 'rendering' a Slash'em/Nethack scene with my pet cat while I kick ass some foes with my Doppleganger Monk full of Wuxia/Dragon Ball/Magical Kung Fu techniques.


Honestly hard to disagree with your first point even though it's sarcasm.

It's still quite easy to end up with a terminal you need to reset your way out of (eg with a misguided cat), not to mention annoying term mismatches when using remix/screen over SSH, across OSes, or (and this is self inflicted) in containers.


Completely disingenuous. Stop the snark.

For UI there exists a straight up superior alternative, which keeps all of the benefits of the old solution. Neovim is just straight up better when used outside of a terminal emulator.

What is true for TUI vs. GUI is not true for CLI vs. GUI (or TUI for that matter) pretending the argument I made applies to the later is just dishonest. You can not replace CLI interfaces adequately by GUI or TUI interfaces, you can totally replace TUI Interfaces by GUI. See neovim as an example. It is superior software when used outside of the terminal.


Maybe on paper. But the snappy low-latency feel of TUI apps in the terminal is a joy, and unequaled in GUIs.


>Maybe on paper. But the snappy low-latency feel of TUI apps in the terminal is a joy, and unequaled in GUIs.

This is not true at all. Terminal emulators are GUIs, the TUI is just another layer on top of that GUI. Using a TUI will always introduces additional latency, depending on the quality of the terminal emulator.

I do not know what GUIs or TUIs you are using, but my KDE Apps are all extremely snappy.


TUIs are the best cross platform apps. They run on all the major and minor platforms in general use. GUIs cannot compete with browsers being the next closest thing. They can be integrated with the shell and also work perfectly well remotely w/o issues. TUIs are superior in many ways to GUIs and have a place in the ecosystem.


> TUIs are superior in many ways to GUIs and have a place in the ecosystem.

There's another reason you don't mention.

Consistent UI.

TUI apps can (and in the Windows world usually do) use the same keyboard controls, derived from IBM CUA, as their GUI equivalents do.

This is why I use Tilde in the Linux shell: the same commands work in it as in Pluma or Leafpad or Mousepad or whatever: Ctrl+O opens a file, Ctrl-X/C/V to cut/copy/paste, Ctrl+N for new, etc.


TUIs do not even run the same across terminal emulators.

It is a total joke to call something which depends on how the underlying terminal emulator interprets specific ANSI escape sequences "multi platform".


Most of my work is done on remote machines. Nothing beats tmux+tuis in this paradigm.


I rather stick with RDP, or browser based workflows.


They are fine, however RDP requires more bandwidth and most of the stuff I run is terminal commands anyway.

Company I work for has a great browser based IDE but that’s something I would never setup and maintain for a personal project.


Modern terminals do color just fine-- 24 bit color support has existed since 2010-ish, and been mainstream since 2015.

There's nothing wrong with graphical IDEs... or text user interfaces. Great developers use both. Low effort troll is low effort.


+1 - crap code can come out of notepad / emacs / vi or IDE-flavor-of-the-day or even the AI code sausage maker. Testing, specification, knowing what you are building and why still matters.


Agreed, we used TUIs because we couldn't afford anything better on MS-DOS, CP/M, 8 bit home computers.

People on better systems like the Amiga and Atari were already past that.


Vim was born in Amiga and Amiga OS came with some Emacs clone.


I surely don't remember such clone.

As for where Vim was born, hardly matters, it was someone with UNIX culture background, that happened to own an Amiga.


> I surely don't remember such clone.

I think they mean MicroEmacs. Despite its name, it was not Emacs, but it had Emacs-like keyboard shortcuts, multiple buffers, and macros, which was quite neat for a free 1986 application on a home computer.


I guess that is it, thank for the memory refresher, and to be more precise, MEmacs.


Amiga OS 3.1 has it under the Workbench floppy sets. You get it by default.


Amiga 500 was shipped with AmigaOS 1.2 in 1987, Amiga OS 3.1 was released in 1994, almost at the end of the commercial life of Amiga.

As the sibling comment points out, MicroEmacs isn't really Emacs.

Also Emacs history is older than UNIX, and overlaps with Lisp Machines.


SSH comes to mind.


How so? I use remote machines all the time, why would I need a TUI for that? VSCode and zed support editing on remote machines and the machine drives are also mounted on the local machine? What purpose would any TUI have? What even are the potential benefits?

Right now I can use the exact same software I use on my local machine. Can you give me any reason why I should consider anything else?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: