Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Just to be clear here: yes, he is using a 32-bit 72 MHz Cortex M4 and a 12 MHz Cortex M0 as an IO controller for an 8-bit 8MHz cpu. It also appears that the ram for the Z80 is coming out of the 64KB built into the M4.

Even the MCUs built into the screen and SD card are probably all more powerful.

This is awesome.



It is awesome. I had a Z80 machine running CP/M (and then Z/PM) in the 80's and when SCSI hard drives became a thing I remember hooking up a 20MB one to the Z80 (max hard drive size 8MB so it showed up as 3 volumes :-) and noted that the computer system on the disk drive was much more powerful than my Z80 system. It was an interesting inversion but one which resurfaces in computation a lot, where a base class for something might be quite complex but the controlling class relatively simple. In the software world you can write 'hello_world.c' in like 5 lines of C code but the amount of code that actually has to run on a Linux machine to make that work is quite extensive.


The SCSI controller I got for my Amiga 2000 had a Z80 on it.

I've come to see IBM compatible PC's of the 80's and up until some point in the 90's as a curious aberration in that until GPUs became common, they were the systems most likely to "just" have a single CPU and no co-processors outside of really stripped down low end home computers.

E.g. an 8 bit home computer with just a tape deck might have had just one CPU. But like for your Z80 machine, for many you got a second CPU as part of the package the moment you added a hard drive or even floppy drive (the 1541 floppy drive for the C64 for example, which was a full 6502 based computer that you could download programs to over the serial bus). And many others had CPUs all over the place (the Amiga 500 and 2000 I had used 6502 compatible cores in their keyboard controllers, for example...)

The PC first "caught up" in the early 90's, and finally got where we are now, where I have x86 servers at work with dozens of non-x86 CPUs in things like hard drives and on controller cards.


Interesting viewpoint. I always thought of the Amiga as catching up when they threw away the funny custom chips and went full PowerPC! - part of the problem with the later models being the chip bandwidth bottleneck (as was related to me by an Amiga programmer) and of course the dreadful expense for Commodore of keeping everything up to date. But you're right about the ARM CPUs everywhere these days. I suppose we're back there again today.

So, now we're back where we were, I wonder if the cycle will repeat. After all, even if we didn't get Larrabee, we did get the Intel HD3000.


Dropping the custom chips didn't happen in any Amiga. AmigaOS runs on PowerPC after years of wrangling over rights following the bankruptcy of Commodore, at which point the IP was severely outdated and nobody really knows who owns all the rights anymore anyway.

The PPC transition happened because there had been a number of 3rd party PPC co-processor cards for classic Amiga's (that let you run code both on m68k and PPC), so there was already a viable, reasonably popular target to port AmigaOS to.

At the time of the Commodore bankruptcy, Commodore was actually not going towards PPC but PA-RISC coupled with a new set of custom chips ("Hombre") that dropped planar graphics for chunky pixels and included 3d acceleration. Dropping M68k was more out of necessity because Motorola failed to get the performance needed out of the 68040 and 68060 and the next generation was outright cancelled, so there was no alternative but to transition, but there were never any plans to stop using custom chips.

> part of the problem with the later models being the chip bandwidth bottleneck (as was related to me by an Amiga programmer)

Chip bandwidth was a bottleneck for "everyone". Commodore was dealing with that by including VRAM in the next generation chip designs. But the Amiga was more vulnerable in this respect because a lead in multimedia was essential to the Amiga image. Lots of people bought PCs even without graphics cards. People still started stuff from DOS at the time of the Commodore bankruptcy. Windows 3.x was not something people bought for the graphics and animation.

But nobody would buy an Amiga without good graphics performance.

That the planar graphics that were the default for the Amiga until '92 also hampered efficient 3D was a much bigger deal.

> the dreadful expense for Commodore of keeping everything up to date.

Commodore actually never invested much in engineering. It was one of the ongoing stories of missed opportunities. They got extremely lucky breaks time after time again due to a series of extremely talented people delivering fantastic products at just the right time, and it finally caught up with them. Jack Tramiel was famously tightfisted, and after he was ousted Commodore had a bunch of managers who to the outside seemed to care very little about the company, and even less about engineering. What kept costing them money on engineering was a complete lack of focus: Repeated re-designs. Products that were canned right before release. Management edicts to make changes way too late in product cycles.

> After all, even if we didn't get Larrabee, we did get the Intel HD3000.

Sure, but they don't represent a single general-purpose CPU core. It's inevitable we'll get more functionality on-chip as yields for larger dies increases. But what matters is whether we continue to see parallelisation and off-loading vs. a seeing single core performance start to rapidly rise again (I wouldn't bet on the latter).


And the wheel turns: http://www.catb.org/jargon/html/W/wheel-of-reincarnation.htm...

It's been going on for a long time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: