• Ran my first Linux at home on a i486-DX2 (33 MHz, 4 MB RAM), which supported a decent X11/R6 performance in color in 1992, with a 14" CRT.
• Ran my first real UNIX at home on a PA-RISC (HP 9000-715/75 with HP-UX 9.03 and 96 MB RAM) in 1997, 20" color CRT.
• Today, Linux is still here, but on a 2-CPU, 140-core AMD server with 2 TB RAM, hundreds of TB NAS and a 40" TFT... (and it still takes too long to open the bloated Web browser!)
I remember trying to run a game, Rise of the Triad, which was built with an improved Wolfenstein engine iirc, and having it struggle on my 386 unless I made the viewport as small as possible. At which point it told me to buy a 486... well I did eventually, I guess it worked.
Had the same experience with Doom II. Got it to run surprisingly well on a brand new Tandy 486DX2 + 4MB RAM, though I seem to recall having issues with SoundBlaster compatibility.
The 486 killer app was DOOM. It was butter-smooth at 20 fps if you also had a VLB graphic card.
The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.
The DX/2 66 is a true legend of a chip. It was so good. The final nail in the coffin for the Amiga and for 68k. I love the Amiga, but it just didn’t Doom.
Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.
Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.
I remember arguments (and benchmarks) around all the variations of the 486 since the bus speed/clock speed was uncoupled (the /2 is clock doubling). For some applications, a 50Mhz 486 with a 50Mhz bus would beat a DX/2 66Mhz with a 33Mhz bus.
And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.
Especially since when actual clock quadrupled chips eventually came out they had to call themselves ridiculous things like ”5x86” instead of DX/4. (The Am5x86 133 runs at 4x33 MHz)
As I noted in my other comment (1), in 1985 Amiga OCS bitplane graphics (separate each bits of a pixel index into separate areas) was a huge boon in 2d capability since it lowered bandwidth to 6/8ths but made 3d rendering a major pain in the ass.
The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.
> The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
The intention was good, but the Akiko chip was functionally almost useless. It was soon surpassed by CPU chunky to planar algorithms. I don't think it was ever even used in any serious way by any released games (though it might have been used to help with FMV).
Ah, I was under the impression that it had a native chunky mode but it was a built-in C2P routine? Anyhow, seems it was useful (1) when running on stock CD32's but not in conjunction with faster machines.
Which brings me to my pet peeve, the already slow 68020 (680ec20) at 14MHz was crippled by, even though it had a 32-bit bus, was only connected to a 16-bit RAM bus. (Chipram.)
This 16-bit memory (2 megs) is also where the framebuffer and audio lives, so the stock CPU in A1200 has to share bandwidth with display signal generation and the graphics and audio processing.
All-in-all, it meant the Amiga 1200 had only about twice the memory throughput of the Amiga 500. (About 5 megabytes/s vs about 10 megabytes/s)
If the A1200 had at least some extra 32-bit memory (it existed as a third party add-on) the CPU could have had its own uncontested memory with a troughput of about 20-40 megabyte/s.
Imagine the difference it would have made if the machine had just a little extra memory.
That's just a tiny detail. That the chipset wasn't 32-bit was another disappointment.
The bigger problem was that Commodore as a company was aimless.
Yeah, and it took ~7 years to make those marginal improvements over the earlier Amiga chipset! I'm ignoring ECS, since it barely added anything over OCS for the average user.
The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
I agree. Unfortunately, even with chunky graphics and/or 3D foresight, 68k would still have been a dead end and Commodore would still have been mismanaged into death. It’s fun to dream though…
Was it necessarily a dead end? Considering the ways Intel and later AMD managed to upgrade/re-invent x86 that until x64 still retained so much of the x86 instruction encoding/heritage (heck, even x64 retains some of the instruction encoding characteristics).
Had the Amiga retained relevance for longer and without a push for PowerPC I don't see a reason why 68k wouldn't have been extended. Heck the FPGA Apollo 68080 would've matched end of 1990s P-II's and FPGA's aren't speed monsters to begin with.
The 68060 is pretty good to be fair, but it never ended up being widely used and Motorola definitely saw PPC as the future.
Maybe if these theoretical new 68k Amigas became a huge market hit they could have taken the arch further and it could have remained competitive, but all the other 68k shops had already pretty much given up or moved on already (Apple was already going PPC, Sun went SPARC, NeXT gave up on their 68k hardware, Atari was exiting the computer business entirely, etc) so I don’t know that the market would have been there to support development against the vast amount of competition from both the huge x86 bastion on one hand and the multitude of RISC newcomers on the other.
The argument is that 68k is "CISCier" than x86, the addressing modes in particular, so making a performant modern out-of-order superscaler core that uses it would be harder than x86.
I believe in that. But Commodore could have plunked a cheap 68020 in their machines for backwards compatilibity (like how MSX2 had a SOC MSX1 inside, PS2 had a PS1 SOC, PS3 had a PS2 SOC, and so on) and put another "real" socketed CPU as a co-processor. Or made big-box machines with CPUs on PCI cards, for infinite expansion options. "True" multitasking, perfect for CAD, 3D rendering and non-linear video editing. It would have been very cool with an architecture where the UI could be rendered with almost hard realtime and heavy processing happened elsewhere.
There were no tech problems IMHO, it was all mgmt problems. They could have chosen a handful of completely different (edit: mutually exclusive even!) tech paths and still have won, but instead they chose to do almost nothing except bleeding the company dry.
Edit: I don't mean that their success was certain if they executed better. I mean they did almost nothing and got the guaranteed outcome: failure. (And their engineers were brilliant but had very little resources to work with.)
Commodore so slowly and ineffectually improving on the OCS didn't help, but the original sin of the Amiga was committed in the beginning, with planar graphics (i.e., slow and hard to work with, even setting aside HAM) and TV-oriented resolutions/refresh rates (i.e., users needing to buy a "flicker fixer"). It's like they looked at one of the most important reasons for the PC and Mac's success—a gorgeous, rock-solid monochrome display—and said "Let's do exactly the opposite!"
Iirc interlaced display and 6 bitplanes were a compromise to allow color graphics in 1985 with the memory bandwidths available at the time.
If it's a sin or feature can of course be debated but I remember playing games on an Amiga in the early 90s and until Doom the graphics capabilities didn't look outdated.
By 1992 with AGA however I agree, flicker and planar graphics(with 8 bitplanes any total memory bandwidth gains were gone) was a downside/sin that should've been fixed to stay relevant.
The original Pentiums (socket 4, 60 or 66 MHz) had the infamous floating point division bug, had underwhelming perf for anything not FP bound (most things), ran hot, and were too expensive for what you got. A DX/4 100 was nearly always a more rational choice.
Second gen Pentiums, starting with the 75 MHz, were great.
Yeah, it does alright and is a significant difference to a DX/2, but Quake came out in ’96 and the P60 came out as a super expensive workstation class CPU in ’93. If you were a gamer in ’96 it is unlikely you were rocking a P60 because it was not ever good value for money.
Doom was released end of '93. In 1992 most of us were in the 286 -> 386 upgrade wave and a 486-33 was easily at $2.5k+ ($5.5k in today's terms). The 486 DX2 66 was a good choice even 1994-1996.
Yes, the latest chips were very expensive back then, and out of reach for most people who would continue buying new computers with older chips. (As opposed to how most people today buy an iPhone or a Mac or whatever with the latest semiconductor technology.) I got my 25MHz 386 in 1991, over two years after the 486 was announced, and I had one of the fastest computers of anybody in school... for a short time.
Funny I'm working with intel 686 right now brutal to get stuff to build eg. rust/cargo related (missing deps but mostly the hardware, slow). Recently trying to fix this maturin problem I ran into. But it is cool the backwards compatibility of python 3.11 to 32bit with debian 12
The CPU I'm working with is Celeron M 900MHz single core no HT struggling to build wheels for python (several hours)
Hard to imagine now, but this was a huge turning point. A genuinely powerful CPU in a "Pee-Cee" available for less than RISC workstation money. I had to wait a while, mine was an AMD DX2-66 since I didn't have a budget for Intel... add Slackware... and countess hours messing with XF86config and I had a poor-mans Sun workstation.
We ran a 3-line BBS (Renegade and then Wildcat) on OS/2 on a 486-33 with 12 MB RAM. This was in 1994 or so. Great way to multitask several dos applications!
I've got an AMD branded 286 chip, from my first owned-by-me PC, bluetac-ed to the case of my home desktop PC, powered by a Ryzen something-or-other from a few years ago (with a 1060/6Gb card from a few years before that because I wasn't gaming enough to justify a new graphics card along with the other updates at the time).
I too have one sitting on my desk, 486DX2 66Mhz. I've had it for probably 25 years now, bringing it from job to job like the magical lost artifact it is. I remember how much more capable it was for playing Doom and Descent than the 33Mhz, or heaven forbid the SX. Of course shortly after the Pentium came out and blew everything away. The good 'ol days of giant Gateway 2000 towers.
I loved my 486DX2 66Mhz based IBM PS/1 (2168), which had a whopping 8MB of RAM. Not only did it really enable me to experience the fullness of PC gaming of the era, but it was the first computer I was able to install an internal modem into, and the computer I used to get SLIP dial-in access to the state university mainframe and thus to the Internet (prior I was limited to Prodigy walled garden). It was this computer that let me play early MUDs via telnet, let me play my first graphical MMORPG (Ultima Online), and and introduced me to real visual programming (Visual Basic).
To a significant degree, the 486DX2 was the primary computing platform that created the foundation I needed to learn computing at depth and enabled my later career, and really set many of the formative moments in my life. Thanks Intel, even though you suck now as a shadow of your former self you were a beast in the 90s.
I remember getting my first 486 33mhz computer and being able to play Ultima 7 the black gate, and later Ultima 7 part 2. This was a turning point for me as the game was way ahead of others on the console side of things. DOS 6 !
I think the turning-point was that flat-framebuffers and plenty of CPU-power for the first time eclipsed specialized 2d hardware (Amiga,Megadrive, Snes, etc).
Flat framebuffers and "powerful" CPU's also enabled easier software rendering (Doom/Duke) of 3d, compared to the Amiga where writing textured rendering for an Amiga is a PITA due to video memory layout with separted bitplanes spreading bits of each pixel into different memory locations (the total memory bandwidth reduction in 1985 by using 5 or 6 bitplanes became a fatal bottleneck at this point).
It wasn't really always full framerate though and the 2d chipsets did help in "classic" actiongames that were still much in the rage.
The Pentium further widened the gap, but at the same time consoles gained hardware 3d acceleration (PSX/Saturn/Jaguar) yet the Pentium could do graphics better in some respects (As shown with Quake).
Once 3d accelerators landed, PC's has more or less constantly been ahead apart from when it comes to price (and comfort/ease).
For me, the 486 was right between my (actually my Dad's) first computer, a 386, and my first personal computer (Pentium MMX). During those couple of years my friends had 486s and I was always jealous. I used to drool at the Best Buy catalog that came every Sunday in the mail.
Nowadays, 486 computers are getting rare and relatively expensive. CPUs themselves are 25, 30, 40, sometimes 50 bucks on eBay. Whole working systems are in the low hundreds, and fully working 486 laptops can fetch 400 or 500 bucks.
Great throwback.. they were awesome proc's. With a few Simms (4 - 16 Mb) it could do multimedia madness never seen before (play a CD-ROM game of mpeg1 video) 486dx4 100 was the latest Intel I had before going to Pentium clones. (AMD K series and the shitty Cyrix 6x86)
While the speed increases weren't as dramatic, do note that even in single core speed, unlike the clocks would suggest the Ryzen 7 is much, much more than 1.23X faster than the P4. The P4 was a particularly fragile architecture, and achieved IPC on real code was typically well below 1, often closer to 0.5. The x3d variants of Ryzen have been measured at running above 3 average IPC on real, complex loads. So the single-core uplift from that P4 to a modern AMD core is about the same as from that 300MHz Pentium to the 3.8 P4, it just took 20 years, not 8. Of course, now we also have 8 times the cores.
A switch from the exponential regime to something immensely slower was a qualitative change. The difference is so vast that it's completely reasonable to say that clock speeds haven't changed a single bit since 2006 or so (and even for raw ops/s speeds, which have improved much more, it's debatable).
Clock speeds used to be going up in a straight line (the normal "interpretation" of Moore's law) - but once the P4 hit a (kind of useless 3.8GHz) we leveled off for decades.
Ahhh but it gave me the opportunity to ran real programs, coming from an XT!
*Edited to add an example: I could for the first time use AutoCAD.
The price difference between a 286 and a 386SX was negligible, but the software I could use, was other league.
Yeah by the time we were getting into it the 486 was already out, but we wanted the real 32 bit bus and had to be a bit careful when looking at used computers (as by that time the 386SX and DX machines were about the same price).
It comes over as so incredibly insane to me that people from the late 80s (people working with computers! Reporting on them!) would look at their current technology stack and basically go: "I have no idea whatsoever what else we can do with these things, we've reached the end"
On the other end, you have people who have no idea how insanely fast computers are today, and how little computing power is "really" needed for most things that computer users do - or how much you can do with one average machine ("Oh no, 1000 requests per second - let's erect another rube goldberg machine to handle that!").
The 80s and 90s were filled with new things computers could do - spreadsheets, wysiwyg word processors, games - things that simply were impossible before (or not done).
In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.
The first 80286-based system (IBM PC AT), 80386 (Compaq Deskpro 386), and 80486 all had people writing about their suitability as servers, with the consensus's implication being that normal people didn't need them.
The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.
• Ran my first real UNIX at home on a PA-RISC (HP 9000-715/75 with HP-UX 9.03 and 96 MB RAM) in 1997, 20" color CRT.
• Today, Linux is still here, but on a 2-CPU, 140-core AMD server with 2 TB RAM, hundreds of TB NAS and a 40" TFT... (and it still takes too long to open the bloated Web browser!)
Suddenly, it was possible to imagine running advanced software on a PC, and not have to spend 25,000 USD on a workstation.
The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.
Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.
Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.
And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.
The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.
1: https://news.ycombinator.com/item?id=47717334
The intention was good, but the Akiko chip was functionally almost useless. It was soon surpassed by CPU chunky to planar algorithms. I don't think it was ever even used in any serious way by any released games (though it might have been used to help with FMV).
1: https://forum.amiga.org/index.php?topic=51616.msg544232#msg5...
This 16-bit memory (2 megs) is also where the framebuffer and audio lives, so the stock CPU in A1200 has to share bandwidth with display signal generation and the graphics and audio processing.
All-in-all, it meant the Amiga 1200 had only about twice the memory throughput of the Amiga 500. (About 5 megabytes/s vs about 10 megabytes/s)
If the A1200 had at least some extra 32-bit memory (it existed as a third party add-on) the CPU could have had its own uncontested memory with a troughput of about 20-40 megabyte/s.
Imagine the difference it would have made if the machine had just a little extra memory.
That's just a tiny detail. That the chipset wasn't 32-bit was another disappointment.
The bigger problem was that Commodore as a company was aimless.
I agree. Unfortunately, even with chunky graphics and/or 3D foresight, 68k would still have been a dead end and Commodore would still have been mismanaged into death. It’s fun to dream though…
Had the Amiga retained relevance for longer and without a push for PowerPC I don't see a reason why 68k wouldn't have been extended. Heck the FPGA Apollo 68080 would've matched end of 1990s P-II's and FPGA's aren't speed monsters to begin with.
Maybe if these theoretical new 68k Amigas became a huge market hit they could have taken the arch further and it could have remained competitive, but all the other 68k shops had already pretty much given up or moved on already (Apple was already going PPC, Sun went SPARC, NeXT gave up on their 68k hardware, Atari was exiting the computer business entirely, etc) so I don’t know that the market would have been there to support development against the vast amount of competition from both the huge x86 bastion on one hand and the multitude of RISC newcomers on the other.
https://en.wikipedia.org/wiki/Amiga_Hombre_chipset
It was going to be HP PA-RISC based and have an AGA Amiga SoC, including a 68k core.
Edit: I don't mean that their success was certain if they executed better. I mean they did almost nothing and got the guaranteed outcome: failure. (And their engineers were brilliant but had very little resources to work with.)
If it's a sin or feature can of course be debated but I remember playing games on an Amiga in the early 90s and until Doom the graphics capabilities didn't look outdated.
By 1992 with AGA however I agree, flicker and planar graphics(with 8 bitplanes any total memory bandwidth gains were gone) was a downside/sin that should've been fixed to stay relevant.
(Also, a Pentium 60 is barely faster than a DX/2 66 at many tasks — it is a Bad Processor — but that’s another conversation ;)
Second gen Pentiums, starting with the 75 MHz, were great.
https://www.silverstonetek.com/en/product/info/computer-chas...
The CPU I'm working with is Celeron M 900MHz single core no HT struggling to build wheels for python (several hours)
It was a life-changing machine.
Ordered, I believe, from the depths of a Computer Shopper magazine.
Back then, 10 years of technological advancement made a huge difference. Today, you can get by just fine with a 2016-era laptop.
I built a 486 Compaq Novell server for the company I worked for and named it Godzilla - gives a sense of how the 486 was seen.
To a significant degree, the 486DX2 was the primary computing platform that created the foundation I needed to learn computing at depth and enabled my later career, and really set many of the formative moments in my life. Thanks Intel, even though you suck now as a shadow of your former self you were a beast in the 90s.
Flat framebuffers and "powerful" CPU's also enabled easier software rendering (Doom/Duke) of 3d, compared to the Amiga where writing textured rendering for an Amiga is a PITA due to video memory layout with separted bitplanes spreading bits of each pixel into different memory locations (the total memory bandwidth reduction in 1985 by using 5 or 6 bitplanes became a fatal bottleneck at this point).
It wasn't really always full framerate though and the 2d chipsets did help in "classic" actiongames that were still much in the rage.
The Pentium further widened the gap, but at the same time consoles gained hardware 3d acceleration (PSX/Saturn/Jaguar) yet the Pentium could do graphics better in some respects (As shown with Quake).
Once 3d accelerators landed, PC's has more or less constantly been ahead apart from when it comes to price (and comfort/ease).
Nowadays, 486 computers are getting rare and relatively expensive. CPUs themselves are 25, 30, 40, sometimes 50 bucks on eBay. Whole working systems are in the low hundreds, and fully working 486 laptops can fetch 400 or 500 bucks.
sigh
[0] https://en.wikipedia.org/wiki/Dennard_scaling
The 386 SX was a crap, 16 bit wide bus IIRC.
I know my 286 you could pair with a 287 next to it.. not sure if it really made a difference you could discern outside of hyper-specific uses though.
Played some awesome games, like DOOM, Wolfenstein. Later duke3d was the shit. But i cant remember if i run on the same setup or something newer.
The lack of imagination is just disturbing.
In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.
The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.