Oldskooler Ramblings

the unlikely child born of the home computer wars

The IBM PC 5150: What if?

Posted by Trixter on December 27, 2012


Did you know that the IBM PC was  not originally designed around the i8088?  Various sources conflict somewhat (including offline sources such as “Blue Magic: The People, Power and Politics Behind the IBM Personal Computer“), but the general consensus is that early 5150 designs considered CPUs that were both less powerful (such as the MOS 6502 and Zilog Z80) and also more powerful (such as IBM’s POWER predecessor 801, or the Motorola 68000 CPU).

I find this fascinating to daydream about sometimes.  What if the IBM PC had not been built around the 8088?  How would have the personal computer industry progressed in the 1980s?  Who would the leader(s) be today?

The CPU possibility I keep drifting to is the 68000.  The IBM PC+8088 design had limitations that the industry spent nearly 15 years working around, the most infamous being limited to 1MB RAM (640KB typically available) and, adding insult to injury, only being able to access it one 64K segment at a time.  But while an address translation chip could have provided a flat 16:16 address space, there were some annoying limitations in the 8088 itself, such as only having four general-purpose 16-bit registers (and of those four, each had a specialization married to some sets of instructions, so you couldn’t use them as flexibly as you would have liked).  The 68000 by comparison was immensely more powerful and had none of these limitations:  It had eight 32-bit general-purpose registers, and another eight 32-bit addressing registers.  On top of that, the 68000 had a flat 16MB address space (no segments!)  The 68000 had some drawbacks too (big-endian architecture, misaligned code would crash the CPU) but the core architecture was so much more powerful that it would have drastically changed how the IBM PC was perceived and used.

However, it would have also drastically changed how the IBM PC was marketed.  The original price of the PC was already expensive at launch ($1565 without any monitor or disk drives, about $3800 today) but designing the machine around the 68000 would have required more expensive components.  This, along with IBM’s existing familiarity with the i8086 family which would shave months off the time to market, was ultimately behind IBM’s decision to abandon all others and go with the i8088.

The 68000 was used in more powerful home computers that came after the IBM PC, such as the Apple Macintosh, Atari ST, and Amiga 1000.  On launch day for each of those computers, they greatly outshone the original IBM PC.  (BTW, the Amiga launch is particularly impressive and fascinating to watch although I can only find a small snippet of the full launch video online.)  If the 68000 had been used in the PC, would those later machines have ever existed?  While it’s interesting to fantasize that a 68000-based PC would have prevented the Atari ST and Amiga from ever existing, I think a more realistic outcome would be that the PC would have been treated like the Apple Lisa:  Powerful, but way too expensive, and perceived as a high-end business workstation that would find its way into some niche business markets but never considered a computer for the home.  (As to what computer(s) would have triumphed in the PC’s absence, that’s a What If? for another day.)

What about other potential CPUs?  What do you think might have happened to the IBM PC if it had been based on the same CPU as found in the Commodore 64 or the ZX Spectrum?

12 Responses to “The IBM PC 5150: What if?”

  1. Arguably the reasons that IBM’s architecture was as influential as it was were that IBM was very well-respected and trusted by businesses, and that the architecture was open, leading to an ecosystem of third-party clones and add-ons.

    Those aren’t technical reasons, so would have been just as true if IBM had used a different CPU. I think history would have played out similarly, though we would have hit the limitations of the architecture at different times and in different ways.

    Modern PCs have evolved so far from their roots there are very few vestiges left – some instruction encodings, a few port addresses, the PIT frequency. So most things now are what they are for technical rather than historical reasons, and the modern PC would probably not be very different from what it is now if we’d started with a different CPU.

    It would have been nice not to have to deal with 64K segments, though – I spent so many hours tracking down bugs that turned out to be related to that.

  2. yuhong said

    “Modern PCs have evolved so far from their roots there are very few vestiges left – some instruction encodings, a few port addresses, the PIT frequency.”
    It is more than just that. Many of the things on the PC/AT motherboard like the 8254, the 8259s, and the 8237s, are still in modern chipsets, though some of them have better replacements that are used by modern OSes instead.

  3. yuhong said

    Personally my favorite is the MS OS/2 2.0 fiasco about the 80386:
    http://yuhongbao.blogspot.ca/2012/12/about-ms-os2-20-fiasco-px00307-and-dr.html

  4. Joe said

    “The 68000 had some drawbacks too (big-endian architecture…”

    What?! big-endian architecture is not a drawback, it is the most straightforward way of thinking about multi-byte data values! It’s the standard for every major network protocol! How can you call big-endian a drawback?

    • Trixter said

      It’s a PITA to program for. With little-endian architecture, if I only want to read the first word of a dword, I simply read the word at that location. If I want to read the high word, I just read the next one. http://en.wikipedia.org/wiki/Endianness#Optimization echos much of my sentiments.

      Keep in mind I am programming for 16-bit hardware. Most people on big-endian machines are performing 32-bit work in a single op.

      • Joe said

        Some fair points and my shock is relieved when you got me to consider the starting point of the hardware you were programming on. However, I thought you were talking about moving to the 68k. In that architecture, all things being the same, would you really prefer it to be little-endian?

        Anyways, I’m about 10 years younger than you and didn’t get into low level programming until the end of the 90s. Everything I worked on then was either an ARM or a PPC where big-endian was good.

        • Trixter said

          My original comment in the article was definitely meant to be of year-1980-mindset. Today, you are correct that endian issues aren’t really issues. Accessing the first 8-/16-/24- bits of a 32-bit dword is done so infrequently today anyway that it doesn’t really matter how many extra steps it takes.

        • Covoxer said

          Yes, basically it would be better if 68000 was little-endian. Don’t forget that 68000 had 16 bit memory bus. And most commands were faster working with 16 bit than with 32. So if it was little-endian, this could give quite a few opportunities for code optimization.

      • Chuck G said

        Jim, it would have fit in quite nicely with the overall IBM scheme of things. The S/360, after all, is big-endian. Note also that the 68K demands alignment–words on word addressing boundaries.

        We were really hoping that IBM would select the 68K, but word had it that, as the DisplayWriter used the 8086, a lot of the existing technology would be reused–and that turned out to be very much the case.

        Handling large data structures with the 68K is a walk in the park, compared with the segment mumbo-jumbo of the 8086. Porting Unix to the 68K was similiarly simple–porting to the x86 base was pretty difficult.

        There’s a good reason that most laser printers of the time (particularly those with Postscript did *not* use the 8086.

        Had National been on schedule with their NS32xxx series, the game might have been very different.

  5. Covoxer said

    I completely agree with your opinion about possible 68k version. I’m pretty sure it could be even bigger fail than Lisa. Some more thought about using 68000.

    1. What OS would they use? I doubt they would use UNIX. Very not IBM-like decision. But if they would, it would increase hardware requirement (esp. RAM), making system even more expensive. Otherwise they would have to use some proprietary OS, which would add to development costs (and product price), and make it much less attractive for independent developers. There would be no easy way of porting any existing software (also, at that time, there was not much business software for UNIX, CP/M had more PC oriented software market).
    So, if they chose 68000, any possible OS choice would be worse than PC/DOS, additionally reducing chances of success.

    2. Features/price. First of all, CPU price alone. All 68k based machines in the PC price range appeared some 5 years later, when 68k was much cheaper than in time of PC development. It was simply too expensive for this market. Second – mainboard complexity. PC had 8 bit bus, 68k had 16 bit bus. Basically, you can double the price of it by doubling number of components and PCB size (using 16 bit components would be even more expensive at that time). Third – less compact instruction set, requiring more memory for the same functionality. And RAM was not cheap at the time of PC development. If I remember correctly, original PC was sold in versions with just 16kb (and BASIC in ROM).
    That’s for the price. But what about features?
    Unlike Lisa, 68kPC would not have any revolutionary GUI or complete office software suite bundled with it. And would not have tons of memory and mass storage as workstations to keep it cheaper. So the only advantage over the CP/M machines would be tremendously fast CPU. But would anyone care? At that time people were perfectly satisfied with 1mHz 6502 and 3mHz Z80. All work was done in monochrome text and 64kb of RAM was considered large. Would anyone see any difference having 68k on the 64kb system with text display, running some text editor or spreadsheet? Obviously no.
    So, the feature to price ratio would be worse than with Lisa. It would be a bit cheaper of course, but Lisa provided something unique and really useful for that price. With 68kPC we’d probably get some CP/M wannabe, with lack of software, no extra possibilities and a price close to workstations… IBM made that mistake once. It was called 5100. They were smart enough to not make it again.

    3. Can we really compare it to Mac/Amiga/ST? No. Macintosh was first 68k based machine with accessible price. But how was this achieved? It was developed much later. 68k was cheaper already. It was also extremely stripped down to reduce the price – no expansion at all (not even RAM expansion! and you had to buy Lisa to actually develop software for it – not very developer friendly). Would PC have any success having no expansion capabilities? No, of course not. After all, it had to compete with perfectly expansible S-100 systems, already having strong market share. Besides, IBM would not be able to produce something like Mac using only off the shelve components. And they would no invest in IC’s development for PC.
    So, basically, PC could not be as cheap as Macintosh and still be successful.
    There’s no point comparing to Amiga and ST as they appeared much later (cpu and memory prices dropped down significantly to that time), were build around highly integrated custom chipsets and also lacked expansion capabilities.

    So if IBM put 68000 in the original PC, I think it would be soon dead (similarly as it happened with RT/PC later, for the quite similar reasons too). With PC dead, CP/M market would grow undisturbed along with Apples. So the definitive PC architecture would be established either by some strong brand embracing CP/M, or by Macintosh putting it to oblivion. Or, alternatively, by IBM making second, more successful attempt at entering PC market. ;-)

  6. Covoxer said

    So what about other CPUs? The most interesting alternative is Z80.
    Let’s imagine PC with Z80. It would get CP/M straight away. Along with all business software already available. Expansion bus – I think IBM would invent something proprietary, similar to ISA. But if they stick to S-100, it would be even better for growing popularity. Basically, with IBM brand name, they would easily establish themselves on the CP/M market. On the other hand, the definitive PC of the future would not be an IBM PC, but CP/M machine, an already existing open standard. Only with IBM supporting, instead of dismissing it, it would continue to grow further.
    Consequences? Instead of Microsoft we’d have DRI tooday. Garry Kildall would probably be still alive and the reachest man on the Earth. Zilog would have a chance to become what Intel is, and Intel would be on place of modern AMD. PC evolution would have one more generation step – from 8bit to 16bit. And one more thing – PC’s would have fairly advanced multitasking OS from the start – MP/M. And it would probably grow into future GUI OS. Also, if Zilog would develop the 16 bit successor architecture, chances are that it would be much cleaner than x86. Look at actual Zilog CPU designs, even at the Z80 (as extension to 8080), they usually tend to make more elegant architecture decisions than Intel did. But who knows?

    And what about 6502? I’ve heard a rumor once that IBM actually wanted to simply resell existing Atari PC’s. But they failed to come to agreement. Anyway, if this was a case, or if they develop 6502 based architecture in home, it would be basically the same. So what would it be?
    They’d have to develop some OS. I’m not aware of any true OSes existing for 6502. But for an open architecture and ease of software development there should be some real OS developed. Anyway, with or without OS, it would be highly incompatible with anything on market. So porting software would be much harder than it was with actual PC. Most ports would be probably from Apple II. On the other hand, abundance of programmers familiar with 6502 would probably help the initial software growth comparing to original PC.
    Hardware would be quite similar to what it was and, probably, in some ways similar to Apple II. It would also be cheaper than 8088 based PC, which is good for initial sales. And 6502 would be a perfect choice for future hardware clones – cheap and simple.
    So would it survive? I think yes. At the time, even Apple II was quite competitive with CP/M machines. With IBM marketing, they would probably succeed at capturing all the business market. Some basic business software would be easy to port from Apple and new would grow fast.
    Consequences? We’ve already seen the following CPU in that line – 65C816, except with huge success of PC and clones, it would appear much earlier. Then 32 bit versions would follow etc.
    Software. Now this is hard to tell since we can’t predict what sort of OS would evolve out of this case. If it was IBM’s own development, then we’d have IBM now instead of Microsoft (after they abandoned hardware PC market), if it was developed by Microsoft (or anyone else for that matter), then it would be on top now. What else?

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.