jacquesm 6 years ago

Beautiful piece of work. This bit jumped out at me:

"First problem: I don’t have a 16-bit code segment! Second: I don’t have a way to generate 16-bit code with GCC."

I had that exact same problem writing a loader for a now defunct OS and for years I kept a copy of Borland C++ handy so that I could compile a 16 bit trimmed down version of the filesystem that would load the rest of the OS before jumping into 32 bit mode. The tricky bit was that the loader had to interact with the BIOS in 16 bit mode and I could not find a way to cleanly jump back to 16 bit mode from 32 bit mode once I got there, so multiple transitions of that 16 to 32 bit boundary were out.

So instead I cut out everything from the filesystem code that had to do with writing and updating things just to read a couple of files, place them in memory, flip to 32 bit and then jump to the equivalent of 'init'.

The x86 compilers of old had a whole pile of 'models' that you could write for, all with different sizes of code and data segment limits.

DJ Delorie's excellent GCC to DOS port (DJGPP) was another very important tool in that whole process.

Also beautiful how web.archive.org provided one of the key bits of information, I suspect that in the longer term it will be as important as WikiPedia.

  • bringtheaction 6 years ago

    > Also beautiful how web.archive.org provided one of the key bits of information, I suspect that in the longer term it will be as important as WikiPedia.

    Ironically, the platform that the OP post itself has been published on results in a blank page when I submit it to web.archive.org and to archive.is.

    OP if you are here, could you please repost your post elsewhere as well such as for example medium.com and respond with a link to it to me here so we (the users of HN - just to clarify that I have no affiliation with either of said archival services, I am just a regular user that submit URLs to them) can archive your post. It was an interesting post and I would like for it to be available in 1, 2, 5, 10 years.

newnewpdro 6 years ago

Having used OS/2 for a number of years before switching to GNU/Linux, I can't recall a single piece of OS/2 native software that I miss and would like to run in emulation.

The vast majority of what I (and my computer geek peers) did under OS/2 was run MS-DOS software in a multi-tasking environment. OS/2 never really got much in the way of quality native software developed for it, at least not that us poor students had access to anyways.

  • ptx 6 years ago

    I miss the Workplace Shell[1]. But I guess you can't really download that separately, and it doesn't make sense without the rest of the system.

    There was a file manager for Unix that was inspired by it – DFM[2] – but not really the same thing, and the project seems to have died a long time ago. I remember using it with IceWM in Red Hat Linux 5.0 to get something vaguely OS/2-like. :)

    [1] https://en.wikipedia.org/wiki/Workplace_Shell

    [2] http://web.archive.org/web/20120717020125/http://www.kaisers...

  • dade_ 6 years ago

    At the time, being able to run windows applications on OS/2 was a big feature, so there wasn't much need for native applications. Windows 95 came out, and its apps were not compatible, so the feature ended up accelerated its demise. I used it to run my BBS, OS/2 was great for muktitasking and stability. The built in networking was great, being used to Windows 3.1 at the time. It was also very common in voicemail systems and is also used in IBM cheque scanners. I don't have any interest in running IBM Works, but there has been interest in running old BBS software lately...

    • newnewpdro 6 years ago

      The windows compatibility was much touted but it was never impactful enough to compel any significant numbers of DOS/Win users to switch, so it's not like that feature ceasing to work with the arrival of Windows 95 resulted in masses of OS/2 users switching back. There simply never were masses of users.

      What IBM needed was some killer OS/2-specific programs to attract the masses. There was a window (har har) of time where they probably could have done it, but they missed it. They certainly had the resources at the time.

      It's also important to remember that PC hardware support was an absolute nightmare back then. OS/2 needed drivers written for everything, and very little was standardized, there wasn't even USB. Hardware compatibility combined with a lack of attractive native software were the major barriers to mass adoption IMHO.

      edit: This stroll down memory lane just reminded me of the pile of 30+ 3.5" floppy disks required to install OS/2, which I had to write myself using the CDROM from DOS, because OS/2 couldn't use that particular CDROM successfully yet.

    • chaoticmass 6 years ago

      I've come across some old IBM Intellistation PCs which were used as check scanners. They had a bunch of weird PCI cards inside and the systems were running OS/2. I dug around the hard drives and found check images. Whoever let those go without wiping the drives really goofed up. Good thing I found them and not someone else.

  • krylon 6 years ago

    I read somewhere that IBM thought by offering DOS and Windows compatibility they could lure in users, which backfired, because there was no incentive for developers to offer dedicated OS/2 versions of their software.

    IIRC, the main reason it caught on in banks was that those were big IBM customers to begin with, and that OS/2 offered good support for talking to IBM mainframes out of the box.

  • jhbadger 6 years ago

    Galactic Civilizations was a pretty great Reach for the Stars/Master of Orion style game that was for a long time OS/2 only, as the developers were OS/2 fanatics. Later versions did get Windows ports, though.

dom96 6 years ago

Hrm. As someone who wasn't alive during the OS/2 era, does anyone have or know of an overview of features that made it particularly stand out?

  • geocar 6 years ago

    • Ability to run multiple applications at the same time

    • Filesystem with long file names

    • Buggy DOS programs couldn't crash the entire system

    These sound silly and lame by today's standards, but you'd have to look at OS/2 through the lens of the late 1980's: The Apple Macintosh couldn't multitask anything and crashed all the time! DOS was a world where important files were stored in a directory called USPLDNGS or NODELETE. In the 1990s when Microsoft tried their hand at long filenames, we got gems like PROGRA~1

    Microsoft and Apple didn't really solve any of these problems for consumers until 2001 (Windows XP and OSX) but it was already too late for OS/2 at that point.

    • Grazester 6 years ago

      Windows XP? What happened to windows 2000?

      • detaro 6 years ago

        I guess the key word is "for consumers", which NT and 2000 weren't aimed at (and anecdotally, the vast majority of consumers did not use).

  • myrandomcomment 6 years ago

    Ran OS/2 on a 486DX/66 with a VLbus Cirrus Logic card. 8MB of RAM. I could run X-wing in DOS and have a download going in the background from an BBS and have my FidoNET client running...etc. I could run Word in a Windows session and have another Windows session running another application and when that application crashed windows it did not crash my others Windows session. I could name things "This is my paper on some dumb stuff the prof wanted me to write about.txt" and save it to the file system. OS/2 was pretty cool. It pretty much also ran every ATM and PBX you interacted with up until the early 2000s.

    It was better then Windows in every way except the fact that it as being sold and marketed by IBM. MS wrote the early code for it and it was going to be the next thing after Windows 3 but then they did NT.

    • Zardoz84 6 years ago

      There is yet ATMs running OS/2

  • projektfu 6 years ago

    I wanted OS/2 to succeed because it was reasonably fast and stable, and included a very good filesystem (HPFS) compared to FAT. But "users" don't find those features compelling. 32-bit OS/2 was also limited to 512MB, but at the time that was an unattainable ideal for a PC.

    I think it would have had a chance if it were easily config'd like the Mac, which was all graphical and simplified. But OS/2 was like DOS, requiring power users to edit config.sys and the like. Windows 95 provided more graphical config options, although in the end if you wanted to run games and the like, you were dealing with the crazy.

  • newnewpdro 6 years ago

    True 32-bit protected mode preemptive multi-tasking was the primary advantage over comtemporary windows at the time.

    People were using Desqview in MS-DOS to achieve this level of multi-tasking in lieu of OS/2. OS/2 did it better. Windows 3.1 was cooperative multi-tasking, not even protected mode if memory serves.

    Windows NT would be where MS tech caught up with IBM, OS/2 was out in the wild for quite some time already.

    But the general public didn't embrace OS/2, Windows 95 effectively killed it.

    • dragonwriter 6 years ago

      > Windows NT would be where MS tech caught up with IBM

      OS/2 was originally a Microsoft/IBM joint product, work on NT at Microsoft began during that time as “NT OS/2”; the cooperation broke down in large part over the different directions MS and IBM wanted to take with OS/2.

      • nickpsecurity 6 years ago

        Although true far as I know, people new to topic might misread that as it being a derivative OS/2 they ran with against IBM. It was a enhanced clone of OpenVMS by the OpenVMS team they poached per Russinovich:

        http://windowsitpro.com/windows-client/windows-nt-and-vms-re...

        They kept better architecture that could eventually be turned into a solid server. For time to market, they ditched the quality, high-availability, etc. They added a GUI. Backward compatible with DOS apps plus compatible with OS/2 stuff if I remember right. Tada! Eventually, added quality and security back in with SDL plus clustering. Bill had already achieved dominance at that point with OS/2 and every other desktop being an also ran.

        Far as OS/2 goes, I read the original versions of NT were developed on OS/2 workstations that the developers gave up grudgingly when forced to dogfood on NT. They also used UNIXen for some server stuff and ran the business on a AS/400. They seemed to have just used whatever was best at each thing with long-term plan to replace it all with their competing product copying one, improving over others, and integrating some (i.e. open-source).

        • Joeri 6 years ago

          For those interested in the topic, there’s a fascinating book “Show Stopper” detailing the making of NT.

          https://www.goodreads.com/book/show/1416925.Show_Stopper_

          The core NT design with its kernel personalities is what allowed them to things like adding linux support, but in the book they describe there was so much pushback because it pushed ram requirements up to 8 mb (outside the abilities of consumer pc’s at the time) that Gates himself had to intervene several times to keep Cutler’s architecture intact.

          I also really loved the bits about Cutler’s personality. They came across as more myth than fact but you do get the sense that working for him must have been a singular experience.

        • kyberias 6 years ago

          Thing to note is that NT started with portability in mind and the first processor it was running on was a MIPS.

          • mrbill 6 years ago

            I ran it on a MIPS Magnum R4000 for a while. I forget the name of the compatibility feature, but you could run 16-bit x86 Windows apps (may have been 32bit, it's been 20 years) with only a slight performance hit.

          • digi_owl 6 years ago

            And it demonstrated how valuable binary compatibility across generations were, as Windows have never really gotten off the ground outside of x86.

            This in large part because of corporate and consumer demand for being able to run their existing software on new computers.

            Something that both the FOSS world and others should take note of (and no, app stores do not remove this issue).

            • kyberias 6 years ago

              Well, I guess at the time there just wasn't any demand for Windows for Alpha or MIPS workstations. PCs had huge demand.

              • scruffyherder 6 years ago

                Oh there was, just people with big pockets. Have some application written in VB6? Need it to go faster where money is no object? Get VB for the Dec Alpha. Although I've never seen anyone do that, but there was such a thing.

                Now when it came to a massive DEC Alpha to run SQL Server, then absolutely. It was the ultimate hardware solution to a software scalability problem, and it was not cheap.

              • pjmlp 6 years ago

                The happening of free UNIX clones also helped, as the companies that might have transitioned to such Windows systems, rather migrated to BSD and Linux distributions instead.

          • nickpsecurity 6 years ago

            Yes. Its predecessor is still a port in progress to x86 instead of running on it due to its less portable design. That was a real improvement by NT team.

        • skissane 6 years ago

          > It was a enhanced clone of OpenVMS by the OpenVMS team they poached per Russinovich

          I think the word "clone" is too strong. DR-DOS is a clone of MS-DOS because it attempts to implement the same APIs, with the objective that most (ideally all) software written for MS-DOS would run on DR-DOS without modification. By contrast, Windows NT doesn't implement any of the APIs of VMS, and VMS software cannot run on Windows NT without modification. Microsoft did take people and high-level ideas from VMS, but they were never trying to build a clone of OpenVMS-which would require aligning API details (as opposed to just high-level concepts)

      • yuhong 6 years ago

        That OS/2 2.0 debacle was so bad BTW it is my favorite topic (especially the MS part). I mentioned before out that DR would not be easily be able to clone OS/2 as they did with DR-DOS.

        • Zardoz84 6 years ago

          DR-DOS not was a clone of MS-DOS. The truth is that was CP/M-86 that was added stuff to be MS-DOS compatible. I used DR-DOS 5 when I was a children and I remember that it worked better that MS-DOS 6. Sadly, Microsoft used these dirty trick that Windows 3.1 would autofail if detect any OS that isn't MS-DOS.

          • einr 6 years ago

            Sadly, Microsoft used these dirty trick that Windows 3.1 would autofail if detect any OS that isn't MS-DOS.

            That code, while present in the codebase, was deactivated in the release version of Windows 3.1, so it actually ran just fine on DR-DOS.

          • yuhong 6 years ago

            Early versions of DR-DOS was (kind of), but eventually I think they redesigned the kernel to use true DOS data structures etc.

            • Zardoz84 6 years ago

              DR-DOS 5 keep having the old CP/M sauce like BDOS kernel renamed as "IBMDOS.COM". Other thing, would if they removed old CP/M stuff that was not necessary any more to run MS-DOS programs, but the source code keeps being a evolution of the old CP/M.

          • orionblastar 6 years ago

            MS-DOS used CP/M-80 API calls so programmers could convert CPM to DOS programs. That is what I heard.

            • pjmlp 6 years ago

              Yes, that was the case.

    • kjs3 6 years ago

      OS/2 was 16-bit until 2.0 (and technically still had 16-bit chunks under the hook after that). Windows NT was where MS 'forked' OS/2 after the 2.0 days; IBM and MS were partners in developing OS/2.

      • scruffyherder 6 years ago

        The real joke is that the '32bit' version was written in parallel, and suppressed by IBM.

        You can read more, and actually run them here: https://www.pcjs.org/blog/2016/02/08/

        It's a little touchy to get the 1987 version running ( https://www.pcjs.org/disks/pcx86/os2/misc/football/87058/ ), but there it is, OS/2 1.0 hacked up with 32bit extensions to support multiple v86 machines. Windows/386 was a thing in 1987, along with Xenix on the 386. OS/2 could have been there too but IBM was too busy trying to cover base with the 5170, the brain dead 286 machine that dragged down the entire industry.

        Between delaying the 32bit version for years, they also wouldn't let Microsoft just port over Windows to run ontop of OS/2. And of course SAA had to be 180 degrees opposite of Windows out of spite.

        It's really no surprise that once Windows 3.0 started to sell, NT OS/2 3.0 suddenly became Windows NT 3.1

      • newnewpdro 6 years ago

        And those things make OS/2 stand out how exactly?

        I'm under the impression OS/2 wasn't an attractive OS until 3.0 came along.

        • kjs3 6 years ago

          I didn't say that they made OS/2 stand out and I obviously wasn't addressing that. But since you asked, OS/2 1.x and 2.x were reasonably stable, multitasking, networked server operating systems with an advanced (for the time) filesystem and other interesting features. With early versions delivering that on an i286. And there were a couple of (possible) killer apps: Notes, DB2 (on something smaller than an AS/400), etc. You can debate whether it was superior to Novell, Banyan, Xenix or a bunch of other contemporary competitors. It was also a usable though less clearly advanced desktop operating system at the time.

      • kyberias 6 years ago

        Windows NT started from scratch. It was never an OS/2 fork.

        • danboarder 6 years ago

          Not from scratch. It started as "OS/2 NT" in partnership with IBM. Microsoft renamed OS/2 NT to Windows NT. See here: http://www.itprotoday.com/management-mobility/windows-nt-and...

          • kyberias 6 years ago

            Yes from scratch. Names matter little.

            From your source: "Microsoft's internal project name for the new OS was OS/2 NT, because Microsoft's intention was for the new OS to succeed OS/2 yet retain the OS/2 API as its primary interface."

            Note how Russinovich talks about a "new OS".

            The NT (kernel) never had any OS/2 code in it, it was a totally different project. The first target processor architecture for NT wasn't even x86, it was MIPS.

            OS/2 in NT was merely a subsystem layer next to DOS and Posix, somewhat like today there is the Linux subsystem layer in Windows 10.

            A good source is this:

            https://www.amazon.com/Show-Stopper-Breakneck-Generation-Mic...

            • newnewpdro 6 years ago

              > The first target processor architecture for NT wasn't even x86, it was MIPS.

              I thought it's pretty well established now that the NT in the name was named after the target architecture: Intel i860 "N10", not MIPS.

              • kyberias 6 years ago

                You're right. My point was, however, that it wasn't an x86.

            • kjs3 6 years ago

              Agreed, I don't mean 'fork' in the sense they took the code and went their own way, I mean 'fork' in the sense they took all the IP and lessons learned and reimplemented an OS/2 they had sole control over.

              "retain the OS/2 API as its primary interface"

              Linux isn't Unix.

              KJ

          • Zardoz84 6 years ago

            Also, HPFS was like NTFS v0.9

        • kjs3 6 years ago

          It was in no way started from scratch. Microsoft took all they learned from OS/2 including the API and had basically the same team (including David Cutler, who was originally in charge of "OS/2 v3") and reimplement an operating system they didn't have to share with IBM. To pretend NT is somehow not directly related to OS/2 is the worst sort of historical revisionism; it's in fact a lie.

  • nickpsecurity 6 years ago

    Here's a nice article about it:

    http://techland.time.com/2012/04/02/25-years-of-ibms-os2-the...

    The lasting benefit it seems to have as jaquesm said is reliability. Claim in article I doubt but shows they rarely crash:

    "Oh, and another thing: OS/2, Waldhauer, says, doesn’t crash. It’ll run for a decade without requiring rebooting."

    It's simple. It doesn't change a lot. It rarely crashes if that's to be believed. It's integrated into important systems for legacy effect. It also has at least one company at any given time updating it to work on modern stuff. So, that sounds like benefits a lot of big companies would be interested in. I think the drawbacks are too significant, though, especially it being a closed, barely-living tech without much of an ecosystem plus who knows what security or high-availability attributes.

  • tyingq 6 years ago

    It competed with Windows 3.1, but had preemptive multitasking vs Window's cooperative. Often ran Windows programs faster/better than Windows. And pretty good MSDOS compatibility. For other reasons, it failed. It was the OS of choice for a lot of bank ATMs though.

    • jacobush 6 years ago

      The way it ran Windows apps was by loading Win 3.1 under the hood. IIRC Microsoft hurt IBM with licensing here.

  • YZF 6 years ago

    It had Rexx as a system scripting language and also a full TCP/IP stack vs. the kludges that were part of Windows at the time...

    • tyingq 6 years ago

      Hmm. I remember KA9Q, Crynwr packet drivers, Trumpet Winsock, etc. I don't think the creators of any of these got enough credit or compensation. Probably other similar solutions as well. I think there was an open source ne2000 driver with similar popularity.

    • kitd 6 years ago

      Interesting fact: when you ran a Rexx program, the OS precompiled it to IR and prepended it to the script. As a result, Rexx programs could run pretty quick (for the time).

      Source: wrote whole systems in OS/2 Rexx!

  • jacquesm 6 years ago

    It was a real contender to Windows, an actual OS rather than a bunch of cobbled together APIs pretending to be an OS. It was also super stable. And it got killed in the marketplace.

  • pjmlp 6 years ago

    The component system was more OO than COM, called SOM and even allowed for metaclasses.

    IBM was the only company, after Lucid, that played with using a Smalltalk like development environment for C++ on OS/2, but it required too much resources.

  • sedatk 6 years ago

    DOS was becoming outdated and insufficient in the 90’s. No multitasking, no crash-protection, no unified UI, unreliable file system. Yet it ran on millions of computers.

    OS/2 was one of the spiritual successors to DOS. It was DOS and Windows compatible, could run DOS apps in parallel and apps wouldn’t crash the whole system. It came with a reliable file system called HPFS as well.

    Its direct competitor was Windows 95. It bet on better compatibility and better hardware support so eventually persevered. After switching to Windows NT kernel with Windows 2000, Windows had no technical shortcomings left compared to OS/2.

FullyFunctional 6 years ago

This is impressive work, but I was particularly interested in his in-passing mention of running Linux apps on Mac. I could use that (EDA tools at best support Linux if even). Lo and behold, there's something Noah that proclaims to do this. Is there anything else? (I'm a heavy user of VMware Fusion, but that's pretty heavy weight).

yuhong 6 years ago

This reminds me that when SYSCALL was modified to change EFLAGS and SWAPGS was created they only cared about Linux which didn't use call gates (not an issue for this since this API emulation runs in user mode). Also reminds me of espfix which generally is not needed for OS/2 programs to work correctly.

orionblastar 6 years ago

OS/2 2.x had WinOS2 to run 16 bit programs for DOS and Windows. OS/2 3.0 did 32 bit OS2 but not Windows 32 bit.

Still running 16 bit DOS and Windows programs meant that developers did not target an OS2 binary because a Windows binary was all they needed.