WalterBright 14 days ago

The article neglects to mention Datalight C, the first C compiler using data flow analysis optimizations. It did such a good job that the magazine reviewers decided it was cheating by deleting the meat of their benchmark codes. What actually was going on was the benchmark code was proven by DFA to do nothing, so was removed.

Within a year other compilers started doing DFA.

The article also neglects Zortech C++, the first native code generating C++ compiler. ZTC++ was the catalyst that made C++ a major language, as the PC was where 90% of the programming was. Before ZTC++, C++ and Objective-C were neck and neck, judging from the volume on comp.lang.c++ and comp.lang.objectivec. With ZTC++, the volume of the former exploded, and Objective-C disappeared into oblivion (later resurrected by Apple for a while).

  • kragen 13 days ago

    interestingly the article claims (that tiemann claims that) g++ came out in 01987, making it the first native-code-generating c++ compiler, and http://www.edm2.com/index.php/Zortech_C%2B%2B says zortech c++ came out in 01988. did zortech c++ actually come out earlier than that? or, did g++ actually come out later than that?

    if not, i think tiemann beat you by a bit there, though i agree that zortech was more influential throughout the 80s and 90s

    https://gcc.gnu.org/releases.html says the first version of gcc to include g++ was 1.15.3 on 01987-12-18. unfortunately it doesn't seem to be preserved in https://ftp.gnu.org/old-gnu/gcc/ where the oldest version present is gcc 1.23 from 01988-06-26; possibly there's a comp.sources archive or something?

    • WalterBright 13 days ago

      Tiemann released his compiler in Dec 1987, labeling it a beta release. ZTC++ came out in April 88, and was a shrink-wrapped commercial product. I obviously passed ZTC++ around to friends and associates before it was released, but didn't keep any records of that. That was in the days of mailing floppy disks around.

      It comes down to whether you consider a beta a release or not.

      I don't know if Mike Tiemann wrote his own code generator or not. ZTC++ was written entirely by myself, from preprocesser to object file.

      • kragen 13 days ago

        he didn't write his own code generator, no; g++ used (and uses) gcc's code generator, which i think he'd written some of, but not nearly all of. and of course it used gcc's preprocessor, cccp, as well. but it wasn't generating c output like cfront, it was producing the same ir the c frontend did

        according to that page, there were four more numbered versions of gcc before april, but i don't know enough to say whether they were beta-quality or not; i didn't start using gcc myself until four years later. but if you were on the net at that time you could get a copy of g++, beta or no, while zortech's product wasn't available yet at any price

    • j16sdiz 13 days ago

      Funny that you prefix 1987 with 01987, as if 5 digits years is needed anything soon... yet, conveniently saying 80s or 90s without the "19" part like we won't exist in 50 more years.

      • kragen 13 days ago

        you won't

        • ngcc_hk 11 days ago

          Not me but for 30+ high chance

  • kragen 14 days ago

    these are indeed significant omissions, and you wrote both datalight c and zortech c++

    • WalterBright 13 days ago

      Some compilers I've written are Northwest C, Datalight C, Zortech C, Zortech C++, Symantec C++, Digital Mars C++, and the Digital Mars D compiler. The last one, DMD, still has code in it from those previous compilers :-/

      I've also written a Java compiler (not from scratch) and a Javascript compiler/interpreter, and designed the ABEL (Advanced Boolean Expression Language) PLD compiler.

      • kragen 13 days ago

        i don't know if you should admit to that last one, you might have to hide from people who program cplds

        • WalterBright 13 days ago

          ABEL is a far easier tool for programming PLDs than CUPL or any of the other such languages.

mannyv 14 days ago

This totally neglects to mention that the commercial c compilers kicked gcc's ass when it came to performance. Sun's cc and and IBM's xlc were substantially better across the board, and I know the latter supported profile-based optimization.

There were also a bunch of commercial c compilers, all of which I've forgotten.

gcc was the lowest common denominator compiler, with the benefits and drawbacks of being in that position.

In any case gcc won the *nix compiler wars because it was free and easily accessible. Getting a license for the commercial compilers took work and/or funds, and the latter was something most unix users didn't have.

  • johngossman 14 days ago

    And they weren’t cheap either. The Sun compiler was over $1000 IIRC.

    • pjmlp 14 days ago

      That was the reason why GCC adoption took off in first place.

      During the UNIX freebie days no one cared about Stalmman's freedoms.

      After Sun being the first UNIX vendor to split UNIX into multiple SKUs, for developers and users, then GCC suddenly became relevant after all.

      For languages like Ada, this was even worse, because SunOS/Solaris SDK only contained traditional UNIX compilers. For something like Sun Ada compiler, it was extra.

      • anthk 14 days ago

        You all forgot something: on stability and speeds, Unix compilers rot over time.

        From 1996-1999, the stable choice on both speed and reliability was the GNU one.

        https://pages.cs.wisc.edu/~blbowers/fuzz-2001.pdf

        FVWM ran circles around MWM or CDE itself about speed. Rxvt was much lighter than an xterm, not everyone needed to plot Tek graphs. Even some late Irix users preferred JWM against the propietary options.

        Nowadays, GNU's the slight bloated one, being Guix the 'essential' GNU distro, making lots of 32 bit machines without SSD's a crawling nightmare to install.

        • pjmlp 14 days ago

          Can't say much on that regard, because we only used GCC on Linux distributions, everywhere else it was the vendors C compilers.

          Interesting that you mention stability, exactly during the GCC vs egcs politics time frame.

          • guenthert 13 days ago

            I worked in a small shop in the late nineties where we used gcc during development within the MS Windows environment because MS' own compiler (Visual C++, iirc) produced either (depending on settings) no warnings or buried the relevant ones under those of questionable code in the system header files. We had to use MS's compiler for the final product, as gcc couldn't produce shared libraries in that environment at that time, otherwise we would have gladly used only gcc.

            • pjmlp 13 days ago

              Anyone that tries to use UNIX stuff on non-UNIX OSes ends up having to deal with interoperability pains, hence why cygwin and mingw keep being leaky abstractions, regardless how much better GCC might generate raw code versus VC++.

              Additionally, nowadays clang also ships alongside VC++, and is offically supported by Microsoft as well.

        • nineteen999 14 days ago

          > From 1996-1999, the stable choice on both speed and reliability was the GNU one.

          Meh, gcc 2.8 was buggy as hell on Linux/i386 by around 1998, as was gcc 2.95 up to 2.95.3. The issues with it are what led to the egcs fork which eventually replaced the original gcc branch. So it wasn't all sunshine and roses.

          • guenthert 14 days ago

            I thought it was missing features in the C++ compiler and its less-than-perfectly-clear error messages which led to the split. The C compiler was pretty sound early on. Sure, occasionally there were bugs, but users, including Linus Torvalds, complained loudly about those, because expectations were already high.

          • CalChris 13 days ago

            EGCS was started because of Stallman. They refused to criticize him directly but it was clear from their Declaration of Independence email that it was Stallman. They even used its language but avoided Stallman’s name to make the eventual merge politically possible.

            https://gcc.gnu.org/wiki/History#EGCS

      • einpoklum 13 days ago

        > During the UNIX freebie days no one cared about Stalmman's freedoms.

        On the contrary, they cared plenty, because GNU freedoms let them have a usable, reasonable-quality (and sometimes high-quality) environment both as developers and as users, which they could otherwise not get unless they were rich or employed by a corporation (who would then have them do what it wanted, not what they might have wanted to spend their time on).

        • p_l 13 days ago

          GCC usage IIRC exploded when Sun made their compiler suite extra paid option instead of bundled as part of standard SunOS install.

          Before that, GCC was of interest particularly on platforms that didn't provide good CC - a lot more platforms with various quality of software - by comparison the big name workstation vendors that survived longer also had higher quality compilers.

          When compiler suite was included by standard, GCC had much less of a benefit to many users (I think gdb actually might have driven more?)

        • pjmlp 13 days ago

          No they didn't, that is why before Stalmman, most UNIX stuff was either free because AT&T wasn't allowed to charge for UNIX as commercial product, or MIT/BSD license.

          Which is how UNIX workstation market came to be, and how even Windows ended up with BSD code on the networking stack.

      • matheusmoreira 14 days ago

        > During the UNIX freebie days no one cared about Stalmman's freedoms.

        No one cares about their freedom until they are denied it.

        • pjmlp 14 days ago

          The rise of non-copyleft licenses proves that the only freedom most people care about is their wallet's.

          Most GPL software won't survive its authors, even Linux kernel remains to be seen when Linus et al are gone, and it gets taken over by newer generations VC driven.

          • kragen 13 days ago

            non-copyleft free-software licenses like the bsd license or the 'mit license' provide plenty of freedom in stallman's sense; you can study, copy, modify, and redistribute what they cover. when you're talking about people who only get wallet freedom, you're talking about free-tier proprietary licenses, shareware, free-for-noncommercial-use licenses, illegally copied software, that kind of thing. and while certainly there are lots of people using tinkercad or pirated windows, obviously those ecosystems don't have the vitality of blender, netbsd, and linux

            • pjmlp 13 days ago

              It isn't the same, otherwise those licenses wouldn't exist, and be fully embraced by those that don't want to be tainted by freedom ideology.

              • kragen 13 days ago

                it isn't the same, no

          • medo-bear 14 days ago

            > The rise of non-copyleft licenses proves that the only freedom most people care about is their wallet's.

            This ignores how much freedom the wallet requirement takes away.

            • pjmlp 13 days ago

              Apparently most people care more about being able to make a living out of selling software, as supermarkets are quite bad at taking Github stars, while others rather not pay for the work of others.

              Nothing related with Stalmman's freedoms.

              • medo-bear 13 days ago

                Alot of people also make money standing on the shoulders of people who made free software

                • matheusmoreira 13 days ago

                  Which is why the only good free software license is the AGPLv3.

                  https://web.archive.org/web/20120620103603/http://zedshaw.co...

                  > Why I (A/L)GPL

                  > I want people to appreciate the work I’ve done and the value of what I’ve made.

                  > Not pass on by waving “sucker” as they drive their fancy cars.

                  Anything other than AGPLv3 is really just transferring wealth directly into the pockets of the beggar barons.

                  https://zedshaw.com/blog/2022-02-05-the-beggar-barons/

                  • pabs3 13 days ago

                    There is no reason those barons can't make money on top of AGPLv3 software too, its easy to comply with if they thought anyone would bother to enforce the license.

                    Also, if you have a desirable app that they want to make money off, they have the ability to just reimplement it from scratch, or make a protocol-compatible equivalent, so the license does not matter in the slightest to them.

              • matheusmoreira 13 days ago

                Plenty of people seem to be making decent livings with GitHub Sponsors.

                • pjmlp 13 days ago

                  As much as street artists manage to get by.

  • kragen 13 days ago

    gcc started being competitive with most of the proprietary compilers quite early on, as i recall; for a number of processors, notably including the ns32000, it reputedly produced better code than the vendor compiler

  • klelatti 14 days ago

    > This totally neglects

    > … Part 1 …

stevekemp 14 days ago

I've recently been working on a CP/M emulator, and having a lot of fun working with the Aztec C compiler referenced in the list.

It's strange to go back to old-school function definitions:

     main(argc, argv)
      int argc;
      char *argv[];
     {
But a C compiler in only 40k, assembler in 20K, and linker in another 20K is a whole world of tiny software compared to what we use right now.
  • nineteen999 14 days ago

    The Hi-Tech C compiler for CP/M works quite well also, and generates pretty compact code.

    sdcc can be used to cross-compile for CP/M from a modern PC, if you provide a crt0.rel and libc. I rolled my own and was able to port a few useful programs and utilities with it.

    • anthk 14 days ago

      You might like this:

      https://t3x.org/t3x/0/index.html

      Pascal like language, compiles at Unix and to DOS and CP/M, it can be run under CP/M too. It can compile both binaries and bytecode.

      I ran a faithful Ladder port natively under a GNU Unix 386, it's really good:

      https://t3x.org/t3x/0/programs.html

    • stevekemp 14 days ago

      I haven't gotten round to trying the Hi-Tech compiler yet, primarily because it won't run under my emulator. I just added a couple of missing syscalls but the only output I see is "Out of memory".

      (Removing the "-v" flag causes it to generate some submit files, with $$$ suffix, but I don't yet support those in my CCP.)

      I'll try it on real hardware next week and if it works there I guess that'll be another fun bug to hunt down!

      • nineteen999 14 days ago

        I didn't bother with emulating CP/M itself, just the Z80 CPU+RAM and disk interface, which I made compatible with Udo Munk's z80pack. I was able to use the stock z80pack BIOS/BDOS so I didn't have any issue getting it to run.

        You are going deep!

        • stevekemp 14 days ago

          Yeah I'm doing things the hard way, for sure! But the project grew little by little, and I'm enjoying the process.

          (And it has to be said one of the reasons I'm interested is because I have a Z80-based single-board computer which runs CP/M natively which helps for testing things.)

  • fuzztester 11 days ago

    >It's strange to go back to old-school function definitions:

    Ha ha, yes.

    The first edition of The C Programming Language book by Kernighan & Ritchie (the book popularly called just K&R) used those definitions.

    The second edition (the ANSI C one) used the latter kind of definition, the ANSI kind, with both function arguments and return values having (mandatory?) types.

    I remember I used to religiously use the latter kind from as soon as the C compilers I used, supported them.

  • pjmlp 14 days ago

    Around 2000 the HP-UX 10 aC compiler that we had available, still wasn't fully ANSI/ISO C89 compliant for function declarations, talk about taking its time.

  • fuzztester 13 days ago

    Turbo Pascal 3.x - under 40K, w/ basic text editor built-in. But could only create .COM files, not .EXEs. Lightning-fast DX.

    • kragen 13 days ago

      there's no such thing as a .exe file

      • fuzztester 12 days ago

        Of course there was, and is.

        • kragen 12 days ago

          nope, totally not a thing in cp/m, sorry

          • fuzztester 11 days ago

            Replying a bit late ...

            I never said a word about CP/M, sorry.

            Referring to my original comment that you replied to, I was talking about Turbo Pascal on DOS, which (DOS, not Turbo Pascal) did have EXE files, from early versions that I had used. But the TP version I mentioned could only create .COM files.

            In fact, I remember the DEBUG utility was DEBUG.EXE, again from early DOS versions, although it was possibly DEBUG.COM earlier.

            • kragen 11 days ago

              > Referring to my original comment that you replied to, I was talking about Turbo Pascal on DOS,

              you posted that comment in reply to stevekemp talking about the c compilers he was trying on his cp/m emulator; maybe you clicked the wrong reply link

              plausibly early versions of tp could only create .com files because they originally came from cp/m before being ported to ms-dog

              > it was possibly DEBUG.COM earlier

              yes, though the cp/m version was ddt.com, with a rather rebarbative ui reminiscent not of ddt but of teco

              • fuzztester 11 days ago

                >you posted that comment in reply to stevekemp talking about the c compilers he was trying on his cp/m emulator; maybe you clicked the wrong reply link

                No, I clicked the right reply link that I meant to.

                But I see the issue now.

                Although his context was small sized compilers on CP/M, my (implied) context was not.

                Mine was just small sized compilers, period.

                My comment about TP 3 was in that context, and was not referring to CP/M. IOW, I was just saying, TP 3 was another small compiler (and I omitted saying that it was on DOS). I did not even know that TP had existed on CP/M, although I had used that OS briefly, before DOS, and had actually used a Pascal compiler on it, in a programming course I was attending. I remember the machine had those big 8-inch floppy disks. (1) But that was not the TP compiler, it was some other one, I don't remember which.

                So I can understand why you thought I was talking about CP/M, and hence said that it had no EXE files.

                (1) Or it might have been an MP/M machine, because I vaguely remember that we had to log in to it.

                https://en.m.wikipedia.org/wiki/MP/M

                • kragen 10 days ago

                  admittedly my initial reply wasn't that helpful in clarifying ;)

                  yeah, tp3 is great. there were a lot of pretty decent compilers for ms-dog; being able to address hundreds of kilobytes of ram (even if awkwardly) and having separate 64k address spaces for code and data really reduces the difficulty of getting a decent compiler running. the compiler scene for cp/m is fairly dismal by comparison

                  also, the 8080 and even the z80 were a lot less hospitable to c compiler output than the 8088. on many small processors, though not the z80, sdcc by default compiles everything that isn't specifically marke as __reentrant as non-reentrant, so your parameters and local variables are statically allocated; this isn't compliant with the c standard but is surely a better default tradeoff for the 8080 (which sdcc doesn't support) and probably for the z80 as well. see https://sdcc.sourceforge.net/doc/sdccman.pdf#page=49 for details

                  • fuzztester 10 days ago

                    >admittedly my initial reply wasn't that helpful in clarifying ;)

                    NP :)

                  • fuzztester 10 days ago

                    >yeah, tp3 is great

                    Sure was, in many ways. That's why it had so many hardcore fans, and why people still talk about it.

                    Speaking to your second paragraph, IIRC, TP 3 even had overlays, although I never used that feature, maybe because of being in the early stages of my career.

      • cowboylowrez 13 days ago

        no, I've seen them, they exist.

        • kragen 13 days ago

          you can't run them on cp/m

          • cowboylowrez 13 days ago

            TIL! so if I wanted to save an exe to cpm, would "3x3" as a file extension work instead? are there other prohibitted file name extensions?

            • kragen 13 days ago

              oh, you can store them on a cp/m disk, you just can't run them under cp/m

  • kragen 14 days ago

    have you tried bds c? it's fully free software now, runs under cp/m, and is reputed to be much nicer than aztec c, so i'm curious to hear your point of view

    fwiw i suspect implementing only ansi c function headers and declarations would result in a slightly smaller c compiler than implementing only the k&r style ones

    • stevekemp 13 days ago

      I downloaded it thanks to your comment, from this site:

      http://www.cpm.z80.de/develop.htm

      Worked first time to create a simple "Hello World" program, but many of the included examples fail to compile for me, with random errors:

           RM.C: 20: String too long (or missing quote)
      
           CP.C: 56: Curly-braces mismatched somewhere in this definition
      
      I'll have to explore more thoroughly later, but thanks for the prod!
      • kragen 13 days ago

        hmm, that's disappointing! are there in fact missing quotes and mismatched curly braces in those files? i wonder if you got a corrupted copy of the compiler

        • stevekemp 13 days ago

          The code looks good, there seems to be some surprises with files that use "#include <stdio.h>" or similar. The include-lines are the ones the errors refer to - but since those files come with the compiler distribution I'd have expected their contents to be well-formed.

          I removed the includes from a few files, changed FILEs to ints, and added "#define EOF -1", etc, which let some more simple code compile & link. But it felt like those changes shouldn't have been necessary.

          I guess the good news is the compiler ran without me having to add any new CP/M syscalls/bdos functions to my emulator - unlike hisoft which required that I implement "T_GET" (Get date and time) and a couple of other functions I'd been missing such as F_SIZE (For getting a file size).

MaxBarraclough 14 days ago

> According to Michael Tiemann:

> > I wrote GNU C++ in the fall of 1987, making it the first native-code C++ compiler in the world.

I thought Walter Bright's Zortech C++ compiler held that honour, but apparently [0] that was released 1988.

[0] http://www.edm2.com/index.php/Zortech_C%2B%2B

johngossman 14 days ago

This makes me feel very old. I remember that issue. I was a Pascal snob at the time but interested because C had a reputation for speed. A year later I got my first job and read K&R, not realizing I would be using C and C++ for most of my career. It mentions Lattice C. Does anybody here remember if Microsoft C was a fresh start, or did they license the entire codebase and use it for a start? I used Lattice on both PC and Amiga, but quickly dumped it on Amiga in favor of Aztec C, for which I still have fond memories. It was much faster and produced a ASM file intermediary which was useful for figuring out how to optimize. Nostalgia is a powerful drug. I’m sure I’d blanche with horror having to develop with those tools today.

  • eesmith 14 days ago

    The linked-to article says:

    "One that has survived is Lattice C, one of the most performant of the compilers tested by Byte. Lattice C was so good that it was licensed by Microsoft and sold as Microsoft C V1.0 before being replaced by Microsoft’s own in-house Microsoft C V2.0 compiler."

    • johngossman 14 days ago

      I read that, but I'm not convinced it is correct. It is a best ambiguous--development could have gone internal but started with the Lattice codebase. It is also incorrect at least as to version numbers. MSC 3.0 was the first version that wasn't developed by Lattice, whatever the provenance of the source

      • eesmith 14 days ago

        Peter Norton wrote this in 1983:

        "The Microsoft C compiler has interesting historical roots. Although Microsoft itself works with C, this compiler is not a direct Microsoft product. Instead, it is an adaptation of the famous and highly regarded Lattice C compiler."

        There are also ads like https://archive.org/details/PC-Mag-1984-09-04/page/n51/mode/... saying "Microsoft C compiler / Includes Lattice C and the MS Librarian".

        I can't find info about v2 or v3.

        Edit: https://winworldpc.com/product/microsoft-c-c/2x says "Microsoft C 1.0 and 2.0 are a rebranded version of Lifeboat Associates Lattice C", so your doubt seems well founded.

        • pjmlp 14 days ago

          Another historical note, if I remember correctly, Microsoft was the last C compiler vendor on MS-DOS to add C++ support, which happened on Microsoft C/C++ v7.0 release, their last release for MS-DOS as well, before Visual C++ came to be.

teo_zero 14 days ago

I still remember my first steps with C on my Amiga, after years of BASIC. I opted for the Dice compiler because Lattice was too expensive. At that time the norm was to send the money via mail and receive floppy disks in exchange via the same means. By pure chance, the Dice distributor in my country lived in my city and we could close the deal in person, in a parking lot like smugglers or pushers!

  • robinsonb5 14 days ago

    Dice was very nice - and later on vbcc was pretty cool, too. In fact vbcc is so nice I used it for a toy CPU project a couple of years ago, because its backend interface is unusually well-documented and accessible. (I think tangling with either gcc or llvm would likely have added another six months to the project.)

kragen 14 days ago

i would have thought 01983 was a bit early for

> many insist that C is the programming language and that it will last forever

but it's 41 years later and c is #2 on https://www.tiobe.com/tiobe-index/, followed by c++, which is almost a superset. items #4, #5, and #6 use c syntax but are high-level languages, and item #7 is a redesign of c as a high-level language with different syntax by, in large part, the bell labs team that designed c in the first place. the #1 item is a high-level language that departs significantly from c syntax, but its almost-universally-used interpreter is written in c

41 years isn't forever but it's longer than computers had existed in 01983. so that's not a bad showing, especially given the thousands of programming languages that existed before c

______

(for future reference, the somewhat questionable ranking in the article i linked is currently python, c, c++, java, c#, js, golang, 'visual basic', sql, fortran, delphi/object pascal, assembly language, ruby, swift, scratch, matlab, php, kotlin, rust, and r.)

  • phicoh 13 days ago

    Yes, both C and Unix are amazing designs. Far from perfect, but good tools for many jobs. In my opinion, it is now time for Rust to take over. But comparing Rust to C, it took a huge amount of improvement over C to replace C.

    • kragen 13 days ago

      i don't think rust has replaced c, or is likely to

      preceding c, there was a large, complex, portable algol-family language capable of handling a range of applications similar to c's, called pl/1. if you read 'the elements of programming style', from about the time c was born, many of the examples are in fortran; most others are in pl/1. gary kildall's first job working on microcomputers was at intel writing pl/m, a cut-down pl/1 for microcomputers, for which he wrote cp/m, from which ms-dos grew. ibm's database db2 is still written in a dialect of pl/1 called pl/x. one of wirth's first languages was a thing called pl/360, which was a version of pl/1 with extra low-level facilities for bootstrapping a better language

      in short, pl/1 was very widely used and highly influential, even though it's almost forgotten today

      and that's because it kind of sucked. compiling it required a large computer and even then was slow. even at the beginning, it was highly complex, so learning it took a long time. this got worse over time as more and more features were piled onto it, often as ill-advised efficiency hacks that then couldn't be removed, because people's programs depended on them. because it was highly complex, and there were a lot of surprising problems that resulted from complex interactions of all its complex features; compiler developers improved this situation significantly by working hard to provide better error messages, but that only diminished the problem rather than eliminating it

      the preceding paragraph describes rust exactly as well as it describes pl/1, so if someone succeeds in producing a c-like small, simple language that provides enough of rust's unique advantages (which are indeed very compelling) i expect that it will replace rust

      i doubt it will replace c tho

      • phicoh 13 days ago

        In my opinion, Rust, though large, is in some sense quite minimal. Lot's of features in Rust are actually needed to allow the compiler to statically reason about memory safety.

        And this is why Rust is likely going to take over from C, not because it is the most C-like design, but because it can provide static safety guarantees.

        There is considerable pressure, at least for internet facing code, to show that it is safe. And that is way to hard to do (and keep doing) for C.

        Of course, somebody may come up with a brilliant design. But lots of organizations don't want to wait for that. They don't want to keep writing new code in C. And Rust has by now a mature compiler, a very active community.

        • kragen 13 days ago

          yes, i know the goals and achievements of rust

          don't conclude there isn't a better solution just because you haven't seen one yet

          • phicoh 13 days ago

            I'm not saying anything about what is possible. What would be the amount of time it takes to create a new language, create a stable compiler, create a sizable community? It can easily be ten years.

            There are lot's of organizations that are not going to wait ten years. And we still have to wait for the language to be designed in the first place to suggest that waiting might be an option.

            And 10 years from now, the people who have running Rust code are no going to switch to a language that is similar to Rust but may have a slightly better syntax or otherwise be closer to C.

            • kragen 13 days ago

              you said, 'Rust, though large, is in some sense quite minimal. Lot's of features in Rust are actually needed to allow x'

              this logically entails that x is not possible without those features. if x is possible without them, they are not actually needed to allow x. so, although you hadn't noticed, you were saying a lot about what is and isn't possible

              there are lots of organizations still using cobol, and people who have running cobol are not in any hurry to switch, either. but i have the luxury of not having to care

              • phicoh 13 days ago

                Obviously Rust has some features, like async, that are not needed for 'C with safety'.

                But in my experience you need most of the other features. So any 'C with safety' will have a lot of the complexity of Rust. Though of course the syntax and defaults may be different.

                At some point it becomes very subjective. Does a language need a UTF-8 string type, or is an array of 8-bit characters enough?

                The research question would be, what is a minimal subset of Rust that still allows for programs that are commonly written in C to be easily written in this subset.

                Some things, like the tuple type, or 'if let' can be removed, but is it still convenient to write programs without them? Closures are a bit tricky to see if they can be removed.

                In any case, C struck a very good balance between being small and powerful. In my opinion Rust strikes a similar balance. Rust is a lot bigger, but it gives powerful tools to write reliable code.

                • kragen 11 days ago

                  it is impossible for you to have experience with better alternatives that haven't been invented yet; they are unlikely to look like subsets of rust. c is not a subset of pl/1

                  • phicoh 11 days ago

                    Indeed, I don't. But languages can be compared based on features. Even a language that doesn't exist can be described based on what features it needs to have. I'm not an expert in type systems, but there is a huge amount of research in how type systems can be described, what features you need to make a type system safe, etc.

                    If we compare C and PL/I then obviously the syntax differences between C and PL/I are mostly based on the taste of the language designer. But there are a lot of features in PL/I that simply have no equivalent in the C language. Either because when C was design they no longer needed to be in the language or because the features are move to the standard library.

                    In particular I/O is something that lots of language have as part of the language and C doesn't. So it is in many ways obvious why C a lot smaller than PL/I without losing any power.

                    But now we go in the opposite direction. We know what the safety outcome should be. So we can make a list of things that need to be in a 'C with safety', even if we cannot predict what the exact syntax will look like.

                    • kragen 9 days ago

                      safety isn't achieved by adding capabilities, but by omitting capabilities. the hard part is how to avoid omitting necessary capabilities

                      > In particular I/O is something that lots of language have as part of the language and C doesn't. So it is in many ways obvious why C a lot smaller than PL/I without losing any power.

                      that's not why; often obvious things are wrong

bwanab 14 days ago

Late 70s, I worked for a company that built embedded monitoring equipment. They'd been using the Intel 8051 coding it with assembly and had just done their first Z80 boards. I was fresh out of college and told them I could code it in C in 1/4 the time they'd been getting assembly code done. They gave the go ahead and I bought Whitesmith's cross compiler. I was able to code and test on the pdp-11s, then burn ROMs to test on the target machine. They were stunned when I requested another 4K of ROM, but went ahead. We did several devices this way before I moved on. They never went back to assembly though. Fun times.

  • kragen 13 days ago

    haha, yes, that's usually the way it goes!

HarHarVeryFunny 14 days ago

gcc really was a game changer.

Software tools used to be really expensive. I remember in the 1980's buying both Rational System's Instant-C and WATCOM-C (32-bit DOS compiler) each for many hundreds of dollars before switching to free Linux and gcc.

Instant-C was, at the time, a magical kind of product. It was basically a C interpreter that could resolve undefined symbols at runtime - run your program then write a function and continue if it was missing. You edited code by function rather than by file, so everything was incremental and instantaneous, a big contrast to the normal experience back then of kicking off a big build and practicing your juggling while it progressed..

  • cenamus 14 days ago

    Was there a connection to any Lisps? Cause that sounds extremely like the interactive developement process there

    • HarHarVeryFunny 14 days ago

      I don't know if that was part of the inspiration or not - I don't recall Rational themselves selling LISP. It was really just a way to speed up the whole compile-link-debug cycle. It was only a relatively short-lived product - I suppose since the speed of Turbo-C provided much of the same benefit.

      • anthk 14 days ago

        A LISP 'repl' environment (read-eval-print-loop) works exactly like that. Stop on an error, fix it, continue like nothing. Unix and C where designed to fault/abort fast on errors, and showing no output on good behaviour.

Blackstrat 13 days ago

Good article. As of the early 90s, Watcom was by far the best compiler option for professional development on Windows and OS/2. We used both Microsoft and IBM before settling on Watcom. Borland was fun on home PCs, but we didn’t approve it for commercial development. Also did a large LAN based system in UCSD-Pascal in the mid-80s. Visual Basic and the Web really ruined my programming desires.

massifist 14 days ago

I remember sometime back in the 90's buying a used copy of Visual C++ 5.0 from some dude at the mall. It felt kinda shady, like a drug deal was going down, but I turned out well and I scored a legit(?) copy for much cheaper than market value. I was very excited when I got it home, and it kept me busy for a long while. There's a lot of good documentation (and source code) on those CD's and it taught me a lot, (pretty much) coming from Quake C.

I think at some point I also started using DJGPP, which pushed me toward GCC (and Linux).

  • int_19h 13 days ago

    In those days, the CDs included the entirety of MSDN for local use - indeed, most CDs in the box were MSDN.

    And IMO the documentation was better written, too.

thefaux 14 days ago

> FORTRAN was provided for the IBM 1401 by an innovative 63-pass compiler

Must have been fantastic to debug when there was an unexpected interaction between pass 37 and 51.

  • mjcohen 14 days ago

    IIRC, it was a 53-pass compiler.

    • Someone 13 days ago

      63, according to the paper they wrote about it in 1965. https://ibm-1401.info/1401-IBM-Systems-Journal-FORTRAN.html:

      “The 1401 FORTRAN compiler has 63 phases, an average of 150 instructions per phase, and a maximum of 300 instructions in any phase.”

      The phases aren’t all what I would call a compilation phase, though. For example, phase zero:

      “Phase 00 - Snapshot. Loads a snapshot routine into 350 positions of core storage. This routine lists a specified amount of core storage.”

svilen_dobrev 14 days ago

About 1988-1990, There was a Metaware HighC (and C++) compiler, which had platform-coverage comparable to GNU - RISCs, intel x86, etc. With own "yield" extension (yeah, generators/coroutines, back then), and few others. Maybe the first locally And globally heavily optimizing one.. the assembly code produced was looking rather weird But was faster and/or smaller than others.

no idea what happened to that company

blairfrandeen 14 days ago

You had me at "Free University Compiler Kit"

  • e63f67dd-065b 14 days ago

    It was written by Tanenbaum, who worked at VU Amsterdam, which translates roughly in english to the Free University of Amsterdam. VU remains to this day a fairly significant institution in the field of CS academia.

stergios 13 days ago

Hopefully the next part covers some of the compilers available on the 68k personal computers of the late mid to late 80s. MegaMax and Manx come to mind a some of the more prominent examples.

Koshkin 13 days ago

The Tiny C compiler deserves to be mentioned.