mrkeen 15 days ago

It's not in the abstract, but towards the end of the paper:

> Our analysis suggests that the language of A-normal forms is a good intermediate representation for compilers.

I messed around with both CPS and ANF transforms for a while. Here are more approachable treatments of them:

  https://matt.might.net/articles/by-example-continuation-passing-style/

  https://matt.might.net/articles/a-normalization/
I wasn't sure which one I'd end up using. I suspected CPS would make it easier to do tail-call optimisation, which is very important to me. But once I implemented CPS, I found I could not figure out how to assign a type to the continuation term that CPS introduces. I had no such typing problem with ANF, so that's what I continued with. I haven't tried the tail-call optimisation in ANF yet.
  • k_g_b_ 15 days ago

    The problem with typing continuations -in my opinion - stems from the fact that nearly all discussion of continuation starts from a functional context/view/programming language and often also in an untyped language/formalism.

    Additionally, CPS ends with "style" suggesting that it's fully expressible in the original programming language. But if we truly type it after a style transformation we need to introduce polymorphism everywhere, see e.g. the first example for Haskell on https://en.m.wikipedia.org/wiki/Continuation-passing_style

    If we use CPS as an IR, typing the continuations as functions is wrong and unnecessary - a continuation does not return anything and more precise: it does not return. This means it's type is constructed just from it's argument type(s). In an IR we can simply add constructs for continuations and their types.

    A lambda \x:Nat.5 has type Nat -> Nat, or in one non-infix syntax: ΠNat.Nat (https://en.m.wikipedia.org/wiki/Lambda_cube). A continuation c with argument Nat simply has type `Cont Nat` a type constructor taking the argument type.

    A `Cont Nat` isn't something you would see in a CPS IR, because it's mostly useless: it's type describes a computation that takes an argument and then diverges. The only place such a continuation has, would be as the single `exit` continuation of the program provided to the main entry point of the program. More useful is e.g. a `Cont (Nat, Cont Bool)` and this brings us the "passing" part of CPS.

    Some like to use the syntax `Nat -> !` and that highlights the non-returning nature, but I think this is again too confusing in the context of functional programming - just leave out the `->`.

    See also https://okmij.org/ftp/continuations/undelimited.html and note that it argues that the untyped call/cc operator might be the origin of the confusion: it represents the captured continuation as a function and a continuation also has an application operation like a function, but it's semantics are different.

    As far as I can tell, there's very few formal treatments of continuations that don't start from the lambda calculus. The only two papers I've found are on the Continuation Calculus (https://arxiv.org/abs/1309.1257, https://arxiv.org/abs/1409.3313) with no other follow up.

undecidabot 15 days ago

Also worth reading: Compiling with Continuations, Continued (2007) [1]

[1] https://www.microsoft.com/en-us/research/wp-content/uploads/...

olsonjeffery 14 days ago

I work on an open source system-f level lambda calculus named System R (https://github.com/olsonjeffery/system_r).

I’m interested in this because I’m writing this LC dialect as a deliberate, extensible IR with a targeted “bottom-dialect” (a System F, if you will) that can be translated to any number of lower representations.

My goal is use it as an extensible dialect that can be built up to any arbitrarily complex dialect of LC (Calculus of Constructions, Algebraic Effects, etc). Those “higher dialects” themselves can be targets for some “front end” language that’s palatable to end-programmers.

There’s a lot of thoughtful insights here in this paper, namely the idea of “combining steps” to optimize/remove IRs. This also corresponds to the same practices in eg Koka (which goes straight to C or wasm/js; heck maybe they cite this paper?)

I personally reflect on this because I think an extensible IR is just a stop-gap to an optimized process, allowing you to “grow” the language semantics you want, but ultimately you’ll have to rewrite bc of perf. This happened with rustc, as well.

Thanks for sharing!

nextaccountic 15 days ago

Why is the text rendering on this pdf so bad? The curves in letters doesn't look smooth (apparent after zooming in). It's not a scanned pdf but it looks like it was converted from postscript

So maybe the question is, why is text in .ps files so bad? Are they bad after being printed too?

  • antonvs 15 days ago

    Text in Postscript files is excellent in general. This was either scanned and then OCR'd to allow copying text, or converted from PS to PDF badly.

    Here's a better version: https://users.soe.ucsc.edu/~cormac/papers/pldi93.pdf

    There's still pixellation if you zoom the font. That could be due to the type of font that was originally used, and/or due to the conversion.

    Note that the article is over 30 years old. Technology has improved a lot since then. A typical screen resolution in 1993 was 800x600.

  • vbezhenar 15 days ago

    I think it's scanned image.