It recently came to my attention that Netflix had released Atlas, a "backend for managing dimensional time series data". Now, if you care about theory and find it much more amusing than implementations and real world cases, this may look like just another tool for the grey suit-n-tie real world business programming that your collegues would probably come to share with you when discussing, say, how to better handle metrics for your microservice based system or something like that; so you wouldn't really be dying to dig deep into its implementation. However, get this: it seems that they used a stack based language to handle the queries in that system! This is not something you see on a daily basis, and it is also not something usual when talking real world money development; it ends up being that one small token of joy for that monday. At the moment I saw it, I was reading something about continuations and linear login - those kind of reads you do to lie to yourself you are not a full-on corporate sell-out yet - and it just hit me: the relationship between uniqueness typing (or substructural type systems) and stack based languages.
From the mid 60s to the late 70s, we had a huge growth in programming paradigms. By huge here we mean that we went from the one and only (un)structured structural programming to branches of functional languages (from ISWIM to ML), logic languages (SQL and Prolog), array languages (APL) and much, much more. Even C, the arguably most (in)famous up to date. Yes, one may argue that, for instance, LISP has been around ever since COBOL was live, but we can also say that Scheme (from around this date) was one of the most important kicks to its growth. There's a lot more to this and the discussion is not so simple, but what we want to bring attention to here for one of the forgotten paradigms from that era: Forth and stack based languages.