Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
A brief, incomplete, and mostly wrong history of programming languages (2009) (james-iry.blogspot.com)
224 points by zdw on June 26, 2023 | hide | past | favorite | 65 comments


My favorites of this evergreen article:

1990 - A committee formed by Simon Peyton-Jones, Paul Hudak, Philip Wadler, Ashton Kutcher, and People for the Ethical Treatment of Animals creates Haskell, a pure, non-strict, functional language.

1995 - Brendan Eich reads up on every mistake ever made in designing a programming language, invents a few more, and creates LiveScript. Later, in an effort to cash in on the popularity of Java the language is renamed JavaScript. Later still, in an effort to cash in on the popularity of skin diseases the language is renamed ECMAScript.

This article should be updated to include TypeScript, Rust, Zig, Nim etc.


> 1991 - Dutch programmer Guido van Rossum travels to Argentina for a mysterious operation. He returns with a large cranial scar, invents Python, is declared Dictator for Life by legions of followers, and announces to the world that "There Is Only One Way to Do It." Poland becomes nervous.

Lmaoo


Besides the main text, there are many comments with appropriate corrections, e.g.:

"Jacquard's loom wasn't concurrent? It was pretty thoroughly multithreaded, I'd have thought!"


2015 - Graydon Hoare invents the Rust programming language, which breaks everything by fixing a problem that nobody actually has.


2015 - A group of sentient extraterrestrial cephalopods, known as "Rustaciens", instruct Graydon Hoare to transcribe the Rust Book onto three golden tablets. Followers of the new movement demand that "everything be re-written in Rust!" Nothing ever is.


Many projects are started, but only to accelerate the compilation of ecmascript. None succeed.


2012: Steve Ballmer invents Typescript after losing a chair-throwing contest with one of his VPs. The language fails to convince any bored grandmas to switch back to Windows from Linux, and a bunch of them write the first version of VSCode with just 640k of assembly to flip him the birdie for getting rid of Clippy.


> Ashton Kutcher

At age 12?


This is one of my favorite evergreen posts on HN. If there were posters of it, I'd buy one. So many good lines to choose from, but I'll just highlight this one today:

> Lambdas are relegated to relative obscurity until Java makes them popular by not having them


This one tickled me a lot

>Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic.


But the entry for Java doesn't mention that it doesn't have Lambdas. I assume that the Java that is mentioned has them?


Java added lambdas in version 8, around 2013, after this was written. They probably omitted the lack of lambdas for the joke regarding C#, since it had lambdas since around 2007.


My favourite kind of bullshitting is this. Ie. when you mix plausible lies with definite lies and the truth, messing with the brain.

Like how cereal was invented by a joint US Army/Navy programme to invent a resilient, easily transportable, high nutrient non-perishable food product for wartime use on ships and in the trenches. The programme of course being run by Admiral Kellogg and General Mills.


But where does Captain Crunch fit in?


I think you’re talking about Lt. John “Captain Crunch” Sydney (referred to as “Captain” when in command of his tugboat) who earned the nickname when he rather infamously pushed a cargo barge full of grain into a dairy transport causing both ships significant hull damage and making a rather soggy mess.



2022 A research charity called "OpenAI" accidentally breeds a language model in its lab (by feeding most of the Web into a neural network) that can generate code on request - and quickly morphs into a for-profit. Instantly, Google sheds 12,000 coders, anticipating a strong decline in coding skill demand.

2023 Rust, then the "most liked programming language", gets forked off as "Crap" by a subgroup of haters in brown t-shirts, so that they can have a proper in-fight. In an interview, an anonymous rebel leader states "I guess I got bored of the vi vs. Emacs debate a bit, so I was looking for something new."


Why do you believe or know that the 12,000 fired were related to AI and coding not needed? 1. Google search, YouTube is dying 2. What % were coders? 3. Where was it stated this is related to AI competition?


How does it taste, this onion?


Onion-y! I guess rather than ask questions people would rather panic over the death of developers.


2006 - Graydon Hoare begins work on a new language and compiler. Understanding the difficult nature of compiler programming, Hoare creates a language with the same level of difficulty.

2012 - Microsoft asks, "What would JavaScript be, if it were C#?" and releases TypeScript

2012 - Google asks, "What would C is be, if i̶d̶i̶o̶t̶s̶ Ivy grads could use it?" and releases Go.


* 2006 - ...named Rust


This just reminds me of how far Scala has fallen. The 3.0 debacle, the fact that the community had a civil war, it really is a shame considering how it's SO close to being a strongly typed python replacement.

Programming Scala is the closest to just expressing how I think. For people who love C#, it has all the best C# features and an even more expressive type system.

Everytime I have to write Go, Python or C++ at work I'm missing features from Scala. :(


Scala was the only time in the past decade I actually enjoyed programming, but it wasn’t a great idea to use for our relative large team as the amount of expression allowed comes at the expense of subjecting others to your mental model of how things should be done and having them spend undue time grokking it. I view this largely as a deficiency in style guidance, but Scala attracts very opinionated people in my experience and I say good luck to any style guide being upheld across a large org or team.

Kotlin has been nothing but great for me though. (Different company). It’s largely the best parts of Scala for me and without the headaches that come from some allowed patterns.


>I view this largely as a deficiency in style guidance, but Scala attracts very opinionated people in my experience and I say good luck to any style guide being upheld across a large org or team.

That's a great point, and probably one of the biggest downfalls of Scala. Not only did it allow for both 'java++' style programming, it also had full on almost 'haskell' style as well, and the community was split on which way was better!

So yeah, any given codebase in the company could be a very tight, neat, compact monad transformer heavy functional program, or it could be a mess of untyped actors passing messages around to mutable state ridden monstrosities!


Multi-paradigm languages, and there are several popular ones including Scala, need strong software architecture enforcement. The software architects need to decide which style to use, e.g. purely functional and monadic, vs nicer-looking Java-style OO.


What a day to be alive, when Java-style OO is described as a nicer looking option!


I felt that way for quite a while, and still miss Scala sometimes, but I find that TypeScript does a really good job of letting me express in the type system what I actually want. (Yeah, it's got some holes, yeah I miss the JVM, yeah JavaScript is lurking under there, but I can correctly write code that other people can understand in short order and the compiler will scream very loudly if people push my code's buttons wrong.)


It's still around; deep in the trough of disillusionment perhaps, but things have settled down and the fundamental strenghs of the language are as good as ever. I work in Scala full time and wouldn't want to use anything else.


What 3.0 debacle?

I've just started a new project and we are using Scala 3. It's wonderful. (Admittedly, it's blank-slate, so no dependencies to previous Scala, e.g. the old Scala approach to meta-programming.)

> expressing how I think.

What you are saying is basically that you like ML-family languages, which also include Haskell, Ocaml, Rust, Kotlin. Increasingly, other languages take the ML-family lessons from 1973 on board!


It’s interesting to me that there are multiple calls in the comments for updated entries (rust, zig, elixir, etc). But no one has actually thrown one down here. I notice that no one even mentioned Swift or Kotlin.

For me as a long time practitioner of many languages, one of the telling points is that most/all of the new entries I might write for any of the newer languages would lack personality. Historically, languages had opinions about things. C was common assembler. Smalltalk was objects. Lisp was lists. You could easily riff on these languages “first principles.”

Modern languages feel more like political committees trying to keep as many constituents happy as possible.


I see it as a positive sign: after 60 years of mainstream programming the field is finally maturing enough that we've got some consensus about what languages should look like (Scoping should be lexical. Types should exist but mostly be inferred. Etc.). We're past the "cambrian explosion" stage and now there's more iterating and polishing what works.


Totally agree, a world where every language had it's own "thing" and felt the need for that thing to warp every aspect of the language would be much worse.


"In a history of Smalltalk I wrote for ACM, I characterized one way of looking at languages in this way: a lot of them are either the agglutination of features or they’re a crystallization of style. Languages such as APL, Lisp, and Smalltalk are what you might call style languages, where there’s a real center and imputed style to how you’re supposed to do everything. Other languages such as PL/I and, indeed, languages that try to be additive without consolidation have often been more successful." — Alan Kay in https://queue.acm.org/detail.cfm?id=1039523


Zig very much has opinions about things and is not created by a committee. I suspect that is why people here want to see it included. Rust is s bit of s mixed bag, while it has strong opinions it also suffers w bit from committee design but that is true for C++ too which is in the article.


> Rust is s bit of s mixed bag, while it has strong opinions it also suffers w bit from committee design but that is true for C++ too which is in the article.

Too true, but C++ is rare in that, while being opinionated, it is strongly opinionated in multiple directions (multiple opinions?).


C++ has all the opinions!


> C++ has all the opinions!

Yeah, but it didn't create any of its own, it just inherited them ;-)


It virtually inherited them :P


Not sure if this just is my algorithm but this page is the top hit if you google for “mostly wrong”


And the 3rd and 4th are HN submissions of the page, for me.


It’s for me as well.


'In spite of its lack of popularity, LISP (now "Lisp" or sometimes "Arc") remains an influential language in "key algorithmic techniques such as recursion and condescension"[2].'


It'd be interesting to bring this up to date.


in case it isn't loading for you too:

https://web.archive.org/web/20230625182521/http://james-iry....

my favorite part:

  > Haskell gets some resistance due to the complexity of using monads to control side effects. Wadler tries to appease critics by explaining that "a monad is a monoid in the category of endofunctors, what's the problem?"


I was at Edinburgh when this came out, and it went round the mailing list. Wadler replied saying he wasn't aware of the fact, and wanted to know the proof.

I was sharing an office a few floors down with Roko of Basilisk fame, who was the category theory guy. He thought about it for a few minutes and then took us through the proof on the board.


this is the hardest I've ever been flexed on at HN

well done


At least they haven't won the Putnam[1]

1: https://news.ycombinator.com/item?id=35079


Needs a recursive function iterating through JS frameworks of the past 20 years.


>Capitalization Of Boilerplate Oriented Language

this is not only true of COBOL, my friend.


The mention of Arc really dates it. I'm disappointed that pg moved on from it so quickly, there was a lot of hype around Arc during its conception but now I rarely hear about it even though I follow everything Lisp-related obsessively.


At least for me, part of the allure of the Lisp way of thinking is that it's perfectly okay, even desirable, to whip up an entire language just for one project.

My favorite part of SICP is this quote:

> It is no exaggeration to regard this as the most fundamental idea in programming:

The evaluator, which determines the meaning of expressions in a programming language, is just another program.

>To appreciate this point is to change our images of ourselves as programmers. We come to see ourselves as designers of languages, rather than only users of languages designed by others.

For me, Arc primarily being used for writing HN is not a sign of failure, but a great example of why Lisps are cool and can really lead to revolutionary thinking.


There is a distinction between making a new metalanguage, and making a new embedded language in an existing metalanguage. SICP and others promote the latter, while Arc (from my recollection) was supposed to be the former - the next big metalanguage.


SICP definitely promotes both. The latter could be done without ever writing eval, which is the climax of the book.


> On the back of his napkin he designs Programmable Hyperlinked Pasta (PHP). PHP documentation remains on that napkin to this day.

Hahahhaha oh my


Stops just short of the triumvirate inventing Go, a language which protects programmers from other programmers' bullshit.


We need entries for Rust and Zig


A lot of Zig happened in between trips to Spicy Village (true story, rc w17).


Related. Others?

A Brief, Incomplete, and Mostly Wrong History of Programming Languages - https://news.ycombinator.com/item?id=33001242 - Sept 2022 (1 comment)

A Brief, Incomplete, and Mostly Wrong History of Programming Languages (2009) - https://news.ycombinator.com/item?id=27559618 - June 2021 (31 comments)

A Brief, Incomplete, and Mostly Wrong History of Programming Languages (2009) - https://news.ycombinator.com/item?id=11936058 - June 2016 (24 comments)

A Brief, Incomplete, and Mostly Wrong History of Programming Languages (2009) - https://news.ycombinator.com/item?id=7796142 - May 2014 (13 comments)

A Brief And Mostly Wrong History Of Programming Languages - https://news.ycombinator.com/item?id=7149634 - Jan 2014 (22 comments)

A Brief, Incomplete, and Mostly Wrong History of Programming Languages (2009) - https://news.ycombinator.com/item?id=6953863 - Dec 2013 (12 comments)

A Brief, Incomplete, and Mostly Wrong History of Programming Languages (2009) - https://news.ycombinator.com/item?id=6504217 - Oct 2013 (8 comments)

A Brief, Incomplete, and Mostly Wrong History of Programming Languages (2009) - https://news.ycombinator.com/item?id=5695816 - May 2013 (33 comments)

A Brief, Incomplete, and Mostly Wrong History of Programming Languages - https://news.ycombinator.com/item?id=3503896 - Jan 2012 (45 comments)

A Brief, Incomplete, and Mostly Wrong History of Programming Languages - https://news.ycombinator.com/item?id=1327746 - May 2010 (13 comments)

A Brief, Incomplete, and Mostly Wrong History of Programming Languages - https://news.ycombinator.com/item?id=1310127 - May 2010 (1 comment)

A brief, incomplete and mostly wrong history of programming languages - https://news.ycombinator.com/item?id=599164 - May 2009 (14 comments)


Just surpassed 48 comments here tho :/


Yes, those earlier threads are surprisingly small for such a well known piece.


The first rule of D is...


ChatGPT is spot on:

2004 - The Google team realizes they're sitting on a pile of user data bigger than the Eiffel tower and starts the development of MapReduce, because finding pictures of cats among a petabyte of data is harder than finding a needle in a haystack.

2005 - David Heinemeier Hansson takes a break from blogging about how much money he has to create Ruby on Rails, a framework designed for developers who want to turn coffee into web apps. A vibrant community of hipsters who can't code in PHP quickly forms around it.

2006 - Guido van Rossum wakes up from a fever dream and decides that Python needs to be harder to write. Thus, Python 3 is born. Many developers remain happily in Python 2, ignoring the encroaching darkness.

2007 - Amazon sees that people are having a hard time storing and retrieving their data, so it introduces Amazon Web Services (AWS) to the world. Developers around the globe rejoice, then promptly get lost in the labyrinthine AWS Management Console.

2009 - Ryan Dahl combines JavaScript, the most misunderstood language, with server development, the most misunderstood part of software development, and creates Node.js. Programmers are finally given the opportunity to write callback hell on the server side.

2010 - Instagram is created by Kevin Systrom and Mike Krieger, offering filters to make your bad photos look vintage. It's quickly discovered that the Valencia filter is the most efficient way to generate envy in your followers.

2011 - After creating C++, Bjarne Stroustrup decides that it isn't complex enough. He proposes C++11, an update that adds enough features to make any sane developer's head spin.

2012 - Facebook, faced with maintaining the world's most visited website, accidentally creates React, a JavaScript framework designed to solve problems no one knew existed. It quickly becomes popular among developers who are tired of enjoying their job.

2013 - John-David Dalton releases Lodash, a library that brings sanity to JavaScript developers by giving them the ability to write code that can actually be understood by humans.

2014 - Apple, tired of Objective-C's quirky syntax and all the fun developers are having with memory management, announces Swift. It's hailed as "Objective-C without the C," though it soon becomes clear that it's also "Objective-C without the fun."

2015 - A group of developers decides JavaScript hasn't been overused enough and creates Electron, a framework for building desktop apps with web technologies. The world rejoices as their RAM is consumed by hundreds of Slack threads.

2016 - Google invents TensorFlow, its AI and machine learning framework. In a surprise twist, nobody understands it. Still, cat identification accuracy skyrockets.

2017 - The Bitcoin bubble inflates to its maximum size. Suddenly, everyone and their grandmother becomes a cryptocurrency expert. Developers use their GPUs for mining rather than gaming, sparking widespread panic in the gaming community.

2018 - The world finally admits that managing state in JavaScript apps is horrifying. Dan Abramov gives us Redux, so we can be horrified about state management in a structured and predictable way.

2019 - TypeScript becomes mainstream, as JavaScript developers realize that sometimes it's nice to know if your function is going to explode before you run it.

2020 - COVID-19 hits, and developers around the world discover they can work from home, leading to a surge of productivity and innovation, until they realize Netflix has all of Friends.

2021 - Rust, a language developed by Mozilla, has its time in the sun as developers scramble to write more efficient and less buggy code. Meanwhile, a group of committed enthusiasts prepares a secret plot to rewrite everything in Rust, including this text.


This has been around on HN quite a few times - so many that I'm just going to say "try the 'past' link" rather than pasting them all.


Indeed, even 11 years ago people were commenting on how many times it had been posted with substantial amounts of comments and upvotes:

https://news.ycombinator.com/item?id=3505201




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: