1. Python was designed by testing syntax with novice users to see what they could adopt easily.[1] > 90% of current Python users weren’t born when it was created. They all had to learn, and Python is the easiest language to learn because Guido and his teammates, unlike $LANGUAGE_DESIGN_GOD, approach the problem as experimental scientists rather than auteurs.
2. Python is conceptually compact, dominated by hash tables with string keys. The initial leader in the ecosystem, Perl, is conceptually sprawling and difficult to reason about.
3. Python also took lessons from the Unix shell, a mature environment for accommodating beginners and experts.
4. Python had a formal process for integrating C modules from early on.
5. Python’s management has an elegant shearing layer structure, where ideas can diffuse in from anywhere.
6. $NEXT_GENERAL_PURPOSE_LANG (Ruby, Go) weren’t enough better to displace Python. Both were heavily influenced by Python’s syntax, but ignored the community-centric design process that had created that syntax in favor of We Know Best.
7. Speaking of open source entrepreneurialism, JavaScript has become a real rival thanks to the Web (and node), but it is handicapped by the inverse failure mode: where Go is dominated by a handful of Googlers, JavaScript was effectively unmanaged at the STDLIB level for a crucial decade, and now it can’t recover. (I’d also guess that having to write a module system that works well in the chaos that is Web clients and simultaneously the Unix world is a daunting design problem.)
9. Python got lucky that its inevitable screwups (Python3) didn’t quite kill it.
10. Swift and Kotlin both define programming as serving the compiler (specifically LLVM) rather than serving the coder’s problem. (I haven’t discussed Rust so far since it isn’t attempting to compete with 98% of Python use cases, but if you squint you can see it as going one step further than Swift and Kotlin and in effect forcing the coder to be a sort of human compiler who thinks in types and memory management. This is not a criticism of Rust, BTW.)
0. And behind all of this is Moore’s Law and the demographic explosion of programmers. Python was an implicit, perhaps unconscious bet that if you served people thoughtfully, the tradeoffs with serving the needs of contemporary silicon wouldn’t matter as much.
I can't stop thinking about this. WRT Perl specifically, it’s fascinating how the two competitors adopted Unix shell patterns. Python is handicapped to this day by not automagically snarfing up environment variables, etc. But Perl leaned hard into TECO-style gibberish and the meta-syntax that is regular expressions, confronting beginners with arbitrary complexity. It feels like Wall embraced the system administrator side of coding — the side that has an enormous capacity for tracking corner cases and managing impedance mismatches. Wall was trained, perhaps not coincidentally, as a linguist, a field where continent facts really matter. Guido, on the other hand, was an accomplished mathematician. (This is the Dwarf / Elf distinction from Cryptonomicon.)
Guido van Rossum was not an accomplished mathematician. He has a master's Degree in Mathematics and Computer Science, and got a bronze medal in the International Mathematical Olympiad, but the former is because that's where CS was taught, and the latter because he was a smart high school student.
As he described it:
"But after that first year [of college], it turned out that the real math I wasn’t particularly good at. And I think there were some great teachers at that university and some super cool topics being taught. And I couldn’t keep up. I remember something about a particular form of group theory. And I knew—and a few other students who were like, “Oh, you’ve got to go do graph theory, or group theory.” And I was like—it went way too fast. And I suddenly realized I didn’t have the skills to keep up with those topics. But in the meantime, starting almost from my first month I entered the math department, I had been learning to program because they had I think one of the first-year undergraduate courses was programming in Pascal."
I believe that "trained" is also not the correct characterization.
He got his degree in CS at a time when CS at the University of Amsterdam was part of the math department. He also took CS classes at Vrije Universiteit, under an arrangement where students could attend courses at either school.
He didn't study group or graph theory and was not interested in mathematics. Instead, he took computer classes:
> the programming classes I got at University of Amsterdam in the math department, were a pretty haphazard collection of topics that didn’t interest me. I mean, sometimes the topics interested me. I mean I remember one
semester we were all learning ALGOL 68. And the teacher was super excited, but other times, it was like numerical programming, calculating what the error is after a certain set of matrix operations. And I was not interested in anything involving floating-point numbers, basically. But Tanenbaum, or his group [at Vrije Universiteit], taught topics like operating systems, databases, networks, and languages, I believe. Yeah, Tanenbaum himself did a class where he taught like seven non-mainstream programming languages. And that was all I just soaked that up.
The math department had to "customize" his degree "a little bit for [his] situation."
There is no indication of being a trained mathematician, neither as someone trained to do mathematical research, nor trained to apply mathematics to other problems.
I have an undergrad degree in math, having taken the course for both pure and applied tracks, and even when I was fresh out of college I wouldn't say I was trained to be a mathematician.
Kotlin didn't target LLVM at all when it was designed, that feature came much later. The primary target for Kotlin is and always was the JVM. And Kotlin was designed with usability in mind from day one, that was the justification for it. Weird to say Kotlin define programming as serving the compiler. The compiler bends over backwards to serve the user.
Rust isn't so much "competing" as it is "complimentary" to python. This is very much how python was billed originally as a scripting language for "C" in it's early days.
The slow parts of your python program can be rewritten in rust or C, your choice. So refreshing.
My background is philosophy of language. I studied formal/mathematical logic in grad school. I was always embarrassed that I couldn't code, but the computer sciences classes were teaching languages that were inscrutable for someone even with my background, with syntax heavily focused on jargony math and technical concepts like object orientation (likely java at the time).
Around 2010, I was talking about this with friend about this failing of mine, and he said "you should try python, I've heard it is popular with non-math folks." So I bought a book, and as soon as I opened it, I could just read it. It took me a couple days of reading to wrap my head around object orientation, but on the functional side, I could have written fizz buzz like, maybe half an hour after opening the book.
Humans have logic pre-built into our brains, it's just that we use natural language as our syntax. Python cleverly used as much of the natural language syntax as was practicable to remove the barriers to entry for non-math majors. Whitespace is perfect example of a natural language syntax feature.
The whitespace thing is actually one of python's major flaws. That feature attaches syntactic meaning to non-printing characters. From a human standpoint, there're many examples of silence having some kind of meaning. From an engineering standpoint, that entire methodology is insane. Communication needs to be positive and deliberate.
Remember that Apple SSL bug "goto fail"? That was a whitespace bug, because even if the C feature predated python, everyone's eyes had been trained to slide right off that particularly crass shortcut as python was widespread by that point.
>Communication needs to be positive and deliberate.
I don't know what you mean by this.
>The whitespace thing is actually one of python's major flaws. That feature attaches syntactic meaning to non-printing characters. From a human standpoint, there're many examples of silence having some kind of meaning. From an engineering standpoint, that entire methodology is insane.
It's not non-printed characters, it's alignment. The period is a parallel for the semicolon in programming, to signal the end of a unit, but the whitespace in python is a parallel to the bullet point, or poetic stanza. Those both parallel to python in the form of atomic statements.
Most people's concern is the hanging indentation. Here, I would argue that we can effectively prove that hanging indentation is vastly more parallel to natural language than braces. Simply search for "handwritten recipes" and you will see that in a natural language assembly exercises -- effectively a real world parallel to programming -- human beings naturally default to hanging indentation when grouping sub-categories of items together.
Does this parallel to the jargony math you'll find in math books? No. But it trivially flows from human beings in the real world, and there is nothing formally incorrect in the syntax. Thus, we would likely find that layperson would intuitively understand the hanging indentation, where as braces as syntax is jargony, and must be learned.
> Here, I would argue that we can effectively prove that hanging indentation is vastly more parallel to natural language than braces.
I see where you're coming from. However, I would counter that in the use cases you've shown, as well as all of the use cases I've seen in the real world, there are important differences between them and Python.
1. Hanging indentation is most often used in things that are small in scope, where you can easily see both the start and the end of the list. Python blocks can grow without bound.
2. Hanging indentation in real life doesn't require perfect alignment. Python does.
I don't think it has anything to do with Python. I had plenty of 'if' errors in C++ caused by indenting the second line and not putting braces around it prior to Python. They were always painful to debug. I finally just _always_ put a brace around an 'if' block, regardless of whether it is one line or many, and I've never had that problem again. I think the problem is that C lets you omit the brace for one line; it should always require a brace.
Python executes like it reads, which seems like a positive feature to me. Makes errors like C's two-line if block impossible.
There's functionally little difference between spaces being non-printable characters, unless there are examples of text editors that do not render offset empty space when they're used.
Spaces vs tabs, sure, that's an argument.
But it doesn't seem reasonable to argue that {something that is visible in a text editor} is different than any other kind of character.
It's not like Python was using the bell ASCII or somesuch.
> It is far and away the most common footgun novices run into when I'm answering questions about why their code doesn't work.
It was, in my experience, before Python 3 clamped down on mixed spaces on tabs; before the `SyntaxError`s got better (for example the handling of `try` without `except`); and before VSCode got so popular (such that all the novices were using who-even-knows-what random editor and you had to figure out every time how they were actually producing the indentation, whether it does/can convert tabs to spaces).
And, oddly enough, before LLMs. Not so much because they explain anything all that well, but because lazy clueless people now get correctly indented code generated in-place rather than copying and pasting from Stack Overflow and not having any clue how to make the pasted code line up properly.
But now I far more often see people who are clueless enough that they can't distinguish the REPL from a command line ("why is `pip install ...` a syntax error?"), or are struggling with whatever is the latest attempt by the Python team to make Python easier to install and manage on Windows, or who seemingly can't wrap their head around the idea that Python has to know where on disk to look for the installed libraries (or why it won't work to just put everything in the system environment). And in terms of the language itself, probably the biggest stumbling blocks are things like command-query separation ("why can't I `foo.append(bar).append(baz)`?") and just using functions properly (which typically boils down to "why doesn't `return foo` let me refer to `foo` in the calling function?", but generally stated completely differently).
Whitespace cleanliness is only part of the problem. The much bigger issue is the fact that the ending of a block is invisible. This leads to several problems:
1. It becomes much more difficult to tell beginners where the end of a block is, because the ending of the block isn't something you can describe verbally. You have to point to it instead.
2. Students who might be prone to using the wrong number of "}" or "end" tokens to close a block will instead un-indent the wrong number of times. It doesn't prevent the kinds of mistakes that mismatched parens tend to cause.
3. When closing a block, the thing you're trying to align a prior block with might not be on screen any longer. I don't know why, but I've seen more off-by-one spacing gaffes after the close of a nested block than I ever expected to see.
That said, I don't doubt that Python arrived at the choice of whitespace blocks empirically. However, I highly suspect that the majority of the users learning how to program in the late 80's and early 90's were familiar with monospaced text, which was ever-present in most productivity applications of the day, and thus easier for them to reason about.
And to be clear, I don't really care that much about Python's whitespace blocks. I don't like them, but I'm at the age where I feel like quibbling over minor syntax gaffes is beneath me, and Python is my top "swiss army knife" language when I need a job done quickly. My only point is that I feel like whitespace blocks are a relic of the time when they were developed, and I don't think they are as beginner friendly as advertised.
EDIT: After stepping away from the post, I realized that in my perfect world, a beginner language would look a lot more like Lua than Python. Then I remembered how popular Roblox is, and how popular Garry's Mod was before that. Given the success of those platforms, there might be something to it.
The Big Question:
How did Earth become a planet with oceans and life, when it formed so close to the hot Sun?
What Scientists Did:
- They used a "radioactive clock" made from two elements: manganese and chromium
- Manganese-53 breaks down into chromium-53 over time (like ice melting at a steady rate)
- By measuring these elements in meteorites and Earth rocks, they figured out WHEN Earth's basic chemistry was locked in
Key Finding:
Earth's chemical recipe was set within just 3 million years after the Solar System formed (that's super fast in space terms!)
The Problem:
At that point, early Earth was missing the ingredients for life—especially water, carbon, and other "volatile elements" (stuff that evaporates easily when hot)
Why Earth Was Dry:
Close to the Sun, it was too hot for water and other volatile stuff to stick to the rocks that built Earth—they stayed as gas and floated away
The Solution:
About 70 million years later, another planet called Theia (which formed farther from the Sun where it was cooler) crashed into Earth:
This collision created our Moon
It also delivered water and other life-essential ingredients to Earth
The Big Takeaway:
Earth needed a cosmic accident to become livable. Without that lucky collision bringing water from the outer Solar System, we wouldn't be here!
Why This Matters:
If Earth needed such specific, lucky events to support life, habitable planets like ours might be much rarer in the universe than we thought.
Look at your examples. Translation is a closed domain; the LLM is loaded with all the data and can traverse it. Book and music album covers _don't matter_ and have always been arbitrary reworkings of previous ideas. (Not sure what “ebook reading” means in this context.) Math, where LLMs also excel, is a domain full of internal mappings.
I found your post “Coding with LLMs in the summer of 2025 (an update)” very insightful. LLMs are memory extensions and cognitive aides which provide several valuable primitives: finding connections adjacent to your understanding, filling in boilerplate, and offloading your mental mapping needs. But there remains a chasm between those abilities and much work.
So many articles should prepend “My experience with ...” to their title. Here is OP's first sentence: “I spent the past ~4 weeks trying out all the new and fancy AI tools for software development.” Dude, you have had some experiences and they are worth writing up and sharing. But your experiences are not a stand-in for "the current state." This point applies to a significant fraction of HN articles, to the point that I wish the headlines were flagged “blog”.
Clickbait gets more reach. It's an unfortunate thing. I remember Veritasium in a video even saying something along the lines of him feeling forced to do clickbaity YouTube because it works so well.
The reach is big enough to not care about our feelings. I wish it wasn't this way.
I stand corrected:
The Microsoft era (2007-2014)
In Jan 2007 I joined Microsoft as an evangelist.
Exactly at the time that Microsoft pushed for the XAML crap... So indeed not reinvented the wheel, just not having giving it up despite the complete failure of it.
I like XAML, the whole point of it is tooling support which was undermined when the tools were so buggy for so long. Now that many of the bugs have been fixed and computers are much faster it’s become a really productive UI paradigm for me. I’ll eventually switch over to Avalonia for better performance and cross platform support.
Yes, among many other things. Applied Researcher so full stack and constant greenfield.
I also worked at DevDiv MS during the Foundation series era so I know why many of the issues with WPF are there, basically political infighting and penny pinching. It didn’t help that Gates quit and Balmer took over. Patterns and Practices didn’t properly understand WPF so they gave bad official advice on how to solve problems, it didn’t help that this was near peak Object Oriented mania and language features like generics were still new so there was a lot of unnecessary boilerplate. Code generation did help with this. WPF has a very high skill threshold for making custom controls that are not just compositions of existing controls, open sourcing WPF has helped a lot with this.
WPF is a flawed implementation of a really good idea, and if you can work around the flaws like I do then it’s actually really great. I totally understand why other people don’t like it.
I know it’ll sounds like I’m a snobbish hipster but the vast majority of people never get to the skill level required to properly use these tools. And while React starts out easier it doesn’t stay that way.
Some headlines, such as this one, are catnip to up-voters* despite the article's contributing nothing to the established discussion c. 2015, let alone 2025. I don't know how you disrupt this dynamic and redirect to "go read X", where X is _Team Topologies_ or whatever, but it would improve HN.
* (not a criticism; the topic is important to hackers)
I suspect it's driven by the Olde School Linux / Free Software contingent of HN commenters / voters. Here is an example[1]:
"The #1 story on Hacker News at 2023:08:21T15:41Z is a 2021 discussion of Linux desktop packaging tools.
Hypothesis: HN story up-voters are heavily drawn from Free / Open Source Software folks interested in issues that were broadly discussed in "tech" two decades ago (Linux for the desktop!) and are much less broadly discussed today."
That anodyne observation garnered 5 downvotes. I mean, of course it was silly to treat Linux desktop packaging tools as the most important story in tech in 2023! Overall the dynamic feels like Wikipedia: people who participate are atypical, and nothing annoys them more than one's pointing out that they are atypical.
I agree, and expressed something similar in another command on this thread here https://news.ycombinator.com/item?id=43493805 that HN is trending more and more from a general site for software/hardware makers, and more towards a Linux/OSS specific site.
That's a notable trend overall, but not really a problem on its own. Every community is inevitably going to lean a certain way.
But it doesn't really explain the anger. To be blunt, most of the comments criticizing DF/Apple in this thread sound unhinged to me, e.g., like Apple goes around living in these folks head rent free. I have difficulty understanding that mindset with something you can so easily avoid just by buying a product from another company? E.g., there are countless companies whose values don't align with my own, so I don't use their products, and that's the end of it. Why are these folks spending so much of their energy on hating this company that they can so easily avoid? And why is it just Apple that garner's this hatred, e.g., why not Nintendo for example, a company with a similar approach overall (closed, emphasis on product experience over specs) for example.
My hypothesis is that about 300 people whose identities were formed by participating in the Slashdot / LWN / etc. communities c. 2000-2005 are active HN participants in 2025. They saw the dream of Linux beating Windows fail—and worse, they saw Macintosh become the high-status alternative to Windows. They saw the Olde Internet of hand-coded web sites be swamped by the arrival of humanity using smart phones, and they hate it. They are like 60 year old sports fans upset about the rise of analytics, or '70s rock fans bemoaning hiphop, or Socrates berating scribblers for displacing orators. Evidence: the 400-point popularity of dozens of recent stories on Firefox minutia—Firefox does not matter, nor does Brave (note: I worked for Mozilla for four years). The many, many stories about reviving the pre-smartphone Web. Probably other topic clusters I am forgetting—Web standards?
Are you saying these things make you angry at Apple? I.e., I'm still having trouble understanding, e.g., why not just make software for the many other platforms that are more aligned with your values? E.g., if I made games, I'd probably ship on Steam and not even try to ship on Nintendo, but I wouldn't hate Nintendo for that, I'd just target platforms that I'm more aligned with. I'm just having a hard time picturing hating another platform for having values different than mine when there are so many options available, why not just focus on the other options?
> I'm still having trouble understanding, e.g., why not just make software for the many other platforms that are more aligned with your values?
I want to make software to users of smartphones, not to Apple users. There is nothing related to values. Gatekeeping the whole apps ecosystem for their own profit isn't a value anybody should share besides the shareholders.
What's the most sympathetic way to state your position? E.g., I'm struggling with "I'm angry because I can't make money off of iPhone users the way I want to" that just doesn't sound sympathetic to me? Does it to you? (This may just be a conflict of values again, i.e., it sounds entitled to me to want to have equal access to sell your products to all smartphone users, regardless of their choices, that sounds super invasive to me in fact.)
Also be curious what you think of console makers like Nintendo who have a similar approach regarding revenue cuts and access as Apple? Specifically why is there so much hate at Apple using this model but not Nintendo?
Why do you think criticizing Apple's predatory strategies comes from hate? Do you hate everything you think it is bad? The sympathetic way is to not take disagreement as hate. Why do you hate people that dislike Apple's way of doing business? Do you hate everybody that dislike things you love?
About Nintendo, I don't develop games nor play them, I couldn't care less about what the market leaders do.
As an Apple user, I'm happy with the gatekeeping, it's why I use their products, and frankly I want more of it. If it was up to me I'd like to see them enforcing their human interface guidelines a lot more.
Why can't you just develop for Android or Palm or Windows and be happy while leaving us alone to enjoy what we like? Can you at least appreciate the irony of calling Apple greedy when your main beef is they get in the way of you making money?
There are also a lot of older Windows users, and users of failed platforms from the 80s and 90s, that absolutely hold deep seated anger at Apple and Steve Jobs. There were similar attitudes about Microsoft and Bill Gates in the 90s. After Gates left Microsoft a lot of that vitriol seemed to have dissipated, but some still aim it at Apple. The anger seems to usually be economically rooted, i.e. in terms of how Apple charges high prices for their hardware.
2. Python is conceptually compact, dominated by hash tables with string keys. The initial leader in the ecosystem, Perl, is conceptually sprawling and difficult to reason about.
3. Python also took lessons from the Unix shell, a mature environment for accommodating beginners and experts.
4. Python had a formal process for integrating C modules from early on.
5. Python’s management has an elegant shearing layer structure, where ideas can diffuse in from anywhere.
6. $NEXT_GENERAL_PURPOSE_LANG (Ruby, Go) weren’t enough better to displace Python. Both were heavily influenced by Python’s syntax, but ignored the community-centric design process that had created that syntax in favor of We Know Best.
7. Speaking of open source entrepreneurialism, JavaScript has become a real rival thanks to the Web (and node), but it is handicapped by the inverse failure mode: where Go is dominated by a handful of Googlers, JavaScript was effectively unmanaged at the STDLIB level for a crucial decade, and now it can’t recover. (I’d also guess that having to write a module system that works well in the chaos that is Web clients and simultaneously the Unix world is a daunting design problem.)
8. Python got lucky that data science took off.
[1] https://ospo.gwu.edu/python-wasnt-built-day-origin-story-wor...
reply