Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Hedy is a gradual programming language that helps kids to learn Python (hedycode.com)
144 points by edtechdev on April 26, 2021 | hide | past | favorite | 48 comments


Despite the criticism in the older HN thread, I actually found this talk pretty compelling. In particular, the idea that you start with a language level that is missing capabilities so that you directly experience the problem and then understand the motivation for the additional complexity of the next level.

When I taught Python, beginning students would spend a huge amount of time just struggling with how to quote strings. Why do strings show up with \n in them sometimes and actual line breaks sometimes? Why are there sometimes backslashes that appear and disappear in front of other backslashes or quotation marks? Why does the output of typing "print x" give you something different from "x" (which actually shows you repr(x))? There are very good reasons for all of this, but it's a lot to try to explain and absorb at once.

Quoting strings is second nature to me. But to understand why it's necessary, you have to imagine what happens when you don't quote strings and then you run into trouble like "how do I write a quotation mark?" It's just harder to imagine encountering that problem and then imagine how you would solve it and what might work or not work, than it is to actually experience the problem and work with a concrete instance of it.

The switch from print as a statement (no parentheses) to print as a function in Python 3 (parentheses required) feels like a problem with some similarities. To the beginner, Python 3 syntax has just added some superflous extra punctuation to print(). To experienced programmers, it makes sense to treat print as a function because then you can pass it to other functions, re-assign it, and so on — so we underestimated its impact on beginners. To see "print(3)" and answer "why did you have to write (3) instead of 3?" you have to explain higher-order functions or monkey-patching or something that a beginner doesn't know about, and they have to imagine the hypothetical situation in which that would be necessary. That takes a lot more work and abstract reasoning than having the situation right in front of you and discussing what to do about it. It's not much of a leap from there to postponing the extra syntax until we have the concepts to talk about why we need it.


Felienne Hermans, who seems to be leading this research effort, is also writing "The Programmer's Brain". It has a sort-of "early access" version that I've been reading as chapters have been coming out. From what I've read so far she has been doing her homework in terms of what science really knows about learning, memory and so on. What I really appreciate is that she also looks outside of the field of programming for answers to questions surrounding how learning and thinking "works".

If she has the same approach to Hedy (and I have no reason to doubt that) then I hope that the research being done with this project will bring in some new insights from outside of programming (and ideally also come up with some of their own, but I have a hunch that there's a ton of low-hanging fruit outside of the programming world that could be useful here).

https://www.manning.com/books/the-programmers-brain


I long for the day when Manning runs out of those terrible, unrelated genetic portraits for their cover art. There is nothing I associate less with new technology and software than pictures of people from antiquity.


>the idea that you start with a language level that is missing capabilities so that you directly experience the problem and then understand the motivation for the additional complexity of the next level.

This is how I want documentation written for features added to a language.

For example, why were arrow functions added to JavaScript? The documentation should explain the problem and motivation for why it was added to the language with demo code so someone can directly experience the problem.

I want this for every single feature added after the 1.0 release. Not just for JS, but for every language, library, and framework.


This is commonly referred to as a "rationale" document. It exists for some languages, but not all, and helps to establish both why something has been done and why something else has not.

http://www.ada-auth.org/standards/12rat/html/Rat12-TOC.html

That is the Ada 2012 Rationale, for example. It details changes to the language standard with examples and explanations, as well (in the Epilogue sections) as things that were dropped. It would definitely be nice to have this for more languages, especially as it can help you grok what's changed in a specific version/edition/whatever (not just a full Language X rationale, but Language X 2.0's rationale for changes over Language X 1.0).


Python Enhancement Proposals (PEPs) are very good at this in my opinion.


I pretty much never use Python. I've false started on getting into it a couple times, but never quite got enough fun or dollars out of it to stick with it.

That said, whenever an interesting or controversial PEP starts making the rounds in forums like this one, I almost always wind up reading the whole thing on account of how well thought-out they are.


I try to do something this when I'm teaching classes. Do the "why should I listen to this? What's it going to get me?" bit right at the start.

Give some examples so folks can decide if it's relevant to them, then show them how to do it (with follow-along coding or a hands-on lab if at all possible), and only at the end dive into how it works.

That way, even if people zone out partway through, they may have learned something useful.


Last time I tried introduction to programming classes I introduced concepts in the following order: variables (store results!), if-else (actually get interesting results!), then arrays (lists of variables!), for-loops (iterate over your lists!), which seemed to make sense.

Do you have any more suggestions?


> why did you have to write (3) instead of 3?

This question does not get asked if print is presented as a function from the get-go.

I can see where you're coming from about letting students/children experience the problem themselves before being fed a solution, but in this case the problem only arises due to your having been taught something some way. It is not inherent.


She's talking about delaying the fundamental concept of a function, and using the absence of the function syntax in order to motivate the need for it.

Do you have kids? My nephew has been learning math. He won't be told about f(x) until he reaches algebra. Until then the presence of parentheses would just be arbitrary added complexity. In fact he hasn't even been shown parentheses in the context of application order yet.

I agree with you that you could start with that syntax. I also think you could start with = instead of "is". In both of those cases though we've introduced distinct concepts - function application and operators - and her entire approach is based on introducing one concept per learning step.


Beginners typically use "print" before they know what a function is.


Your first paragraph is exactly why Lua is so good for beginners. You want arrays? Implement them. You want classes? Implement them! On the other hand, the one built-in data structure is powerful and can do a lot without implementing these other things.


Python is just for playing with strings, which is a contrived problem brought about by UNIX-style OS. And other confusing non-fundamental, stuff that just leads to months of cognitive dissonance when one inevitably philosophizes over it: objects, classes, exceptions, the ability to modify global variables from other modules, metaclasses, etc.

> There are very good reasons for all of this, but it's a lot to try to explain and absorb at once.

The reason is because you're embedding strings into the program, which is just a string. There's no reason languages have to be like this. The ironic part is that developers are just like children and do not understand the significance of this either, which is why string injection vulns still exist today.


Tell me brother, tell me the thoughts on the outside of this San Franciscan echo chamber.


Just based on my experience teaching elementary-school kids things, I think kids would be most helped by making an iPad-native environment for programming. Things like using a cursor in a multi-line text input are not natural for kids in early grades even when I think mentally they have the problem-solving ability to start programming. It ends up being a "tax" on teaching things and they spend more of their time struggling with the interface than they do struggling with the actual programming concepts.

Plus it is just really boring to struggle with syntax and typing. It is much more interesting to think about how to get a program to work, and kids are only really going to learn things well if they find it interesting. At least my kids.


Out of curiosity, why do kids need to learn programming early at all? It seems like you could just wait until cursor and multi-line text entry wasn't difficult for them to comprehend, and start teaching them then.

I could be totally wrong, but I've never seen an instance where kids exposed to dumbed-down "reduced complexity" material outperform those that weren't, except in sports and art. You can teach the mechanics of a tennis swing with small racquets and oversized balls, but can you really teach pointers with an iPad environment?


> Out of curiosity, why do kids need to learn programming early at all?

Simon Peyton Jones gave a great lecture on this topic at Strange Loop a few years ago. Basically, "computer science" can be reframed as the study of information, computation and communication. Thinking about these subjects rigorously and solving the problems surrounding them is a foundational skill that is much more broadly applicable than just in programming, and really does not need to involve computers at all (as he shows in his lecture.

> I could be totally wrong, but I've never seen an instance where kids exposed to dumbed-down "reduced complexity" material outperform those that weren't, except in sports and art.

I'm not sure I follow. I mean, that's how children learn to read and write, it is not? Or how they learn maths at school: one step at a time.

[0] https://www.youtube.com/watch?v=y-xgWLYQc4g


I have no quarrel with the assertion that computing is one of if not the single most important field to learn in modern society; what I am suspicious of is that there's any long-term performance benefit of making children learn "pre-computing" through iPad apps at an age where they're not ready to learn real computing.

Of course if a particular child shows great initiative and engagement with pre-computing, they'll do well with real computing, but then why not just start with computing? For the ones that don't care, why not let them continue reading books they enjoy and playing outside?

That's how American children learn to read and write. It's not how Russian children used to learn to read and write; they started school much later, and immediately began fairly mature studies of classic Russian literature and poetry (Pushkin, for example).


making children learn "pre-computing" through iPad apps at an age where they're not ready to learn real computing

I just think there's a bit of a gap right now. I think apps like Tynker that provide sort of pre-computing problem-solving are pretty good. I also like environments like Replit that are easy to jump right into and get a Python environment without a bunch of random setup. And all of this is way better than I had when I was a kid! But I think there could be something in the middle, maybe something that started to engage with topics like list manipulation a bit more than the current batch of pre-computing apps do.


I have no quarrel with the assertion that reading is important. I'm suspicious of the need for pre-reading (letters)

If a child shows great initiative with letters, they'll do well with reading. But why not just start with reading?

Russian children don't start with letters on an IPad. They start later, immediately tackling classic literature.


This analogy is not accurate. It is impossible to read without understanding letters; it is not impossible to program without completing simple pre-programming exercises on an iPad.

Other people in this thread are pitching simplified programming exercises as an advantageous performance benefit for children to show them before real programming. This is akin to showing kids simplified reading exercises before reading. I do not believe there is any tangible long term benefit to making kids look at a picture of a dog and choose the letter "D" from a floating letter cloud to complete __OG, and I also do not believe there is any tangible long term benefit to making kids complete simple loop structures on an iPad or in a video game in advance of actually starting to program.

It is not crucial for the development of children for adults to provide them with neutered facsimiles of reality.


You're right, they should learn digital logic first...

I'm not entirely joking either; a lot of these "teach kids to code" efforts either start far too high-level and don't really help to dispel the myth that computers are magic, or border on outright corporate indoctrination ("use this locked-down environment, don't even think of trying to go outside it".)


Hell, teaching me to code started too high-level, and I was an engineering student.

On review, I think I probably took too aggressive of a stance against the concept. If you look at my other comments in the thread, I made a lot of references to the fake "ladder of math", and I thought I detected a similar attitude in "teaching kids to code without using real code". I just don't think there's any benefit in the pedagogical method of "teaching by pruning away the actual realities of the subject", and I think that believing that method is useful leads to more dire consequences down the line.

I mean, if you want to teach a kid piano, you can just start teaching them piano once their fingers are nimble enough to act independently of each other. It's not particularly important to give them a "starter piano" with four keys, and it's more satisfying to bang away on the real one anyway.


But you want to start with simple songs and not with playing Bach directly. Much what we believe about teaching/learning (computers) is also influenced by survivor bias. I learned programming because I absolutely wanted to and read programming books before even having a simple home computer. That approach doesn't work for everybody, though.


> I've never seen an instance where kids exposed to dumbed-down "reduced complexity" material outperform those that weren't

Isn't literally every subject taught exactly this way? Books for toddlers don't have chapters for a reason. Practicing basics as early as possible, means that you don't have to think about little things when you move on to more complex subjects. You spent a long time practicing addition before anybody even told you about the concept of multiplication.


The reason that books for toddlers lack chapters is that authors believe that toddlers aren't ready to comprehend chapters, not that it is an immutable truth of the universe that humans can't handle a separated narrative structure in writing until they've orbited the sun five or six times. Soviet children started Pushkin right when they started school- when I asked my Russian teachers for good examples of Russian children's books, they said that the concept didn't really exist when they were children, because they started studying real literature pretty much as soon as they started school.

The idea that there are "basics" that can be safely ignored once learned is patently false. Everything I've read from great performers in any field- piano, cooking, mathematics, physics, etc- contains constant encouragement to return to and refine one's understanding of the fundamental skills of the practice. I did indeed spend a long time practicing addition before anyone informed me that it was time to "move on" to multiplication, and it wasn't until I graduated college with a degree in aerospace engineering that I had read a few books about the history of mathematics and found out that the manipulation of Arabic numerals via "addition" or "multiplication" was simply shorthand for counting, which itself is a fundamental mathematical art. What was taught to me as "multiplication" is actually "one particular algorithmic notation for representing the counting of multiple sets", not "objective mathematical law".

Had I disabused myself of the notion that there was any part of mathematics that I could "stop thinking about" because they were "little", I would have made a much better showing in school.


Yep. We don't start 1st graders with calculus, we start them with arithmetic. Variables are called variables by late elementary school, algebra is more properly taught as algebra in 8th or 9th grade (US), with geometry, trig, and calculus following that.

You can develop a similar plan for teaching programming that starts with basics and progressing (variables, linear control flow (aka, no conditionals or loops), conditionals, loops, functions/subroutines, complex types like records, recursion, modules and classes; in roughly that order). You aren't hiding anything, you aren't misstating anything, you just don't move on to advanced things until it's time to. Of course, this progression would probably be a lot faster than the arithmetic -> calculus progression.


"Until it's time to" is a completely arbitrary choice, though. I still am not convinced that it's tremendously important to introduce loops and conditionals before typing symbols into a computer is introduced. Would a class of students introduced to pre-programming seriously outperform a class of students not introduced to pre-programming in the long term? Certainly the short-term gains would be obvious, but I don't think there would be any difference in the long term, and the brilliant students from each group would be roughly equally matched after many years, with the primary differentiating factors being genetic ability and amount and type of studying, like it is in most other subjects.

The ex-Soviets I know speak reverently of the superiority of Soviet mathematics education, which begun much later than American mathematics education, and was far more rigorous.

There's certainly nothing wrong if a child wants to explore pre-programming on their own, but I think there is little to no value in deliberately exposing them to it in a bastardized form in order to "prepare them for what they'll see later".

Also, the assumption that there actually should be an arithmetic -> calculus progression is itself a pernicious issue- reference page 24 of this PDF for an extremely accurate takedown of the American mathematics curriculum: https://www.maa.org/external_archive/devlin/LockhartsLament....


I learned programming in the 2nd grade on a TRS 80, in mouse or even text editor, just a command line to add lines of basic. If you needed to add a line, hopefully you had some numbers left between lines you already entered :)

I think the simplicity and limitations of computers at the time had a lot to do with them being accessible to a second grader.


Because when you are older, people treat programming as talent and not as skill to be learned.

So if you don't come into it already knowing how to think, you will be written off as "not technical type" or some such. And you will write yourself off too.

Pointers don't matter at all. Loops and ifs matter a lot.


I certainly agree with that, but I also believe that the "writing off" is a the direct result of a system that takes it upon itself to distill the utter chaos of human knowledge into a learnable "system" that can be fed to humans of different ability levels.

"Teaching kids computing through iPad apps" sounds like more systematic reduction. The kids that don't get it won't care, and the kids that do get it would have got regular computing, too.


Well compare to mathematics for example, you could wait until kids are good at using a pencil and paper to start teaching them mathematics. But there's really no need, with modern iPad apps you can teach addition, subtraction, multiplication and some of the associated reasoning skills before kids have the motor skills to be able to write numerals well. Do they need to learn math this early? I don't know, why not try? If you can get kids to like learning things then why not.


Alan Kay has worked on that problem space since the late 1960s. Here's his recounting of the history of the Dynabook and how it compares to the iPad: https://www.quora.com/American-computer-pioneer-Alan-Kay-s-c...

It's a shame that his vision of tech education isn't shared by hardware manufacturers. There was the OLPC project, which didn't seem to gain much traction. The iPad could easily fulfill this role, yet the OS still limits the device to media consumption and any programming on it is cumbersome. There's an iPad Scratch app that might be worth a try: https://apps.apple.com/us/app/scratchjr/id895485086


I tried iPad Scratch, to me it really seems like a desktop app that was directly ported to iPad, awkward to use. Also I could be wrong it seems like it is not really designed to engage kids directly, as much as it is to be used as a tool to be used in conjunction with some other sort of content. I appreciate the effort but it doesn't seem as high-quality as for example some of the mathematics apps like MathTango.


> making an iPad-native environment for programming

Why be US-centric?

> Things like using a cursor in a multi-line text input are not natural for kids in early grades

Sure it is. In past decades, many children start working their family computer at ages 6-10. Maybe they started with games like Space Invaders or Digger or Xonix, but soon enough they were editing text files.

> it is just really boring to struggle with syntax and typing

Yeah, so let's not teach children to express themselves in writing... that stuff is boring.



This is great. I think the biggest misunderstanding with computers is how abstraction works, and why we use it. Hedy seems to be a pretty good example of why people don't need to see the big picture all at once. One of the biggest thing that stopped me from really digging in and writing code was the fear that I didn't fully understand what was going on, so abstracting away the complicated backend seems like a great way to foster a sense of confidence in a young programmer. My only concern is how fast this can be learned: while it seems like a great track to follow for a few years, one of the things that actually made me get into programming was sitting down and reading some documentation, point blank. It takes a few days, but the deep dive is one of my favorite ways to learn a language.


I ran through the demo for a bit. It's an interesting idea, but I'm very much not into changing the rules out from under the user. That creates an unnecessary layer of complexity, and prohibits exploration.

This might seem a little far afield, but I've been playing a lot of Baba Is You lately. For a game that's all about rewriting rules, a lot of logic is completely arbitrary. Why does "Stop" take precedence over "Defeat", and why does "Rock is Rock" take precedence over "Rock is Water"? In the end, it doesn't matter, because the game is consistent, so you learn the rules once and can rely on that behavior.


I'm still not convinced Python's syntax is great for beginners.


Could you elaborate why? Personally I agree, and the reasons I have is the use of white-spaces as part of the syntax (being one of the only modern languages to do so) and the need to explicitly add the 'self' keyword in class methods.

I would still lean on one of the modern Logo's to learn kids or non-technical people to program, such as MSWLogo or NetLogo (I used to teach programming to people from Social Sciences using Netlogo).


Python was originally derived from a teaching language called ABC.

It says something that Python has grown so complex as to need a new intro version.


About time something got named after Lamarr! The usual "inventor of Bluetooth, in a way" always seems like a bit of a stretch, but her story deserves getting told and naming a fun learning language might be just the right amount of spotlight.

(I'm blindly assuming that it was named after her, haven't checked)


In my opinion Hedy badly needs syntax highlighting, along with visible indentation marks (space and tabs).


I'm sure the developers don't mind that feature being added at some point either, but they aren't prioritizing it. It's a research language that is being designed as they are doing workshops with children to see what does and does not work. Figuring out what helps them learn best has priority over adding features just because we're familiar with them as experienced programmers.


I don't think the goal should ever be "to learn Python". It's never about syntax or programming language. It's not about learning "how to code" either. It's about learning what can be done and how things work. What needs to come before how.


So it's like the programming language equivalents of Weenie Hut Jr and Super Weenie Hut Jr.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: