Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Stop Making Students Use Eclipse (2020) (nora.codes)
270 points by ducaale on Nov 7, 2021 | hide | past | favorite | 335 comments


I don’t agree with this perspective: why do entry-level students have to understand the internals behind compiling their code, and weird edge-cases like misspelling import? In my experience this is actually the stuff that hinders learning and confuses students.

The author claims that students won’t be able to do real-world programming without learning this esoteric stuff, but actually this is the exact stuff they don’t need to know: full teams can and have built huge java projects just by using the intellij project wizard and maybe a tiny bit of Gradle from stack overflow. Actually students can solve most of the author’s problems by spending 5 seconds on Stack Overflow.

I’m not saying that these more esoteric details are completely useless. Learning e.g. the difference between char and UTF8 “character” is actually pretty important so that emoji inputs don’t crash your website. But this is stuff that should come in later classes, particularly for students who are interested in those details. Not everyone needs to know it.

In fact my experience is that some colleges have the exact opposite problem: they teach students old tools and techniques which are tedious and not actually used much later on. Like writing code by hand and deducting points for syntax errors, or using some outdated framework e.g. jQuery. When I saw the title my first thought was literally “yeah stop making students use eclipse, they should be using IntelliJ!”


Completely disagree.

> full teams can and have built huge java projects just by using the intellij project wizard and maybe a tiny bit of Gradle from stack overflow.

I would not ever want to work with anyone from those types of teams. They are the kind of programmers who have a very narrow range of abilities, and whenever they see a problem outside those abilities, they get completely lost.

> But this is stuff that should come in later classes, particularly for students who are interested in those details. Not everyone needs to know it.

Yes, they do. They absolutely do. There seems to be this mindset that's starting to infect software development that pushes the idea that people can be good at programming computers without actually knowing anything about how those computers work. I just do not understand how anyone could seriously believe that. My experience over the past 20+ years tells me that people who avoid learning about these topics are generally just not good at their jobs, or at best can do a decent job of things, but then completely freeze when they encounter something outside of their narrow comfort zone.

> In fact my experience is that some colleges have the exact opposite problem: they teach students old tools and techniques

That's lame, but has nothing to do with the point you're trying to make.


You only like to work with experienced engineers, who are already experienced with the specific tooling you use on your projects fine.

Students by definition are not experienced engineers. They’re in a given class to learn the topic for that class, and hopefully that class with be a challenging program of learning. In CS often the fact they’re even working in Java is incidental, they’re not even there to learn Java, but generalisable CS concepts.

I’ve got no problem teaching students practical software engineering skills, sure, that can be very valuable. It should be on a course on that topic though.


> You only like to work with experienced engineers, who are already experienced with the specific tooling you use on your projects fine.

That's not what I said at all. Inexperienced in general is fine. I benefited from mentorship earlier in my career, and I'm happy to pay it forward.

Inexperienced with my specific tooling is a weird point to make, because it directly contradicts my point. If you teach people how things work: the underlying computer, OS, system libraries, and teach people the nuts and bolts of how to do their jobs: version control, the command line, filesystems, etc., then I especially don't care if they've used my specific tooling before, because they probably have enough foundational and background knowledge to be able to pick up different tooling and languages without too much trouble.

The people who are just taught to open an IDE, click "new project", and never leave the IDE, are the kinds of people that I expect will have trouble if they join my team and we're using different tooling than they're used to.


It's always felt to me like, "Oh you want to learn to cook? Great!" This is a pickaxe. I want you to mine some iron slag out of this vein.

"What does this have to do with cooking?"

"Well, you want to use a knife to cut your vegetables don't you? How on earth do you expect to get a knife to cut your vegetables if you don't mine for iron slag, refine it, build a forge, learn blacksmithing, make thousands of inferior and useless knives until by trial and error you finally have a knife you can cut vegetables with!"

"Ok, so once that is done then I can be a cook, right?"

"What? are you a fucking moron? Look at this asshole who thinks he can be a chef if he knows how to make a knife! What an idiot! You need a pot and a pan and a kitchen, but to get a kitchen you need a house and to get a house you have to buy the land and then to get the food you have to learn agriculture and animal husbandry, woodworking, stone-masonry, how to quarry, political intrigue, bartering, build a working economy you and 7 other people can all agree on using..."

It isn't that difficult. All I really needed was a problem and a line on the tools I needed to solve them. All the other stuff may make me competent should I need to rebuild the world's technology stack from the ground up a.k.a Dr. Stone but it never earned me a dime of income of served the most basic of purposes in attempting to dissuade me from a computer career.


I agree and also feel like Students new to computer programming shouldn't be using Eclipse. They should be learning the fundamental concepts and their implementation, and having to learn a whole very complex IDE as well would just add to that difficulty.


What fundamental concepts?

if, while, and other programming concepts? I'd agree on that.

But the article suggests learning about files, file formats and other low level stuff.

So maybe they should first learn how a filesystem works, how it keeps internal consistency, and how the bytes are mapped into some physical storage. That's pretty fundamental too, right?

What we have here are "leaky abstractions"[1]. If the abstraction an IDE provides is not good enough that students can work with it without knowing about the lower layer, then the abstraction is leaky. And that is the problem. Not "should I teach them with an IDE or without".

[1]: https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-a...


> What we have here are "leaky abstractions"[1]. If the abstraction an IDE provides is not good enough that students can work with it without knowing about the lower layer, then the abstraction is leaky.

Do you suggest teachers fix Eclipse so it no longer has leaky abstractions rather than just teach students how to work in a modern file system?


I agree with you. For example, in an algorithms class they need to focus on the right things such as O() and not on the Java compiler or the Python interpreter or even worse (god forbid) on gcc.


>> full teams can and have built huge java projects just by using the intellij project wizard and maybe a tiny bit of Gradle from stack overflow.

> They are the kind of programmers who have a very narrow range of abilities, and whenever they see a problem outside those abilities, they get completely lost.

How does one follow the other? How does not learninging unnecessary skills hinders ability to solve new problems? If skill is really needed later one person from the team can learn it and share most important bits of knowledge. If skill is not needed now why learn it? YAGNI can also be applied when learning things.

I find that not understanding how compilers works or how to use them in low level tasks is minor problem when developing. I never met developer who had problem like that. Much bigger problem during developing is not listening to and understanding your users and IDE will not help you with that. This is much better skill to learn than compiler CLI.


"How does not learninging unnecessary skills hinders ability to solve new problems?"

I disagree here. Life is not so easy that you can ignore many problems. If you don't know certain skills you don't know the best way to solve it and it can lead to various problem.

Its like saying why learn chemistry, advance math, physics in high school when you might not even use them other than few simple formula and if you need them just learn it from team. I don't think this is how real world works.


> Its like saying why learn chemistry, advance math, physics in high school when you might not even use them other than few simple formula and if you need them just learn it from team. I don't think this is how real world works.

the point of learning all that in high school is not to learn about these things specifically, just to teach you how to learn things fast so that you can easily learn what's useful when you work.

No one cares about what you learned in chemistry class at 15 y.o, it's all approximations if not outright wrong stuff sometimes and it does not matter in the slightest.


Hard disagree. While indeed atomic models taught in school are way too simplistic, it’s a much better world view to vaguely remember than not knowing anything about molecules/atoms. The same for biology, and we should indeed push for somewhat stronger education in most countries (as is apparent from stupid antivaxxers), as having at least the correct mindset on scientific things is exceedingly important.


The correct mindset to "not be an antivaxxer" definitely does not correlate with raw knowledge. Proper philosophy and logical reasoning classes are infinitely more efficient for that, though a bit more background in statistics and probabilities would also likely help as it seems that a lot of people struggle with that.


A great rule of thumb I’ve read on HN is that a great developer is one who understands at least one layer of abstraction below where he/she works (but 2 is better). That doesn’t compiler intrinsics if you are programming in high level languages, but it does include in my opinion the general tooling of the language, its way of execution, etc.


I call it n-1 rule and IMO it can be generalized to any creative/craft profession.

A great driver understands how engines work. A great wood carpenter understands about different kinds of Timbers and their properties.


> YAGNI can also be applied when learning things.

I don't think that's true. I think learning things that are related to your work will almost always make you better at your work, even if you don't use that specific knowledge directly. Human brains are big pattern matching machines, and the more data you can feed it, the better it will be at looking at problems and truly understanding them.

> I find that not understanding how compilers works or how to use them in low level tasks is minor problem when developing.

I'm not saying that you need to understand enough to build your own toy compiler (though that can also be very useful in understanding how to write better programs). But at least understanding it from a basic level, like "javac takes in my source code, turns it into some kind of machine-readable intermediate representation, does optimizations, and outputs bytecode" is something that I expect every Java programmer to know.


> They are the kind of programmers who have a very narrow range of abilities, and whenever they see a problem outside those abilities, they get completely lost.

The vast, VAST majority of programming is of the type OP described. It ugly and has warts, but is good enough to deliver business value. I’ve seen it first hand.

The truth is that developers working on projects like that outnumber everyone else probably 10-1. See https://www.hanselman.com/blog/dark-matter-developers-the-un...


While that may be true, programmers working on these types of projects will end up being better at doing them (they will build it faster, write fewer bugs, and have an easier time fixing the bugs they do write) if they understand more about computers, operating systems, and the tooling around their chosen language and IDE.

Sure, maybe they can still get things done without that knowledge, and it'll work, but that's not really the point.


> I would not ever want to work with anyone from those types of teams. They are the kind of programmers who have a very narrow range of abilities, and whenever they see a problem outside those abilities, they get completely lost.

Okay but what matters is not who you want or do not want to work with, but "can it make a business that works". That's the only metric.

The first job I was in the owner did "code" pretty much everything on a visual programming language (Max/MSP). Was it pretty ? No. Did it allow him to start a business and sell things ? Definitely, and that's the only thing that matters


"The only thing that matters"

With this myopic attitude you'll get your business up and off the ground fast. Good job.

Then you will be mercilessly hacked.

Once you rename your startup to avoid the negative publicity, you'll hit a performance wall.

You'll lose customer data (and the associated customers) while you try to figure out what the four letters in ACID actually mean.

AWS will take 80% of your profits.

A different company will have taken their time, done it properly, and eat your lunch. They'll take the customers that lost data, as well as the customers that couldn't stand your slow-ass website.

Then they will mercilessly undercut you, because they're not scaling to infinity to compensate for an O(n^3) algorithm being used to scan database tables in their entirety by Mr I Know How to Bang Out Code.


You live in a parallel universe. Hacked ? So what ? Equifax got hacked too. Its valuation is literally up 150% since then lol. People make more money selling wordpress themes than I do selling low-latency C++.


Equifax is a monopoly. Run-of-the-mill companies will lose contracts and money when their infrastructure goes down or some crypto miner explodes their aws bill.

Optimizing is a different item, it reduces costs... More profit, better costs for customers, potentially undercut other companies with prices that are not feasible / offer more features.


o(n^3) bugs happen, sure, once every two year. So the 99.99% rest of the time, wondering about it is useless... Just as much as nitpicking over which version of Eclipse or gradle or whatever to use. In the grand scheme of software projects, these are not super relevant. Making code robust to crazy unexpected inputs is much more important. Making code that is easy to read (that is, readable without knowing tons of frameworks or technical details) is much more important.. At least in business computing.

Now they are niche when these issues do matter...


Both POV are myopic.

One wants to get things done to the exclusion of all else. Possibly trapping you into a corner if you need to ever scale.

And the other wants to “do it the right way from day one“. An undertaking that would cost more than a small business could ever hope to afford or slowing you down giving rivals time to catch up.


I don't really get why you seem to believe there are just two extremes and that's it. I don't think "do it right from day one" means that you never accumulate technical debt or that you write the perfect architecture that anticipates every possible future need.

Doing it right, to me, means hiring people who can adapt to changing conditions and needs, and who can make (or help you make) the right trade offs when you have to balance faster time-to-market (which can involve cutting corners) against avoiding technical debt.

The team that lives inside IntelliJ and can't do anything outside it is not going to do that for you.


>I don't really get why you seem to believe there are just two extremes and that's it.

Well, we’re only discussing a specific dimension of software development here. In reality there are a whole bunch of things not even related to software that will cause problems.


It's not about being pretty, it's about understanding when you're incurring lots of technical debt that's going to bite you in the ass down the road. There's nothing wrong with implementing a complex system in the likes of Max/MSP necessarily but it should be a reasoned decision, not something you do reflexively because you just don't know any better.


Indeed. I had a boss who didn't "know any better" about a lot of things, so he often could only think of one solution to any problem.

My thought when people only have one solution to a problem: If you can only think of one solution, how do you know it's not the worst one?


There are millions of business that "work" but that doesn't mean they are places I want to work. Personally I put the bar higher than "I'll just use the magic wizard and hope it works" and I'd prefer working with similar minded people


But it isn't the job of the university to create people you want to work with ?


My preference is to work with teams of people that strive for something else than "I'll just use the magic wizard and hope it works." Others can have other preferences.

I don't have an opinion where these people develop these attitudes/skills


No, unless it's a "professionalization degree" or whatever you call it on your corner of the world.


That's the huge majority of all degrees in university ? In my country, from the figures I can find online, there are ~20k "pro" masters vs a few hundred "research" ones. It's entirely negligible (I know it firsthand having been through the research path, which was a single-digit percent of the students in the "pro" path)


It actually is, because why would I pass them during an interview if I don't want to work with them? Thus, the students will be locked out of good jobs - and it's the university's job to give the student the necessary skills to find a good job.


Maybe "can it make a business that works" is the only thing that matters to you, but it's not to me. If I have to spend my days dealing with developers who can't think outside of a tiny box, I will be miserable. I don't care how much money the business makes; that's not how I want to live my life.

And what happens after you get that business running and bringing in some revenue, but you need to start tackling harder problems in order to keep your current customers happy and attract new customers? You're stuck with a bunch of developers poorly suited to solving those problems, and a bunch of infrastructure that wasn't designed with this sort of feature expansion in mind.

So you hire some new people who have to rewrite a bunch of things, or who have to spend months or quarters untangling the mess. You could even miss your market opportunity. Even if you don't, you've still wasted a lot of time and money.


> Maybe "can it make a business that works" is the only thing that matters to you, but it's not to me.

What I want doesn't matter any more than what you want regarding this. What matters is what the society as a whole wants, and the current situation is in big part the manifestation of it.

> And what happens after you get that business running and bringing in some revenue, but you need to start tackling harder problems in order to keep your current customers happy and attract new customers? You're stuck with a bunch of developers poorly suited to solving those problems, and a bunch of infrastructure that wasn't designed with this sort of feature expansion in mind.

you're assuming that you would even have gotten the first customers if things were "done right". My experience seeing entire businesses held by 1kloc of "academic" (I'm not saying this as a compliment) python scripts with 20% of it being globals succeeding says otherwise.


> "can it make a business that works"

For how long? Until it needs a rewrite because it’s completely unmanageable. Yes I know, the rewrite cycle is part of the business now, but I don’t think it’s necessary. We can do better.


> > In fact my experience is that some colleges have the exact opposite problem: they teach students old tools and techniques > > That's lame

It's not lame, it's fine. It is just fine, and usually better, to teach basic concepts using a simple tool - even if it is not fashionable or up-to-date.


I think the parent's point was that some people teach using old tools that no one uses anymore, while teaching with the current-generation tools would be just as easy to teach, but would be much more useful.

Agree that teaching old techniques can be useful in the cases where the current techniques build on what came before.


This article is in the context of computer science education, though - not real world programming.

Undergrad CS students aren't learning software development first and foremost. They're learning logic and math, data structures and algorithms. They need a low friction environment so they can learn the academic stuff without wasting their (or their TA's) time.

So while I agree that many "real world" programming jobs require dealing with the kind of boring minutia that IDEs try to abstract, this stuff shouldn't necessarily be a prerequisite for students on a CS track. There's plenty of time for the real world later :)


> They are the kind of programmers who have a very narrow range of abilities, and whenever they see a problem outside those abilities, they get completely lost.

Interesting. In other fields, like surgery, narrow specialization is desirable.


A surgeon is still a doctor who spent years studying medicine though.


You generally specialize after you have learned the basics. All oncologists are doctors, all architects that design very large buildings are capable of designing houses etc. It's seems obvious that the basics for a developer includes how a machine works.


A surgeon still spent many years studying medicine at a broad level before moving into training for that specialty, though.

And many developers I know do end up specializing in particular sub-fields (ML, databases, whatever), but they already have a wide breadth of knowledge and experience to serve as a foundation underneath that. In contrast, I also know some developers who didn't take that route, and jumped into specialization early, and who are completely useless at productionizing their work.


Very much agree.

IDEs can be great, but in the hands of someone just learning they encourage a cargo cult style that becomes a barrier to actually knowing what you're doing. I've had to work with people who are still stuck cargo culting thing 15+ years into their careers. Not fun.

Relatedly, they can also assist people who to get just enough success (in the cargo cult sense) to get a degree, when they would be much better off in a different field. It does them (and their future team mates) a disservice.


> I would not ever want to work with anyone from those types of teams. They are the kind of programmers who have a very narrow range of abilities, and whenever they see a problem outside those abilities, they get completely lost.

The thing is, they are the ones least likely to cause difficult to debug build issues due to "very clever" features and using the latest version features of build tools in the first place.


They're also the kinds of people who will use the same version of their tools long after they've stopped being supported, and then when you need one of those new features, or run into a problem that you can't work around, you have to drag all that stuff into the present, which is always a huge pain in the ass. And the kind of developers who got you in that situation in the first place will be the least well-suited to help get you out.

Regardless, I don't think "I don't want knowledgeable people because they might write something 'too clever' that causes problems in the future" is a particularly winning strategy. You're basically saying you want dumb people because they will never try to be too smart for their own good.


They are absolutely the ones to create hard-to-debug issues because they layered a whole bunch of complex things incorrectly or unknowingly.

For example, recently I spent more time than I'd like finding out that we linked two incompatible versions of a library. All the visible dependencies were correct, but at some point someone bundled an old version in a fat jar and that got included transitively via another dependency. If you understand the details you know why you don't distribute libraries as fat jars. But if you're just following some guide it's easier to fat jar it than correctly specify your dependency tree.

If your education was really effective you recognize this is just the static/dynamic linking problem again in new clothes.


It’s true that clever code can only be written by knowledgeable engineers, but that doesn’t mean that knowledgeable engineers all write clever code.


When I learned programming with eclipse years ago I spent so much time worrying about the magic behind the scenes that it left me with no confidence in my programming. When I finally learned to write my own make files and compile on my own, I went back to using ides and like them, but you shouldn’t learn with them


So should we teach the students how to write make files before we allow them to write first line of code?


I pity the student who only knows how to hit the compile button in the IDE and not `javac`. That's a terrible thing to inflict on a beginner, everything will seem like a mystical black box to them with no way forward when they hit encounter their first compilation problem.


I completely disagree, when I was a beginner programmer my only concern was “how do i make games?” i didn't really care about the game engine, the compilation process, the algorithms even, my focus was on making a game. From middle school until early high school all my coding was in IDEs and I never really tried to use the command line it just wasnt my concern, it was much easier to hit the green run button. Throughout that time I was learning all the basic concepts of how a program is executed, what is control flow, how do you write good code, how does the program even run (i recall one time in middle school trying to figure out if all the lines in a program run at once or in sequence), how do you simplify booleans, how do you debug etc.

It was only later once I was interested in web development that I started picking up linux, learning more deeply about the compilation process, learning algorithms etc. For a beginner all that stuff is just more roadblocks to building your interest and foundation. But later I started hitting a wall when trying web development or 3d game programming where I didn’t have the fundamentals and thats when I started learning all the other skills besides just typing some code and hitting the green run button. In the transition from beginner programmer to Intermediate programmer is where I would say one should learn the behind the scenes concepts not while you’re still building an interest and foundation.

Also interesting aside the programming environment I started on was called “Kids Programming Language” it was like a kind of simplified blackbox environment that I compared to flash (but didnt cost money). Even today I prefer to introduce coding in such blackbox environments like processing.js maybe because of that.


I am helping my son to learn c++ for his school clases. This shit that makes sense for us has no fucking sense for a beginner.

So for the simple for loop for(int i = 0; i< something; i++){} there is a large number of syntax errors a beginner can do, he can forget to open or close the () , he can forget about the : or placed them in the wrong place, he can forget about the {}.

For sure I do not start teaching my son about the c++ compiler, linker though I show him how the program runs in the IDE debugger and how changes update.

Low level stuff is needed much later for maybe parts of the students, at the beginning is too much junk that needs to be learned like a parrot.


> For sure I do not start teaching my son about the c++ compiler, linker

You could, or at least the tip of that iceberg. Compiling a simple program often is a simple one-line command. Perhaps you can show him how to automate this with a shell script or (gasp) a makefile.

The reason why it is not "junk" is that it will teach him a bunch of important things: that you can "talk" to a computer using different languages, that those languages have different capabilities, different properties and different grammars (syntax), and that languages are all about grammar and semantics.

> Low level stuff [...] at the beginning is too much junk that needs to be learned like a parrot.

I see a problem with teaching there, rather than a problem with what to teach. Anytime you make people learn by heart something that needs to be understood first, there's a failure to teach.

It seems these days that we are trying to rush people who are not even comfortable with files and folders to "code". That's stupid in itself.

How about teaching them how to use a computer before trying to program it? Is it not a saner approach? Even if they don't get programming at least they'll know a bit better how those things actually work.

(And why the hell your son has to learn C++ first thing in the morning when there are more beginner-friendly languages. That's another major failure, if that's true.)

It follows the general philosophy that a computer is just a tool that should be at people's command. But that approach only makes good customers, people that are helpless when problems come. Farmers had that sort of problem and had to fight to have the right to repair their "tool".

The correct and actually beneficial philosophy is user-tool co-evolution, that is mutual adaptation. And for this to happen, the user has to really understand their tool.


> The reason why it is not "junk" is that it will teach him a bunch of important things: that you can "talk" to a computer using different languages, that those languages have different capabilities, different properties and different grammars (syntax), and that languages are all about grammar and semantics.

This would be a good point if the year where 2003. Compiling code and setting up a proper coding environment is a solved problem that isn't worth the headspace of a young and aspiring programmer. In the modern day, you can open up the console on your browser and start writing and running code just like that.

What's far more important for a beginner is to learn algorithms and data structures, which are timeless concepts that are highly valued and can be applied to any language or environment (and to give OP credit, C++ is a fine language for learning these concepts). Recursion vs iteration, control flow, data structures, time-complexity -- gaining a mastery of these topics will enable the student to write programs that make it worth while to then learn the auxiliary technologies that can make their programs usable and sharable -- unix commands, compilers, linkers, etc -- learning these tools as a means to an end.

> The correct and actually beneficial philosophy is user-tool co-evolution, that is mutual adaptation. And for this to happen, the user has to really understand their tool.

I would argue that a lot of existing technologies such as unix are less so philosophy driven and more so backwards-compatibility driven where things exist purely for historic reasons. The idiosyncrasies of say unix can be a headache for a beginner, and it'd be a better use of their time to only interact with those technologies when absolutely necessary.


> This would be a good point if the year where 2003. Compiling code and setting up a proper coding environment is a solved problem that isn't worth the headspace of a young and aspiring programmer. In the modern day, you can open up the console on your browser and start writing and running code just like that.

"Modernity" has nothing to do with it. In the old days you could switch on you family computer and start entering Basic just like that too. The only difference between yesterday and now, is that the leverage is much stronger.

Adding and multiplying matrix has been a solved problem too for an even longer time, yet we teach students how to do it manually. It's a solved problem, yet we expect them to understand the solution and be able to execute it.

I don't know why it shouldn't be the case also in CS, unless what you want is code monkeys [1]

> Recursion vs iteration

You cannot explain that properly without explaining first that compilers to stack machines, routine calls and jumps. If you skip that you just end up saying "anyway use iteration when your program mysteriously blows up". Actually if you do go all of a sudden low-level like this, your students will discard your explanation and remember the previous rule of thumb. That's one of the reasons why some people bad mouth us, saying we abuse the terms "science" and "engineering" in "computer science" and "software engineering".

CS 101 should always be in assembly language on some 8bits micro-controller. If you don't do that, you are cutting corners.

[1] https://www.joelonsoftware.com/2005/12/29/the-perils-of-java...


>You cannot explain that properly without explaining first that compilers to stack machines, routine calls and jumps. If you skip that you just end up saying "anyway use iteration when your program mysteriously blows up"

Isn't this a implementation detail? I mean it depends if your course is focused on teaching programming first level (how to get some simple problem then math and code it with if, for and while and some variables) or if you actually have a course about say c++ where you get in detail about why this old languages are designed how they are.

Just to make things clear, I don't like "magic", I too prefer to understand how the computer works from the logic gates up though I am still missing some stuff in my knowledge. But what I noticed is the introduction to programming is very hard, I remember me wasting hours explaining stuff to a student colleague and after those many hours he had the revelation " so this computer is dumb, we have to tell it to do everything step by step , all details..." , so you have the problem of teaching the students algorithms where for myu son they use some C-like pseudo code, then you want them to start running those algorithms ASAP so they can get feedback from the machine , but the syntax is a big issue. It would not be efficient to teach them to also use the command line(Windows and Linux), run the compiler, troubleshoot if the command fails to run(maybe wrong directory or paths), then do the parse the syntax errors and try to find the actual error that produces that sometimes unrelated message. With and IDE you might get the red squiggly lines as you type with some hints on what is missing (before you write 10 new lines and not know what is the problematic one.

But I agree with you if the student is not in the first year of learning programming and indeed the people that do computer science they will study compilers and CPu architecture, and data structures and lot of stuff.


> Isn't this a implementation detail? I mean it depends if your course is focused on teaching programming first level (how to get some simple problem then math and code it with if, for and while and some variables) or if you actually have a course about say c++ where you get in detail about why this old languages are designed how they are.

As much as the fact that numbers are stored on a finite number of bits. Your iterative factorial won't blow up for 100! but gives an obviously wrong result, like a negative number.

Don't blame C/C++; blaming the tool is generally a bad idea (now if your school teaches cooking using flamethrowers that's a different story...). Many languages don't make promises about tail call optimization. A lot of languages don't have "big numbers" (as in Lisp) support - even "modern" ones. Very few languages will accept recursive data structure definitions without using pointers (or references if you want, but the label you put on this can of worms doesn't make much difference). Some will actually give the right-ish result because they use floating point numbers internally, but then you might have to explain why the hell 1/10 does not give exactly 0.1 (which is why, in some algorithms, you compare not-so-real numbers using <= instead of ==).

You cannot escape the fact that computers are mechanical machines, not abstract math machines (unless you load a program that does precisely that). Part of "the Art of Programming" is about overcoming that limitation.

Abstractions are lies by definition. Lies are sometimes a necessary evil, but abstraction abuse is a poison.


I’m neutral on whether knowing how to compile from terminal is essential or not, but I disagree with this:

> Compiling code and setting up a proper coding environment is a solved problem that isn't worth the headspace of a young and aspiring programmer. In the modern day, you can open up the console on your browser and start writing and running code just like that.

Noone should get into writing more complex programs without understanding the basics of the tooling, and we at most only have a few auto-generators that will create a skeleton that can and will break with ever so slightly a change.


In Romanian high schools they learn c++ and maybe soem high schools might still teach Pascal. This is the reality, before high school I helped him with some Lua scripting in Gary's mod so he is a bit familiar already with concepts as variables, if, comparisons etc.

>The reason why it is not "junk" is that it will teach him a bunch of important things: that you can "talk" to a computer using different languages, that those languages have different capabilities, different properties and different grammars (syntax), and that languages are all about grammar and semantics.

My point is not that he should not know about the low level stuff eventually, but keep that for the second or third year.

But int he end the reality s that we write text and some tool makes it run and print stuff on screen, is not important that everyone understands that the compiler has 3 parts/stages and that the linker searches for stuff in some weird paths using some weird algorithm... for first years of teaching I think is OK we focus on understanding the fundamentals and not the tools, we would waste our time teaching them about the c++ linker or Java classpath if they will use Python or JS in the real world.

I identified 2 hard parts

1 create the solution for the problem, like say add all numbers from 1 to n , my son can't invent the solution, I have to make with him similar problems over and over again until things connect (is the same for Math and physics, we repeat same type of problem until it makes sense and he can solve them himself)

2 the syntax, I can work with him, make some simple programs and next week he fails to use "cout" and "cin" properly , this means I failed to identify that he did not properly understand how this crap works and have to do more examples with cout and cin too.


I think you hit the nail on the head. For a beginner learning about a language is so much more important than the compiler, and there's only so many things you can pay attention to at a time. When you're starting it's very important to only learn the minimal subset needed. Later you can expand if you need it, but I have been programming for close to 2 decades and I can count on one hand how many times I had to call a compiler directly. I didn't code much Java though, so maybe there it's different?


I have been programming for 3 decades, the 2 last as a dayjob. It seems to me that there is a deep cultural chasm around the new century. Before 1990 you had next to no Internet and computers were shipped with book-sized printed manuals that were trying to explain how the thing worked, not just how to operate it. After 2000 they begun to come with a single sheet showing how to plug them and www address for perverts who want to download manuals.

However it's not entirely a matter of when you started to touch the keyboard. Around me I see "people who understand" and "people who use" of various ages. Guess who helps who more often.


The pint we are discussing is teaching in the first years, I agree that you don't want to hire a c++ programmer that does know what "linking" means or a Java developer that does not know what the classpath is, or a JS developer that does not know that he can leak memory too and you should manually cleanup sometimes in JS too.


In my experience the people who are helping are the ones who use tools that lets them work more efficiently. They have more time to learn new things, while the ones who use less efficient tools use much of their mental effort on the tools themselves.


I’ve been programming in Java for about 10 years now. I would consider myself an expert - I’m not just some guy who built crud apps, I’ve gotten messy with Java. I came up programming for minecraft servers - lots of interesting challenges there.

The only time I have ever used javac was for fun. Not once professionally.


I think I'm in your camp. When you're a programmer, there's a lot of things you rely on that have been done by someone else and you don't really understand, neatly packaged.

There's not much point in teaching specific things, because there are thousands of specific things that make a program work. Yesterday my code failed because the whitelist at the server was checking my ipv6 address instead of the ipv4 on the website interface. The day before I was stumped on a Rust borrow checker thing.

You end up learning the things that you run into. If you've never needed compiler flags, it's actually not that easy to learn without a motivation. Same goes with memory barriers, garbage collection, and so on.

There's just so many rabbit holes. Telling people they need to know this or that is futile, what they need to know is how to figure out whatever their issue is. Some common tools with some common comments about pros and cons, warnings about where the caves are and what's more trivial than it looks.

My wife is starting a CS degree, and I'll be looking over her shoulder providing this kind of advice, rather than recommending anything in particular.


It's not that you need to use it. It's that you need to understand classpath and classloaders to solve some problems. And if you always use IDE automation there's no incentive until it's too late. Stuff is autoimported and autoinstalled. There's dozens of forms where you can add paths to classpaths depending on what button you press to run stuff. It's all magic when it works, and when it doesn't work it's random changes in random places to fix it. There's many more variables than if you just run the compiler from the command line.

Good education should give you understanding of the basics, not just higher-level stuff. It's easier to learn high-level stuff when you got the basics down than to unlearn the "magic black box" approach and dive into basics when you need it.


> And if you always use IDE automation there's no incentive until it's too late.

Too late for what, though? At any point in time there's a certain subset of knowledge that's most relevant that you're trying to do. If it's learning overall programming concepts not the internals of a language which you may or may not use, then NOT using the IDE automation would be detrimental.

Furthermore, in every single project that i've worked on professionally, attempting to manually manage dependencies and the way things are compiled would earn me odd looks, since that's far more tedious and less productive than just using Maven/Gradle. Not only that, but manually attempting to manage imports as opposed to letting my IDE do everything that's needed in the background would provide me precisely 0 benefit.

In my eyes, "It just works", isn't a bad thing at all, as long as you:

  - can look into the lower level mechanisms (but only once actually needed)
  - can find information about all of it, to make the above easier
Why would you "unlearn" something if you'll still use it in 95% of the cases, the lower level approaches being the outlier?


> in every single project that i've worked on professionally, attempting to manually manage dependencies and the way things are compiled would earn me odd looks, since that's far more tedious and less productive than just using Maven/Gradle

Of course, we're not talking about doing javac manually when you're working, we're talking about learning the basics before going on to use the automation. BTW Maven and Gradle have plenty of gotchas of their own. If you don't understand what exactly happens you won't solve "magical" problems like old builds of non-snapshot versions remaining in the repo for example.

> manually attempting to manage imports as opposed to letting my IDE do everything that's needed in the background would provide me precisely 0 benefit.

you say that, and just last month I've had to help a guy manually remove an import of Token class from a wrong package that "magically broke his code" :)

> Why would you "unlearn" something if you'll still use it in 95% of the cases, the lower level approaches being the outlier?

Because I've seen a lot of people decide something that always worked is not their responsibility to fix when it doesn't work. That's how you end up with 1 guy in the company that understands the build system :)


> you say that, and just last month I've had to help a guy manually remove an import of Token class from a wrong package that "magically broke his code" :)

Hmm, the IDE should offer you multiple classes to import from in a dropdown if there's any ambiguity, as well as highlight any of the methods that aren't present or whose signatures differ and provide popups with essentially the same information that you'd get from the compiler, without even having to compile the project. In that regard, i'd say that IDEs are actually superior. A single ctrl+click that'd take you to the source of the class that you've imported should also clear up any confusion about the import.

Importing the wrong thing and not bothering to figure out why their code doesn't work honestly feels like not being familiar with the language itself (as opposed to the tooling) and for some reason not trying to figure it out on their own by reading what the IDE is telling them - to me, that does sound more like learnt helplessness than anything else, which would also manifest itself when writing code and running into design patterns that they don't understand, as well as any number of other situations.

> Because I've seen a lot of people decide something that always worked is not their responsibility to fix when it doesn't work. That's how you end up with 1 guy in the company that understands the build system :)

To me, this also feels like more of an organizational issue: if companies don't have a RACI matrix (https://en.wikipedia.org/wiki/Responsibility_assignment_matr...) and have a low bus factor (https://en.wikipedia.org/wiki/Bus_factor), then that's on them. If no one takes initiative and the company fails to find employees who will, that's also on them. If only one person understands how things work and doesn't document it, nor are there processes in place to ensure any sort of knowledge transfer, then that's also on them.

That does sound like a dysfunctional environment and at that point builds will be the least of your worries - figuring out how to get code to prod, how to work with the DB and run tests (if any are present), as well as any other steps that haven't been formalized with an "Infrastructure as Code" approach will cause far worse problems down the road. Good luck if any of your servers ever go down and you haven't used something like Ansible for configuring them, but instead it was done N years ago by someone who's long gone along with their knowledge about how to do that.

If you work in such an environment and care about personal success, consider either changing the company or changing companies.


> Importing the wrong thing and not bothering to figure out why their code doesn't work honestly feels like not being familiar with the language itself (as opposed to the tooling) and for some reason not trying to figure it out on their own by reading what the IDE is telling them - to me, that does sound more like learnt helplessness than anything else

To be fair they have 30 files open in the IDE and 100 lines are highlighted cause of spelling errors, it's easy to miss the line highlighted in red as opposed to yellow or black. If you look at the console output you will know the reason build failed immediately, but that's hidden behind 5 different things begging for your attention.

It is learned helplessness, but there's a reason people learn it - they went straight for the industrial tools with dozens of variables that were never explained. They never learnt how it works under the hood and what all the knobs do. They were taught to follow the tutorial, ignore the warnings, leave the defaults as is and click the big green triangle to run the program. 99% of the time it works and when it doesn't maybe start from the beginning again and follow the tutorial more carefully :)

It's a generalization of course, there are people who problem solve better and worse in any demographic, but in my opinion the tendency is - depending on tools and automation early in your education makes you a worse problem-solver.

> If no one takes initiative and the company fails to find employees who will, that's also on them. If only one person understands how things work and doesn't document it, nor are there processes in place to ensure any sort of knowledge transfer, then that's also on them.

There's organizational goals and processes, and there's company culture. The latter is far more important in practice. And it's mostly shaped by the attitudes and expectations of people, not by putting a document on company's wiki or talking about it in a meeting.


> To be fair they have 30 files open in the IDE and 100 lines are highlighted cause of spelling errors, it's easy to miss the line highlighted in red as opposed to yellow or black. If you look at the console output you will know the reason build failed immediately, but that's hidden behind 5 different things begging for your attention.

Then why not address those things first? Most IDEs should allow you to set up inspection profiles or customize the spellchecker and version this configuration so all developers would get a consistent and noise free experience, just like you should be doing anyways when not using an IDE but instead using a code editor together with a linter/formatter and Git hooks.

JetBrains products are actually pretty good in that regard, personally i've sometimes enabled almost all of the warnings, apart from some of the conflicting ones, just to learn about more concerns that i would have otherwise not thought about myself. Not that most people should do in their everyday lives when they want to get things done, but being able to do so is nice.

> They were taught to follow the tutorial, ignore the warnings, leave the defaults as is and click the big green triangle to run the program.

That sounds like that joke about everyone ignoring 99 warnings within their project because the code compiles and runs. Once again, i do believe that this is a bit orthogonal to the IDE vs code editor debate, because if there are warnings within your project you should address them regardless of the tools that you use. If it's a bad tutorial, then why are you using it?

Of course, one can also talk about how defaults should be sensible and the default configurations/examples should never have warnings, but i guess that's just what you get when people don't pay attention to the quality of the things that they make, which is at the very least understandable in our current world.

> It's a generalization of course, there are people who problem solve better and worse in any demographic, but in my opinion the tendency is - depending on tools and automation early in your education makes you a worse problem-solver.

This is an interesting argument. I don't necessarily agree that the automation is the cause for this, merely a canary of sorts. Perhaps the people who don't have the patience to struggle with the "traditional/old" way of doing things would simply not stick around in the industry for long (this probably applies to situations where they'd have to write ASM instead of Python, as an example) due to the frustration they'd experience with all of it and not getting any demonstrable results early on, which is extremely discouraging, as opposed to the more patient and persistent people? In that regard, IDEs indeed enable a wider variety of people to stay within the industry.

I cannot comment on whether that's a bad thing or a good thing, much like some people said that we'd not always have calculators with us and yet almost everyone does have a smartphone in their pocket.

> There's organizational goals and processes, and there's company culture. The latter is far more important in practice. And it's mostly shaped by the attitudes and expectations of people, not by putting a document on company's wiki or talking about it in a meeting.

This is debatable. Those two are not mutually exclusive. Look for environments where both live up to your standards and contribute to both in a positive way.

I certainly have, for example, in one of our projects now all of the configuration lives in Git and is managed through Ansible, so we know when, why and who changed things. I've also made running services more consistent, have containerized apps to get rid of the dependency hell and have written plenty of scripts for automating the bits where people made mistakes, e.g. long DB migration names that need a particular format like V20211107.5.4.2.21.1.0_Alter_Some_Table_Do_Something

At the same time i've also onboarded new people, helped them get started and have explained things both about the projects as well as things in the industry in general, as well have pointed them towards useful learning resources. Being pleasant to work with and having a healthy environment doesn't need to come at the expense of anything else, apart from maybe people's egos sometimes (including mine).

Of course, no one wants to do things for just putting a checkbox in some corporate form, but in practice most of the meaningful ways to minimize risks and make people's lives easier down the line are worth it and there are no excuses not to implement them in any mature company. That's why having documentation and enforcing that it's present is a good idea, especially if you do IaC and most of it is actually code, that's commented as well as the rest of your codebase should be.


You realize the IDE relies on the compiler to do all that magic stuff, right? So when you get to the point where you have a build error because some odd/unexpected import/target all of a sudden everything in the project is an error and the IDE is suddenly not so helpful.


> So when you get to the point where you have a build error because some odd/unexpected import/target all of a sudden everything in the project is an error and the IDE is suddenly not so helpful.

This has not been my experience, at least with IntelliJ in the context of Java projects. Bad imports result in IDE highlighting the erroneous usages of the imported class, by showing which methods aren't available. The red squiggles provide immediate feedback and let you know whether any problem lies within the method parameters, or whether the method itself isn't found, before you try to compile/run anything manually. Tooltips about the error contents are also pretty useful, no need to manually go to the file/line based on some text output in a terminal.

Bad language targets/JDK choices are far "louder" as far as the amount of errors output go, but they're easy to figure out, since if you ignore everything being red in the IDE and instead try to compile the project, you'll get the exact same output that the compiler will generate. It's just a matter of Googling those errors, regardless of how you compile your projects.

The only problems of such sort that i've run into have all been because of the Spring framework and its configuration mappings not being detected properly because some developer decided to initialize it in a non-standard way which confuses the IDE. Then again, using a code editor instead of an IDE wouldn't really provide any useful functionality/hints for Spring either, so i guess that ended up being roughly equivalent.


I should preface that my experience primarily comes from Scala and dotnet, I don't really use Java.

Sure you can play Google whackamole but a lot of times the compiler message alone isn't going to help you and when you Google the message you there are 20 questions on SO where the top answer has 1000 votes and says something like, delete the bin folder and restart.

But that's not the issue. The issue is your importing two projects that are targeting different dotnet frameworks.

The dependency project builds fine on its own.

They project using the dependency just loudly proclaims that the imported type doesn't exist.

Rider gives you a few squiggly lines and tells you "Computer says no"

You yourself have to figure out that one of the projects has changed targeting framework while the other one hasn't


I spent bout 3 years in the 90s when I needed to mess with Java code. Would not dream about going through all those giant piles of enterprise code without IDE.


It’s good that we don’t throw people directly into a professional environment when teaching them how to code, no?


I see no point in wasting time. I learned programming on my own. Started with machine codes. But I grabbed every tool that can help (including IDE) as soon as they appeared. Programming to me was / is a tool I use to make products. I see no point in suffering for the heck of it. Instead of fucking with 10 miles long command line and remembering every squiggly thingy I think it is better to explain to students how computers really work instead.


It's nice to learn how tools work internally, so you can fix them when they break. Once you know how to do that, it's fine to graduate into something that does the work for you.


I have yet to write java byte code manually, but a high level understanding of them do make me a better developer in my opinion. It’s not about the number of times one uses it, but knowing about the underlying abstraction.


Yes, what a pitty. They will have to read that compilation error from the IDEs "compile" window and not the console. Or just see the red squiggly lines in the code. That is a problem for some reason.

And you know what happens when people use a black box and that stops working? They start learning how it works then. When it has actually any relevance for them. Until then it's just a more complicated way of doing things that doesn't benefit them in any way whatsoever.


> people use a black box and that stops working? They start learning how it works then.

No, they don't. Because they don't even know where to start. They never learned anything outside of the compile button and the project settings.

I know because I was in that boat once. Nothing ever boosted my programming skills more than learning the basics or another layer of what's actually going on behind the abstractions.

You also need to consider hardware. School computers are not very good in general. The one I had to use for Eclipse took at least 2 minutes to start the IDE until it reached the point it actually reacted to mouse clicks. You can guess how fun the rest was. But I didn't even consider using a lightweight tool back then. Why? Because the only editor I knew was Notepad, and nobody would use that voluntarily. And nobody taught something other than fat IDEs. There was a time I'd programmed for over a year and could not code in languages like C because the program didn't have a Compile button. It was confusing and I gave up.

You underestimate how bad an IDE-only approach to teaching programming is. It comes with batteries but also learned helplessness included.


>No, they don't. Because they don't even know where to start. They never learned anything outside of...

This is fixable though, instead of just telling them to use the tools you can teach students how to use them well. It also doesn't hurt to inculcate a mindset that shows them how to look for help when they are stuck. Discarding the IDE entirely is like throwing the baby out with the bathwater in my opinion.


I'm not saying they should ignore IDEs entirely. But by teaching everything in the context of IDEs is limiting, it hides a lot of things that students don't even know exist. I went through 5+ years of education without knowing what e.g. the classpath is. Guess how much fun it was to fix classpath issues for the first time. One single lesson outside the IDE would have filled this gap.

> instead of just telling them to use the tools you can teach students how to use them well

Yes, that would be great. If only I'd ever met a teacher able and motivated enough to do that. Coincidentally, it's easier to teach about how to run a couple CLI commands and read the documentation than explaining a behemoth including everything from version control to debuggers that's all thrown in your face at once.


> One single lesson outside the IDE would have filled this gap.

Hmm, what would the lesson look like? I doubt there is a way to teach everything about out-of-IDE/advanced Java development in just a single hour or two. You mentioned CLASSPATH, but there are also JARs with manifest files, there are multitude of build systems with their own configuration languages/plugins/conventions, there are ClassLoaders, there are Java Compiler Plugins... Just too much.

You either list terms for two hours straight without any details and exercises or you choose some semi-random subset of topics, which may very well not include CLASSPATH for some reason. Alternatively, you teach for ten hours instead of two.


>and read the documentation

This part stays intact either way though. You're not wrong that teaching students to run a couple of CLI commands is easier, but they still need to understand what they're doing, otherwise it can be just about as opaque as clicking a button in a graphical program.

Considering the vast majority of them will be using IDEs if they are coding in Java professionally, personally I feel it is worth taking some extra time to at least give them a starting-off point and then letting them access the documentation that the IDEs themselves provide. Whether this comes after teaching them the basics of the language through the barebones text editor + CLI tools approach or is what they are taught from the ground up is a matter of choice.


So you are saying they don't know where to start, and as an example of someone not knowing where to start you present... yourself, doing the exact thing you claim they can't do.


> yourself, doing the exact thing you claim they can't do

I'm confused. Where did I describe doing something as example of not being able to do it? Maybe I worded something wrong. My point was just that I was completely lost and confused in many areas for years until I went out of my way to learn things properly, despite everyone teaching me to not do it like that.


You wrote:

"No, they don't. Because they don't even know where to start. They never learned anything outside of the compile button and the project settings. I know because I was in that boat once."

The original statement was that people can learn about lower-level details when stuff breaks. You quoted this statement, and then add the claim they can't learn [when stuff breaks], "because they don't even know where to start". You support that by adding personal experience: "I know because I was in that boat once".

However, for other people in that proverbial boat, it seems to me that they can do the same thing that you apparently did, which is to learn what they need to know - something you deny is possible: "no, they don't."

Also, since we are quibbling about text interpretation, I'm fairly confident that nobody is _teaching_ you not to use a non-IDE approach to development. Maybe they are advising it (on the basis of IDEs being more approachable to a student) or demanding it (on the basis of an entire team working with the same tools), but _teaching_? I don't think that's right.


I think it’s time to move on :-)

That ship sailed long ago. There’s honestly no need for majority of programmers these days to go cli, vim/emacs way or whatever the one true way is these days. It’s just not needed.

And the ones who need it for their work they learn it, know it, and do it.

Or for the fun and satisfaction of maintaining, running, and keep oiling something antique and/or analogue.


Good luck throwing CI scripts at someone who can only drive an IDE.


Everybody who can code started out not being able to do either of those things.

Cross the bridge when you get there.


I chew CI scripts for breakfast and CD for lunch.

Jokes aside, it’s a decent part of my day job and never faced any problem. Neither do my younger colleagues who never ever were even exposed to those esoteric programming thingies.

The problem is the belief that cli or mildly esoteric CS practises and skills have to be some kind of mandatory initiation.


> There’s honestly no need for majority of programmers these days to go cli

I never did CS classes (I taught a couple once).

I agree that you don't need to learn CLI in the first year.

The idea that the majority of "programmers" can tackle production code without ever being introduced to a CLI is terrifying, though. I'm astonished to learn that some people think you can get a Computer Science degree without learning processor architecture, IO, operating systems and so on.

FWIW, I hate git. It's the devil to learn, and it solves a load of problems I've never had. I wouldn't inflict it on a first-year student.

Of course you need to know what a file is to use git; but you need to know what a file is to use any VCS. I don't see how you can teach someone programming before they know what a file is.

Is a CS course now little more than a way to get an introductory programming qualification?


Everything does seem like a mystical black box with no way forward when you're just getting started. Does not matter if it's GUI, CLI, IDE or a raw compiler.

What is `SyntaxError: invalid syntax` for `print(hello world)`? What is `NameError: name 'Print' is not defined`? I don't have any names, I just want to show a string.

The only differences are: a) amount of new things to learn (e.g. students already know how to use GUI, but learning CLI is a whole separate skill); b) quality of the tool (e.g. how relevant/readable error messages and configuration are); c) your ability to get help with the specific tool you're using.

You have to stop digging into details somewhere. You can always go deeper, if you want.

In my opinion, it's perfectly reasonable to learn one thing at a time. You can start with learning CLI and then learn Java using javac. You can also start with learning some IDE and its build facilities and use Java completely inside it.


What sort of compilation problem would that be that's only solvable by manually running javac outside your IDE?


> What sort of compilation problem would that be that's only solvable by manually running javac outside your IDE?

Figuring out compilation errors when using the configuration of the IDE versus using known system environment variable that are using different values. Notably the resolution of classpaths.

The fact you are asking about it, is scary.


Way to gatekeep, bravo.

I've programmed java professionally for 6+ years, never had to deal with that.


Even scarier that you have that attitude, given your experience.


Or maybe you're just wrong? Or maybe I'm out there solving business problems while you're dabbling with your tools?


All these things are exposed by the IDE if you are willing to look for them. This kind of purism is pointless in my opinion. I did start off with just vi and javac and never complained, but I feel like it's just a matter of knowing the tools you use, even if some of them are an abstraction over the barebones stuff.


We're talking about how you should learn to use the tools you are using.

Talking about implicitly learning from your IDE is a lot like learning to program with React versus the browser console (or a sandbox). I'm not sure how 'purism' applies.


> We're talking about how you should learn to use the tools you are using.

I don't think there's any difference between using javac and using an IDE. They're both tools that you configure and the press a button to run. What's the difference?

And the danger with saying 'you should understand it at the lower level' is anyone can challenge you if you understand the tools you use at a lower level, and at some point you won't, because nobody understands it all, and then they can point and laugh at you like you're pointing and laughing at others.


An IDE and a command line are just different interfaces to the same thing, the compiler. One is not somehow superior to the other, in principle they are interchangeable.


I'm not saying it has to be implicit. If we are talking about a student environment there's no reason why educators can't do a deep dive to explicitly learn the features of the tools they are using.


I'm viscerally into this camp. I just cannot fathom using black box. I went to college to engineer not to reuse, at least not blindly. And things like eclipse have just too many layers and principle-less helpers. Even when an intern, we had to make tiny eclipse wizards (that didn't add anything of value, just cute helpers ala visual studio to setup some files) I was physically unhappy because it's the opposite of how I'm wired. Whenever something doesn't go your way.. well you're back to square one, having to understand java, javac, the jvm AND eclipse stack and logic on top.


Do you have any clue how many layers there are below the java/javac layer? What made you decide that that particular layer is where it's at, and that just one layer up everything is bad?


Good point. Somehow the compiler is the critical tool, eclipse was failing to provide value while impeding me to work too early for me. I was happier with a makefile most of the time because I had the information in short form under my eyes. With the jvm you have a spec, it's somehow meaningful, Eclipse APIs were quite horrendous and if you need to go that low to understand why your project fails to run it's not a good tool.


I completely understand that. I'm not a java programmer; my experience is limited to fixing one tiny bug, in one java program, and I struggled a bit with Eclipse before realising building from the commandline was also possible and pretty simple.

But I do most of my work in Visual Studio, which I prefer over vim+make+gdb because a powerful IDE does give you a significant advantage over a 'mere' text editor. I also find that working in vim imposes a much higher mental load than doing the same in MSVC (and don't get me started on vi without the 'm'...).


I have no conclusion on the matter to be honest. Like many I wasted time trying to setup emacs for language L and had a broken support system (especially regarding debugging). But I often felt that IDEs brought as much cost as value. And up until very recently, linting and ergonomics were below what I'd got with emacs (and if you've seen the regular posts about magit you have an idea of what I mean).


I pity the HN commentor who only knows how to hit the "power on" button on his computer and not how the CPU will actually power-up and start running instructions.


> everything will seem like a mystical black box

The whole reason computers work at all is because of the many many layers of abstraction. At some point everything seems like a mystical black box to you too.

Sure it might be at a lower level. But there's always a point where you get to "I just do this and it does what it says it should. I don't understand how it works." Do you feel like a terrible thing has been inflicted on you because you don't know how the Linux kernel works internally, or how `idiv` is implemented, or which SRAM cells your CPU uses?

You have to stop somewhere, and running javac manually is something almost no Java developers do so learning it definitely seems like a terrible thing to inflict on beginners.


Java is a terrible thing to inflict on beginners


My favourite programming language is supreme because of some objective reason and totally not because it’s my favourite.

Yeah we’ve all heard that one before


Hard to disagree with that! Though it does have some redeeming features: it's statically typed, runs pretty fast, has absolute rock solid code intelligence (the best of any language in my experience) and a ton of learning resources.


Unfortunately it's ridiculously verbose and static typing isn't inferred.


Sarcasm? I hope?


You have written exactly what I think about this, but more clearly than I could write it down.

Every rebuttal and complaint against this seems to centre around how the moment someone touches an IDE they become a gibbering idiot who is unable to ever learn anything else, as if it is impossible that the next course could be about how the programs they've been writing are all processed and put together by the tool chain.

I once spent a very frustrating and miserable day in the computer labs at university unable to make a program work, and got zero points on that homework, because of a missing bracket. Did I learn anything about programming? Or how computers work? Did the application of anything I'd previously learned about computer science or programming work? No. I learned nothing. I had no time for that, because none of it was relevant. I'd simply missed a bracket very early on in the program, wasn't experienced enough to see it, and that was that.

I did learn something critical though, later on, from a friend: "here, use this NetBeans program and turn on 'syntax highlighting' and check out how it points out 'syntax errors' just like how a word processor highlights spelling mistakes". And from then on I could actually learn programming, and later on that semester we learned about javac and putting stuff together on the command line and big O notation, and all the rest of that good stuff that people have moral panics about not starting with.

It's like everyone's decided to have absolutely zero concept of time, or parallelism, or that students spend more than 15 minutes in their entire life studying before being ejected into the workforce to spam tech debt everywhere. They're taking multiple courses in parallel per semester, followed by more semesters of more courses, followed by more years or more semesters. There's time to learn more than one thing, and space to not have to learn it all in the same place, and a student studying one thing at a time when they're starting from zero is not mutually exclusive with them becoming a skilled programmer who enters the workforce and writes quality code to create useful programs.


Both of you have made great points.

A better idea might be two-pass learning:

The first pass: Something pure like Structure and Interpretation of Computer Programs, but more approachable for freshmen. And a more beginner-friendly language like Python is preferable. The key is to: 1) Focus on the idea of computation without any distractions. 2) Be less scary. I remember how I gave up C++ several times when I tried programming all by myself when I was a kid. A decade later I decided to try again, and it's fun because it's JavaScript and I can make useful stuff from day 1.

The second pass: Something like Computer Systems: A Programmer's Perspective. It goes down to the implementation rabbit hole, don't skip the essential details and tools, making the students well equipped to deal with real-world problems.


Item 1 already exists: https://composingprograms.com/


I took a look, because I thought: "Do they even touch on recursion and all the things, that SICP does using Scheme, which usually in Python are inelegant or cumbersome? Lets check recursion."

They do have the examples at https://composingprograms.com/pages/17-recursive-functions.h... but no where one the whole page is the word "overflow" or "stack" even mentioned. Almost as if one does not want to mention, that recursion is a bad idea in Python in cases, where you do not know about recursion depth, because of the limitations of Python itself. So the book is teaching an approach, that often might not be a viable good solution, simply ignoring the problems of the chosen tool. On the surface the book looks like good teaching material, but these problems need to be stated, so that beginners know about them.

Don't get me wrong. This is already a much better learning experience than with most C/C++ tutorials, but it definitely leaves things to be improved.


This post resonates with me: i never really learnt much about javac and haven't needed it in about ~5 years while working with Java daily - for the most part, working with Ant/Maven/Grade is enough and is actually more important to know, whereas javac errors during build can mostly be addressed with a quick StackOverflow search.

Knowledge about the lower level stuff is probably nice to have, yet not really needed to be productive in a professional capacity within the industry, unless you're not using any sort of a build system (a red flag, if you expect to manage dependencies easily), or are doing something more advanced and specific. It's like using a Linux distro daily but not really knowing how to or caring for compiling your own software - when the Wi-Fi drivers fail to work in new hardware that'll probably become relevant, but otherwise not necessarily.

However, i'd also like to go into the opposite direction and warn against the dangers of not introducing students to higher level tools like IDEs and linters properly. I was one of the unfortunate few who were also taught Java in university and had to write it on paper, as well as in Eclipse/NetBeans. The problem was that none of the staff ever introduced us to the actual capabilities of the IDE, or told us about alternatives like the JetBrains products, or talked about readability or refactoring at all (neither in Bachelor's, nor Master's courses in CS).

Thus, i saw some students write code without readable indentation, comments or understandable variable names, treating the entire Java language like some pseudo assembler and happily ignoring any and all of the IDEs suggestions to fix errors or improve the way their code was written. Not only that, but they were also not familiar with any of the refactoring tools, even the more basic ones like extracting a variable or a method, or renaming variables, or even code generation capabilities like getters/setters, equals/hashCode, toString, constructors/builders and so on.

To them, these IDEs were just like using Vim at a very basic capacity - knowing how to enter the insert mode and not much more, thus gaining no actual benefit from the tool in the first place, though actually working in the industry would probably teach this to them, due to the necessity to follow code style guides or refactor code in legacy codebases.


Strongly disagree with "full teams can and have built huge java projects just by using the intellij project wizard and maybe a tiny bit of Gradle from stack overflow." I wouldn't want to work with someone who doesn't understand how build systems work. Specifics are irrelevant but knowing how to work with one build system means you don't have some vague textbook picture of how it works but you actually had to see it how it glues together. IDEs are nowhere near good enough to abstract that complexity away and it's a big part of day to day work.


> I don’t agree with this perspective: why do entry-level students have to understand the internals behind compiling their code, and weird edge-cases like misspelling import?

TFA doesn’t say that. It says to teach entry level students about control structures and such, and then, after two semesters or so, to teach them about those internals.


I think the key problem here is that most people here are programmers and have forgotten what it means to NOT be a programmer.

Moreover, most are thinking that all who are thought programming are going to become professional programmers - like the class success rate of a 100% is a given.

But unfortunately, no. Most intro into programming courses are dealing with the issue "how to get these people over the 'hump' and not quit" and nothing more.


I disagree but from a different angle than the other person: IME a major barrier to students being able to complete assignments was basic mastery of syntax. Now, you might think, but of course, they're students.

But I saw this as the main barrier at ALL levels and in all classes. People who could understand algorithms and linear algebra being unable to write a counting for loop that iterates over an array without looking it up and carefully copy pasting it.

There's just an innate familiarity with the text of your program and how it translates to an AST that you MUST command. And having tools that can "did you mean" auto-fix things doesn't help, especially with imports, when almost languages have non-trivial import systems.

I obviously encourage teaching modern IDEs but I think we would do students a disservice if we didn't make them use a fairly plain text editor for most learning. I started with BlueJ which provided a "compile" button and syntax highlighting and that was about it, and I was then later (after having stopped using bluej of course) confused by people who were using eclipse but couldn't get their imports working right.

Maybe that's my main point actually: none of these tools are good enough that they let you actually not understand the underlying system. They're just time-savers/auto-completes, which can confuse newbies.


What you are calling "esoteric details" are the basic building blocks. Understanding them gives you a mental model that helps you solve the problems when they arise. And it's really not that difficult to grasp.


Me and my entire batch vehemently hated how we had to configure confusing development environments before actually writing a line of code. That is a huge deterrent and only a few of us liked this profession since then.

All those misspelled import errors and command line tools help make it far more easy than configuring an intimidating tool like eclipse/jetbrains.


IMO this is all about curriculum structure.

By all means, introduce students to coding using the easiest, most engaging method possible. To me, that's seeing something big, visual, and getting right into the code as soon as possible, like using an IDE with a mostly working sample project they just need to finish something easy in.

Thing is, a good course should then, at some point, force the learner to understand what's happening under the hood, or at the very least, be interesting enough to draw them to want to understand it.

I don't really use eclipse, but the Visual Studio example, it would be as simple as looking at the output window after hitting "run" and walking through what the IDE is actually doing in the background.


> When I saw the title my first thought was literally “yeah stop making students use eclipse, they should be using IntelliJ!”

I disagree with this point. Universities should stop driving students into the arms of proprietary software vendors. Eclipse does the job and works well enough, with a license, which does not limit students later on. If they learned Eclipse, they will be able to learn other IDEs later on, if the job requires it.

Of course it would be good, if somehow for a low cost in frustration and time, students could learn about what it all translates to on the command line.


To this and comments below -

We dance around it all the time, but there is no way around inherent technical complexity of this field.

I started out with Python and Project Euler, went through JS and web apps "ELO hell", picked up some C to understand lower levels, ended up with a microcontroller - then went all the way up to category theory and Haskell / Scala coding.

If you ask me if I know how to code, my answer is all the more resolute - It depends.

More practical personalities will definitely manage better in current landscape.

But the sands are always shifting.


> Actually students can solve most of the author’s problems by spending 5 seconds on Stack Overflow.

The problem is that those students will end up in the workplace one day and someone in the team will have to help them solve problems that are out of the scope of the project and could be worked around by knowing the proper tools

anecdotal example: I spend at a least 4 hours a week helping colleagues fix Jetbrains shenanigans that could be non-blocking if only they knew basic mvn or git CLI


The big problem seems to be that universities are running into this iPad generation of children who do not know the basics of how computers work and are organized, because apps have veneered over all of that detail.

It's not uncommon to see astronomy and biology professors lamenting that their students don't understand file directories, or realize that a file can be opened by different programs.


> When I saw the title my first thought was literally “yeah stop making students use eclipse, they should be using IntelliJ!”

Exactly the same


My first real uni programming class was Java and we used BlueJ. I tried to set up Eclipse but I was so freaking lost. And I've been a computer nerd most my life, just not a programmer at that time. I am so glad we were set up for the simpler IDE, learning your first couple languages is hard enough as it is.


>Like writing code by hand and deducting points for syntax errors

We like to discuss the gap between university and enterprise but for once both a totally (ironically) aligned: interviews in big companies are like this, with a whiteboard instead of a sheet of paper.


What kind of students are you talking about? University students that want to an education in Computer Science or training in software development?


How is JQ outdated?


your factual proof?


>why do entry-level students have to understand the internals behind compiling their code

It's a good primer before they write compilers in their latter years in a CS degree.


IntelliJ looks better but do less for you. Eclipse by default compiles the whole project for example and by default shows all compilation errors. With IntelliJ you need to look for these.


> With IntelliJ you need to look for these.

Erm what? Click "build", and IntelliJ shows all compilation errors with their locations


IntelliJ also does on-the-fly compilation and checking, at least if you enable it - which may be a performance problem for huge projects, but definitely not for students.


In eclipse, you dont have to press anything. You have prominently "problems view" where they are all listed as you type essentially. You dont have to expand nodes and errors are sorted all on top. (And you can filter them in millions ways, easily, make multiple copies of the view with different filters if you want.)

Then you see them in package/project view too. Each package, project and file with error has red icon with it.

IntelliJ does not show them in project view at all. It shows them only after you pressed "build", so you need to develop the habit of pressing it constantly. Oh, and the console shows exactly one error. The others are hidden in separate view view, inside collapsed nodes and you have to click multiple times to see them.


> IntelliJ does not show them in project view at all. It shows them only after you pressed "build"

It does better than that: it shows them directly inline with your code, with all the tips and hints and refactorings to make it work.

When you change something to be incompatible with existing code, it will immediately show a handy link inline to display all the places across your code that depend on this code and are now problematic.

There's a problems view that has an overview of all problems in the current file and in the entire project (but true, a single overview of the entire project only appeared sometime in 2020)


What you are describing is quite literally what the intellij "Build Project" button is for.


Which you have to press. In default eclipse setting, it just happens without you having to press anything.


> Using Java or Python in a professional IDE like IntelliJ IDEA, NetBeans, PyCharm, or Eclipse is not a good first introduction to programming for computer science students, whether they’re in the field to become web developers, systems software engineers, or academic computer science researchers.

I was forced to program Java without an IDE and my experience is that I learned less in all those lessons over a year than in a few select days at work during my first year.

Furthermore, according to a friend who is a specialist teacher this is a common problem:

Most kids are inclined to see how things works as a whole while most teachers want to teach analytically by tearing everything apart.

I'm grown-up now and don't mind diving deep into stuff (in fact I enjoy it) but in my understanding we really should try to go with the flow when teaching:

Teach kids the wonders of programming (and everything else) and then go into the details as needed.

Because, if my friend is right and I understood him right (he doesn't teach programming) most kids doesn't want to learn programming because they want to know how to please javac but because they want to create cool stuff.


I feel like a lot of people misread the article.

It doesn't say you shouldn't use and IDE to teach students.

It says you should avoid professional IDEs and at some later point you shouldn't neglect teaching them about other things like compilers and files. At some point they should have an understanding of how some of the underlying mechanics that a professional IDE abstract away.

Also it's about teaching adults at University, not kids


When one teach driving, do one start with the ignition system and the gas pump or do one start with the high level controls and how to behave in traffic?

Do one (ideally) start in a new, safe car or in an old one with unsyncronized stick shift (the ones where you need to match the engine RPM before the gear lever will enter).

Because that, in my eyes, is how programming is taught many places.

Edit:

What is the result of all this?

It takes longer time to grasp the easy parts and the parts that fuel motivation. People drop out.

What if we instead taught students using proper tools (e.g. non nerfed versions of Eclipse, IntelliJ or NetBeans for Java) and then used the time we saved in the beginning to dive deeper and deeper towards the end of the course?


The emphasis was on professional.

The article suggests using a simpler IDE.

You don't learn to drive in a formula 1 car


I don’t learn to drive in a Formula 1 car, and I don’t drive a Formula 1 car with my driver’s license. I drive a regular car that’s quite similar to the vehicle I learned to drive in. As do the virtually all drivers.

So if we go with the assumption that university is meant to prepare for a job in the industry, and that industry uses “professional” IDEs in 99% of the circumstances, then university shuold use IDEA and Eclipse too.


It's a hyperbole.

Virtually everything is designed such that there are different versions for people if different skills. Golf clubs, bikes, skiis, you name it.

University is about education. Learning about computer science concepts, not churn out Java eclipse programmers.

There's no reason to use a tool designed to do work on an enterprise codebase to teach the concepts behind a linked list.


Eclipse is poor tooling for teaching. Far too many buttons and noise that gets in the way. But the opposite, programming with no debugger and no linter is just as bad. It punishes you for getting the smallest syntax mistakes by waiting for the compiler and reading out erroneous messages.

A debugger makes it far better to understand the execution of code line by line, and should be standard in education, but it's not.

I feel that visual studio code, with some modifications, would be perfect to begin with. Then when building things at a higher level, introduce why to use eclipse, like how it handles packages, etc.


A debugger makes it far better to understand the execution of code line by line, and should be standard in education, but it's not

In practice, from what I've seen teaching programming (not using Eclipse nor Java), this tends to cause "debugging tunnel-vision" instead, where someone will just keep stepping through code that has a problem, and not even know what happened until the wrong result was seen.


To be fair I’m taking an assembly course right now, a massive component of which is learning (command line!) gcc...


Literally the second paragraph is about an alternative Java IDE suitable for teaching introductory programming


What kind of modifications?


I do not agree. If you follow a course about algorithms or data structures, or Java, or parallel programming, you should focus on that. Learning the same time an operating system or command line will introduce unnecessary complexity.

Anyway, when I was a student in University, most of my colleagues knew the command line and had a strong understanding of operating systems. By the time we did the OS courses the only things we had to learn was about the more advanced architecture, system calls.

As a working professional the IDE is a huge time saver. From quick refactoring to debugging, to step through, to step into, to watch memory, to renaming, to jumping to definition, to method or variable exploring, to function definition, to writing common patterns, to running unit tests, everything sits at one keyboard shortcut away. And the IDE is smart enough to correct my mistake as I am typing, so by the time I hit build, the software builds without errors.

Without an IDE, my development time will take longer.

And why shouldn't someone who is aspiring to become a professional not use the tools a professional uses?

If anything the IDE showing his mistakes might make it a more pleasurable experience since he doesn't have to search Google or Stack Overflow as often. And by having a good experience, he will be encouraged to learn.


> As a working professional the IDE is a huge time saver. From quick refactoring to debugging, to step through, to step into, to watch memory, to renaming, to jumping to definition, to method or variable exploring, to function definition, to writing common patterns, to running unit tests, everything sits at one keyboard shortcut away. And the IDE is smart enough to correct my mistake as I am typing, so by the time I hit build, the software builds without errors.

I'm glad someone mentioned this. When I was learning programming for the first time, the Eclipse debugger help me to understand for/while loops, because I could see exactly how the computer behaves, as it runs through my code line by line.


Umm. You do realize you can set watchpoint in a command line debugger


I'm a very heavy gdb CLI user, have my own collection of batch-debug scripts (bisect stuff, fuzzing triage stuff, 'trace every change to variable x' stuff...), and... I wouldn't inflict that on a beginner. Eclipse debugging experience is OK, quite simple to start and do interesting stuff, and you can even patch/continue which as a novice will help a lot.

The trick is to 'teach' debugging with the least friction possible. Evaluating the code in your head and comparing with what you observe is the underlying concept, the rest is tools, and they need to get out of the way.

As for the OP, I learned java with notepad+CLI and it was terrible. So demotivating. When I discovered Eclipse, I became so happy with java and programming, 'create class', move stuff around, nice autocompletion, find class, refractor, automatic online build, correction suggestions. And I'm a 'using the mouse breaks my flow' old fart. Yes eclipse is buggy, heavy, but damn does it make a difference. It's the standard I compare any IDE to.


What are the benefits of CLI tools (including debugger) to GUI tools? From my experience, most people prefer GUI tools, and are more productive there.

I understand CLI is useful for scripting, but outside of that I'd rather not deal with it.


I mostly prefer CLI tools is that they're more consistent. You typically don't install a specific version of IntelliJ IDEA, but it's very common to install a specific version of Java. You hardly can specify exact configuration of a GUI tool (unless it stores its config in a file which you can blindly copy between machines), but it's always possible with CLI.

For example, in my C++ course there are some strict requirements on how the code should be formatted. They're specified like this: install this specific version of clang-format on this specific Ubuntu 20.04.3 and run it with this flags having this config in this directory. Fortunately enough, installing and using clang-tidy on Windows and macOS yields exactly the same results in about 90-95% cases (the probability space is the space of all programs that students write during the year; not the space of all configurations).

Of course, one can and is encouraged to attach clang-format to their favorite IDE like CLion.

It's impossible to make all students use the same OS comfortably: macOS wouldn't run on non-Macs, Windows costs extra money, Linux consistently does not work on about 10% of laptops unless a specialist spends five hours messing with Wi-Fi firmware/touchpad drivers on each laptop individually. Installing Desktop Linux in a VM is terribly slow as well.

However, requiring command-line only version of Linux is much better: it takes less HDD, it requires less RAM, it does not have slow GUI (as it does not have any GUI), and you don't really need to use it all the time, only when you submit the solution.

Switching to web-based IDEs do not help: there are some power students who want to use their favorite vim/emacs configuration and compile locally instead of trying to catch 3G signal while commuting in a subway from the dorm to the university.


I think you are just pointing out that the issue here is C++ ecosystem (not saying other languages have a better one), not the CLI vs GUI.

Linting/formatting with the same configuration should provide the same results on all operating systems, unless there is an issue with the tool used.

Easy workaround would be to provide a Docker image with necessary tools to perform linting/formatting, which would ensure everybody can use it, and should produce the same result everywhere.


Sure. But I'm yet to see a single language whose GUI tools are more consistent than CLI. Agda and Coq come close, because it's impossible to write either outside of their main IDE. But they're more of proof assistants, not a general purpose programming languages.

Docker is a good idea, actually. Should be easier to set-up than a Linux VM. Still, it's a CLI tool.


As program complexity increases guis loose their discoverability benefits. clis often become easier to remember particularly if you have a vague idea about how to do something. Setting a breakpoint in gdb is b myfubc. Setting a conditional breakpoint is b myfunc if muvar = something. Set a conditional read watchpoint for a specific thread... rwatch myvar thread 3 if myothervar != 6


> scripting

Exactly. So the flexibility and power to pretty much anything you want.

Compile and test every single commit on this branch, easy.

Being fast, responsive and working on remote machines is also a big plus.


An IDE highlighting your mistakes is the definition of fast and responsive. Having to wait for multiple seconds of recompilation is the very opposite of that.


FYI, you can do this in VIM/Emacs and other CLI editors

You can also use CLI tools at the same time as using an IDE


> FYI, you can do this in VIM/Emacs and other CLI editors

At which point, you are using an IDE, instead of running javac!


I guess you could extend the definition of an IDE but then there's no GUI


A command line debugger is not as discoverable for a learner who doesn't know how to make it work.


Importing external libraries is absolutely the most frustrating part of learning a new programming language. I’m glad Eclipse handled that for me in my first few programming classes. New students shouldn’t need to get hung up on imports before they get to write a single bit of logic. If they’re serious then they will get to have all that fun later when they are a bit more advanced.


100% agreed that this is a huge problem.

My first programming intro (I did EEE) was in MATLAB, where this isn't an issue. When we moved onto C/C++, though, this was an enormous headache. Even after a couple of years in industry it could be a massive pain the arse in terms of configuring CI/CD and porting it cross platform.

To be honest, I think the best way to solve this is not so much to use IDEs that handwave over things, but a beginner-friendly, useful language where the problem simply doesn't exist - namely Python.


> but a beginner-friendly, useful language where the problem simply doesn't exist - namely Python.

Python? I would rather tech my child Lisp or C , white space being important is very confusing.

For first time programming I would make a simple language that uses syntax like: if end_if for end_for function end_function then it would make more sense then tons {} or stupid tabs and spaces.


In my experience, Python's whitespace-based blocks usually "click" with students faster than {} or start-end blocks. Especially if they previously learned visual programming like AppInventor/Scratch, having subsections indented to the right will be more natural than end statements.

More generally, no program will be easily readable to a human without indentation. Having to scan aheas to find the endif, then find back where I was, then keep a mental note of the "block" of code that I'm looking at is really annoying, while indented blocks make that separation visually obvious. So if we already have to use indentation for our own sanity, why do we make ourselves add clutter like end statements for the compiler's benefit, when in reality, it has absolutely no problem understanding our indentation directly?


I think it would make the code easy to read for a beginner. It could also make the error reports much more clear, instead of "Unexpected symbol }" you can print something like "This for is missing it's matching end_for".

I don't like identation, it is easy to mess it around if you copy and paste code, or if you make edits, is like we would background color the blocks of code, it might look more visual but will cause issue when we actually manipulate the text.

It is a shame that at the beginning of learning to program you are hit with this many concepts and on top of that with this complex programming languages , why we did not invent a simple , more limited but perfect for teaching one? I assume some will say why should we waste time teaching a toy language when we can teach the real thing.


There can't be such a thing as a perfect teaching language because there are many aspects to learning programming and the way you'd make those things easy aren't always compatible. "Typed" or not? Automatic or manual memory management? Pointers? ...

But when it comes to teaching the core idea of encoding an algorithm with lines of text, the main thing seems to be that the less syntax there is to think about, the better. Parentheses, brackets, semicolons, variable declarations, etc. are all in the way. We have two very good languages for this - Python and Lua. I much prefer Python for many reasons, some of which I explained earlier, but Lua also works very well.

The copy-paste issue is real though, I'll give you that. Modern editors (e.g. PyCharm) completely solve it, but when using less sophisticated tools, it is a pain. Still, I can't imagine teaching programming without indentation being an important part of it and Python conveniently forces you into it. Perhaps Lua with format-on-save would be better in this regard I might try that some day...


I disagree, but let me know if you have experience teaching someone python, Python is replacing say both "{" and "}" with one character a space.

When I teach my son I always tell him "when you open the ( or { always add it's matching closed one (if teh editor does not do it and only after you closed it you add your stuff in.

So for a for I teach him to start with for(){} then fill in the stuff. It makes sense to be more explicit. We did not have issues with types though, it makes sense that int and float are different things and you need to be clear about it.


I have taught a handful of programming intro classes in Python, JavaScript and Java (to mainly high school students). Java was by far the worst, since it has a lot of boilerplate that beginners just need to memorize, which is very discouraging. Starting off with types also made the lessons less focused, but I should've adjusted the pre-programming lessons to fix that. JavaScript was great for the first few programs where the automatic type casting let us have quick and satisfying results, but then the messiness of the stdlib made anything that wasn't self-contained (reading/writing files, drawing...) a pain. In both of these cases, students' code was utterly unreadable and issues were hard to find, despite in one case saying indentation was actually required (was that immoral?). I try to guide students to follow their drawn flowcharts along with the code to see where it goes wrong, but a lot of the time, their code was all over the place, which made that difficult.

With Python, that last issue was basically gone, since the layout of the code is the thing that instructs the interpreter. Getting the syntax correct wasn't an issue in either of the languages, but this time, correct syntax necessarily also meant nice layout. The students' code was much more similar and that made it easier to guide a large group through analyzing their code to find problems.

The way I transition students from flowcharts to code is through an intermediate "step list" notation that they're already used to from procedures they learn in chemistry and other lab classes, as well as generally in instruction manuals. That notation often uses nested (indented) sub-lists, which are almost directly copyable to Python. If I didn't have that step, perhaps the benefit of Python code layout wouldn't be as pronounced, but I think this way of teaching is generally beneficial, even when not targeting Python.

Some notes from the Python classes: the editor should be configured to turn tabs into spaces - no tab character should ever reach the file, even if copy-pasted (I'm 100% on team tabs, but they're horrible for beginners in Python). Skip "idiomatic" things that seemingly simplify code (open file contexts, "or default" expressions, ternaries...) and instead use longer versions - the less knowledge required the better, despite the resulting code being more "complex". Use f-strings and ignore string concatenation for as long as you can. The turtle library is your friend.


I've helped a lot of children learn Python (volunteering) and the whitespace hasn't actually been a huge issue. They seem to get the hang of it quickly. The typing, dealing with syntax errors which don't explain clearly the problem, and motivation are the main barriers I would day. It is quite nice for them to learn a language that is used in anger professionally.

The first experience of most children here (in the UK) at school and elsewhere with programming is Scratch, a visual block-based programming language. So no typing or syntax errors, and very quickly you can makes games etc. It's got its pros and cons.


> Python

I'm tempted to share this xkcd, called "Python Environment": https://xkcd.com/1987/

I'd say that Python is an absolutely stellar language when it comes to readability and getting things done thanks to its rich ecosystem, but at the same time is about as bad as Ruby when it comes to managing dependencies.

In my eyes, something like Java/.NET/Go could be better choices for when you need to work with dependencies as a beginner, whereas Python is probably the better choice when you don't.


Have you used it recently? On any popular version+os+arch combination, pip will "just work" for anything a beginner would ever want to do. Binary packages are pre-compiled, packages are installed in per-user dirs automatically to avoid permission issues and conflicts with system packages (where those exist).

The only issue comes up when you have multiple apps with conflicting dependencies, but at that point you're probably not a beginner anymore and someone will have taught you about pipenv. Just remember to type "pipenv install" instead of "pip install" and all your dependency problems go away. Most IDEs pick it up automatically too.


> On any popular version+os+arch combination, pip will "just work" for anything a beginner would ever want to do. Binary packages are pre-compiled, packages are installed in per-user dirs automatically to avoid permission issues and conflicts with system packages (where those exist).

That's often not the case in my experience. I've had several Python packages fail to compile on my machine (x64, so not even something less common like ARM). Others failed because of missing system libraries without any clear indication what library they're looking for.

It's not as bad as NodeJS's package management, but Python certainly has problems.

Then again, most of these problems aren't really problems for most use cases if you use virtualenvs. I've seen many (ex)academics use Python without venvs and cursing their system for not working properly. This results to pip being invoked as root, subsequently conflicting with the OS package manager and then some cursing about how Python has broken their system. venvs really need to be in the spotlight more often because I think a lot of people have terrible experiences with Python purely because they don't know they exist (or how to use them).


This basically sums up my experience, though perhaps more so in projects with Flask or any other loosely coupled codebase with many dependencies, rather than the likes of Django which are more tightly integrated and perhaps better tested.

In particular, i remember problems when trying to compile DB drivers for a project, what worked in Windows didn't work in Linux containers and vice versa. I remember having to mess around with the system configuration manually and track down .wheel packages, rather than just being able to do a "pip install" inside of a venv. That was in Python 3, though i don't remember which version it was or the particulars of that situation, so sadly i can't comment more, merely comment that i had problems.

That's actually been the case with most languages and toolchains so far, so maybe my computer is just cursed, or maybe that's just the fact that i use Windows for development some of the time, other times i boot into Linux, whereas containers might run something different like Alpine Linux which has plenty of its own idiosyncrasies.


This has been my experience.

To import matplotlib, for example, you type "pip install matplotlib" into the terminal to install it.

Then, you add "import matplotlib" to the top of your .py file.

That's it.


for introductory programming courses there shouldn't be any dependencies

also NuGet is no better than pip


For solving the sieve of Eratosthenes youn need 0 dependencies, and Eclipse gets in the way all the time(I have seen it and it's painful to watch).


Ironically importing external jars is particularly easy in java... and frankly rarely nescessary in an early course. Java gives you a lot!


It should be frustrating because you should avoid doing it.


I think people need to appreciate that universities are not updating their courses and tools that often. Eclipse was a pretty reasonable choice a few years ago. I used it for well over a decade before grudgingly switching to Intellij. If you are doing Java, it's still pretty nice. It always was pretty good at that.

For a university software engineering course, it's more than adequate. It's representative of what people in the industry would use, or would have used until a few years ago. It's open source and easily customizable. It has most of the bells and whistles you could want in such a tool. And it runs on modest hardware and more than meets the requirements for implementing some simple toy project for school; which is a big deal when you deal with students working on old/slow laptops.

Are there other alternatives? Of course, many. But it helps having students all use the same thing when supporting them and it helps when that tool is not Notepad or something similarly backwards. I actually taught Java to students back in 1996-1998, trust me dealing with weird/odd/misguided choices as to what to use to edit files was a thing I had to deal with. Actually one of the key things to learn in university is learning new things. So, a slightly complicated but genuine IDE is exactly the right kind of thing to learn to wrap your head around in university.


And I think people need to realize that, 25 years ago, the Fortune 500 in the US listened to the trade press, decided that off-shoring development was the magic bullet, and now there's an entire industry in India devoted to cranking out absolute baseline-level programmers who have learned nothing but programming control structures in Java using Eclipse. Some of us work in companies that are STILL following that model, and falling further and further behind in our markets because of it.


IMO eclipse is still a reasonable choice for a university (or a developer). It's open source, actually free, and comes with all the bells and whistles one needs.

Unfortunately, it's gotten pretty buggy over the years, and their tooling and process seems very unwelcoming towards new contributors.


That is a misguided characterization, as it makes it seem that IntelliJ is not open source, not "actually" free, and does not come with all the bells and whistles one needs: https://github.com/JetBrains/intellij-community/blob/idea/21...

I am trying really hard to ween myself off of weighing into this discussion every time it comes up, because it's getting to be the emacs-vs-vi of IDEs, but we should at least stick to factual arguments


The community edition is not the full IDE(A). The full IDE is not open source. They're welcome to their business model, and I'm sure the ultimate edition comes with all the bells and whistles, but the community edition does not.

Feature comparison: https://www.jetbrains.com/idea/features/editions_comparison_...

Please do stick to factual arguments.


Feels like one of these “let’s introduce them to programming with c++ in a text editor with debug by printf because that’s how I learned in the 80s”. I had a professor like that, and that turned me off from programming for a few years before I came back later through higher level languages.


Controversial opinion ahead. Take a deep breath.

If a student can't handle C/C++ in nano/vim/emacs and debugging with gdb, maybe a computer science degree just isn't right for them.


Yeah I don't agree with that at all. That's like saying mechanical engineers should all learn starting with hammers and nails. Computer Science isn't about the tools! C/C++ nano/emacs are tools only, and mostly outdated tools used less and less by most companies. Most professional software engineers starting today are not using these tools, they are using Java, C#, Javascript, Swift, using IDEs.

Also, learning Computer Science has very little to do with programming language or tools. Computer Science is a Science (it's in the name). You can learn about logic, grammars, discrete math, computational complexity etc, without knowing anything about Java or C#.


Do you think the only purpose of a CS degree is to teach software development? Many people with the degree don't end up writing software. The point of a CS degree isn't to teach you the tools and languages being used at companies in 2021; go to a coding bootcamp if you want that. It's to teach computer science. After you learn that, picking up languages and tools is easy.

The point with learning with simple tools is that you learn from first principles with as much as possible stripped down. Most of the time C has a straightforward correlation with the instructions emitted by the compiler, and the language rarely does something behind your back. Try explaining memory management (in detail) to a first year student using JavaScript or Python.


> Do you think the only purpose of a CS degree is to teach software development

No, it's not. But it's purpose is almost certainly not teaching people how to use CLIs.

None of those CLIs have a lick to do with the 'not writing software' parts of CS. They are an implementation detail, that either helps you solve a particular problem, or get in the way. Most of the time in the undergraduate curriculum, they get in the way.


I agree that purpose of CS degree is not to teach any tool.

It is to teach a person how to learn and how to approach problems. If someone does not get it then yes bootcamp will be better for him.


> The point of a CS degree isn't to teach you the tools and languages being used at companies in 2021; go to a coding bootcamp if you want that.

Thought experiment: so where's the degree for people who want practical and applicable skills and an in depth knowledge of actually building systems with the modern tools that are popular within the industry, as opposed to just being familiar with the theoretical aspects? Surely you'll agree that a bootcamp is in no way equivalent to a degree of any sort (due to time constraints in most cases, the very concept means that you won't get 4-5 years of material's worth), nor do the current degrees actually focus on the practical aspects that much.

After getting a Bachelor's and then a Master's in CS, i still feel that the degrees left me unprepared for software development in the industry, since a lot of the things that i learnt while working within a company in parallel wasn't covered at all! And surely we can't expect people to gradually pick up all of the tools and practices as they go along, because not everyone is going to end up in companies that follow the best practices?

Some examples of things that i feel weren't covered enough:

  - organization: issue management systems, versioning policies, team organization (silos vs open communications) and other management aspects
  - organization: release management, ways of shipping software (public repositories of various sorts, using SFTP servers, container registries etc.)
  - practical system design: when to use SOAP, REST, gRPC, GraphQL or others
  - practical system design: the tradeoffs of event based systems, distributed systems, what happens when you have message queues, other ways of dealing with backpressure
  - practical system design: more about horizontal scalability and how to achieve it, versus vertical scalability as well as eventual consistency/single source of truth
  - practical system design: how to do DDD properly and work with bounded contexts, to split systems into logical chunks
  - security: talking more about permission management, ACLs, as well as encryption algorithms and systems for managing credentials (e.g. Vault) and identities/permissions (e.g. OAuth2)
  - security: talking more about VPNs, tunnels, encryption, service meshes, observability/auditing, intrusion detection etc.
  - development: how to efficiently use all that IDEs offer you and how to work with large legacy codebases while limiting the impact of everything breaking, like in the book "Working Effectively with Legacy Code"
  - development: proper testing, not just saying that tests are important and giving vague examples of unit testing, but a course that would focus on nothing but the many ways of testing, everything from unit tests, integration tests, browser automation, performance testing and getting to 100% test coverage whilst also ensuring that no CI steps pass when it falls below that
  - development: speaking of which, CI should also be covered more, everything from Jenkins, to GitLab CI, GitHub Actions, the benefits of all of these, as well as the tradeoffs that are there, how to setup your project's build steps and tests in a fully automated way
  - ops: how to manage servers, use something like Chef, Puppet or Ansible, install packages, properly configure permissions and a bit more about all of the POSIX cruft that has accumulated over the years and that you'll have to deal with as well as everything from creating systemd services properly, to navigating the filesystem
  - ops: using log shipping systems (like Graylog), monitoring systems for the servers (like Zabbix), performance monitoring systems for the applications (like Skywalking APM), analytics systems (like Matomo)
  - and so on...
For example, in the university, when talking about designing systems we might cover the 4EM model and various aspects of planning, but where are the practical examples of systems that have succeeded because of these practices, how in particular the models translate to good implementations in practice, or quite on the contrary, what are the most typical mistakes that one can make (e.g. splitting up the system wrong and ending up with really chatty services). Case studies and post mortems would be exceedingly useful in that regard.

Yet, the current outcome is that people learn a lot about theory and two people with similar knowledge in that regard might have wildly different outcomes overall, because one of them would get to work in a company that follows best practices and lets them learn a lot of new things, whereas the other ends up in a body shop where their knowledge is silo'd and they can't learn much, instead just writing code to meet the business goals.

Perhaps universities fail at doing this, because they're not even trying to - since apparently the industry moves too fast for them to maintain a relevant curriculum. But if that's the case and we as society value formalized education, what other approach would let me learn about all of the above and more, even the concepts and practices that i'm not aware of? It feels like we're still stuck in the medieval approach of "guilds", where people are handed over to those more knowledgeable (senior devs & teams) and learn by doing, which has certain tradeoffs and perhaps isn't the best approach overall.


> Thought experiment: so where's the degree for people who want practical and applicable skills and an in depth knowledge of actually building systems with the modern tools that are popular within the industry, as opposed to just being familiar with the theoretical aspects?

Many European countries have a two-tiered system of higher education. There are research universities that teach both theoretical subjects (such as CS and history) and research-based applied subjects (such as medicine and law). Then there are vocational universities where the teaching focuses on more immediately applicable skills.

The drawback of a system like that is that people generally consider degrees from research universities more prestigious. As a result, research universities can be more selective with their student intake and their graduates have better employment prospects.

Also, when we are talking about degree-based education, the focus should always be on what is needed 10-20 years from now. The delay from making decisions to getting a nontrivial number of graduates to positions where they are ready to contribute is easily a decade. When there are more immediate needs, the industry has the responsibility to provide the necessary education.


> Thought experiment: so where's the degree for people who want practical and applicable skills and an in depth knowledge of actually building systems with the modern tools that are popular within the industry, as opposed to just being familiar with the theoretical aspects?

Sounds like a software engineering degree to me.


Hmm, in that case, i'd have to say that we have nothing of the sort here in Latvia. In the Riga Technical University that i got my Bachelor's and Master's in, i completed the following degrees:

  - Professional Master Degree in Computer Systems and Qualification of Programming Engineer
  - Bachelor Degree Of Engineering Science in Computer Control and Computer Science
While the names might sound similar to what you're suggesting, the courses were still somewhat lacking in regards to the practical aspects that i mentioned above. Some of the lecturers did diverge from the officially approved contents and did teach us more, notably about CI/CD and DevOps concepts, as well as geospatial and temporal databases, and we did get some group projects where we implemented entire systems, taking on the roles of architects, developers, designers etc. and later talking about where we failed or what could have been improved, yet in my eyes that still isn't enough.

The curriculum was too slow to move otherwise, at least in my opinion - i guess that their way out of that situation is having the "Professional" degree, which requires you to do case studies and spend months working in an actual company, which in my case was just continuing to work at the company that i was already working at during my Bachelor's, which sort of reminds me of the guild/apprenticeship systems that i mentioned previously.

I might be wrong, but the balance between theory and practice ends up at an impasse often. Maybe degrees should just take longer to complete and cover things more thoroughly? Or do we as a society expect them not to be enough to make someone ready for working in the industry, and accept that as something normal?


I kinda feel insulted.

I started years ago with C++ in HS with Codeblocks and felt like this is painful environment to operate, getting external libraries felt like I needed to copy files manually from the internet; idk whether I eventually figured it out

because then somebody showed me C# and I quickly realized how it makes my life 10 times easier just due to allowing me to focus on actual problem solving instead of fighting with obscure language at its ancient version with probably some random compiler and random IDE cuz in CPP world there's many versions of everything with its own quirks. Bonus points when I have to compile LLVM and it takes 30min and eats whole avaliable RAM where compiling .NET's Roslyn is order of magnitude "cheaper'

Thus, dealing with CPP's "weaknesses" is not a good proxy on your performance during your engineering or master's degree in comp sci.

Let alone that computer science ain't school of programming or using X debugger, it's mostly math and theoretical informatics


1. "can't handle" !== "can't handle when presented with for the first time with no prior programming experience". It's like saying someone shouldn't pursue a math degree because you gave them a differential equation to solve first day of high school and they didn't know what to do.

2. You're mixing "low-level tools" and "ancient user interfaces". An argument could be made that a student should understand low-level tools, just like a mechanical engineer should understand physics. But there is absolutely no argument to be made for using old-school UIs the same way there's no reason for an engineer to use compasses and rulers instead of CAD. Actually, even that makes more sense, because there might be situations where they won't hve anything else, like in the field. There is no situation where the average programmer would only have access to CLIs.

3. CompSci !== SWE. A CS student developing novel learning-based image processing methods has no need for knowledge of obscure gdb commands and C++'s pointer and dependency voodoo. A SWE student could benefit from that knowledge, but even then, will rarely need more than a surface understanding of it when not actually developing using those tools.


> There is no situation where the average programmer would only have access to CLIs.

It was fairly typical for me to have SSH-only access to the production/staging server and no other tools than Vim/git to alter/store its configuration.


If you have SSH, you probably have SFTP, so just mount it locally and work with your local tools. If you somehow don't have SFTP, FISH works on a bare SSH connection. Many modern tools also have native support for remote filesystems (vscode, anything from JetBrains...).

Not to mention you probably shouldn't be doing that directly in the first place, especially as "just" a programmer. Sysadmins need to deal with CLI stuff, but programmers generally have no reason to. Of course many of us have both of these roles, but that's beside the point. As programmers, we should not even need to see the production server and as sysadmins, we shouldn't need to know things like gdb and proper development in CLI editors.

We have the tools, we just don't use them. How many issues from bad edits in remote scripts could've been prevented if we used an IDE with linting via sftp instead of barebones vim?


> If you have SSH, you probably have SFTP

Absolutely not always. The SSH may be provided through multiple proxies by a mess of convoluted Bash scripts. Because reasons or compliance or whatnot.

> Not to mention you probably shouldn't be doing that directly in the first place, especially as "just" a programmer.

Yeah, if you have a dedicated sysadmin to start with, and not just a team of five-ten-fifty enthusiasts who are fully comfortable with managing servers themselves.

> How many issues from bad edits in remote scripts could've been prevented if we used an IDE with linting via sftp instead of barebones vim?

Yeah, that would be wonderful.


I like Vim. I like CLI. I like JetBrains IDEs. I use whichever tool I deem appropriate for the situation.

Mastering vim/git/cli has been the most productive thing I've learnt as a software engineer. It's something I would recommend. Not to freshman student but at some point


> A CS student developing novel learning-based image processing methods has no need for knowledge of obscure gdb commands and C++'s pointer and dependency voodoo.

I guess you could argue that AlexNet was the product of one very motivated CS student who was very proefficient with C++ and CUDA. This particular skill set had a very deep impact on the computer vision community. Today, how ever, you don't need it because all that has been abstracted away. But someone had to lead the way.


Definitely! I'd say we need more people like that (although for obvious security reasons, I'd personally prefer the use a safer lang than C/C++). But we also different people who don't care about performance or integration, but have unique ideas that are worth exploring. Let them experiment and quickly iterate using the easiest most high-level tools we have, then let the GPU magicians rewrite and optimize their work for production.

The best progress happens when you have both of these groups, not just one. As well as some application development plebs like myself (jk) to allow end users to benefit from all of that progress.


Even this comment is technically true that are really not the things you should start with in programming classes, imho.

Having already some experience with other similar but simpler things will enable the students to pick up later on those more advanced things more quickly with less effort, and especially with much less pain.

Also things become simpler to teach as more background is already know to the students. Needing to explain "everything" at once is not a simple task. This will cause a lot of confusion usually as people don't have yet the proper mental models to sort things into. It becomes harder to actually teach things.

My point is: Lessons in itself shouldn't be "the worthiness test".

The goal is to teach people programming, not torture them. You can test and verify the understanding of the things taught in the next step.

Also it's in general much simpler to teach things along a structure where you go form the big picture gradually down to the details instead the other way around — otherwise people may not see the forest for the trees for a long time. Needing to much details only to get basic things going will cause confusion and a lot of unnecessary frustration in my opinion.


Why draw the line there? Why not just throw everyone straight into assembler? Or, better yet, manually flipping bits in memory.


Woah, woah, let's not get ahead of ourselves! Nobody's touching a keyboard until they can prove to the professor they can write a bug-free compiler with pen and ink.


You are perhaps right, for some narrow definition of "computer science degree".

Perhaps though the root of these endless discussions seems to be that the name of the degree means different things to different people.

Some see that degree as a way to learn programming, some see it as a route to a doctorate, ultimately to a career in theoretical science, some see it as understanding how hardware works, and so on.

Ultimately since schools, and students, have such a wide interpretation of the term, making blanket statements in any form about compurt science education is always both right and wrong depending on context.


If we're really talking about a batchelor's degree in a computer-* subject, then I'd expect the graduate to have the basics nailed - not just how to write Javascript using an IDE. I'd expect a graduate to have passing familiarity with a couple of different languages; to have used a CLI and a VCS; to know what an OS does; and to have a reasonable knowledge of processor architecture. Learning how to use some specific IDE is just dead-weight - students aren't expected to write complex systems where IDEs are really useful, there isn't enough time on the course.

It's perfectly easy to learn to write simple programs using a text-editor and a command-line. If that's too challenging, please stay away from production code!

There's an approach to teaching that's been referred to as "circular learning" - the first time around, you cover most of the basics, then you go around again, deepening your understanding - lather, rinse, repeat. "Spiral learning" is probably a better term. It does require the curriculum to be somewhat integrated, though - not just a bunch of independent, standalone modules.

Like, you can't learn to program without some understanding of computer architecture. It's very helpful when learning computer architecture to have some programming skills. But you get taught these things in separate, supposedly-standalone modules that don't reinforce one-another.


"Computer science degree" is not an ambiguous term, in fact (in the US) the goals and curriculum of the degree are explicitly defined by accrediting institutions. At the very least, it's not intended just to teach web development, and that's not up for debate.


It's not just the degree itself, it's what the expectations from that degree are. Why students are pursuing a CS degree is what informs what the goals should be and in turn what they should and shouldn't learn. And as that varies greatly between students as well as institutions.

OP wasn't saying "you can't pass the designated curriculum of a CS degree without this arcane knowledge", they were saying you shouldn't pursue it in the first place. There are many reasons to pursue a CS degree, therefore there are many different situations you'd need to evaluate their statement on. It's quite ambiguous.


Early courses should enable a student, who has seen enough to want to pursue something further, to do that. Once you know what you want to do, you can slog through the muck of non-IDE programming because you’re motivated. Not everybody comes in with that much motivation.

I did the slog part before I went to university, but there were people who, at the age of 21, decided to try CS, and did the intro courses. Thank goodness for Eclipse, that an English major can feasibly learn enough in a semester not to back out.

Not everyone wants to Go Build Real World Programs, either. Please let’s not raise the bar to exclude anyone who isn’t already and won’t soon be hooked up to a terminal speaking in tongues and reciting the fast inverse square root incantations of their forebears.


Well, to be fair, she also calls for using Python for intro programming classes, not Java.


What we need is a good separation between computer science, and computer engineering - as there is is a separation between physics as a science and mechanical or civil engineering. What this article laments is students coming out of CS with little practical know-how. But practicality/application is not a requirement from a pure science. Let the scientists do science, and the engineers do engineering. And stop making a CS degree a requirement for a software engineering job.


There is a 3 way split:

Computer science is about conceptual development and research. It's learning what Turing-complete means etc.

Computer engineering is about systems and processes and practical application of the concepts of CS. It's about things like why TLA+ is a good approach to developing safe and effective systems, it's about the SDLC generically and the potential different approaches and their cost/benefits.

Computer programming is a technical skill. It's learning the specifics of a particular environment and language.

Any CS graduate should understand programming, because it's a basic required skill, same for a CE graduate. But specific skills in Java vs C++ vs etc is virtually irrelevant to a grad getting their 1st job. It's also something that any student should have already picked up as a side effect of a competent CS/CE course.

But does a CS grad need to know the difference between agile and Agile and iterative development and CI/CD and...

I'd say no, because it's not particularly relevant to type theory or provability etc. It's also not particularly relevant to ongoing CS research.

Our industry is well short of good CE graduates, because the focus is on the stupid details of languages instead of the conceptual grounding needed to actually engineer.

Same with the dumb interviewing about reversing a list on a whiteboard.

We've adopted the language of engineering, without the actual approach.


> Our industry is well short of good CE graduates,

Are there any? My son had a summer job in a small recruiting firm in Silicon Valley. Customers (you know them all) had just two requirements:

- a proper degree from top 10 universities;

- 5+ years of experience in the field;

All others need not apply (and there were so many of them - recent Berkeley/Stanford graduates…)


To me Computer Engineering should be about engineering computers. What runs on them is entirely secondary. Not really even involved, the people using the computer have to figure out how to develop that...


Actually it’s a 4 way split ! With the 4th being Software Engineering that addresses things such as Methodologies. This is most undergrads are really seeking.


Even if your approaching computer science from a theoretical standpoint you're going to be using the computer. If you're able to do that efficiently you can try out ideas and concepts better and have a better learning experience.

Question is if this should be taught as a part of a degree or of the onus is on the institution. I don't know. But these can be hard and intimidating topics to learn on your own.


You don't teach a kid to swim by starting in the deep end. There's a reason training wheels exist for learning to ride a bike. Same applies here. You need to understand what code even is before diving into the specifics of the compilation step.

That said, I agree with the title but for different reasons. Eclipse is terrible compared to other available options (JetBrains is so vastly superior).

I do agree that Java is a terrible first language to teach, it has so much arcane boilerplate to get through before doing anything useful. Especially the whole "public static void main(String[] args)" incantation you're taught up-front. So many meaningless words for a beginner.

And yes I do agree "how to use your computer" type stuff should be covered at least after the first assignment, things like how to use a terminal and the filesystem as mentioned. And IMO git should never be taught CLI-first, it's way too hard to wrap your head around without the visual of the commit graph. GitKraken does it best, and is free for students.


> I do agree that Java is a terrible first language to teach, it has so much arcane boilerplate to get through before doing anything useful. Especially the whole "public static void main(String[] args)" incantation you're taught up-front. So many meaningless words for a beginner.

The boilerplate is annoying, but it's an obstacle that can be overcome. My lecturer/lab tutor for introductory Java just gave us a pre-prepared:

    public class Main {
    
    }
And told us to write all the code in between the two curly braces and don't worry about what's outside of them. The main method is another piece of boilerplate to ignore.


Everyone I knew growing up learned to swim by getting tossed in the deep end. The kids who's parents babied them didn't go to the pool or have many friends. I'm sure those kids are all managers at fast food places or live off the dole now. Of course I'm a dad of 4, full time, so my viewpoint on kids is different from "parents" who feed their kids baby formula and hire babysitters and nannys instead of taking care of their kids themselves and so therefore have no idea how resilient humans are and how we can only achieve what we're led to believe is possible. If you're restricting your kids so much as to protect them from life, they'll only get hurt later, and usually those people have no idea how to deal, so they blame everyone around them for things they're responsible for, and we currently have so many of these kids, now adults, that they're passing laws to protect themselves later in life because they never had to face real adversity ever in their lives. Anyways. I disagree. A bunch.


I went off on bad parenting because I think it goes hand in hand with bad teaching. But honestly after reading it I just think it's funny. I'm obviously dealing with other people's kids :D too much man, too much. Some people's children, ya feel me?


I think there is really important reason "to force students to figure this stuff out on their own".

Figuring stuff ON YOUR OWN is what higher education is all about. It is not about teaching one tool or the other and if someone is learning tools instead of "how to learn to use those tools" he is doing it wrong.

Dev work is from my experience mostly about "figuring stuff on your own" and I expect that this should still be main goal of higher education.


> Figuring stuff ON YOUR OWN is what higher education is all about.

I share this view. Not everyone agrees.

Maybe it depends on what you mean by "higher education". I mean a degree. If you have a degree, that should signify that you know how to study on your own, and have a general overview of your degree subject. You should be able to make some sense of a research paper in your subject, even if you can't follow it in detail.

A degree isn't a vocational training course. If you hold a good batchelor's degree, then you should be qualified to undertake a master's degree, which in itself isn't much use, IMO; the purpose of a master's degree is entry-level training in research. You get to find out whether you want to/have the ability to undertake a doctorate.

If what you want is to learn how to do a certain job, then surely a vocational course is more appropriate (the hint is in the word "vocational").


We use java in the database programming course with jdbc. We use eclipse and maven and I give a 5-minute introduction and a boilerplate, after the students used a web-based java ide in the previous year.

I will not subject my students to managing compilation, classpaths, and dependencies to get on with jdbc (and other abstraction layers).

This sounds strange, but learning how to compile or run a program is, in my opinion, not necessary in the academic context, if that's not one of the goals of that special subject.


I taught Java to tens of thousands of middle/high school kids using Eclipse. It is certainly not a friendly interface to beginners, but is vastly preferable to teaching the command line. This post comes across as highly dogmatic and ignores the realities of teaching in a classroom.


It's nice to see a reply from an actual educator. Maybe you could elaborate on your experience. How was the language selected? Did you ever try a language specifically designed for pedagogical purposes?

I can't find anything on the author's page about being an educator, so I assume they're talking about what they themselves would have preferred.


I started a Bay Area summer camp in 2013 and expanded it over five years into multiple locations and a year-round after-school program. Java is widely known for its role in the AP Computer Science course, so there is a pretty strong demand from parents. I also developed introductory courses for Python, and built a custom browser-based teaching platform with Skulpt (skulpt.org) - the idea was to give kids a giant green run button so they would keep testing and iterating on their programs, and to parse the native Python error messages into helpful suggestions.

My contention with the article is that instead of bringing up any real issues that educators have with Eclipse (my list goes on and on), it instead posits that students would be better served by using the command line and `javac`. It strikes me as completely ignorant toward the realities that come from a diverse group of students with varying degrees of prior knowledge. Unfortunately, this opinionated pedagogical thinking makes its way into curriculum all the time.

My opinion, supported by years of physical presence at the head of many classes, is that students are most successful when they have fun. Dealing with the terminal, or really any repetitive process, is not fun. This is why my entire Java curriculum explained concepts through animation and game design (https://www.youtube.com/watch?v=KNphPVif26I), while trying to lift as much of the burden of syntax and the runtime environment.

Eclipse is open-source, and I'd love to see an article that motivates an Eclipse developer to create a kid-friendly version of the software for classrooms.


Were you teaching programming or Java?


Using Java at all is not a good first introduction to programming for computer science students.


Fully agree with this. The fact that the "hello world" program involves the use of several keywords (public static void main String[] args System.out.println) that students will have no idea about until 1-2 months later is not ideal. Compare that to python "print('hello world')" is very intuitive.


I don't see much difference between `System.out.println` and `print`. Both should be spelled exactly as they are. You cannot say `write`, `say` or `Print` in Python despite it being somewhat synonym to `print` in English.

They are parts of syntax, and any language will have some syntax to learn, and there are always syntactic quirks.

Granted, Python has lighter syntax than Java, but I don't think Java is too verbose for beginner's purposes.


what's System

what's out

why is there a period, what does that mean

what's ln

what's pubic

what's static

what's void

what's main

what's String[]

what's args

Java is as verbose as it gets with lots of unnecessary information at a beginner level


Why parenthesis? Why is the string quoted? Why I cannot write `print("a") print("b")` on the same line? Why `print`, if it shows on a screen, not a printer? What is `.py`? What is file? Why `write` on a file does something different than `print`?

I agree that Java shows you more syntax which is intimidating. But the answer to all these questions for all languages is: "it's a special syntax, you should write it exactly as it is. We will learn more details later in the course". There is literally no need to dive in. You wouldn't dive to variadic functions, builtins and CPython implementation details when asked about the name `print` or how it can take multiple arguments, would you?

You get some kind of syntax questions no matter what programming language you teach, even Python. Granted, there may be less with Python in the beginning, but in lots of scenarios the difference is negligible.


I think AP's point here wasn't that command, it's all of the class structure required to get to the coding. You either have to introduce that as "don't worry about that, it'll come later" or introduce OOP basics on the way to simply stating hello world.


Yes, and I would say "don't worry about that, it'll come later, it's boilerplate". Same as I would do with `import` in Python, actually.


Right!? I was like, you must be going to a "school" online where they just hire people who can use bigger words than them :D ooo00hh he can write in Java! He said type inheritance, I need to shake my head like I understand. Look at the color of his backpack, this guy's hired already, green backpack people are amazing. Java isn't even the deep end, it's just verbose, they should use it as a compare and contrast with Javascript, just to mess with people. Wait, so is this the script version of Java? Why doesn't it look like Java? Then show them python, and if nobody runs out screaming, you keep teaching.


You'll note that I actually do say this in the article.


This is interesting, since I normally fall on the “let’s keep this approachable” side of things - but in this case I disagree.

Maybe I’m just old, but I found using the tools directly very comforting. Learning some static html in a computer camp in notepad.exe was approachable and empowering. I could do it anywhere. I don’t think connecting code to files or compiling it yourself (in simple cases) adds too much complexity while learning.

The first time I opened vs studio I was terrified. I was suddenly not in control of my project. What if I bork it? What if there is an incompatible project change? I think these tools are fantastic and should be used, I just also think learning your toolchain is useful. Disclaimer - I learned these at my own pace, outside of the pressures of a classroom.


I believe it's a question of how good the tools are. If you're "using the tools directly", you probably don't have much knobs to fiddle with and everything either works by default or does not work because of some well-known and searchable reason, e.g. "you Java source file should have the same name as the only public class".

However, if try some higher-level tools, they may:

* Try to guess your preferred options and guess incorrectly.

* Hide some low-level details which actually matter in your specific case.

* Provide error messages that are not easy to look up because the tool itself is not as popular as its lower-level counterpart

For example, I found Eclipse (GUI) and Gradle (CLI) very frustrating to use, but IntelliJ (GUI) and Maven (CLI) worked wonderfully for me. The only downside of Maven comparing to pure javac for me was that it needed some boilerplate config. Other than that, it felt magical.


I just would like to add that I think this was an outstandingly written post.

For example, look at the first sentence: it promptly sets the stage ('first introduction programming for computer science students') and confers the main point (about using a professional IDE), all while remaining straightforward to understand. The reader knows exactly what is coming! I thought the remainder of the article was equally sharp, well-argued, and stylistically quite pleasant.

Sadly, the main points of the article seem to be sorely misrepresented in this comment thread. I do not know why, but I really would not want this to discourage the author from writing more.


Thanks! I appreciate the compliment and I'm glad you took away what I was intending!


maybe professional deserved emphasis since that part seems lost on pretty much everybody


Starting with the CLI only makes sense for students who are already using CLIs, the connection to familiar ground will help them. For other students it's just another burden.

Not that Eclipse is the ideal solution. These days I would probably start off teaching beginners in a web IDE like Repl.it to lower the friction even more. Of course there is a ton of things about development environments that need to be taught before they are fully-fledged programmers, but there is no need to frontload that and overwhelm them at the beginning.


Nora.codes makes a lot of sense in this essay.

CS instruction is a mixed bag. I started out in CS in it's Middle Ages so my own initial experiences aren't relevant. As a graduate student for many years though, I have watched and taught new CS students, which has given me a perspective on learning CS. More importantly, I've had an opportunity to observe my own daughter who started programming in college, got a degree in CS and will soon be finishing grad school.

Observing how my daughter and her friends at other colleges were taught, I saw many unnecessary difficulties (Eclipse being one of them) and on the other hand some outstanding approaches to teaching them CS.

Her first language was Racket (interesting) and her first book was How to Design Programs, 2nd Ed. (excellent choice, see [1]). This course taught her basic principles and heuristics for designing programs. She was able to go back to these ideas when she was uncertain how to proceed and though the steps elucidated in the text was able to work through her assignments; by the end of the semester she wrote a simple video game going above and beyond the assignments requirements. Essential to making this course accessible to someone who had never programmed before college was the controlled environment provided by Racket. I believe that she did all of her programming within the IDE provided by installing Racket. For an absolute beginner, a controlled environment hiding the complexities of the file system, etc. worked very well. She wasn't sure if she was going to like CS, but this experience helped her decide to keep going. Thumbs up for Racket and How to Design Programs.

A couple of semesters later, she had to take the Data Structures course. Inexplicably to me, the course which covered subjects like single and doubly linked lists, hash tables, and binary trees used Java and Eclipse. I feel like this is just teaching bad habits. If you are using Java, why aren't you learning to use Java's own collection libraries. Why use a garbage collected language, without an explicit random addressable memory visible to the new programmer, in order to teach how one builds doubly linked lists? The book was terrible. Thumbs down on Java and Eclipse for teaching Data Structures.

So, I agree with the essay that learning the details of Java in an intro course is inappropriate. The Racket course provided a series of increasingly more comprehensive subsets of the language as the semester advanced, perfect for insulating the students from incidental complexity.

And, I also agree with Nora.codes that learning the details matter. Java was too high level, to full of other stuff (like type erasure) to get in the way and it really obscured the ideas behind pointers, and implementing one's own binary trees.

[1] https://htdp.org/2019-02-24/index.html


Anecdotally I find these Java IDEs - IntelliJ, Eclipse, etc. all painfully slow. Is there some magic configuration that's required to make them run decently?

On Android Studio, for example, there are times when I tap a menu item and it takes a good 5-6 seconds to open. I'm on one of the latest Ryzen processors and I have 16GB RAM. Do I need 32GB?

EDIT: SSD for my work files and OS and the applications themselves


My current configuration is: Ryzen 3 1200, 16 GB of RAM, SSD for the OS, another SSD for the IDEs/code.

In my experience:

  - NetBeans is reasonably fast for code editing, but slows down once you try to do refactoring or even search for things; overall it ends up feeling a bit lacking when compared against the alternatives
  - Eclipse is always horrible, both in regards to usability, as well as performance; some people swear by its extensibility and workflows, but i do not share their experiences
  - IntelliJ (and other JetBrains products) are slow when opening and indexing projects, as well as somewhat slow when they aren't given >2 GB of memory for the IDE itself, or swap is hit; perhaps the best functionality though
  - Android Studio feels like an exception to the above, as it's almost always been slow in my experience
Some suggestions:

  - don't attempt to do anything when your project is opening/indexing, especially in cases of medium to large codebases (e.g. >100k SLoC) or large amounts of dependencies (e.g. >20 packages)
  - disk performance also probably matters a lot in this case, i'd suggest using SSDs for any development workstation and HDDs only for storing data for longer periods of time, or fewer larger files
  - consider increasing the amount of memory that's available to the IDE itself, which you should be able to do in the case of IntelliJ through Help > Change Memory Settings
  - the JDK that you use may or may not have an impact on this, switching from Oracle JDK to OpenJDK has improved performance in prod projects by a factor of 10, might have minute improvements for IDEs
  - consider whether you need swap or a paging file, since if memory gets persisted to disk, accessing it might slow things down more than necessary, or decrease your OSes swappiness value https://www.howtogeek.com/449691/what-is-swapiness-on-linux-and-how-to-change-it/
The latter two steps probably aren't too important, as long as you have enough memory and a fast enough disk, you should be good: currently IntelliJ menus open in <1 second once the project has been indexed and when there are no problematic third party plugins present, e.g. Codota (an earlier version of which thrashed my CPU usage).


It kills me that given all that people would still say "java is as fast as c++!"


That's a simple statement about a complicated situation, which doesn't do it justice.

When comparing synthetic benchmarks, C++ is 1-5x faster: https://benchmarksgame-team.pages.debian.net/benchmarksgame/...

When comparing real world benchmarks (webdev), C++ still is faster, but only sometimes, it all depends on the framework used: https://www.techempower.com/benchmarks/#section=data-r20&hw=...

I think that this is largely a sign of various small inefficiencies and design decisions adding up in any piece of code - and the larger the codebase, the more of these there will be, which is why for non-trivial programs C++ can actually be slower than Java. Not because the ideal C++ code is slower than the ideal Java code (with realistic runtimes), but rather because in practice you'll never get to this ideal code.

JetBrains paid a certain price in performance, just so they could get rich cross platform IDEs working. That said, Visual Studio is written in both C++ and C#, and suffers similarly in many hardware configurations. I think Wirth's law is at play here: https://en.wikipedia.org/wiki/Wirth%27s_law

With that in mind, write code in whatever will give you the best results, whether you're optimizing for speed/ease of development, or runtime performance. In most companies, the former two are more important than the latter, which is okay with me, as long as i don't have to pay for their servers.


> I think that this is largely a sign of various small inefficiencies and design decisions adding up in any piece of code - and the larger the codebase, the more of these there will be, which is why for non-trivial programs C++ can actually be slower than Java. Not because the ideal C++ code is slower than the ideal Java code (with realistic runtimes), but rather because in practice you'll never get to this ideal code.

but this isn't what we observe in practice: all the C++ apps I know and use regularly consistently work better than their Java equivalents AND perform faster. Jetbrains IDEs are unuseable for me coming from Qt Creator and I don't understand how anyone can stay sane with them - and I try CLion pretty much every year since it's been announced.

Besides, what matters to compare language is not how the ideal, infinitely optimized app looks like, but how the average one does. And, to me, the average Java app is unambiguously terrible when compared to the average C++ one.


I have 16GB for Android development and it gets painful, 32GB is recommended.

Are you paging to disk?

You may want to increase your heap size if you haven't already.

https://developer.android.com/studio/intro/studio-config#adj...


Why not to use `perf` to figure out why Eclipse is slow, fix the problem, and then send a patch?


I suppose IDEs can be a good productivity tool, but they seem to be a poor teaching tool in a university setting. It's difficult to learn how to drive a car by peering through the smoked glass at the chauffeur.

For first-year classes, I suggest restricting lectures to concepts (linked lists, big-O notation, and so forth) and not coding. Let students learn coding by themselves, by reading an assigned textbook on the language of choice, supplemented by a few one-page handouts on linux, vim/emacs, version control, etc.

Assignments should involve both writing short essays about concepts and writing code. The latter should be expected to work, to be well-documented and, as the class progresses, to be elegant.

In mid-term and final examinations, students should expected to again write essays and code, but this time by hand, on paper, and in a room without computers, books, cell phones, etc.

The goal, it seems to me, is to expose students to fundamental concepts and give them the skills to learn more by themselves. The use of IDEs can come up later, as they start working on larger projects.


The author appears to be a purist. While I respect the position, I'm a pragmatist and respectfully disagree. Also the title seems misleading. It should probably be "Stop making students use IDE".

Schools' job should be to teach how to learn in my opinion. Back when I was in university, I was taught a lot of things too and some of it were fundamentals thus necessary. Others could have been cut out for more discovery based learning.

I "learned" Java after many other languages (C, C++, Perl, C#, Python, JS, etc). I've used Eclipse and NetBeans but I feel IntelliJ is special. It has a great feedback mechanism that teaches the language too as a side effect. Dealing with the command line to compile code does teach a few things but after a short while it's just busy work imo. Learning imports? Sure, but again, not having to spend much time in mechanics means one can spend more time on more exciting aspects like refactoring.

It all depends on what the teaching focuses on after allowing or disallowing the use of IDE.


I'm curious what you think I'm a purist about - CLI usage? Languages? That wasn't the intention of the post, so if that's what's coming across, my writing is poorer than I thought.


I’m at a school right now where they teach C++ using Visual Studio on Windows. I transferred here, and at my old school we used gcc on the command line (using WSL if you were on Windows). The amount of time wasted in class trying to get VS to behave right is ridiculous. On top of that, none of the kids know how to troubleshoot anything outside of copy pasting their code out to Notepad and asking VS to build a new project (and sometimes that doesn’t work, and they are stuck). It’s a productivity killer, and they’re all super scared of the command line (so when Web Dev with npm or Cloud Computing with Docker is taught, they have to be hand held for the first few weeks on how to get to know the command line).

Apparently employers don’t care (probably because when they get to work everyone else uses a IDE anyway) but I don’t think it’s a good idea in the long run.


Honestly figuring out the tooling is going to be a big thing you'll have to get used to. Eclipse with maven on linux is way easier than on windows. Dead simple even. If you're going to be building code, you'll probably be using linux to run in prod, might as well get introduced to that now in school. You have to do some extracurricular learning, but everyone has to. That's just how it all works. Java is taught because it is more C than python where you don't really have to think about how things works, you can just throw shit at the wall and it'll stick. But XX year old systems you'll be working with in the field won't be like that. so learn typed languages, learn you some nix and get used to running into frustration, because figuring things out the hard way is usually what it takes.


Unless you are at a place using Java 8 with tomcat in windows for production with powershell scripts inside Jenkins for build systems. Good times. Nothing like hitting stackoverflow and coming up empty because your entire took chain is stupid and wrong


My heartfelt condolences.


I don't know what the rage is all about with Eclise really. I'm a hard code dev for 30+ years, kernel, embedded, I do a lot of C, last time I wrote some java was in 1995 to give it a spin, none since.... ... And I use Eclipse CDT as my primary "IDE" all day. It's stable, it's as quick as the other JS monsters out there, and one thing it has is that the Indexer is unique -- the Eclipse Indexer will index /anything/ -- even enormous codebase. And it's not "ctags" dump indexing, it's full C preprocessor expanded macros.

And really, if you use the "Makefile" projects, it's just an IDE. And it's better than everything I tried (as of last year) -- most IDEs out there can't even parse "Make" output correctly to parse the error messages!


I had to teach web dev to students who were coming straight out of the intro courses that used Eclipse and yeah...everything that the author is saying here is right and these IDEs do a disservice to them. A surprising amount of students don't have an idea of working around the command line, file systems, and in general develop poor debugging skills (and, in turn, poor programming skills since they're used to eclipse etc filling in everything for them).

I think a good middleground is the IDE that CS50 has their students use. It gives them exposure to the command line and allows them to actually debug on their own. Of course, utilization of that IDE in intro CS courses means that these courses won't be using java anymore (and I frankly have no idea why it's still a mainstay of cs courses to this day).


Where I work, I was just assigned a project to add functionality to. The README instructions for the project started thus:

1) Download Visual Studio Code from https://code.visualstudio.com.

2) Clone the repository from BitBucket at <blah>.

3) Download Kafka and Zookeeper, and follow such-and-such directions to run them.

4) Make configuration changes <foo>, <bar>, and <blatz> to connect to the nonprod Mongo database.

5) To run the program, right-click on Application.java in VSCode and then click on Run.

6) To run unit tests, right-click on their names in VSCode and click on Run Unit Test.

The real world is just like the university! IDEs are how you engage with the program in the enterprise -- and you'd better be using the same one that the original author used because if something goes wrong, you're not going to be able to find someone who can fix it without that particular IDE.

This idea that software should build and run with just the compiler and simple, scriptable build tools is a quirk of open-source development and distribution practice from the 90s and it does not generalize. In the real world, if it can't be done from the IDE it generally can't be done. I know, I know, CI/CD -- but that's generally provided by templates from the corporate CI/CD team and the average line programmer need never touch it, nor know how it works. You may think this is a totally clownshoes way to handle software, and you may be right. But billion-dollar companies run on software handled this way -- and worse ways! -- and you're not going to be able to change it. This stuff was written by tired programmers who just want to play with their kids after putting in their 9-5. You need to budget for these kinds of changes during work hours if you want them done -- and knowing how to operate javac, and structuring your software so that it can be edited, compiled, and run with any tools that are up to the job, do not deliver value to the customer and will not be budgeted for.


> but that's generally provided by templates from the corporate CI/CD team

That's rather a "corporate" perspective. And having to rely on corporate infrastructure, build-teams and so on results in a mushroom outlook, in my experience, which I found depressing and disempowering.

In most modern non-corporate development work, you are the CI/CD team.


where i work most projects are IDE agnostic. Clone repo, cd to folder and "gradle build". In this case person can choose whatever ide he likes, or just use command line tools.


They should have to use EMACS. And git. And the UNIX command line. And learn to suffer. And get off my lawn. Right.

At least with an IDE, you have a set of tooling that plays together.


Can we maybe agree that the real question is not if/when do people need to learn toolchains they use, but why do we need this knowledge in the first place?

Basically, regardless of whether you are a student or have been using $toolchain for the past 10 years, troubleshooting of (or even just the need to understand) it gets in the way of doing the actual work.

We should strive to have better tools, that just work, without the need to debug cryptic errors.


Given how much time and effort is spent on developing tool chains, I'm a afraid one that "just works" is a pipe dream.


I agree completely here. I refrained from recommending them in this article because it would have made it even more controversial, but both Elm and Rust have made _incredible_ strides on this front.


Teaching programming is a technical skill.

Teaching engineering is about conceptual skills and how to apply them, preferably in a standard and proven and consistent way.

Teaching science is about research and developing concepts.

They are 3 different sorts of education. An introduction to computing needs to cover all 3 so that the student can then specialize.

The details of Java or C or C++ or <insert language> is not the point in 1st year courses. It should be about teaching all 3 aspects. Learning "import" is not important from a file system/CLI POV, but learning about libraries and APIs and how they relate to your programming and engineering is.

Learning about file systems and UTF-8 and OSes etc need you to have grasped the concepts of persistent vs ephemeral storage and i18n/l10n and string manipulation.

They are concepts that are relevant to the engineering side of computing.

IDEs vs CLIs vs non-integrated DEs aren't that relevant to 1st year courses.

A well structured course won't need to push Java vs C++ vs Ruby vs Rust vs Python. The conceptual elements are mostly the same and the title of Wirth's book, "Algorithms + Data Structures = Programs" summarizes it well.


I agree with the title. No one should use Eclipse unless there is no other option.

The article appears to be typical comp sci masochism though. I've built Java projects with make and javac, and it was fine, for 2006 when everything was much simpler. But the only reason I ever did that was because I bought into the terminal Stockholm syndrome that afflicts programmers, and switched from using JCreator to emacs.

Eventually I came to my senses and bought an IntelliJ license.

The thing that I really think that software pedagogy needs, is some coursework on the history of software, which explains why things are the way that they are, and how they got to be that way. Everything about software and how computers actually work is this dirty, messy accumulation of ideas piled higgeldy-piggeldy atop one another over the decades. If you just try to understand the result without understanding the stratums and accumulations the created that result, you miss an enormous context, and you are in enormoys danger of just recreating wheels.


They reference an improved student centered IDE and then jump into what IDEs are good for and finally that in first or second year:

>What they can’t do, unless they’ve figured it out on their own, is operate a computer outside of the confines of the IDE they’ve been taught. In none of the curricula I’ve seen, through personal experience or reading syllabi provided by other students, is there a place for students to get past the myriad of barriers that constitute the use of a computer in the modern day.

I have to unpack this but first of all, the use of a computer in the modern day for many engineers does involve an IDE, so I question whether that is even a true claim, but continuing the things missing from an IDE are:

> First, in a Java-focused curriculum, it insulates the student from the javac command line program, and the command line environment itself. Second, it catches some basic mistakes and allows the student to defer learning about the finnicky language requirements that aren’t deemed core to the curriculum, like imports and file naming requirements.

I don’t know about others but this didn’t fill me with horror. Those are the things that would make a student in an introductory class want to quit, and the number of drops in CS is already very high. If as the author said it takes 2 to 3 years to grasp control structures, why mess with the finicky software engineering headaches when it’s an added load on the students? By the time they graduate they should have a complete view of the system and in my experience things like the stack, logic gates, and compilation are taught in intro CS, the students are just not forced into it when coding which is really another endeavor.

Here I do agree with the author that some more granularity in the department of CS would help, but I’d argue that would lead to more specialization and fewer graduates who understand the entire system.


Personally, I'm against making students use Eclipse because it's a terrible IDE and this constitutes cruel and unusual punishment.

Seriously though, I have trouble following the author's argument here. For example, she points out that details like the Windows filesystem being case-insensitive are hidden. Well, that's true if you use an IDE or not. And how does not using an IDE expose you to that?

Also, what 300 level software engineering course is taking time out to teach you how to use an IDE? Is my experience abnormal? I can remember showing up to second year and being told "we're using C now, here's an 8 page pamphlet about how that works" and that was it.

I'm all for not making learning programming arbitrarily more difficult in the beginning as that's how you lose people. Having done things is not a badge of honor. In fact, encouraging this kind of gatekeeping as acceptable is (IMHO) far more harmful.


Absolutely not. There's significant educational benefit in timely, corrective feedback, which an IDE with incremental compilation and a linter is perfect for.

Eclipse isn't a good IDE for teaching[0], but it's significantly better than the commandline and javac.

By introducing too many concepts, you water down the educational value and motivation to learn. Seeing output on the screen is exciting, don't get in the way of this.

javac doesn't provide value for beginners, it's a roadblock to a user seeing output on the screen. Take it out the way of new developers, maybe show it in a "commands executed" window to drive curiosity and introduce it properly in a later course, but only after a user feels comfortable with the basics of software development

[0] I don't think a good IDE for teaching exists currently. Customizing IntelliJ would be fantastic for this, and I'd love to see if someone's working on this.


To start with, shouldn't universities just not teach programming? I'm taking a graduate class on artificial intelligence which involves coming up with a novel project and implementing it, but we have free reign to choose what tech stack we want. It's just assumed that everyone can handle picking up a new programming language if they have to, despite people in completely different backgrounds being in the course (such as my neighbor, who majored in physics)

You don't need class time to learn how to follow codecademy tutorials or to learn how to Google. You need class time to learn a bunch of theory and algorithms and higher level topics. Pointing out syntax errors is a waste of a TA's time.

We don't have addition or algebra as part of the core curriculum at univeristies, so why is basic computer competency like writing the most basic of programs or using CLIs, a part of the curriculum at a lot of schools?


College algebra?

Programming is very difficult to learn in a short period of time if you've never worked in it. Translating other concepts to how they work in the software world requires time and work. From what I've seen intro to programming classes are much harder than they ought to be and fail to translate the math concepts we all already know into the programming world.

Writing programs isn't basic computer competency. Using CLI's, more advanced computer competency, but not at the level of writing software.

Students struggle to resolve off by 1 errors in array index's vs length. Objects, instances and inheritance aren't even on their radar yet.


To start with, universities need to decide if they're teaching computer science or computer programming. Currently a bachelor's in the former is actually mostly the latter.


I don't think "computer programming" is an appropriate subject for a batchelor's degree. That's called a "training course".


Where did you get this impression?


From my own degree.


Not sure why you were downvoted - I don't think a student that needs to be hand-held through learning a specific programming language should be on a university course.

Perhaps the answer is that many low to middle-grade universities are prepared to enroll students that are not really university-standard candidates. As a consequence, the school has to provide hand-held training in the specific language the course depends on, because the student hasn't been taught how to study.

You may not know how to study independently at the beginning of a university course; but if you haven't learned by the end, then you shouldn't get to graduate.


There should be at least some time spent on the basics to ensure that the degree is accessible to everyone otherwise you'll only get people that had early learning or put in lots of extra effort able to do it.


Universities shouldn't teach programming in X language, producing monoglots.


> Using Java or Python in a professional IDE like IntelliJ IDEA, NetBeans, PyCharm, or Eclipse is not a good first introduction to programming for computer science students

I agree. But I would say that using Java itself is not a good first introduction to programming for Computer Science students. Python interpreter in a textual terminal seems like a better place to start. Alternatively, JS code in an HTML page. Students should not feel they're steering a large airplane with an infinite number of indicators, knobs and switches; it should be a low-epistemological-noise environment - where, initially, the only new thing to learn is the language and what output you get when trying to (compile and) run a program.


As a teen, I taught myself Java several years ago (pre 1.2) on BlueJ, and I'm so happy to see that it is still a thing! https://www.bluej.org

It really helped a lot back then!


This is horrible advice, yes use a old rusty hammer instead of an airgun to build that house, if you don't, the house will suck and you won't understand how it was ... built? Huh?. I have 4 kids, and 3 can write code, one has downs and she is just taking her time is all :). They all learned on Intellij Ultimate babay! My 10 year old can write in 3 diff languages already, thanks to the help he gets from his dad and the error checking his ide does. I've been using ides since dos edit. If you don't use one, I give 0 fuches.


Stop teaching how to drive, teach combustion engines and mechanics instead.


This strikes me as an unfair comparison - we don't have for cars what IDEs are for programming. The best analogy I can think of is, "you should teach people to drive the car before you let them use Tesla Autopilot".


But before you can teach them combustion engines you have to teach them combustion. AP Chemistry should be required for all 15 year olds.


I bet most of them understand what happens throwing out sulphur dust into water.


I started programming with text editors. I wouldn't recommend that to anyone. While learning how build systems work is valuable, it should be done through inflicting pain or wasting time.

And learning languages, algorithms and data structures is more valuable than learning the build systems.

Also, most assumptions in the article are false.

Most courses targeting Java or .NET will teach what the underlying framework is, what is the difference between the compiler and the VM, what jit means, what is the memory model, what a library is. Without going into unnecessary details.


> Also, most assumptions in the article are false.

Can I ask what assumptions I made that you consider to be false?


Interesting problem, I imagine many here grew up on computers not because it was valuable or strategic but because they couldn’t not, such was it’s draw. Learning the concepts by brute force was the norm, the idea of not knowing what a file, file system etc is is foreign because you’ve lived and breathed it for so long before higher education.

I guess there are people today who don’t fit that mold and approachability becomes important, shouldn’t be surprising.


This was the initial concept behind the Pi - something to give A Level students (16-18) a basic computer to learn the basics before they went to university


I don't really see much difference for a newbie between a button in IDE or copy pasting some arcane command line command to get something out of mess of text they don't understand they copy pasted in a file...

Many forget that these students are essentially starting from zero. And just to get something that prints text in first place is good start. With luck they should also learn some flow controls and functions too.


Anecdotally if I’m collaborating/helping a newer programmer with python, I can tell if they are using and have only used PyCharm. PyCharm users never seem to understand virtualenvs or requirements.txt. Collaboration with pycharm users at that level always gets frustrating as they seem to routinely fail to update requirements.txt - as their setup isn’t using it and it isn’t easy for them to do


It is almost scary to me how many people here think that by knowing how to run a compiler from a command line, they now know how a computer works.


This discussion shows that there are several types of developers who have radically opposite ways of functioning. There are those who think that a car driver can ignore all aspects of the car other than the interior. There are those who say that knowing how a car works makes us a better driver. I think we have to accept that these differences do not mean that one is better than the other.


Do people still use IDEs? I've worked as a web dev for years and the only person I know who used an IDE was an older DBA. Honestly just learning how to run and compile shit from the CLI is the most direct way to learn to program, and not that hard. VSCode is free? I learn this way both on my own, bootcamp, and now in grad school. Only older YT videos suggested Eclipse and such


I legitimately left a job once over being made to use eclipse. Absolutely abominable horrible IDE.

Our environment was so fragile, we needed everyone to be on eclipse for things to work. To this day I don't really like Java, and I typically don't even apply to Java roles. Eclipse is just that bad.

Visual studio on the other hand is one of the best Ides ever made, you can tell Microsoft puts in some effort.


For first time students I'd argue they should stick to good IDEs with minimum hassle. They can learn the command line stuffs later. Back when I taught myself C++ the least thing I wanted was to learn the compiling process at the same time, so I picked Visual Studio. After that I switched to manual compiling and linking when I studied C.


I am going through the process of creating a stepping stone in the form of a small app code sample that already has the feedback including UML visualization in place as a way to address this in my tutorials and upcoming Flutter dev book series.

Its kind of the right way to pair the livebook way of teaching Flutter frontend and frontend in general.


Lemme up the ante. Before starting a course students should design and assemble a computer they'll be learning to program on.

Oh and while we are at it please prohibit all languages with automatic memory management, safe arrays, range checking etc. etc. Real programmers code by flipping switches.


Why not build it from ground up. That it first have them learn physics, without teaching them calculus, and expecting them to figure that one out themselves. Next they can learn semi-conductors. Then gates and so on. Why not have them design their own CPU and assembler, ofc, never giving them tools for this... Once they have some working system, we can move to things like data structures.


We could use it to teach them mining, smelting…


> Use a language that teaches the fundamentals of the paradigm you’re interested in, like Scheme or Python. (Please, please not Java.)

If not Java then at least use a statically typed language. I feel that understanding types, and type-systems, is fundamental to computer science.


Only if you want to teach that. An introduction to programming might very well skip over static typing checks.


Reading all the comments, the bulk of which are in agreement with my own beliefs in this, it is perhaps better to write an alternative article that has the opposite position.

Or, indeed, instead of <<article>> an <<!article>>


I would appreciate that! A lot of people have criticized this piece pretty vaguely and I'd love to see some points in opposition laid out in a more long-form piece.


Looking at the screenshot of jGRASP I can see it looks similar to IntelliJ. The icons are identical. I assume it’s based on IntelliJ. What’s the point of the article?

Yes Eclipse is too complicated but I think IntelliJ is good for starting.


I'm 3 months into a 3 year part time Software Development university course. First module is Java and using Eclipse. Can someone recommend something else I should be learning in line with this article/your view?


I like intellij ultimate, it's not expensive, it's a netflix subscription essentially, but better, endless fun :). They have built in educational materials, and you can easily jump through libraries to see how a function works, all the way down. It's fantastic. The MIT courses suck, even if they weren't evil. Anyone looking to support MIT should look up Aaron Schwartz and then go elsewhere.


I'd recommend this one: https://missing.csail.mit.edu/


yes, stop making students to use eclipse, or blueJ or whatever it is. They can choose whatever they want. In real world, very rarely a workplace force you to use specific IDE. Most of the time they use eclipse because no one knows how to setup their project in other IDE so they stuck with it even though it sucks. I remember a project which I have to setup linking each project manually inside eclipse. I asked myself "this is ridiculous, if I switch my PC, I have to re-do all this again?"

Switched to Intellij and use other jetbrains product, never looked back


This guy is making a bigger deal out of this than it needs to be. On days 1 and 2 use javac and notepad.exe (or whatever). On day 3 move to an IDE.

This as how we did it back at my school and it works.


> Please, please don't Java.

Weak. My university starts with C++.


This completely misses the point. The deeper problem is that it’s a Java focused curriculum.


You may note that I do, in fact, call that out as the deeper problem:

> Use a language that teaches the fundamentals of the paradigm you’re interested in, like Scheme or Python. (Please, please not Java.)


I thought it would be a rant why they don't allow using community version of Intellij.


I didn't know about jGRASP. Good to learn it exists.


Im glad he put in there that he’s not talking about mit or cmu. There is a world of difference between a cs student there and someone learning programming at the average American university.


*she


Maybe he didn't get the memo




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: