I don’t agree with this perspective: why do entry-level students have to understand the internals behind compiling their code, and weird edge-cases like misspelling import? In my experience this is actually the stuff that hinders learning and confuses students.
The author claims that students won’t be able to do real-world programming without learning this esoteric stuff, but actually this is the exact stuff they don’t need to know: full teams can and have built huge java projects just by using the intellij project wizard and maybe a tiny bit of Gradle from stack overflow. Actually students can solve most of the author’s problems by spending 5 seconds on Stack Overflow.
I’m not saying that these more esoteric details are completely useless. Learning e.g. the difference between char and UTF8 “character” is actually pretty important so that emoji inputs don’t crash your website. But this is stuff that should come in later classes, particularly for students who are interested in those details. Not everyone needs to know it.
In fact my experience is that some colleges have the exact opposite problem: they teach students old tools and techniques which are tedious and not actually used much later on. Like writing code by hand and deducting points for syntax errors, or using some outdated framework e.g. jQuery. When I saw the title my first thought was literally “yeah stop making students use eclipse, they should be using IntelliJ!”
> full teams can and have built huge java projects just by using the intellij project wizard and maybe a tiny bit of Gradle from stack overflow.
I would not ever want to work with anyone from those types of teams. They are the kind of programmers who have a very narrow range of abilities, and whenever they see a problem outside those abilities, they get completely lost.
> But this is stuff that should come in later classes, particularly for students who are interested in those details. Not everyone needs to know it.
Yes, they do. They absolutely do. There seems to be this mindset that's starting to infect software development that pushes the idea that people can be good at programming computers without actually knowing anything about how those computers work. I just do not understand how anyone could seriously believe that. My experience over the past 20+ years tells me that people who avoid learning about these topics are generally just not good at their jobs, or at best can do a decent job of things, but then completely freeze when they encounter something outside of their narrow comfort zone.
> In fact my experience is that some colleges have the exact opposite problem: they teach students old tools and techniques
That's lame, but has nothing to do with the point you're trying to make.
You only like to work with experienced engineers, who are already experienced with the specific tooling you use on your projects fine.
Students by definition are not experienced engineers. They’re in a given class to learn the topic for that class, and hopefully that class with be a challenging program of learning. In CS often the fact they’re even working in Java is incidental, they’re not even there to learn Java, but generalisable CS concepts.
I’ve got no problem teaching students practical software engineering skills, sure, that can be very valuable. It should be on a course on that topic though.
> You only like to work with experienced engineers, who are already experienced with the specific tooling you use on your projects fine.
That's not what I said at all. Inexperienced in general is fine. I benefited from mentorship earlier in my career, and I'm happy to pay it forward.
Inexperienced with my specific tooling is a weird point to make, because it directly contradicts my point. If you teach people how things work: the underlying computer, OS, system libraries, and teach people the nuts and bolts of how to do their jobs: version control, the command line, filesystems, etc., then I especially don't care if they've used my specific tooling before, because they probably have enough foundational and background knowledge to be able to pick up different tooling and languages without too much trouble.
The people who are just taught to open an IDE, click "new project", and never leave the IDE, are the kinds of people that I expect will have trouble if they join my team and we're using different tooling than they're used to.
It's always felt to me like, "Oh you want to learn to cook? Great!" This is a pickaxe. I want you to mine some iron slag out of this vein.
"What does this have to do with cooking?"
"Well, you want to use a knife to cut your vegetables don't you? How on earth do you expect to get a knife to cut your vegetables if you don't mine for iron slag, refine it, build a forge, learn blacksmithing, make thousands of inferior and useless knives until by trial and error you finally have a knife you can cut vegetables with!"
"Ok, so once that is done then I can be a cook, right?"
"What? are you a fucking moron? Look at this asshole who thinks he can be a chef if he knows how to make a knife! What an idiot! You need a pot and a pan and a kitchen, but to get a kitchen you need a house and to get a house you have to buy the land and then to get the food you have to learn agriculture and animal husbandry, woodworking, stone-masonry, how to quarry, political intrigue, bartering, build a working economy you and 7 other people can all agree on using..."
It isn't that difficult. All I really needed was a problem and a line on the tools I needed to solve them. All the other stuff may make me competent should I need to rebuild the world's technology stack from the ground up a.k.a Dr. Stone but it never earned me a dime of income of served the most basic of purposes in attempting to dissuade me from a computer career.
I agree and also feel like Students new to computer programming shouldn't be using Eclipse. They should be learning the fundamental concepts and their implementation, and having to learn a whole very complex IDE as well would just add to that difficulty.
if, while, and other programming concepts? I'd agree on that.
But the article suggests learning about files, file formats and other low level stuff.
So maybe they should first learn how a filesystem works, how it keeps internal consistency, and how the bytes are mapped into some physical storage. That's pretty fundamental too, right?
What we have here are "leaky abstractions"[1].
If the abstraction an IDE provides is not good enough that students can work with it without knowing about the lower layer, then the abstraction is leaky. And that is the problem. Not "should I teach them with an IDE or without".
> What we have here are "leaky abstractions"[1]. If the abstraction an IDE provides is not good enough that students can work with it without knowing about the lower layer, then the abstraction is leaky.
Do you suggest teachers fix Eclipse so it no longer has leaky abstractions rather than just teach students how to work in a modern file system?
I agree with you. For example, in an algorithms class they need to focus on the right things such as O() and not on the Java compiler or the Python interpreter or even worse (god forbid) on gcc.
>> full teams can and have built huge java projects just by using the intellij project wizard and maybe a tiny bit of Gradle from stack overflow.
> They are the kind of programmers who have a very narrow range of abilities, and whenever they see a problem outside those abilities, they get completely lost.
How does one follow the other? How does not learninging unnecessary skills hinders ability to solve new problems? If skill is really needed later one person from the team can learn it and share most important bits of knowledge. If skill is not needed now why learn it? YAGNI can also be applied when learning things.
I find that not understanding how compilers works or how to use them in low level tasks is minor problem when developing. I never met developer who had problem like that. Much bigger problem during developing is not listening to and understanding your users and IDE will not help you with that. This is much better skill to learn than compiler CLI.
"How does not learninging unnecessary skills hinders ability to solve new problems?"
I disagree here. Life is not so easy that you can ignore many problems. If you don't know certain skills you don't know the best way to solve it and it can lead to various problem.
Its like saying why learn chemistry, advance math, physics in high school when you might not even use them other than few simple formula and if you need them just learn it from team. I don't think this is how real world works.
> Its like saying why learn chemistry, advance math, physics in high school when you might not even use them other than few simple formula and if you need them just learn it from team. I don't think this is how real world works.
the point of learning all that in high school is not to learn about these things specifically, just to teach you how to learn things fast so that you can easily learn what's useful when you work.
No one cares about what you learned in chemistry class at 15 y.o, it's all approximations if not outright wrong stuff sometimes and it does not matter in the slightest.
Hard disagree. While indeed atomic models taught in school are way too simplistic, it’s a much better world view to vaguely remember than not knowing anything about molecules/atoms. The same for biology, and we should indeed push for somewhat stronger education in most countries (as is apparent from stupid antivaxxers), as having at least the correct mindset on scientific things is exceedingly important.
The correct mindset to "not be an antivaxxer" definitely does not correlate with raw knowledge. Proper philosophy and logical reasoning classes are infinitely more efficient for that, though a bit more background in statistics and probabilities would also likely help as it seems that a lot of people struggle with that.
A great rule of thumb I’ve read on HN is that a great developer is one who understands at least one layer of abstraction below where he/she works (but 2 is better). That doesn’t compiler intrinsics if you are programming in high level languages, but it does include in my opinion the general tooling of the language, its way of execution, etc.
I don't think that's true. I think learning things that are related to your work will almost always make you better at your work, even if you don't use that specific knowledge directly. Human brains are big pattern matching machines, and the more data you can feed it, the better it will be at looking at problems and truly understanding them.
> I find that not understanding how compilers works or how to use them in low level tasks is minor problem when developing.
I'm not saying that you need to understand enough to build your own toy compiler (though that can also be very useful in understanding how to write better programs). But at least understanding it from a basic level, like "javac takes in my source code, turns it into some kind of machine-readable intermediate representation, does optimizations, and outputs bytecode" is something that I expect every Java programmer to know.
> They are the kind of programmers who have a very narrow range of abilities, and whenever they see a problem outside those abilities, they get completely lost.
The vast, VAST majority of programming is of the type OP described. It ugly and has warts, but is good enough to deliver business value. I’ve seen it first hand.
While that may be true, programmers working on these types of projects will end up being better at doing them (they will build it faster, write fewer bugs, and have an easier time fixing the bugs they do write) if they understand more about computers, operating systems, and the tooling around their chosen language and IDE.
Sure, maybe they can still get things done without that knowledge, and it'll work, but that's not really the point.
> I would not ever want to work with anyone from those types of teams. They are the kind of programmers who have a very narrow range of abilities, and whenever they see a problem outside those abilities, they get completely lost.
Okay but what matters is not who you want or do not want to work with, but "can it make a business that works". That's the only metric.
The first job I was in the owner did "code" pretty much everything on a visual programming language (Max/MSP). Was it pretty ? No. Did it allow him to start a business and sell things ? Definitely, and that's the only thing that matters
With this myopic attitude you'll get your business up and off the ground fast. Good job.
Then you will be mercilessly hacked.
Once you rename your startup to avoid the negative publicity, you'll hit a performance wall.
You'll lose customer data (and the associated customers) while you try to figure out what the four letters in ACID actually mean.
AWS will take 80% of your profits.
A different company will have taken their time, done it properly, and eat your lunch. They'll take the customers that lost data, as well as the customers that couldn't stand your slow-ass website.
Then they will mercilessly undercut you, because they're not scaling to infinity to compensate for an O(n^3) algorithm being used to scan database tables in their entirety by Mr I Know How to Bang Out Code.
You live in a parallel universe. Hacked ? So what ? Equifax got hacked too. Its valuation is literally up 150% since then lol. People make more money selling wordpress themes than I do selling low-latency C++.
Equifax is a monopoly. Run-of-the-mill companies will lose contracts and money when their infrastructure goes down or some crypto miner explodes their aws bill.
Optimizing is a different item, it reduces costs... More profit, better costs for customers, potentially undercut other companies with prices that are not feasible / offer more features.
o(n^3) bugs happen, sure, once every two year. So the 99.99% rest of the time, wondering about it is useless... Just as much as nitpicking over which version of Eclipse or gradle or whatever to use. In the grand scheme of software projects, these are not super relevant. Making code robust to crazy unexpected inputs is much more important. Making code that is easy to read (that is, readable without knowing tons of frameworks or technical details) is much more important.. At least in business computing.
One wants to get things done to the exclusion of all else. Possibly trapping you into a corner if you need to ever scale.
And the other wants to “do it the right way from day one“. An undertaking that would cost more than a small business could ever hope to afford or slowing you down giving rivals time to catch up.
I don't really get why you seem to believe there are just two extremes and that's it. I don't think "do it right from day one" means that you never accumulate technical debt or that you write the perfect architecture that anticipates every possible future need.
Doing it right, to me, means hiring people who can adapt to changing conditions and needs, and who can make (or help you make) the right trade offs when you have to balance faster time-to-market (which can involve cutting corners) against avoiding technical debt.
The team that lives inside IntelliJ and can't do anything outside it is not going to do that for you.
>I don't really get why you seem to believe there are just two extremes and that's it.
Well, we’re only discussing a specific dimension of software development here. In reality there are a whole bunch of things not even related to software that will cause problems.
It's not about being pretty, it's about understanding when you're incurring lots of technical debt that's going to bite you in the ass down the road. There's nothing wrong with implementing a complex system in the likes of Max/MSP necessarily but it should be a reasoned decision, not something you do reflexively because you just don't know any better.
There are millions of business that "work" but that doesn't mean they are places I want to work. Personally I put the bar higher than "I'll just use the magic wizard and hope it works" and I'd prefer working with similar minded people
My preference is to work with teams of people that strive for something else than "I'll just use the magic wizard and hope it works." Others can have other preferences.
I don't have an opinion where these people develop these attitudes/skills
That's the huge majority of all degrees in university ? In my country, from the figures I can find online, there are ~20k "pro" masters vs a few hundred "research" ones. It's entirely negligible (I know it firsthand having been through the research path, which was a single-digit percent of the students in the "pro" path)
It actually is, because why would I pass them during an interview if I don't want to work with them? Thus, the students will be locked out of good jobs - and it's the university's job to give the student the necessary skills to find a good job.
Maybe "can it make a business that works" is the only thing that matters to you, but it's not to me. If I have to spend my days dealing with developers who can't think outside of a tiny box, I will be miserable. I don't care how much money the business makes; that's not how I want to live my life.
And what happens after you get that business running and bringing in some revenue, but you need to start tackling harder problems in order to keep your current customers happy and attract new customers? You're stuck with a bunch of developers poorly suited to solving those problems, and a bunch of infrastructure that wasn't designed with this sort of feature expansion in mind.
So you hire some new people who have to rewrite a bunch of things, or who have to spend months or quarters untangling the mess. You could even miss your market opportunity. Even if you don't, you've still wasted a lot of time and money.
> Maybe "can it make a business that works" is the only thing that matters to you, but it's not to me.
What I want doesn't matter any more than what you want regarding this. What matters is what the society as a whole wants, and the current situation is in big part the manifestation of it.
> And what happens after you get that business running and bringing in some revenue, but you need to start tackling harder problems in order to keep your current customers happy and attract new customers? You're stuck with a bunch of developers poorly suited to solving those problems, and a bunch of infrastructure that wasn't designed with this sort of feature expansion in mind.
you're assuming that you would even have gotten the first customers if things were "done right". My experience seeing entire businesses held by 1kloc of "academic" (I'm not saying this as a compliment) python scripts with 20% of it being globals succeeding says otherwise.
For how long? Until it needs a rewrite because it’s completely unmanageable. Yes I know, the rewrite cycle is part of the business now, but I don’t think it’s necessary. We can do better.
> > In fact my experience is that some colleges have the exact opposite problem: they teach students old tools and techniques
>
> That's lame
It's not lame, it's fine. It is just fine, and usually better, to teach basic concepts using a simple tool - even if it is not fashionable or up-to-date.
I think the parent's point was that some people teach using old tools that no one uses anymore, while teaching with the current-generation tools would be just as easy to teach, but would be much more useful.
Agree that teaching old techniques can be useful in the cases where the current techniques build on what came before.
This article is in the context of computer science education, though - not real world programming.
Undergrad CS students aren't learning software development first and foremost. They're learning logic and math, data structures and algorithms. They need a low friction environment so they can learn the academic stuff without wasting their (or their TA's) time.
So while I agree that many "real world" programming jobs require dealing with the kind of boring minutia that IDEs try to abstract, this stuff shouldn't necessarily be a prerequisite for students on a CS track. There's plenty of time for the real world later :)
> They are the kind of programmers who have a very narrow range of abilities, and whenever they see a problem outside those abilities, they get completely lost.
Interesting. In other fields, like surgery, narrow specialization is desirable.
You generally specialize after you have learned the basics. All oncologists are doctors, all architects that design very large buildings are capable of designing houses etc. It's seems obvious that the basics for a developer includes how a machine works.
A surgeon still spent many years studying medicine at a broad level before moving into training for that specialty, though.
And many developers I know do end up specializing in particular sub-fields (ML, databases, whatever), but they already have a wide breadth of knowledge and experience to serve as a foundation underneath that. In contrast, I also know some developers who didn't take that route, and jumped into specialization early, and who are completely useless at productionizing their work.
IDEs can be great, but in the hands of someone just learning they encourage a cargo cult style that becomes a barrier to actually knowing what you're doing. I've had to work with people who are still stuck cargo culting thing 15+ years into their careers. Not fun.
Relatedly, they can also assist people who to get just enough success (in the cargo cult sense) to get a degree, when they would be much better off in a different field. It does them (and their future team mates) a disservice.
> I would not ever want to work with anyone from those types of teams. They are the kind of programmers who have a very narrow range of abilities, and whenever they see a problem outside those abilities, they get completely lost.
The thing is, they are the ones least likely to cause difficult to debug build issues due to "very clever" features and using the latest version features of build tools in the first place.
They're also the kinds of people who will use the same version of their tools long after they've stopped being supported, and then when you need one of those new features, or run into a problem that you can't work around, you have to drag all that stuff into the present, which is always a huge pain in the ass. And the kind of developers who got you in that situation in the first place will be the least well-suited to help get you out.
Regardless, I don't think "I don't want knowledgeable people because they might write something 'too clever' that causes problems in the future" is a particularly winning strategy. You're basically saying you want dumb people because they will never try to be too smart for their own good.
They are absolutely the ones to create hard-to-debug issues because they layered a whole bunch of complex things incorrectly or unknowingly.
For example, recently I spent more time than I'd like finding out that we linked two incompatible versions of a library. All the visible dependencies were correct, but at some point someone bundled an old version in a fat jar and that got included transitively via another dependency. If you understand the details you know why you don't distribute libraries as fat jars. But if you're just following some guide it's easier to fat jar it than correctly specify your dependency tree.
If your education was really effective you recognize this is just the static/dynamic linking problem again in new clothes.
When I learned programming with eclipse years ago I spent so much time worrying about the magic behind the scenes that it left me with no confidence in my programming. When I finally learned to write my own make files and compile on my own, I went back to using ides and like them, but you shouldn’t learn with them
I pity the student who only knows how to hit the compile button in the IDE and not `javac`. That's a terrible thing to inflict on a beginner, everything will seem like a mystical black box to them with no way forward when they hit encounter their first compilation problem.
I completely disagree, when I was a beginner programmer my only concern was “how do i make games?” i didn't really care about the game engine, the compilation process, the algorithms even, my focus was on making a game. From middle school until early high school all my coding was in IDEs and I never really tried to use the command line it just wasnt my concern, it was much easier to hit the green run button. Throughout that time I was learning all the basic concepts of how a program is executed, what is control flow, how do you write good code, how does the program even run (i recall one time in middle school trying to figure out if all the lines in a program run at once or in sequence), how do you simplify booleans, how do you debug etc.
It was only later once I was interested in web development that I started picking up linux, learning more deeply about the compilation process, learning algorithms etc. For a beginner all that stuff is just more roadblocks to building your interest and foundation. But later I started hitting a wall when trying web development or 3d game programming where I didn’t have the fundamentals and thats when I started learning all the other skills besides just typing some code and hitting the green run button. In the transition from beginner programmer to Intermediate programmer is where I would say one should learn the behind the scenes concepts not while you’re still building an interest and foundation.
Also interesting aside the programming environment I started on was called “Kids Programming Language” it was like a kind of simplified blackbox environment that I compared to flash (but didnt cost money). Even today I prefer to introduce coding in such blackbox environments like processing.js maybe because of that.
I am helping my son to learn c++ for his school clases.
This shit that makes sense for us has no fucking sense for a beginner.
So for the simple for loop for(int i = 0; i< something; i++){} there is a large number of syntax errors a beginner can do, he can forget to open or close the () , he can forget about the : or placed them in the wrong place, he can forget about the {}.
For sure I do not start teaching my son about the c++ compiler, linker though I show him how the program runs in the IDE debugger and how changes update.
Low level stuff is needed much later for maybe parts of the students, at the beginning is too much junk that needs to be learned like a parrot.
> For sure I do not start teaching my son about the c++ compiler, linker
You could, or at least the tip of that iceberg. Compiling a simple program often is a simple one-line command. Perhaps you can show him how to automate this with a shell script or (gasp) a makefile.
The reason why it is not "junk" is that it will teach him a bunch of important things: that you can "talk" to a computer using different languages, that those languages have different capabilities, different properties and different grammars (syntax), and that languages are all about grammar and semantics.
> Low level stuff [...] at the beginning is too much junk that needs to be learned like a parrot.
I see a problem with teaching there, rather than a problem with what to teach. Anytime you make people learn by heart something that needs to be understood first, there's a failure to teach.
It seems these days that we are trying to rush people who are not even comfortable with files and folders to "code". That's stupid in itself.
How about teaching them how to use a computer before trying to program it?
Is it not a saner approach? Even if they don't get programming at least they'll know a bit better how those things actually work.
(And why the hell your son has to learn C++ first thing in the morning when there are more beginner-friendly languages. That's another major failure, if that's true.)
It follows the general philosophy that a computer is just a tool that should be at people's command. But that approach only makes good customers, people that are helpless when problems come. Farmers had that sort of problem and had to fight to have the right to repair their "tool".
The correct and actually beneficial philosophy is user-tool co-evolution, that is mutual adaptation. And for this to happen, the user has to really understand their tool.
> The reason why it is not "junk" is that it will teach him a bunch of important things: that you can "talk" to a computer using different languages, that those languages have different capabilities, different properties and different grammars (syntax), and that languages are all about grammar and semantics.
This would be a good point if the year where 2003. Compiling code and setting up a proper coding environment is a solved problem that isn't worth the headspace of a young and aspiring programmer. In the modern day, you can open up the console on your browser and start writing and running code just like that.
What's far more important for a beginner is to learn algorithms and data structures, which are timeless concepts that are highly valued and can be applied to any language or environment (and to give OP credit, C++ is a fine language for learning these concepts). Recursion vs iteration, control flow, data structures, time-complexity -- gaining a mastery of these topics will enable the student to write programs that make it worth while to then learn the auxiliary technologies that can make their programs usable and sharable -- unix commands, compilers, linkers, etc -- learning these tools as a means to an end.
> The correct and actually beneficial philosophy is user-tool co-evolution, that is mutual adaptation. And for this to happen, the user has to really understand their tool.
I would argue that a lot of existing technologies such as unix are less so philosophy driven and more so backwards-compatibility driven where things exist purely for historic reasons. The idiosyncrasies of say unix can be a headache for a beginner, and it'd be a better use of their time to only interact with those technologies when absolutely necessary.
> This would be a good point if the year where 2003. Compiling code and setting up a proper coding environment is a solved problem that isn't worth the headspace of a young and aspiring programmer. In the modern day, you can open up the console on your browser and start writing and running code just like that.
"Modernity" has nothing to do with it. In the old days you could switch on you family computer and start entering Basic just like that too. The only difference between yesterday and now, is that the leverage is much stronger.
Adding and multiplying matrix has been a solved problem too for an even longer time, yet we teach students how to do it manually. It's a solved problem, yet we expect them to understand the solution and be able to execute it.
I don't know why it shouldn't be the case also in CS, unless what you want is code monkeys [1]
> Recursion vs iteration
You cannot explain that properly without explaining first that compilers to stack machines, routine calls and jumps. If you skip that you just end up saying "anyway use iteration when your program mysteriously blows up".
Actually if you do go all of a sudden low-level like this, your students will discard your explanation and remember the previous rule of thumb. That's one of the reasons why some people bad mouth us, saying we abuse the terms "science" and "engineering" in "computer science" and "software engineering".
CS 101 should always be in assembly language on some 8bits micro-controller. If you don't do that, you are cutting corners.
>You cannot explain that properly without explaining first that compilers to stack machines, routine calls and jumps. If you skip that you just end up saying "anyway use iteration when your program mysteriously blows up"
Isn't this a implementation detail?
I mean it depends if your course is focused on teaching programming first level (how to get some simple problem then math and code it with if, for and while and some variables)
or if you actually have a course about say c++ where you get in detail about why this old languages are designed how they are.
Just to make things clear, I don't like "magic", I too prefer to understand how the computer works from the logic gates up though I am still missing some stuff in my knowledge. But what I noticed is the introduction to programming is very hard, I remember me wasting hours explaining stuff to a student colleague and after those many hours he had the revelation " so this computer is dumb, we have to tell it to do everything step by step , all details..." , so you have the problem of teaching the students algorithms where for myu son they use some C-like pseudo code, then you want them to start running those algorithms ASAP so they can get feedback from the machine , but the syntax is a big issue. It would not be efficient to teach them to also use the command line(Windows and Linux), run the compiler, troubleshoot if the command fails to run(maybe wrong directory or paths), then do the parse the syntax errors and try to find the actual error that produces that sometimes unrelated message. With and IDE you might get the red squiggly lines as you type with some hints on what is missing (before you write 10 new lines and not know what is the problematic one.
But I agree with you if the student is not in the first year of learning programming and indeed the people that do computer science they will study compilers and CPu architecture, and data structures and lot of stuff.
> Isn't this a implementation detail? I mean it depends if your course is focused on teaching programming first level (how to get some simple problem then math and code it with if, for and while and some variables) or if you actually have a course about say c++ where you get in detail about why this old languages are designed how they are.
As much as the fact that numbers are stored on a finite number of bits. Your iterative factorial won't blow up for 100! but gives an obviously wrong result, like a negative number.
Don't blame C/C++; blaming the tool is generally a bad idea (now if your school teaches cooking using flamethrowers that's a different story...).
Many languages don't make promises about tail call optimization. A lot of languages don't have "big numbers" (as in Lisp) support - even "modern" ones. Very few languages will accept recursive data structure definitions without using pointers (or references if you want, but the label you put on this can of worms doesn't make much difference). Some will actually give the right-ish result because they use floating point numbers internally, but then you might have to explain why the hell 1/10 does not give exactly 0.1 (which is why, in some algorithms, you compare not-so-real numbers using <= instead of ==).
You cannot escape the fact that computers are mechanical machines, not abstract math machines (unless you load a program that does precisely that). Part of "the Art of Programming" is about overcoming that limitation.
Abstractions are lies by definition. Lies are sometimes a necessary evil, but abstraction abuse is a poison.
I’m neutral on whether knowing how to compile from terminal is essential or not, but I disagree with this:
> Compiling code and setting up a proper coding environment is a solved problem that isn't worth the headspace of a young and aspiring programmer. In the modern day, you can open up the console on your browser and start writing and running code just like that.
Noone should get into writing more complex programs without understanding the basics of the tooling, and we at most only have a few auto-generators that will create a skeleton that can and will break with ever so slightly a change.
In Romanian high schools they learn c++ and maybe soem high schools might still teach Pascal. This is the reality, before high school I helped him with some Lua scripting in Gary's mod so he is a bit familiar already with concepts as variables, if, comparisons etc.
>The reason why it is not "junk" is that it will teach him a bunch of important things: that you can "talk" to a computer using different languages, that those languages have different capabilities, different properties and different grammars (syntax), and that languages are all about grammar and semantics.
My point is not that he should not know about the low level stuff eventually, but keep that for the second or third year.
But int he end the reality s that we write text and some tool makes it run and print stuff on screen, is not important that everyone understands that the compiler has 3 parts/stages and that the linker searches for stuff in some weird paths using some weird algorithm... for first years of teaching I think is OK we focus on understanding the fundamentals and not the tools, we would waste our time teaching them about the c++ linker or Java classpath if they will use Python or JS in the real world.
I identified 2 hard parts
1 create the solution for the problem, like say add all numbers from 1 to n , my son can't invent the solution, I have to make with him similar problems over and over again until things connect (is the same for Math and physics, we repeat same type of problem until it makes sense and he can solve them himself)
2 the syntax, I can work with him, make some simple programs and next week he fails to use "cout" and "cin" properly , this means I failed to identify that he did not properly understand how this crap works and have to do more examples with cout and cin too.
I think you hit the nail on the head. For a beginner learning about a language is so much more important than the compiler, and there's only so many things you can pay attention to at a time. When you're starting it's very important to only learn the minimal subset needed. Later you can expand if you need it, but I have been programming for close to 2 decades and I can count on one hand how many times I had to call a compiler directly. I didn't code much Java though, so maybe there it's different?
I have been programming for 3 decades, the 2 last as a dayjob. It seems to me that there is a deep cultural chasm around the new century. Before 1990 you had next to no Internet and computers were shipped with book-sized printed manuals that were trying to explain how the thing worked, not just how to operate it. After 2000 they begun to come with a single sheet showing how to plug them and www address for perverts who want to download manuals.
However it's not entirely a matter of when you started to touch the keyboard. Around me I see "people who understand" and "people who use" of various ages. Guess who helps who more often.
The pint we are discussing is teaching in the first years, I agree that you don't want to hire a c++ programmer that does know what "linking" means or a Java developer that does not know what the classpath is, or a JS developer that does not know that he can leak memory too and you should manually cleanup sometimes in JS too.
In my experience the people who are helping are the ones who use tools that lets them work more efficiently. They have more time to learn new things, while the ones who use less efficient tools use much of their mental effort on the tools themselves.
I’ve been programming in Java for about 10 years now. I would consider myself an expert - I’m not just some guy who built crud apps, I’ve gotten messy with Java. I came up programming for minecraft servers - lots of interesting challenges there.
The only time I have ever used javac was for fun. Not once professionally.
I think I'm in your camp. When you're a programmer, there's a lot of things you rely on that have been done by someone else and you don't really understand, neatly packaged.
There's not much point in teaching specific things, because there are thousands of specific things that make a program work. Yesterday my code failed because the whitelist at the server was checking my ipv6 address instead of the ipv4 on the website interface. The day before I was stumped on a Rust borrow checker thing.
You end up learning the things that you run into. If you've never needed compiler flags, it's actually not that easy to learn without a motivation. Same goes with memory barriers, garbage collection, and so on.
There's just so many rabbit holes. Telling people they need to know this or that is futile, what they need to know is how to figure out whatever their issue is. Some common tools with some common comments about pros and cons, warnings about where the caves are and what's more trivial than it looks.
My wife is starting a CS degree, and I'll be looking over her shoulder providing this kind of advice, rather than recommending anything in particular.
It's not that you need to use it. It's that you need to understand classpath and classloaders to solve some problems. And if you always use IDE automation there's no incentive until it's too late. Stuff is autoimported and autoinstalled. There's dozens of forms where you can add paths to classpaths depending on what button you press to run stuff. It's all magic when it works, and when it doesn't work it's random changes in random places to fix it. There's many more variables than if you just run the compiler from the command line.
Good education should give you understanding of the basics, not just higher-level stuff. It's easier to learn high-level stuff when you got the basics down than to unlearn the "magic black box" approach and dive into basics when you need it.
> And if you always use IDE automation there's no incentive until it's too late.
Too late for what, though? At any point in time there's a certain subset of knowledge that's most relevant that you're trying to do. If it's learning overall programming concepts not the internals of a language which you may or may not use, then NOT using the IDE automation would be detrimental.
Furthermore, in every single project that i've worked on professionally, attempting to manually manage dependencies and the way things are compiled would earn me odd looks, since that's far more tedious and less productive than just using Maven/Gradle. Not only that, but manually attempting to manage imports as opposed to letting my IDE do everything that's needed in the background would provide me precisely 0 benefit.
In my eyes, "It just works", isn't a bad thing at all, as long as you:
- can look into the lower level mechanisms (but only once actually needed)
- can find information about all of it, to make the above easier
Why would you "unlearn" something if you'll still use it in 95% of the cases, the lower level approaches being the outlier?
> in every single project that i've worked on professionally, attempting to manually manage dependencies and the way things are compiled would earn me odd looks, since that's far more tedious and less productive than just using Maven/Gradle
Of course, we're not talking about doing javac manually when you're working, we're talking about learning the basics before going on to use the automation. BTW Maven and Gradle have plenty of gotchas of their own. If you don't understand what exactly happens you won't solve "magical" problems like old builds of non-snapshot versions remaining in the repo for example.
> manually attempting to manage imports as opposed to letting my IDE do everything that's needed in the background would provide me precisely 0 benefit.
you say that, and just last month I've had to help a guy manually remove an import of Token class from a wrong package that "magically broke his code" :)
> Why would you "unlearn" something if you'll still use it in 95% of the cases, the lower level approaches being the outlier?
Because I've seen a lot of people decide something that always worked is not their responsibility to fix when it doesn't work. That's how you end up with 1 guy in the company that understands the build system :)
> you say that, and just last month I've had to help a guy manually remove an import of Token class from a wrong package that "magically broke his code" :)
Hmm, the IDE should offer you multiple classes to import from in a dropdown if there's any ambiguity, as well as highlight any of the methods that aren't present or whose signatures differ and provide popups with essentially the same information that you'd get from the compiler, without even having to compile the project. In that regard, i'd say that IDEs are actually superior. A single ctrl+click that'd take you to the source of the class that you've imported should also clear up any confusion about the import.
Importing the wrong thing and not bothering to figure out why their code doesn't work honestly feels like not being familiar with the language itself (as opposed to the tooling) and for some reason not trying to figure it out on their own by reading what the IDE is telling them - to me, that does sound more like learnt helplessness than anything else, which would also manifest itself when writing code and running into design patterns that they don't understand, as well as any number of other situations.
> Because I've seen a lot of people decide something that always worked is not their responsibility to fix when it doesn't work. That's how you end up with 1 guy in the company that understands the build system :)
To me, this also feels like more of an organizational issue: if companies don't have a RACI matrix (https://en.wikipedia.org/wiki/Responsibility_assignment_matr...) and have a low bus factor (https://en.wikipedia.org/wiki/Bus_factor), then that's on them. If no one takes initiative and the company fails to find employees who will, that's also on them. If only one person understands how things work and doesn't document it, nor are there processes in place to ensure any sort of knowledge transfer, then that's also on them.
That does sound like a dysfunctional environment and at that point builds will be the least of your worries - figuring out how to get code to prod, how to work with the DB and run tests (if any are present), as well as any other steps that haven't been formalized with an "Infrastructure as Code" approach will cause far worse problems down the road. Good luck if any of your servers ever go down and you haven't used something like Ansible for configuring them, but instead it was done N years ago by someone who's long gone along with their knowledge about how to do that.
If you work in such an environment and care about personal success, consider either changing the company or changing companies.
> Importing the wrong thing and not bothering to figure out why their code doesn't work honestly feels like not being familiar with the language itself (as opposed to the tooling) and for some reason not trying to figure it out on their own by reading what the IDE is telling them - to me, that does sound more like learnt helplessness than anything else
To be fair they have 30 files open in the IDE and 100 lines are highlighted cause of spelling errors, it's easy to miss the line highlighted in red as opposed to yellow or black. If you look at the console output you will know the reason build failed immediately, but that's hidden behind 5 different things begging for your attention.
It is learned helplessness, but there's a reason people learn it - they went straight for the industrial tools with dozens of variables that were never explained. They never learnt how it works under the hood and what all the knobs do. They were taught to follow the tutorial, ignore the warnings, leave the defaults as is and click the big green triangle to run the program. 99% of the time it works and when it doesn't maybe start from the beginning again and follow the tutorial more carefully :)
It's a generalization of course, there are people who problem solve better and worse in any demographic, but in my opinion the tendency is - depending on tools and automation early in your education makes you a worse problem-solver.
> If no one takes initiative and the company fails to find employees who will, that's also on them. If only one person understands how things work and doesn't document it, nor are there processes in place to ensure any sort of knowledge transfer, then that's also on them.
There's organizational goals and processes, and there's company culture. The latter is far more important in practice. And it's mostly shaped by the attitudes and expectations of people, not by putting a document on company's wiki or talking about it in a meeting.
> To be fair they have 30 files open in the IDE and 100 lines are highlighted cause of spelling errors, it's easy to miss the line highlighted in red as opposed to yellow or black. If you look at the console output you will know the reason build failed immediately, but that's hidden behind 5 different things begging for your attention.
Then why not address those things first? Most IDEs should allow you to set up inspection profiles or customize the spellchecker and version this configuration so all developers would get a consistent and noise free experience, just like you should be doing anyways when not using an IDE but instead using a code editor together with a linter/formatter and Git hooks.
JetBrains products are actually pretty good in that regard, personally i've sometimes enabled almost all of the warnings, apart from some of the conflicting ones, just to learn about more concerns that i would have otherwise not thought about myself. Not that most people should do in their everyday lives when they want to get things done, but being able to do so is nice.
> They were taught to follow the tutorial, ignore the warnings, leave the defaults as is and click the big green triangle to run the program.
That sounds like that joke about everyone ignoring 99 warnings within their project because the code compiles and runs. Once again, i do believe that this is a bit orthogonal to the IDE vs code editor debate, because if there are warnings within your project you should address them regardless of the tools that you use. If it's a bad tutorial, then why are you using it?
Of course, one can also talk about how defaults should be sensible and the default configurations/examples should never have warnings, but i guess that's just what you get when people don't pay attention to the quality of the things that they make, which is at the very least understandable in our current world.
> It's a generalization of course, there are people who problem solve better and worse in any demographic, but in my opinion the tendency is - depending on tools and automation early in your education makes you a worse problem-solver.
This is an interesting argument. I don't necessarily agree that the automation is the cause for this, merely a canary of sorts. Perhaps the people who don't have the patience to struggle with the "traditional/old" way of doing things would simply not stick around in the industry for long (this probably applies to situations where they'd have to write ASM instead of Python, as an example) due to the frustration they'd experience with all of it and not getting any demonstrable results early on, which is extremely discouraging, as opposed to the more patient and persistent people? In that regard, IDEs indeed enable a wider variety of people to stay within the industry.
I cannot comment on whether that's a bad thing or a good thing, much like some people said that we'd not always have calculators with us and yet almost everyone does have a smartphone in their pocket.
> There's organizational goals and processes, and there's company culture. The latter is far more important in practice. And it's mostly shaped by the attitudes and expectations of people, not by putting a document on company's wiki or talking about it in a meeting.
This is debatable. Those two are not mutually exclusive. Look for environments where both live up to your standards and contribute to both in a positive way.
I certainly have, for example, in one of our projects now all of the configuration lives in Git and is managed through Ansible, so we know when, why and who changed things. I've also made running services more consistent, have containerized apps to get rid of the dependency hell and have written plenty of scripts for automating the bits where people made mistakes, e.g. long DB migration names that need a particular format like V20211107.5.4.2.21.1.0_Alter_Some_Table_Do_Something
At the same time i've also onboarded new people, helped them get started and have explained things both about the projects as well as things in the industry in general, as well have pointed them towards useful learning resources. Being pleasant to work with and having a healthy environment doesn't need to come at the expense of anything else, apart from maybe people's egos sometimes (including mine).
Of course, no one wants to do things for just putting a checkbox in some corporate form, but in practice most of the meaningful ways to minimize risks and make people's lives easier down the line are worth it and there are no excuses not to implement them in any mature company. That's why having documentation and enforcing that it's present is a good idea, especially if you do IaC and most of it is actually code, that's commented as well as the rest of your codebase should be.
You realize the IDE relies on the compiler to do all that magic stuff, right? So when you get to the point where you have a build error because some odd/unexpected import/target all of a sudden everything in the project is an error and the IDE is suddenly not so helpful.
> So when you get to the point where you have a build error because some odd/unexpected import/target all of a sudden everything in the project is an error and the IDE is suddenly not so helpful.
This has not been my experience, at least with IntelliJ in the context of Java projects. Bad imports result in IDE highlighting the erroneous usages of the imported class, by showing which methods aren't available. The red squiggles provide immediate feedback and let you know whether any problem lies within the method parameters, or whether the method itself isn't found, before you try to compile/run anything manually. Tooltips about the error contents are also pretty useful, no need to manually go to the file/line based on some text output in a terminal.
Bad language targets/JDK choices are far "louder" as far as the amount of errors output go, but they're easy to figure out, since if you ignore everything being red in the IDE and instead try to compile the project, you'll get the exact same output that the compiler will generate. It's just a matter of Googling those errors, regardless of how you compile your projects.
The only problems of such sort that i've run into have all been because of the Spring framework and its configuration mappings not being detected properly because some developer decided to initialize it in a non-standard way which confuses the IDE. Then again, using a code editor instead of an IDE wouldn't really provide any useful functionality/hints for Spring either, so i guess that ended up being roughly equivalent.
I should preface that my experience primarily comes from Scala and dotnet, I don't really use Java.
Sure you can play Google whackamole but a lot of times the compiler message alone isn't going to help you and when you Google the message you there are 20 questions on SO where the top answer has 1000 votes and says something like, delete the bin folder and restart.
But that's not the issue. The issue is your importing two projects that are targeting different dotnet frameworks.
The dependency project builds fine on its own.
They project using the dependency just loudly proclaims that the imported type doesn't exist.
Rider gives you a few squiggly lines and tells you "Computer says no"
You yourself have to figure out that one of the projects has changed targeting framework while the other one hasn't
I spent bout 3 years in the 90s when I needed to mess with Java code. Would not dream about going through all those giant piles of enterprise code without IDE.
I see no point in wasting time. I learned programming on my own. Started with machine codes. But I grabbed every tool that can help (including IDE) as soon as they appeared. Programming to me was / is a tool I use to make products. I see no point in suffering for the heck of it. Instead of fucking with 10 miles long command line and remembering every squiggly thingy I think it is better to explain to students how computers really work instead.
It's nice to learn how tools work internally, so you can fix them when they break. Once you know how to do that, it's fine to graduate into something that does the work for you.
I have yet to write java byte code manually, but a high level understanding of them do make me a better developer in my opinion. It’s not about the number of times one uses it, but knowing about the underlying abstraction.
Yes, what a pitty. They will have to read that compilation error from the IDEs "compile" window and not the console. Or just see the red squiggly lines in the code. That is a problem for some reason.
And you know what happens when people use a black box and that stops working? They start learning how it works then. When it has actually any relevance for them. Until then it's just a more complicated way of doing things that doesn't benefit them in any way whatsoever.
> people use a black box and that stops working? They start learning how it works then.
No, they don't. Because they don't even know where to start. They never learned anything outside of the compile button and the project settings.
I know because I was in that boat once. Nothing ever boosted my programming skills more than learning the basics or another layer of what's actually going on behind the abstractions.
You also need to consider hardware. School computers are not very good in general. The one I had to use for Eclipse took at least 2 minutes to start the IDE until it reached the point it actually reacted to mouse clicks. You can guess how fun the rest was. But I didn't even consider using a lightweight tool back then. Why? Because the only editor I knew was Notepad, and nobody would use that voluntarily. And nobody taught something other than fat IDEs. There was a time I'd programmed for over a year and could not code in languages like C because the program didn't have a Compile button. It was confusing and I gave up.
You underestimate how bad an IDE-only approach to teaching programming is. It comes with batteries but also learned helplessness included.
>No, they don't. Because they don't even know where to start. They never learned anything outside of...
This is fixable though, instead of just telling them to use the tools you can teach students how to use them well. It also doesn't hurt to inculcate a mindset that shows them how to look for help when they are stuck. Discarding the IDE entirely is like throwing the baby out with the bathwater in my opinion.
I'm not saying they should ignore IDEs entirely. But by teaching everything in the context of IDEs is limiting, it hides a lot of things that students don't even know exist. I went through 5+ years of education without knowing what e.g. the classpath is. Guess how much fun it was to fix classpath issues for the first time. One single lesson outside the IDE would have filled this gap.
> instead of just telling them to use the tools you can teach students how to use them well
Yes, that would be great. If only I'd ever met a teacher able and motivated enough to do that. Coincidentally, it's easier to teach about how to run a couple CLI commands and read the documentation than explaining a behemoth including everything from version control to debuggers that's all thrown in your face at once.
> One single lesson outside the IDE would have filled this gap.
Hmm, what would the lesson look like? I doubt there is a way to teach everything about out-of-IDE/advanced Java development in just a single hour or two. You mentioned CLASSPATH, but there are also JARs with manifest files, there are multitude of build systems with their own configuration languages/plugins/conventions, there are ClassLoaders, there are Java Compiler Plugins... Just too much.
You either list terms for two hours straight without any details and exercises or you choose some semi-random subset of topics, which may very well not include CLASSPATH for some reason. Alternatively, you teach for ten hours instead of two.
This part stays intact either way though. You're not wrong that teaching students to run a couple of CLI commands is easier, but they still need to understand what they're doing, otherwise it can be just about as opaque as clicking a button in a graphical program.
Considering the vast majority of them will be using IDEs if they are coding in Java professionally, personally I feel it is worth taking some extra time to at least give them a starting-off point and then letting them access the documentation that the IDEs themselves provide. Whether this comes after teaching them the basics of the language through the barebones text editor + CLI tools approach or is what they are taught from the ground up is a matter of choice.
So you are saying they don't know where to start, and as an example of someone not knowing where to start you present... yourself, doing the exact thing you claim they can't do.
> yourself, doing the exact thing you claim they can't do
I'm confused. Where did I describe doing something as example of not being able to do it? Maybe I worded something wrong.
My point was just that I was completely lost and confused in many areas for years until I went out of my way to learn things properly, despite everyone teaching me to not do it like that.
"No, they don't. Because they don't even know where to start. They never learned anything outside of the compile button and the project settings. I know because I was in that boat once."
The original statement was that people can learn about lower-level details when stuff breaks. You quoted this statement, and then add the claim they can't learn [when stuff breaks], "because they don't even know where to start". You support that by adding personal experience: "I know because I was in that boat once".
However, for other people in that proverbial boat, it seems to me that they can do the same thing that you apparently did, which is to learn what they need to know - something you deny is possible: "no, they don't."
Also, since we are quibbling about text interpretation, I'm fairly confident that nobody is _teaching_ you not to use a non-IDE approach to development. Maybe they are advising it (on the basis of IDEs being more approachable to a student) or demanding it (on the basis of an entire team working with the same tools), but _teaching_? I don't think that's right.
That ship sailed long ago. There’s honestly no need for majority of programmers these days to go cli, vim/emacs way or whatever the one true way is these days. It’s just not needed.
And the ones who need it for their work they learn it, know it, and do it.
Or for the fun and satisfaction of maintaining, running, and keep oiling something antique and/or analogue.
Jokes aside, it’s a decent part of my day job and never faced any problem. Neither do my younger colleagues who never ever were even exposed to those esoteric programming thingies.
The problem is the belief that cli or mildly esoteric CS practises and skills have to be some kind of mandatory initiation.
> There’s honestly no need for majority of programmers these days to go cli
I never did CS classes (I taught a couple once).
I agree that you don't need to learn CLI in the first year.
The idea that the majority of "programmers" can tackle production code without ever being introduced to a CLI is terrifying, though. I'm astonished to learn that some people think you can get a Computer Science degree without learning processor architecture, IO, operating systems and so on.
FWIW, I hate git. It's the devil to learn, and it solves a load of problems I've never had. I wouldn't inflict it on a first-year student.
Of course you need to know what a file is to use git; but you need to know what a file is to use any VCS. I don't see how you can teach someone programming before they know what a file is.
Is a CS course now little more than a way to get an introductory programming qualification?
Everything does seem like a mystical black box with no way forward when you're just getting started. Does not matter if it's GUI, CLI, IDE or a raw compiler.
What is `SyntaxError: invalid syntax` for `print(hello world)`? What is `NameError: name 'Print' is not defined`? I don't have any names, I just want to show a string.
The only differences are: a) amount of new things to learn (e.g. students already know how to use GUI, but learning CLI is a whole separate skill); b) quality of the tool (e.g. how relevant/readable error messages and configuration are); c) your ability to get help with the specific tool you're using.
You have to stop digging into details somewhere. You can always go deeper, if you want.
In my opinion, it's perfectly reasonable to learn one thing at a time. You can start with learning CLI and then learn Java using javac. You can also start with learning some IDE and its build facilities and use Java completely inside it.
> What sort of compilation problem would that be that's only solvable by manually running javac outside your IDE?
Figuring out compilation errors when using the configuration of the IDE versus using known system environment variable that are using different values. Notably the resolution of classpaths.
All these things are exposed by the IDE if you are willing to look for them. This kind of purism is pointless in my opinion. I did start off with just vi and javac and never complained, but I feel like it's just a matter of knowing the tools you use, even if some of them are an abstraction over the barebones stuff.
We're talking about how you should learn to use the tools you are using.
Talking about implicitly learning from your IDE is a lot like learning to program with React versus the browser console (or a sandbox). I'm not sure how 'purism' applies.
> We're talking about how you should learn to use the tools you are using.
I don't think there's any difference between using javac and using an IDE. They're both tools that you configure and the press a button to run. What's the difference?
And the danger with saying 'you should understand it at the lower level' is anyone can challenge you if you understand the tools you use at a lower level, and at some point you won't, because nobody understands it all, and then they can point and laugh at you like you're pointing and laughing at others.
An IDE and a command line are just different interfaces to the same thing, the compiler. One is not somehow superior to the other, in principle they are interchangeable.
I'm not saying it has to be implicit. If we are talking about a student environment there's no reason why educators can't do a deep dive to explicitly learn the features of the tools they are using.
I'm viscerally into this camp. I just cannot fathom using black box. I went to college to engineer not to reuse, at least not blindly. And things like eclipse have just too many layers and principle-less helpers. Even when an intern, we had to make tiny eclipse wizards (that didn't add anything of value, just cute helpers ala visual studio to setup some files) I was physically unhappy because it's the opposite of how I'm wired. Whenever something doesn't go your way.. well you're back to square one, having to understand java, javac, the jvm AND eclipse stack and logic on top.
Do you have any clue how many layers there are below the java/javac layer? What made you decide that that particular layer is where it's at, and that just one layer up everything is bad?
Good point. Somehow the compiler is the critical tool, eclipse was failing to provide value while impeding me to work too early for me. I was happier with a makefile most of the time because I had the information in short form under my eyes. With the jvm you have a spec, it's somehow meaningful, Eclipse APIs were quite horrendous and if you need to go that low to understand why your project fails to run it's not a good tool.
I completely understand that. I'm not a java programmer; my experience is limited to fixing one tiny bug, in one java program, and I struggled a bit with Eclipse before realising building from the commandline was also possible and pretty simple.
But I do most of my work in Visual Studio, which I prefer over vim+make+gdb because a powerful IDE does give you a significant advantage over a 'mere' text editor. I also find that working in vim imposes a much higher mental load than doing the same in MSVC (and don't get me started on vi without the 'm'...).
I have no conclusion on the matter to be honest. Like many I wasted time trying to setup emacs for language L and had a broken support system (especially regarding debugging). But I often felt that IDEs brought as much cost as value. And up until very recently, linting and ergonomics were below what I'd got with emacs (and if you've seen the regular posts about magit you have an idea of what I mean).
I pity the HN commentor who only knows how to hit the "power on" button on his computer and not how the CPU will actually power-up and start running instructions.
The whole reason computers work at all is because of the many many layers of abstraction. At some point everything seems like a mystical black box to you too.
Sure it might be at a lower level. But there's always a point where you get to "I just do this and it does what it says it should. I don't understand how it works." Do you feel like a terrible thing has been inflicted on you because you don't know how the Linux kernel works internally, or how `idiv` is implemented, or which SRAM cells your CPU uses?
You have to stop somewhere, and running javac manually is something almost no Java developers do so learning it definitely seems like a terrible thing to inflict on beginners.
Hard to disagree with that! Though it does have some redeeming features: it's statically typed, runs pretty fast, has absolute rock solid code intelligence (the best of any language in my experience) and a ton of learning resources.
You have written exactly what I think about this, but more clearly than I could write it down.
Every rebuttal and complaint against this seems to centre around how the moment someone touches an IDE they become a gibbering idiot who is unable to ever learn anything else, as if it is impossible that the next course could be about how the programs they've been writing are all processed and put together by the tool chain.
I once spent a very frustrating and miserable day in the computer labs at university unable to make a program work, and got zero points on that homework, because of a missing bracket. Did I learn anything about programming? Or how computers work? Did the application of anything I'd previously learned about computer science or programming work? No. I learned nothing. I had no time for that, because none of it was relevant. I'd simply missed a bracket very early on in the program, wasn't experienced enough to see it, and that was that.
I did learn something critical though, later on, from a friend: "here, use this NetBeans program and turn on 'syntax highlighting' and check out how it points out 'syntax errors' just like how a word processor highlights spelling mistakes". And from then on I could actually learn programming, and later on that semester we learned about javac and putting stuff together on the command line and big O notation, and all the rest of that good stuff that people have moral panics about not starting with.
It's like everyone's decided to have absolutely zero concept of time, or parallelism, or that students spend more than 15 minutes in their entire life studying before being ejected into the workforce to spam tech debt everywhere. They're taking multiple courses in parallel per semester, followed by more semesters of more courses, followed by more years or more semesters. There's time to learn more than one thing, and space to not have to learn it all in the same place, and a student studying one thing at a time when they're starting from zero is not mutually exclusive with them becoming a skilled programmer who enters the workforce and writes quality code to create useful programs.
The first pass: Something pure like Structure and Interpretation of Computer Programs, but more approachable for freshmen. And a more beginner-friendly language like Python is preferable. The key is to: 1) Focus on the idea of computation without any distractions. 2) Be less scary. I remember how I gave up C++ several times when I tried programming all by myself when I was a kid. A decade later I decided to try again, and it's fun because it's JavaScript and I can make useful stuff from day 1.
The second pass: Something like Computer Systems: A Programmer's Perspective. It goes down to the implementation rabbit hole, don't skip the essential details and tools, making the students well equipped to deal with real-world problems.
I took a look, because I thought: "Do they even touch on recursion and all the things, that SICP does using Scheme, which usually in Python are inelegant or cumbersome? Lets check recursion."
They do have the examples at https://composingprograms.com/pages/17-recursive-functions.h... but no where one the whole page is the word "overflow" or "stack" even mentioned. Almost as if one does not want to mention, that recursion is a bad idea in Python in cases, where you do not know about recursion depth, because of the limitations of Python itself. So the book is teaching an approach, that often might not be a viable good solution, simply ignoring the problems of the chosen tool. On the surface the book looks like good teaching material, but these problems need to be stated, so that beginners know about them.
Don't get me wrong. This is already a much better learning experience than with most C/C++ tutorials, but it definitely leaves things to be improved.
This post resonates with me: i never really learnt much about javac and haven't needed it in about ~5 years while working with Java daily - for the most part, working with Ant/Maven/Grade is enough and is actually more important to know, whereas javac errors during build can mostly be addressed with a quick StackOverflow search.
Knowledge about the lower level stuff is probably nice to have, yet not really needed to be productive in a professional capacity within the industry, unless you're not using any sort of a build system (a red flag, if you expect to manage dependencies easily), or are doing something more advanced and specific. It's like using a Linux distro daily but not really knowing how to or caring for compiling your own software - when the Wi-Fi drivers fail to work in new hardware that'll probably become relevant, but otherwise not necessarily.
However, i'd also like to go into the opposite direction and warn against the dangers of not introducing students to higher level tools like IDEs and linters properly. I was one of the unfortunate few who were also taught Java in university and had to write it on paper, as well as in Eclipse/NetBeans. The problem was that none of the staff ever introduced us to the actual capabilities of the IDE, or told us about alternatives like the JetBrains products, or talked about readability or refactoring at all (neither in Bachelor's, nor Master's courses in CS).
Thus, i saw some students write code without readable indentation, comments or understandable variable names, treating the entire Java language like some pseudo assembler and happily ignoring any and all of the IDEs suggestions to fix errors or improve the way their code was written. Not only that, but they were also not familiar with any of the refactoring tools, even the more basic ones like extracting a variable or a method, or renaming variables, or even code generation capabilities like getters/setters, equals/hashCode, toString, constructors/builders and so on.
To them, these IDEs were just like using Vim at a very basic capacity - knowing how to enter the insert mode and not much more, thus gaining no actual benefit from the tool in the first place, though actually working in the industry would probably teach this to them, due to the necessity to follow code style guides or refactor code in legacy codebases.
Strongly disagree with "full teams can and have built huge java projects just by using the intellij project wizard and maybe a tiny bit of Gradle from stack overflow." I wouldn't want to work with someone who doesn't understand how build systems work. Specifics are irrelevant but knowing how to work with one build system means you don't have some vague textbook picture of how it works but you actually had to see it how it glues together. IDEs are nowhere near good enough to abstract that complexity away and it's a big part of day to day work.
> I don’t agree with this perspective: why do entry-level students have to understand the internals behind compiling their code, and weird edge-cases like misspelling import?
TFA doesn’t say that. It says to teach entry level students about control structures and such, and then, after two semesters or so, to teach them about those internals.
I think the key problem here is that most people here are programmers and have forgotten what it means to NOT be a programmer.
Moreover, most are thinking that all who are thought programming are going to become professional programmers - like the class success rate of a 100% is a given.
But unfortunately, no. Most intro into programming courses are dealing with the issue "how to get these people over the 'hump' and not quit" and nothing more.
I disagree but from a different angle than the other person: IME a major barrier to students being able to complete assignments was basic mastery of syntax. Now, you might think, but of course, they're students.
But I saw this as the main barrier at ALL levels and in all classes. People who could understand algorithms and linear algebra being unable to write a counting for loop that iterates over an array without looking it up and carefully copy pasting it.
There's just an innate familiarity with the text of your program and how it translates to an AST that you MUST command. And having tools that can "did you mean" auto-fix things doesn't help, especially with imports, when almost languages have non-trivial import systems.
I obviously encourage teaching modern IDEs but I think we would do students a disservice if we didn't make them use a fairly plain text editor for most learning. I started with BlueJ which provided a "compile" button and syntax highlighting and that was about it, and I was then later (after having stopped using bluej of course) confused by people who were using eclipse but couldn't get their imports working right.
Maybe that's my main point actually: none of these tools are good enough that they let you actually not understand the underlying system. They're just time-savers/auto-completes, which can confuse newbies.
What you are calling "esoteric details" are the basic building blocks. Understanding them gives you a mental model that helps you solve the problems when they arise. And it's really not that difficult to grasp.
Me and my entire batch vehemently hated how we had to configure confusing development environments before actually writing a line of code. That is a huge deterrent and only a few of us liked this profession since then.
All those misspelled import errors and command line tools help make it far more easy than configuring an intimidating tool like eclipse/jetbrains.
By all means, introduce students to coding using the easiest, most engaging method possible. To me, that's seeing something big, visual, and getting right into the code as soon as possible, like using an IDE with a mostly working sample project they just need to finish something easy in.
Thing is, a good course should then, at some point, force the learner to understand what's happening under the hood, or at the very least, be interesting enough to draw them to want to understand it.
I don't really use eclipse, but the Visual Studio example, it would be as simple as looking at the output window after hitting "run" and walking through what the IDE is actually doing in the background.
> When I saw the title my first thought was literally “yeah stop making students use eclipse, they should be using IntelliJ!”
I disagree with this point. Universities should stop driving students into the arms of proprietary software vendors. Eclipse does the job and works well enough, with a license, which does not limit students later on. If they learned Eclipse, they will be able to learn other IDEs later on, if the job requires it.
Of course it would be good, if somehow for a low cost in frustration and time, students could learn about what it all translates to on the command line.
We dance around it all the time, but there is no way around inherent technical complexity of this field.
I started out with Python and Project Euler, went through JS and web apps "ELO hell", picked up some C to understand lower levels, ended up with a microcontroller - then went all the way up to category theory and Haskell / Scala coding.
If you ask me if I know how to code, my answer is all the more resolute - It depends.
More practical personalities will definitely manage better in current landscape.
> Actually students can solve most of the author’s problems by spending 5 seconds on Stack Overflow.
The problem is that those students will end up in the workplace one day and someone in the team will have to help them solve problems that are out of the scope of the project and could be worked around by knowing the proper tools
anecdotal example: I spend at a least 4 hours a week helping colleagues fix Jetbrains shenanigans that could be non-blocking if only they knew basic mvn or git CLI
The big problem seems to be that universities are running into this iPad generation of children who do not know the basics of how computers work and are organized, because apps have veneered over all of that detail.
It's not uncommon to see astronomy and biology professors lamenting that their students don't understand file directories, or realize that a file can be opened by different programs.
My first real uni programming class was Java and we used BlueJ. I tried to set up Eclipse but I was so freaking lost. And I've been a computer nerd most my life, just not a programmer at that time. I am so glad we were set up for the simpler IDE, learning your first couple languages is hard enough as it is.
>Like writing code by hand and deducting points for syntax errors
We like to discuss the gap between university and enterprise but for once both a totally (ironically) aligned: interviews in big companies are like this, with a whiteboard instead of a sheet of paper.
IntelliJ looks better but do less for you. Eclipse by default compiles the whole project for example and by default shows all compilation errors. With IntelliJ you need to look for these.
IntelliJ also does on-the-fly compilation and checking, at least if you enable it - which may be a performance problem for huge projects, but definitely not for students.
In eclipse, you dont have to press anything. You have prominently "problems view" where they are all listed as you type essentially. You dont have to expand nodes and errors are sorted all on top. (And you can filter them in millions ways, easily, make multiple copies of the view with different filters if you want.)
Then you see them in package/project view too. Each package, project and file with error has red icon with it.
IntelliJ does not show them in project view at all. It shows them only after you pressed "build", so you need to develop the habit of pressing it constantly. Oh, and the console shows exactly one error. The others are hidden in separate view view, inside collapsed nodes and you have to click multiple times to see them.
> IntelliJ does not show them in project view at all. It shows them only after you pressed "build"
It does better than that: it shows them directly inline with your code, with all the tips and hints and refactorings to make it work.
When you change something to be incompatible with existing code, it will immediately show a handy link inline to display all the places across your code that depend on this code and are now problematic.
There's a problems view that has an overview of all problems in the current file and in the entire project (but true, a single overview of the entire project only appeared sometime in 2020)
The author claims that students won’t be able to do real-world programming without learning this esoteric stuff, but actually this is the exact stuff they don’t need to know: full teams can and have built huge java projects just by using the intellij project wizard and maybe a tiny bit of Gradle from stack overflow. Actually students can solve most of the author’s problems by spending 5 seconds on Stack Overflow.
I’m not saying that these more esoteric details are completely useless. Learning e.g. the difference between char and UTF8 “character” is actually pretty important so that emoji inputs don’t crash your website. But this is stuff that should come in later classes, particularly for students who are interested in those details. Not everyone needs to know it.
In fact my experience is that some colleges have the exact opposite problem: they teach students old tools and techniques which are tedious and not actually used much later on. Like writing code by hand and deducting points for syntax errors, or using some outdated framework e.g. jQuery. When I saw the title my first thought was literally “yeah stop making students use eclipse, they should be using IntelliJ!”