A problem I've noticed in my degree is that for a certain percentage of students Java just doesn't stick. I'm in my fourth year now and there are plenty of people who just can't program, because they've only been taught Java, and they didn't get that. It's far too big a language to teach in a semester or two to a previous non-programmer.
Even if it does sink in they rapidly find that the simplified Java being taught at first year undergraduate level is far below the standard required for real world Java programming.
Java is definitely a terrible language for teaching and as a student's first introduction to arrays, memory handling and flow control it's pure terrible. However, Java is incredibly effective as a language for large teams. Since most of the companies coming to hire CS grads are big teams in need of extra hands on their software projects, they are looking for CS grads who know Java. So Java in universities is generally the ivory tower being forced by business constraints to adapt to outside business interests.
My university down here taught in Scheme for the introduction and theory classes, and in Java for algorithms and networking. Seemed to work well as a balance, although I'm not sure if they still do that.
> Java is definitely a terrible language for teaching and as a student's first introduction to arrays, memory handling and flow control it's pure terrible
That's not really a bad thing. First, no beginner is going to be learning about coding OSes so teaching the basics isn't a problem. Secondly, Java OSes are far more common than you'd think, although this was more true of 5 years ago when many phones/smartcards/etc used Java in their OS internals. A lot of the topics in OS concepts would probably be useful if you wanted to hack on OSS Android too. So I don't think there is anything wrong with that book which sounds more like an example and reference book on using Java in OS internals (received mention a shotgun approach).
I don't see any problem with teaching CS concepts in Java. The concepts of message passing or a heap are the same everywhere. I was referring more to the basics of programming where you should either teach at a very low level (pascal is great), or at a very high level (python can work). I think people taught originally in something low level like pascal or C often have the best understanding of memory management and what their instructions are doing.
If I was teaching someone coding, I'd probably start with Pascal (incl inline ASM), then any language based on lambda calculus to get a good understanding of recursion, and then Java in a large team environment so they could understand why encapsulation, delegation and code quality are important. It's a shame more classes don't actually just have all 100 students work together on a single code base - it would be incredibly beneficial. The code would likely be terrible with 100 newbies to Java all hacking in their own required feature to the detriment of everyone else. That would drive the lesson home quick!
> It's a shame more classes don't actually just have all 100 students work together on a single code base - it would be incredibly beneficial
I up-voted your comment as a whole, but I really disagree with this statement. It's reality in many situations, but I think it's a shame that anyone has to work together with a team of 100 on anything, ever. You deal with it, and you solve the problems associated with it, but those are problems of management and process.
The students would learn a lot though. Most CS courses will give theory and practice for algorithms etc, but only give theory for software design. Trying to teach a student how and why it's important to encapsulate logic is generally a losing battle until you put them in a position where it makes sense. It's just a theoretical topic (that sounds like a religion) to not have global state until you dump the student in with 100 others all declaring global state. Same for namespaces and interfaces.
It would be a really interesting experiment even: divide up a big problem into 100 or so features and assign each feature to a random person. Give the students some time to work out who they need their code to communicate with and draw up some simple interfaces. Use github or similar with an account for every student. Make the TAs in charge of accepting pull requests. It could be the main focus of a software engineering course.
EDIT: The students wouldn't necessarily need to work with all 100 others - only the ones whose feature their feature interacts with. eg, 1 student could need to receive input from 5 inputs and calculate some statistics - he would only need to talk to the 5 students who would feed him data, and then talk to the 1 student he would feed data to.
Up until about a decade ago, most CS graduates from MIT were never taught C or Java in any class. They still seemed to do pretty well in the job market. (Mostly it was assumed that if you really liked software development, you would go out and learn one of those languages outside of class, such as from a student-led seminar.)
It's just a business thing. Some large local companies come and talk to the university and say that they need a bunch of CS grads to fit into their business. The HR drone says they need to know Java to be able to integrate into the company quickly. The lecturers get upset. The finance department gets promised a donation, the marketing department gets promised easy job searching for CS students. The ivory tower crumbles a bit and the lecturers teach Java.
Well yes it is easily explained by some love for java, why would that be so unlikely?
Computer Science is a diverse field, and not every CS professor considers mutable global variable an evil to be purged. (In fact, the majority of CS professors can't be bothered to think about such matter. They're busy tuning their neural networks, photorealistic simulation of snowflakes, improvements to TCP or whatever their particular research interest is. The answer to "Java or Haskell?" for them is "whatever that suits my need.")
Ah, yes, I remember those days (different university though). Our first class was programming in Pascal, and all subsequent programming classes were "deriving algorithms and proving them correct"-classes (a la Dijkstra). In other classes, like user interfaces, OO-programming, mathematics, computer systems, all kinds of projects, AI, and so on, the assignments had to be done in a particular language and it was up to us to learn it ourselves in addition to the classwork. So I had to learn and used java, c, c++, Delphi, PHP, prolog, bash, mathematica, matlab, haskell, ASM, python, and latex (although I could be forgetting some DSLs). It was fun though :-)
I think there are 2 ways of teaching programming at the beginning, although I'm not sure which is best.
1. Teach structure and organisation of programming. - Python is a good for this as it's object oriented, but has very little boilerplate and syntax getting in the way of learning. A few universities have done well using Haskell for this instead, and I think that's quite a good option, particular as it introduces types.
2. Teach from the perspective of how a computer works - C is probably the best one for this as it helps students understand how computers work at a more basic level, being closer to assembly and machine code. It also allows for memory to be thought of as a much more basic resource of the machine that users must manage.
Java would be good for teaching data structures and algorithms, but I don't think that should come until after an introductory class using one of the above methods.
That's a problem regardless of the specific language taught first and I say that having personally witnessed the struggles of students who learned in pseudocode (my first semester), Java, Python, and Scheme. Programming requires you to structure your thinking a particular way. It's natural for some people, it's unnatural for others. I tutored a couple dozen people of varying aptitudes through the courses as an undergrad and was suprised at how little a difference Java vs Python made.
I do think that there are better ways and worse ways to teach programming. Brett Victor put out an overview [1] of the topic last year and while that essay in itself isn't the answer, it's at least a better start than anything I ran into as an undergrad.
I already had math research experience with C when I finally took my schools introduction to CS course, and had wrapped my head around pointers and recursion. Java just felt like memorizing a bunch of strings and exceptions.
Even if it does sink in they rapidly find that the simplified Java being taught at first year undergraduate level is far below the standard required for real world Java programming.