Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I get what you're saying, but learning C teaches you about the C memory model instead of "real machines". C was designed for portability across different architectures and, believe it or not, was considered high-level and abstract at one time.

But I agree that learning C is valuable because I believe that learning about manual memory management is valuable.



> But I agree that learning C is valuable because I believe that learning about manual memory management is valuable.

You need to do it in Ada too :-)!


And that's fair. I tend to view C as high-level assembler which does mean some loss of fidelity from pure assembly. It seems like a case of being "close enough", I suppose, because switching to pure assembly would make it a lot harder to cover the subject matter in a standard semester.


>I get what you're saying, but learning C teaches you about the C memory model instead of "real machines". No. C doesn't do that, not at all; see: https://queue.acm.org/detail.cfm?id=3212479 If you want to get to the down and dirty, and quickly, without nearly all the "Gotcha!" inherit in C, FORTH is the way to go.

>C was designed for portability across different architectures Absolute bullshit. This is a claim that has been going on for far too long, the whole "portable assembler" myth. IEEE694-1985 is a portable assembler, hell read "Using a high level language as a cross assembler" ( DOI: 10.1145/954269.954277 ), for that matter.

>and, believe it or not, was considered high-level and abstract at one time. The "gotcha!"-factor was increasingly ignored when I was learning about computer languages, but I had an advantage in having my first language be Turbo Pascal and self-taught using the manual and compiler.

> But I agree that learning C is valuable because I believe that learning about manual memory management is valuable. That's doable much easier in Ada, Forth, or BLISS. C is becoming more and more of a liability, to the point I would say that it has no real value in any professional setting, and an incredibly limited one in the academic.


How can you say C has no real value in a professional setting? Look at the amount of C code in any Linux distribution, even ignoring the kernel itself.


> How can you say C has no real value in a professional setting?

Fairly easily. My professional career has been mainly in maintenance and, as such, I get to see the gritty back-end of things, the end-result of all the technical-debt... and being more correctness and security-minded than most, I often note how a good design could help prevent problems, both on the language being used to implement and on the project itself.

From that perspective, most defenses of C as productive or useful fall flat on their faces, especially in recent years as security becomes more and more important a concern — about the only place that C makes any sense anymore is micro-controllers because "all the micro-controllers have a C compiler." — But let's not make the mistake thinking that "having a C compiler" means that C is a good (or even appropriate) language for the task.

Forth, Assembly, and Ada all exceed C's capabilities in many of what have been traditionally claimed as C's strong suits: * Ada: much, MUCH, better as a systems-language. Any project of medium or large size should seriously consider Ada instead of C. * Assembly: very fine control, especially important for the severely-constrained controllers. * Forth: Very fast, very low-level; would recommend for small/medium-small projects on small controllers. (Doesn't have calling-conventions, doesn't manipulate stack-frames; this makes it faster than C.)

> Look at the amount of C code in any Linux distribution, even ignoring the kernel itself.

And? That says NOTHING about having value in a professional setting, only that it (a) was chosen by a project that got big, and (b) enjoys popularity.

Nearly EVERYTHING that C is claimed to be [very] good about is done much better by some other language. C is particularly bad at large-systems, given the complete lack of modules, and entirely unsuitable for many things that it's commonly used for like multithreaded applications (honestly, take a look at Ada's TASK construct and consider how that might be used in [e.g.] a game-engine).


I think you and I see "value" differently. C is literally everywhere. Other people know it and are able to work with it. Tons of existing code is written in C. Most other languages can interface with it. This combination makes it incredibly valuable in a professional setting.

It is used inappropriately? Sometimes. Can there be better alternatives? Yes. But to say it has "no real value" is a bit extreme.


Pretty much all of my career has been in maintenance, so I tend to see the mess that gets made and while bad-design is always a killer, I notice there are languages that encourage/discourage it to varying degrees.

In the case of C, and other C-like languages, I most recently have four or five custom-made programs [some requiring specialized tools] that have little/nothing in the way of documentation: what it does, the "why-for"/motivation, any sort of high-level architecture-plan, or design-documents.

Fortunately for me, most of the programs actually do have documentation thanks to a true hero that left before I arrived.


I did some maintenance work on a fairly large C++ system that called into a custom lower level library written in C. Much of my work involved extending that lower level library. It processed several billion dollars a year in financial transactions, and was barely documented at all. It was intended to be "portable" so it had bizarre #ifdefs all over the place in case you happened to compile it on an ancient Unix with a K&R compiler from 1986. This was the early 2000's, so nobody ever did that. I doubt it would've worked, yet those #ifdefs remained "just in case."

The lower level C library was pretty clever. The original developer of most of it left about a year into my tenure there. He was one of the smartest guys I ever worked with.

The C++ "app" layer was a different story. The worst part it was a 3000 line switch/case block with about 100 different cases, chock full of copy-and-pasted code. It went on... and on... and on... I still have nightmares about it.


> The C++ "app" layer was a different story. The worst part it was a 3000 line switch/case block with about 100 different cases, chock full of copy-and-pasted code. It went on... and on... and on... I still have nightmares about it.

Ouch. That sounds brutal. If I had to do something similar, or maintain that, in Ada I'd leverage nested subprograms, local type/subtype definitions, and mandatory case-coverage — and I've done similar with VMs, particular opcodes — so you get something like:

    Type Opcode is ( NOP, Add_A, SUB_A, ..., Rem_D );

    Procedure Execute_Instruction( State : in out Machine_State; Instructions : in Instruction_Stream ) is
       Subtype A_Series is Opcode range Add_A..Sub_A;
       Subtype B_Series is Opcode range Add_B..Sub_B;
       Subtype C_Series is Opcode range Add_C..Sub_C;
       Subtype D_Series is Opcode range Add_D..Sub_D;
       
       Procedure Do_Add_A;
       -- other subprograms.
       
       Current : Opcode renames Decode( Next_Token( Instructions ) );
       --...
    Begin
       Case Current is
         when A_Series =>
           case A_Series'(Current) is
             when Add_A => Do_Add_A;
           end case;
         -- other series.  
       end case;
    End Execute_Instruction;
Of course you could structure it so that all the Do_OPCODE subprograms are local to the top-level switch, or local to the nested switches, as best suits the design; or decompose along 'families' of operation (Add_A, Add_B, Add_C, Add_D), but the important thing there is keeping things local/nested for maintainability.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: