I just wanted to confirm from our meeting just now, did you want me to (some crazy shit that could cause problems)?

  • 0 Posts
  • 3 Comments
Joined 10M ago
cake
Cake day: Jan 09, 2024

help-circle
rss

Er… sort of. He brings up some towards the end:

There is a common myth in software development that parallel programming is hard. This would come as a surprise to Alan Kay, who was able to teach an actor-model language to young children, with which they wrote working programs with more than 200 threads. It comes as a surprise to Erlang programmers, who commonly write programs with thousands of parallel components. It’s more accurate to say that parallel programming in a language with a C-like abstract machine is difficult, and given the prevalence of parallel hardware, from multicore CPUs to many-core GPUs, that’s just another way of saying that C doesn’t map to modern hardware very well.

I would add to that Go with its channel model of concurrency which I quite like, and numpy which does an excellent job in my experience with giving you fast paralleled operations on big parallel structures while still giving you a simple imperative model for quick simple operations. There are also languages like Erlang or ML that try to do things in just a totally different way which in theory can lend itself to much better use of parallelism, but I’m not real familiar with them and I have no idea how well the theoretical promise works out in terms of real world results.

I’d be interested to see someone with this guy’s level of knowledge talk about how well any of that maps into actually well-parallelized operations when solving actual real problems on actual real-world CPUs (in the specific way that he’s talking about when he’s criticizing how well C maps to it), because personally I don’t really know.


ITT: People who didn’t understand the article

OP: You should not be bothered. The author’s arguments are perfectly valid IMO, but they’re way way beyond a beginner level. C is already a fairly challenging language to get your head around, and the author is going way beyond that into arguments about the fundamental theoretical underpinnings of C and its machine model, and the hellish complexities of modern microcode-and-silicon CPU design. You don’t need to worry about it. You can progress your development through:

  • Basic computer science data structures Python and the like
  • C and the byte for byte realities <- You are here
  • Step 3
  • Step 4
  • Microcode realities like this guy is talking about

… and not worry about step 5 until much much later.