I'm kicking around this idea that, when learning a new programming language, it is more helpful to start with examples of things that one should *not* do, rather than what one *should* do.

My logic here is that the things you *can* do in a language are effectively infinite, however what you *can't* (or shouldn't) do is a very finite and relatively manageable number.

Obviously at some point you learn how to do things correctly, but I'm curious if this method lessens the initial pre-grok haze.


Does this seem logical? Would love some feedback from my fedi friends with more programming experience (virtually all of you ha).

How are you supposed to know that a given concept is bad (or rather, how it is bad), if you have no better case/concept to compare it to? I think showing bad practices to weed them out early does make sense, but the learner needs to understand the basic concepts and good practices first, in order to reflect on why a bad practice is considered bad.

@ScumbagDog For example, watching a video like this: youtube.com/watch?v=j0_u26Vpb4

It not only shows you bad patterns (some obvious and universal) but it also helps you grok why things aren't done a certain way. I always hate the feeling I get learning something new and I'm just repeating a pattern because "it's just done that way". You obviously can't understand the entire history of every convention and pattern when you're new, and that blind following of things is what I'm trying to minimize.

@ScumbagDog I don't know if that video was the best example, but I just grabbed it quickly. Hopefully the idea comes across ha.

That video also assumes that you know the basics of c/c++. Don't get me wrong, I think the concept of looking at bad code is a great way to deepen your knowledge in the given language, but an absolute novice (at least within the paradigm) would look like a question mark once you start throwing a little terminology around.
For example, you have no chance at knowing why gotos are bad unless you know what control structures such as if and while are.

Or alternatively thr example with macro v function. There is no guarantee that a novice knows what a preprocessor directive is, so showing why a #define might sometimes yield unexpected result may just end up confusing them, rather than giving them that "eureka!" moment

@ScumbagDog I agree with everything you said, but I am assuming this is a person's N+1 language being learned - so general concepts would already be grok'd. Do you thing that makes a significant difference? I'm also dyslexic, so I always have a hard time learning anything new at all, and I tend to not learn well with traditional methods that work for most people. So what works for me here may not be anything close universal, ha

I don't think number of languages known matter - paradigms do. A Haskell and C program are two very different beasts to master, and knowing the practices of one, does not imply knowing the practices of the other. Compare that to a Haskell and Lisp program or C and Pascal program, and the similarities are much more pronounced (but not the same, of course)

@ScumbagDog @self Haskell is a declarative language, which in my experience is dramatically different from an imperative language. In Haskell, you tell the computer what the situation is then say “So, uh, go solve that somehow okay?” whereas in C, Lisp, or Pascal you tell the computer what to do to solve it. So Haskell, Caml, certain pattern matching mini-languages, Sympy, those are what I see as similar.

Though certain C optimizations can be seen as declarative, like where you tell it for(i=0;i<4;++i) and the computer goes to solve the problem “somehow”, either with a loop or an unrolled loop depending. So there are even similarities between C and Haskell. Ultimately they all boil down to a sequence (or sequences) of machine code instructions being fed to a CPU core.

My point is that they both share concepts (not fully, just some) that a C programmer would likely not know. Sure, Lisp and its dialects are procedural and Haskell is declarative, but if you know one, you likely also know what lambdas, declarations and tail-recursion is, which is present in the other. For a pure C programmer, these would be foreign concepts.

Sign in to participate in the conversation
mastadon.linuxlusers.com | Mastodon

Lusers unite! Despite the name, no loyalty or alignment with any software or technology are prerequisite - we welcome all! This instance is dedicated to inclusion and will not become yet another negative "tech" stereotype! For example, if you're not okay with people of different races or genders... please move along. If you think it's harmful for projects to adopt a code of conduct or ethical license... please move along. There are plenty of other Mastodon instances out there, and I would like to have mastodon.linuxlusers.com be one of the few Linux/open-source related instances that does not reaffirm stereotypes!