Mark Guzdial gives a snapshot look at a research student done by several of his graduate students on how programmers who learn on their own actually program. A lot of interesting things there. Things like these programmers use a lot more FOR loops than WHILE loops. They use more TRY-CATCH blocks than WHILE loops as well. Who knew that TRY-CATCH was either more easy to learn or more important to learn (unclear which) than WHILE loops.
Oh and self-taught programmers seldom write their own classes. What does that say about the naturalness of classes? That one doesn't surprise me that much BTW. I have come to the conclusion that the pendulum towards objects (object first, last and always) has swung too far in one direction. I think a swing back to somewhere in the middle (use classes where they make things easier but don't use them when they make things harder and more confusing) is inevitable.
I hope this paper is published somewhere I can get a copy of it. I suspect that there is a lot we can learn by studying how people learn to program on their own.
Which reminds me - I wonder how often the GOTO statement showed up in self-taught programmers programs? Did they even find it? Did they find it useful?
As someone who learned programming primarily on their own, I find this very interesting. But I must confess, I don't understand the distinction between a class and an object. Aren't they synonymous (one being the abstract term, and one the tangible code of the abstract concept)?
one being the abstract term, and one the tangible code of the abstract concept
Actually that is the difference more or less but that is why they are not synonymous. A class discribes the attributes and functions. An object is an actual instance. When one changes a class one changes something about all objects of that class but it does not go both ways. By that I mean when you change one object you do not change all objects of the same class.
Ok, so I understand the distinction correctly. But then I don't understand what you mean when you write that the pendulum has swung more towards objects than classes. Do you mean that people are using already created objects (such as the ones provided in the .Net Framework) as opposed to redesigning their own from scratch?
Ah, I see the confusion. There is a school of thought that says one should teach creating and using classes early and that students should use objects from the very beginning. They should be thinking and designing object oriented and only look at imperative programming later - if at all. There is another school of thought that is moving back away from that and using simple imperative programs that many not be using objects at all. If they do use objects they are only created from library classes. My own thought is that using objects, especially from libraries, should be taught early but that some programs are imperative by nature and the forcing them into an OOP paradigm is artificial and makes things more complex than necessary. Objects/classes however can still be useful tools in that context. But get students through all that before makign them design and ceate their own classes.
I see. It's interesting that you concede that some applications might not be appropriate for OOP. Although I see the benefit of it, I struggle with programming OOP properly, and often just regress towards my old habits. However, maybe it's not so bad after all! Maybe these applications really are better suited to a non-OOP style!