Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Codea Enhancements: Learnable Programming

edited September 2012 in General Posts: 13

Summary

I found a great article tonight that I think outlines some great approach to making programming more learnable that could and should be leveraged by Codea to make it even easier to use and encourage broader adoption of Codea as a teaching tool. I've recently started to teach my 7 yr old daughter to code and have been thinking a lot about such things lately and its an area that many people could benefit from.

Please check out the following article and give feedback on what you think! :)

http://worrydream.com/LearnableProgramming/

Details

I'm a long time programmer who started to learn programming at the ripe age of 8 years old with a Commodore 64 back around 1982. I've started to teach my very bright 7 year old daughter how to program and so far she and I are having a blast working together using Codea to make a game she's titled "Super Kitty Girl!". :)

I've been thinking a lot about how to teach programming so I can help my daughter learn the ropes as I did so long ago. Unfortunately, its next to impossible for her to learn the same way I learned because modern computers are so incredibly powerful that there are few constraints on what you can do. Back then, on a C64, I had 16 colors to work with. I had 8 sprites. I had 3 audio channels.

I couldn't be endlessly distracted much by the game art, or music, or 3D projection modes. It was pretty easy to focus because of the constraints. There was enough to learn without modern day distractions and the bar being set so very very high especially for game programming. Thankfully, simpler games are making a resurgence via mobile platforms. Angry Birds could have been done on a C64, a NES, an Amiga, a SNES, etc. The mechanics of some of these mobile games are very clean and lovely, something that isn't a supreme technical challenge, but has been very well executed by those companies by leveraging good developers and artists to make a polished game.

I believe its an incredible challenge for new programmers to know what to focus on in a way that sets them up for success, so I've been thinking hard about how to approach the issue with my daughter and other new coders. Codea is one of the best applications I've found so far when it comes to teaching my daughter how to program. :) Touch and gravity inputs make it fairly trivial to map cause and effect. First thing we did was map gravity / tilt sensors to moving a sprite around. Its very simple and intuitive and offers a fine degree of control. I'm teaching her about geometry, coordinate systems, that the little sprites have "personal space" (collision boxes), functions, code commenting, etc. :) (We're also reading the book Lauren Ipsum right now, introductory CS material for kids, that reads like Alice in Wonderland and is pretty funny)

All this in mind, I came across a fantastic article tonight posted on Hacker News that is directly relevant to reaching goals of making modern programming easier to learn and there are several suggestions directly applicable to Codea itself:

http://worrydream.com/LearnableProgramming/

There are many concepts that could be applied from this article, but I think the low hanging fruit are the "Make meaning transparent" and "Explain in context" sections. I, an expert programmer, definitely find it cumbersome to go into help to look up the parameters for all of the functions, so having something even just tell me the name of the parameter would be helpful for jogging memory or helping me figure out how to use the method for the first time. My daughter has no mental model for navigating help / documentation systems, so she has a very challenging time finding the right help - even I do sometimes.

I think Codea is an awesome tool, I've been impressed by the progress so far and I think there is room for it to get even better. I hope our discussions and support can help drive that forward. :) In the meantime, please take a look at the article on Learnable Programming and give feedback on your impressions. :)

-Xavian

Comments

  • Posts: 182

    Wow thanks for that, I've been trying to teach my brother the little programming I know but he automatically decides lotsOfTyping = hard that website really helped.

  • SimeonSimeon Admin Mod
    Posts: 4,459

    Thanks for linking Bret Victor's fantastic article. I'm glad to have read it.

    Several of his ideas are things that I have been wanting to put into Codea for some time now. As he says in his essay, the underlying concepts are far more important than any specific feature based on them.

    I'm thinking hard about where his philosophy aligns with my own, and how that can be represented in Codea.

  • Great Simeon, I'm glad you liked it! As you said, the underlying concepts are the most useful and his way of outlining them is immediately accessible. It occurred to me after I posted that you might have some of these things outlined in the roadmap already. I think Codea is a great platform to build upon and I'm looking forward to seeing how things develop. :)

  • Posts: 502

    I recommend this video on the same topic by the same person. :)

  • Thanks @xavian for this excellent link. I have a similar background like you, started with C64 BASIC, then assembler on Amiga. In the next few years I will also face the challenge to teach my kids some coding, I hope some of the ideas of Bret Victor will then be reality :-) Sometimes I think it is harder to learn new languages today than learning assembler back in the eighties. So I can only agree with your explanation, that the possibilities were so limited, that we could focus more on the code and therefore had a quicker learning curve. I can see it now with Codea: I started 10 projects at the same time and it is hard to focus on one as there are so many ideas and possibilities.

    Projecting the article on Codea I think there is one thing which would make things for beginners a lot easier: Just showing a label with a functions arguments when opening the bracket. A second thing would be a debugger, but I understand, thats much harder to develop. Of course, what Bret Victor shows in his videos as debugger is incredible, but a simple debugger would be a start :-)

  • @tnlogy, great video, thanks! @KilamMalik Ah yes, I had a similar journey, basic and machine code on C64 (I didn't have an assembler) and then eventually got an Amiga, then switched to PCs for years, and finally Macs.

    I've started so many game dev project, but eventually I tend to get paralyzed or bogged down a bit by all of the possibilities especially since I really like the media side of the work as well as the programming side. But I discovered something on a lark that makes it easier. I was playing a game called "Harvest Moon: Save the Homeland" on the Playstation 2 when I noticed the actual game mechanics were simple enough it could work on something old like a C64. I was between jobs at the time so I spent some time with a C64 emulator, had a C cross-compiler and started porting a Playstation 2 game backwards to the C64. It was great fun, but I got a new job and never had the time to complete it. The insight I had, however, is it was much easier to focus on the core game since everything else was so very limited. I got pretty far despite the limitations:

    Harvest Moon - Back-Port

    Codea with the built in sprite() support coupled with the PlanetCute artwork really take me much closer to such an experience. I have visually attractive artwork right out of the box so I can focus on mechanics like spinning stars or making a touch joystick. I love how trivial it was for us to make this for 'Super Kitty Girl!' scene.

    Super Kitty Girl Game

    Give us artwork that looks decent enough that we can play around with it and focus on the basics without getting distracted. My daughter loves changing the sprite around, and thats a very cause effect sorta thing. I couldn't as easily had her engaged if I moved a circle around the screen and said "OK, now imagine that circle is Super Kitty Girl..." This is much much easier. :)

  • Posts: 2,160

    Hogwash.

    I grew up with CBM64 and BBC Basic as well. What I did was copy code from magazines, often without a clue what it did. Then I got ambitious and started changing things and finally I ported a CBM64 fractal program to fortran on our shiny new 286 (with Hercules monochrome graphics card).

    The article makes some good points, but what is missing is the role of the teacher in all this. A well-designed IDE (if that's the right term) certainly helps but more important is a guide who can engage the child in the process of learning.

    And if you change Codea to be more "learning friendly" I'll complain so loudly you'll look back on the furore about number spinners as a mild disagreement.

  • @Andrew_Stacey, your opening gambit is combative and uninformative. What exactly are you refuting? The role of teacher is a critical one, but that doesn't nullify the usefulness of offering visual affordances, especially when someone will be engaged in independent learning. Outside of Run or Compute! magazine articles, most of us didn't have the benefit of a teacher. If you are balking at the concept that I'm advocating making Codea more "kiddy", I am not advocating that. An interface that is more informative should benefit everyone, experts and novices alike. Just because I use Emacs for development does not cause me to suggest other coders, especially new coders, should use that approach and forgo graphical IDEs or things that have intellisense. What are your actual objections to a more learnable interface? What are the downsides? I didn't see you list any outside the threat of your protestations.

  • edited September 2012 Posts: 580

    I am all about getting more people interested in learning how to program, and I think some of the ideas in that video might be helpful. I am a little skeptical, given the way I (and pretty much every other programmer I know) learned how to program, but that doesn't mean that it's not possible for something like that to be helpful to teach programming concepts.

    If Codea featured a "learnable interface", I would have no objections as long as it could be switched off. If everyone is forced to use it, it would seriously get in the way of those who already understand programming and aren't using Codea to learn how to program. Perhaps that's some of the sentiment @Andrew_Stacey might be trying to express.

  • JohnJohn Admin Mod
    Posts: 459

    I don't think we would ever make those interface elements mandatory because we are seasoned programmers and enjoy using Codea as much as anyone. That said, I think its important to bridge the gap between the absolute beginner and the intermediate programmer. Something I feel is very much missing right now.

  • edited September 2012 Posts: 580

    Well, part of me feels like this could become a question of focus: Is Codea meant to be a tool primarily for prototyping (and even publishing games), or is it a tool primarily for teaching? Of course there's no reason that it could not be both to some extent, but trying to be everything to everyone will likely result in something that truly pleases no one. I for one would rather you guys not spend your time implementing a "learnable interface", because it would take away from the already limited time you guys have to maintain and add other useful features to Codea.

    I feel like an environment like the one suggested in the video would best be pulled of with a tool that was solely meant for teaching.

  • @toadkick, I understand your skepticism and I would like more control in preferences about which things are enabled. It could work similar to how some games work like Civilization V or SimCity that has popup tips that you can disable at will. I'd be mighty annoyed if anyone forced me to use a particular IDE other than what I am comfortable with, but as long as those aspects are configurable, I do not see the harm.

    @John Yeah, thats along how I was thinking about it. Even from the standpoint of a new Codea user it could be helpful. I'm not new to programming by a long shot, but its been years since I've touched Lua (I used to use it for WoW UI plugins?) and I am not familiar with all the non-lua Codea specific functionality. If it took me a little bit to become familiar with what can be done with Lua, I can only imagine what it is like for someone brand new to programming altogether. I have a friend who is interested in learning to program for the first time at the age of 40, he is no child but he could definitely benefit from assistance until he gets the hang of things.

  • edited September 2012 Posts: 13

    @toadkick, I think Codea is most useful as a tool for rapid prototyping, which provides access to a wider demographic of experts and novices alike. I find it a bit easier to concentrate on prototyping on the iPad with Codea or Pythonista than my laptop is with instant messagers and email and facebook, et al. I find it a great tool for rapidly banging out ideas. I have very little free time at this point in my life, so I really enjoy a tool like that that can let me prototype at will.

    I don't think it has to be solely one thing or the other. Television remote controls were designed to help people with disabilities control the TV without getting up. Guess what? Able-bodied people loved them too. Now people don't even KNOW they were intended for handicapped people. Same with XO brand kitchen utensils, initially intended for arthritic users. Bam, people without arthritis found them much easier to use.

    A good tool is helpful to have whether you are a novice or an expert. A good tool does not get in the way of what you are trying to accomplish. In that way I think Codea can be an excellent prototyping tool that can be both powerful and easy to work with.

  • SimeonSimeon Admin Mod
    edited September 2012 Posts: 4,459

    My goal with Codea has always been to allow creative coders to get their ideas on the screen with minimal hassle.

    Bret Victor's Learnable Programming essay focuses a lot on learning core programming concepts — something that I'm not interested in teaching (though I wouldn't want to deter anyone from using Codea as a teaching tool).

    However there are lots of good points in Victor's essay. The absolute key issue for me, right now is the following:

    The API can't be opaque. I run into this myself when using Codea — sometimes i have to check the documentation to remember whether saveGlobalData accepts a key as its first or second parameter. Sometimes I forget return values. I'm experimenting with a solution for this.

  • SimeonSimeon Admin Mod
    Posts: 4,459

    .@Andrew_Stacey I have no intention of making Codea about learning how to program. But I feel that a well designed tool shouldn't hinder learning, and should still be able to handle advanced uses. Codea could do with (and is getting) some upgrades in this regard.

  • SimeonSimeon Admin Mod
    Posts: 4,459

    By the way @xavian, that Super Kitty Girl scene you rendered looks fantastic.

  • @Simeon Aw, thanks! :) I'll post a completed version of the game as we get further along. :) Right now we're working on the hero gathering the treasure when they intersect and getting points. I am considering going back and writing up a tutorial on the steps involved one we get to the end of this.

  • (btw, how are you getting the @username to turn into links? I cannot find it in docs)

  • edited September 2012 Posts: 580

    Damn, I was in the process of writing a long post about the difference between a tool that is "easy to use" and a tool that "teaches", but @Simeon pretty much summed up everything I was going to say, and much more succinctly.

    .@Xavian The point I was trying to make is that I would hate to see TLL's focus (and limited time) shifted away from making Codea more powerful for people who can already program, and toward making Codea a tool intended to teach programming concepts to a non-programmer. I believe that doing so would achieve neither, and would be detrimental to Codea.

  • @toadkick, OK, I think I gotcha, that is fair enough. I know all to well about the finite amount of time and effort that can be invested into something, so I grok where you are coming from. There are a lot of features folks already want and the learnability of the tool is not really a limiting factor for active community of folks already using it.

  • SimeonSimeon Admin Mod
    edited September 2012 Posts: 4,459

    .@xavian I think the learnability could be improved, especially by following the philosophy expressed in Victor's essay. But improving learnability should be achieved by making Codea a better tool for all users (e.g. better autocomplete, better documentation).

    (.@username links don't work properly at the start of a line, so just add a dot in front of the "@" symbol to ensure they link.)

  • I agree completely @Simeon, and better autocomplete and documentation would be a boon to all.

  • BortelsBortels Mod
    edited September 2012 Posts: 1,557

    You know, I tried to get thru that article - really. And I wasn't going to say anything, but since the thread is here... If I tried to rebut everything I see on the internet that I see as "Hogwash" (good word, there, Andrew!), well - I'd never get anything else done. ("An article I disagree with, Robin - To the Batcave!")

    He has some valid points - and some points I disagree with 100%. And those points poison the rest of the article. Like a proof - if one of your postulates is false, the whole thing goes out the window. He may be correct for the concept of "learning to program" in some fuzzy, out-of-focus generic way - but what he suggests is exactly the opposite of what you unfortunately need to do to actually learn a programming language you can use in the real world - what you need to know to actually get something done. It is unfortunately a fact that learning to program requires learning a language, and learning languages, even artificial ones, can be hard - you simply must memorize a vocabulary and rules of grammar, and if some IDE spoon-feeds things to you, you're unlikely to memorize things nearly as quickly or thoroughly.

    The way to learn a language, natural or otherwise, is immersion. Jump in feet first. Make mistakes. Learn from them.

    Right off the bat, in the first section: " JavaScript and Processing are poorly-designed languages that support weak ways of thinking, and ignore decades of learning about learning."

    That's lovely - but you know what? If you want to program for the web, you NEED TO LEARN SOME JAVASCRIPT. Plus, you could say that about most real languages noways - they're not designed to be easy to learn, they're designed to be powerful/expressive/stable/etc. If you want to do an iPad app - even using Codea - to get it into the app store, you're going to have to learn Xcode and a smidgen of Objective-C, and those aren't easy, and if they '... ignore decades of learning about learning', well - Boo Hoo. In reality, few languages are designed for "learning" - BASIC and Pascal come to mind. And neither of those are considered reasonable languages today for general programming (I always hark back to the fact that Wizardry was written in Apple Pascal - to be fair, Pascal extended with the language features that were missing and necessary to write useful graphics code). "Learning" a language you can't then use for the real world is almost useless - I guess once you had basic concepts, you could move onto a real language, but I still say immersion is better.

    But that's not what put me off the rest of the article. His first example (lovingly done as a graphic, so you can't cut and paste it, sigh) of an ellipse, had this commentary:

    "Likewise, guessing the third argument of the "ellipse" function isn't "learning programming". It's simply a barrier to learning. In a modern environment, memorizing the minutia of an API should be as relevant as memorizing times tables."

    I would say this is fundamentally wrong. It's like saying learning the rules of grammar and vocabulary aren't part of learning a natural language. If you're going to use a real language and do real programming, the meanings of the parameters for function calls are something you either need to KNOW - without an IDE spoon-feeding them to you - or at least know how to look up. And a good way to memorize them is to change them - tweak the example, see what happens, without knowing ahead of time. The element of "what if...", the element of surprise, the learning to give it a shot without knowing for sure if it will do what you want, is FUNDAMENTAL to programming.

    By making his "learning environment" do things like show what parameter changes what on a hot-mouseover, he is doing the exact opposite of having you learn - what he's teaching you here is that you don't need to know what those parameters do, because something will show you. He's teaching you to be crippled, to be unable to function outside of an IDE. Don't get me wrong - that stuff would be fine in an IDE that's not intended to teach, but is intended to assist - it's ok to use a calculator once you already know and understand the math. But the first time Joe learner is caught outside of the IDE, they'll be largely helpless - very little memorized, and they don't even know for sure where to go to look for answers, because their learning environment always held them by the hand.

    Maybe I'm old. There's a part of me says the best way to learn to program is to type in code from magazines, by hand, and puzzle out what the hell all of that means - to make the inevitable typos, and have to read the error messages, and work out what you did wrong, and why what you did had the effect it did. I know - those days are gone, good riddance, it was hard work - but it's how I learned BASIC (Atari 400 with a membrane keyboard, baby), and it taught me more than the language - I learned troubleshooting skills, and that I am smart enough to be able to do whatever I want to on a computer if I apply myself - and I don't buy that the "new learning" will be nearly as effective as jump-in-with-both-feet.

    Codea already does the one thing it needs to do for people wanting to learn to code: It provides them the opportunity do do so, lowering the barrier to entry to something that a total noob may still feel ok with. It brings back the best parts of old 8-bit programming - the immediacy of being able to sit down and code NOW without a tremendous IDE and application framework being in the way. My daughters don't know a desktop computer from Adam - to them, the ipad is the computer, and by the time they get older, I suspect for personal computing a tablet will be the rule rather than the exception. If we told them "To learn to code, you'll need to download 3 gigs of X-code, and we'll start with this giant Objective-C framework" - we're done before we begain. Codea provides them the opportunity to learn to code without such a giant barrier to entry. Codea does not, and should not, try to TEACH them to code. That's what school, the internet, tutorials, and their dear old father are for. And you know what? Maybe we'll sit down together and make a thing where they slide sliders around to change the ellipse, and then they'll KNOW it - not because someone handed it to them - because they BUILT it.

    Enough of the rant - you kids get off of my lawn.

  • edited September 2012 Posts: 580

    .@Bortels I first saw that video a couple of days ago on HN, and it's been bothering me ever since, and I couldn't quite put my finger on why. I think you've nailed it for me. Thanks.

  • BortelsBortels Mod
    edited September 2012 Posts: 1,557

    I have to admit - I didn't watch the video (was at work). And - the article is NOT completely without merit; the only thing I object to are some of the conclusions.

    I'm gonna bitch about Java for a bit. There's a point - I promise.

    At work, Java programmers are a dime-a-dozen - and the VAST majority of them can't program their way out of a wet paper sack. There are exceptions, but for every competent Java programmer, we have 10 that aren't worth the space they take up.

    I've been trying to figure out why - and the conclusion I've come to is that they didn't learn "computer programming" in college - they learned Java. Moreover - they learned, usually, the Eclipse IDE, and about some common Java libraries and hosting environments (JBOSS). They have little or no knowledge of the environment the JVM is running in, or how to debug said environment (or frankly even the JVM itself), so - they're ultra fragile. They can handle it if nothing goes wrong, but the real world isn't like that, and when something in the library they're using doesn't do what they need it to do - they're totally lost. Zero troubleshooting skills, Zero knowledge of the OS (it's freaking Red Hat Linux! it's free! Go install it!). Zero knowledge of our environment. And Zero interested in learning any of that.

    Case in point - the default java https library (or the one they're using, at least) will fail a web GET call if the certificate isn't 100% perfect - correct, not expired, etc. This is normally desirable. That is - until they're trying to connect to a test server, one we don't control, that has a self-signed cert (and expired, to boot). We said "oh, well, it's QA - just ignore the fact that the cert is self-signed and expired. With wget, use --insecure, with curl it's -k - Java must have some way to do that, right". I suspect it does - but our "Java programmers" were totally high and dry. Everything was a black box, all they knew was how to plug in the parts - they weren't even able to diagnose the issue, they had ME look at the logs to find out what was wrong.

    Apparently, the way to "fix" this is to subclass the https class in that library, and then ignore the cert issue - but this was completely beyond even the comprehension of these guys. They had so little real-world experience with programming, they couldn't even google for and imlement a solution. (The final "solution": we paid a couple hundred bucks, and got a "real" certificate, for that box, and the next time this happens they'll be right in the same boat, because that's how the real world works)

    Needless to say - even explaining the issue to them was almost impossible; SSL is non-trivial, and even really smart guys can get confused over certificates and signers and keys and such and so. Java (or JBOSS, or both) makes it worse by using a "keystore", which adds a level of indirection on top of things, and as far as I can tell adds almost no benefit.

    (PS for those in the know: We use exactly zero EJB features; this all could run on tomcat just as well. But - they "know JBOSS", so they get JBOSS, and you get this amusing lost kitten look when you ask them about Enterprise Beans. Which is good - they're heinously non-performant. Still.)

    Why did I tell you all of this? here's why:

    I fear people who "learn" with an IDE that holds their hands too much are going to learn the IDE - not how to program. I know these boys did. My advice, to management, was to replace them with old-school C systems programmers, and have them spend a few months learning Java. (no - they didn't do this, unfortunately).

    Disclaimer: I was a "professional" Java programmer for just under a year - went in knowing nothing but "hello world", learned enough to see just how shoddy everything was in about 60 days. This was at a prior employer - I won't say who, PM me if you're curious. I was hired for RADIUS ( http://en.wikipedia.org/wiki/RADIUS - I know it backwards and forwards from my 8 years running PacificNet, a local ISP ) knowledge - my "let's learn java" proof-of-concept server did 8000 auths per second (object reuse! as opposed to 6000 in my perl 'reference' version), and the "real" version I was supposed to help them troubleshoot? 10 to 12. per second. Goal was 40. I pointed out why it wasn't working (they completely re-built all of the data structures on every call; zero concept of what "initialization" means, or of how badly they were thrashing the GC), and they transferred me to another department rather than have someone point out the emperor has no clothes. Oh - and to top it off, because they didn't use real-world data (real world data is messy) when they coded, they were dumping stack almost every call, multiple times. Idiots, the lot of em.

  • SimeonSimeon Admin Mod
    Posts: 4,459

    .@Bortels wrote:

    I would say this is fundamentally wrong. It's like saying learning the rules of grammar and vocabulary aren't part of learning a natural language. If you're going to use a real language and do real programming, the meanings of the parameters for function calls are something you either need to KNOW - without an IDE spoon-feeding them to you - or at least know how to look up.

    Actually this is a place I very much agree with Bret Victor. I hate inefficiency. If a computer can do something so that I don't have to, then the computer better do it for me. And it better do it well.

    This includes memorisation, english grammar and spelling, and coding.

    For example, ever since it was introduced a decade ago, I have always relied upon having a dictionary built into every single text view in Mac OS X. I can hover over any word and see the English definition. This has made me a better writer and increased my vocabulary. The harder it is to look up words, the less I will do it (whether consciously or not, it doesn't matter.)

    I never want to look up API documentation. But I have to because it's obviously not all going to fit in my head. So the quicker and easier that is, the better. If it's right on top of the words I just typed, then perhaps that is best of all.

    Memorising this sort of stuff is not useful to me — this stuff should exist outside of my head. The things inside my head are the concepts I am creating and expressing, the rules should be clear enough that they never distract from creation.

    .@Bortels wrote:

    But the first time Joe learner is caught outside of the IDE, they'll be largely helpless - very little memorized, and they don't even know for sure where to go to look for answers, because their learning environment always held them by the hand.

    Bret Victor is actually thinking a little ahead here. I think he understands that a lot of his concepts aren't applicable to existing languages and technologies. His argument is that our whole way of thinking is outmoded.

    His point is that we should never even consider building a system, or a data structure, that doesn't encourage visual use. He doesn't want us to think about how we can apply these concepts to Javascript, he wants us to think about how we can apply these concepts to creating in general. He is targeting programming because it is so held-back by this way of thinking right now.

    In his world, a language and an IDE are one and the same. One doesn't exist without the other (e.g., Smalltalk). To him, the only things that should exist inside your head are the things you want to create — anything else is noise to be eliminated. Victor believes that computers are capable, and should deal with all the rest. We just need to be smart in our approach to tool development.

    --

    That said, Victor's article is focused on teaching. So you can see why he gets caught up in designing an IDE that holds your hand. However I am most certainly a believer in the same philosophy as him: to get what's in your head onto the screen in the most efficient manner possible. I've been interested in this way of thinking for many years now.

  • Bah, sorry, I don't know why I keep referring to the article as a video. Perhaps because I am tired and/or delirious at the moment. I read the article, not watched the video. Maybe I kept referring to it as a video because of all of the little videos embedded in the article. Maybe I'm just brain damaged. Anyhoo, good story! :)

  • BortelsBortels Mod
    edited September 2012 Posts: 1,557

    That may explain the disconnect - he wasn't talking about teaching programming then, he was talking about teaching a nonexistent ideal of programming.

    I would suggest a 100% required first step is to make a usable language that fits his criteria. I suspect such an effort is doomed to failure.

    Everyone here fluent in a computer language knows that the real difficult parts have zero to do with the grammar and API, generally. Yes - you can have issues with those, but that's paperwork. I think he's focusing on the wrong thing.

    By way of comparison - to deeply analyze the novels of Flaubert, you gotta know French, and if you don't know it, you gotta learn it. And the process of learning French will teach you very little about "Madame Bovary" - it's simply a necessary prerequisite. "Learning French" and "Learning about characterization in Madame Bovary" are two completely different things. (Disclaimer - I neither know French, or have read "Madame Bovary", even translated - I just like saying "Flaubert")

    Andrew mentioned quaternions earlier - for me, a common stumbling block beyond language is the MATH, which is arguably a second-order part of the language (or, perhaps, "the language from which all other languages derive"). In 3d programming, there is a problem called "gimbal lock" that arises when you use Euler angles (yaw, pitch, roll) - it happens in the real world with a compass gimbal as well. Using quaternions is the "right" solution to this, and that fact transcends language - the 3d programming solution has absolutely nothing to do with the underlying language. But - you can't pretend to understand them (I'm still not sure I do) without representing them in a language, even if that language is "mathematics". Languages are a necessary pre-requisite to be able to move on to the higher-level thought processes.

    There is a school of thought that I'm not sure I agree with, 100%, that suggests the development of a language was a necessary pre-requisite for higher level human activity, ie. civilization. I dunno - maybe I just convinced myself.

    Point is - he wants to teach "programming", but he's pointing out issues with "programming languages", and I think maybe it's not the same level of things. It's as if he wants to teach about "civilization", and is trying to so by complaining about "the english language". About the closest I can think of to learning "programming", as opposed to a language, are the only computer courses in college that I took that actually taught me anything useful - one was "data structures" (where we did things like linked-lists, B-trees, and so on - again, generic, above the level of a particular implementation), and the other was "Files and Databases", which could be language-independent, but was very grounded in the real world when-the-rubber-hits-the-road issues of performance and so on.

    And - perhaps that's my real issue. He is perhaps TRYING to talk about that 2nd level of programming - and mistakenly using the first level of language as an example, and a lot of what he says simply does not apply there. I should perhaps look again with that light.

    One other thing - I understand you don't want to have to memorize an API - nobody does. But there's a big difference between being familiar, and memorizing details. If I provide you with "ellipse(100, 200, 20)" - you KNOW the numbers describe the ellipse being drawn - if you have to look up which is what, ah well. They key to making good use of an API isn't in the trivia - it's in knowing what's there. So long as you can say "I want to draw an oval, and I recall the API had an ellipse function - and I know where to look to see what the parameters are", I'd argue you know the API well enough. The javascript example presented again mixed up the levels; you don't need to memorize which parameter is which - you need to know where to find out. And that's nothing he touched on. better, perhaps, then pretty moving graphics, would be a popup of the API docs - something you could bookmark, something with a "back to top" pointer, something you can browse.

    I would argue that most knowledge of computer language and API isn't memorizing details - which parameters mean what in what order - it's knowing what's there. The rest is paperwork.

  • SimeonSimeon Admin Mod
    edited September 2012 Posts: 4,459

    .@Bortels Absolutely, I think his points generally align with yours.

    For example, what you refer to as "knowing what's there" he calls "dumping the parts bucket onto the floor." He is saying make your tools obvious and easily explorable. That's never a bad thing.

    About memorising the API, Bret Victor believes it should either be encoded in the language (drawEllipseAtX:0 Y:0 withWidth:100 andHeight:100) or easily inferred. I generally agree with this. After a few years of using Objective-C, I am convinced having the argument names right up-front is a good thing. I don't believe languages need to be designed to do this, though — I'm experimenting in this area with Codea.

    As Bret Victor says at the start of his essay: the features are not the point.

    This essay will present many features! The trick is to see through them -- to see the underlying design principles that they represent, and understand how these principles enable the programmer to think.

    He is absolutely trying to talk about a higher-level of creative tools by applying feature examples to existing tools. Sometimes those examples don't always work (or you don't always agree with them), but the ideas he is trying to describe through these examples are pretty sound.

  • BortelsBortels Mod
    edited September 2012 Posts: 1,557

    re. named arguments - I actually thought of that as a possible "solution", but there are practical issues, and it largely just "pushes back the issue". Was that "withWidth"? or "width" or "Width"? or are the axes X and Y, not width and height? You've traded one set of things to memorize or look up with another (albeit hopefully easier to remember). Or maybe you take both "width" and "Width", but what if I pass both? which overrides? There are enormous practical issues with named parameter passing, which is why it's fairly uncommon (I pass "width" but you're looking for "Width" and the call looks good but is failing and I'm right back to looking it up in the API). Apple and Objective-C perhaps being an exception, but remember - Apple doesn't care about the developer or about making your job easier. Sad, but true.

    Now - named does help with readability - perhaps, slightly, for those unfamiliar with the API. I'm not sure those are my target audience. (my target audience is me 6 months from now trying to remember what I did - and for that, I use comments)

    But - remember I mentioned I learned on the Atari 400, with the membrane keyboard? It has that, at least, in common with the iPad, and I'd suggest the following rule trumps named arguments in this environment:

    eschew verbosity

    You know what? "ellipse(10,20,50)" pretty clearly tells me we're drawing an ellipse, and if I want to know what those parameters mean, I can tap the "eye" to bring up help. (Uh - and I just tried it, and it's not context sensitive? wah. Shows you how often I try to look up an API function). What I don't want to do is type "(drawEllipseAtX:0 Y:0 withWidth:100 andHeight:100)" on my iPad, ever, even with autocomplete. (Side Note: CamelCase is a pain. I would argue "drawEllipseAtX" yadda yadda has the same semantic content as "ellipse{x=5, etc", but doesn't force me to hit the shift key a gazillion times)

    I would also say that sticking to lua trumps that as well; lua has a syntax for named arguments, even if it is a bit "hackey"... there's nothing that says both "ellipse(args)" and "ellipse{width=10, height=20, angle=5}" [1] can both be legal. But - even with a named argument API, I'd keep the names as short as possible to disambiguate, owing to the platform (ie. I'd strongly consider X and Y rather than height and width, simply because if your angle is 90, they swap anyway).

    [1] For readers unaware of the syntax: http://www.lua.org/pil/5.3.html - Now THAT'S learning!

  • Posts: 2,160

    Bortels (welcome back! We've missed you) has said most of what I wanted to say and more. The article is wrong with its basic assumptions. Let's start at the beginning.

    Here's a trick question: How do we get people to understand programming?

    Agree. It's a trick question. The real question should be: How do we get people to understand computers?

    Stuff about the Khan Academy

    • Programming is a way of thinking, not a rote skill. Learning about "for" loops is not learning to program, any more than learning about pencils is learning to draw.

    Of course learning about pencils is learning to draw. It's absolutely amazing to see what a real artist can do with a humble pencil because they've learnt all about its capabilities and limitations. A calligrapher can produce an amazing piece of work with some ink and a stick of wood, but only because they really understand pens and so know how much a stick of wood is and isn't like a pen.

    Yes, you can get a long way in drawing without understanding pencils. But you'll get a whole lot further if you do. Of course, you don't need to start by explaining all about graphite and so forth, but you sculpt the environment so that that information seeps in by itself.

    But what does "is a way of thinking" actually mean? It's just newspeak.

    People understand what they can see. If a programmer cannot see what a program is doing, she can't understand it.

    Rubbish. People understand when they have done it lots of times. Can you see a quaternion? No, but you understand it once you've used it lots of times and seen what it is for.

    Thus, the goals of a programming system should be:

    • to support and encourage powerful ways of thinking

    See my remark above. This is so generic as to be worthless.

    • to enable programmers to see and understand the execution of their programs

    Again, I object to the word "see". Why do I want to see it? If I could see it, I probably didn't need the computer to do it.

    Stuff about live-coding

    Alan Perlis wrote, "To understand a program, you must become both the machine and the program." This view is a mistake, and it is this widespread and virulent mistake that keeps programming a difficult and obscure art. A person is not a machine, and should not be forced to think like one.

    Why is it a mistake? Programming is telling a computer what to do. Ever tried telling someone or something what to do? You need to enter their world and understand how your instructions will be interpreted.

    Here's my fun anecdote on this. My kids wear glasses and have done for most of their lives. Once when she was about 2 my daughter lost her glasses somewhere in the house. We spent a long time looking for them and every five minutes were asking "Where did you put your glasses" which just got a "Don't know" response. Eventually, I tried a different tack: I told her "Put your glasses on" whereupon she went to where she'd left them and put them on.

    My goal is always to get the computer to do what I want it to do and not the other way around. But to do this, I need to understand how the computer will react to what I tell it and for that then yes, I need to "become the machine".

    How do we get people to understand programming?

    We change programming. We turn it into something that's understandable by people.

    Whereupon it no longer is programming. Not that "being understandable by people" and "getting a computer to do what it is told" are incompatible, but if you focus on the first exclusively you are in danger of losing the second.

    There are lots of vague statements in that first section with nothing to back them up. I can't say for sure that they are out-and-out wrong, but with no evidence to say that they are right - and a huge Baysian prior against it - I'm not convinced.


    When I say, "I want my kids to learn to program" what do I mean? I'm not totally sure. Here's what I think I mean:

    1. I had a lot of fun as a kid programming my BBC Micro and CBM64. My kids are a bit like me and would probably enjoy it as well.

      But the environment had a lot to do with that as well. My dad built stuff for us to control from the CBM64 so we could write programs that interacted with our toys. How easy is that these days? Also, as I mentioned, I subscribed to programming magazines where I could copy out some code and type it in. Nowadays, you'd download it off their website (I was going to say, you'd get a disc attached to the magazine, but then realised that not even that is true anymore).

      How do we recreate the effect of that environment now?

    2. I'm having a lot of fun programming now - thanks in no small part to Codea. I'd like to be able to share that time with my kids.

      This is where it gets difficult, though. I'm having a lot of fun because I know the basics of programming. They don't. So at the moment, the sharing is along the lines of me writing a program that they play with (Anagrams, Hangman). I've tried writing a program with my eldest, but it didn't get anywhere because ... wait for it ... he doesn't have a clue what is possible with a computer because he doesn't understand computers.

    3. I think that if they learn to program at a young age it will make it easier for them to learn mathematics later (Whoa! You didn't spot that one coming, did you? See What does mathematics have to do with programming? for more details), and will give them a very valuable skill for later in life.

      With this, then what I really want them is to understand computers: namely, what computers are good for and what they aren't good for. When it's a reasonable task to give to a computer and when it's better to do it yourself. Then, I'm sorry to say, "for loops" are everything.

    So my learning objectives are simple, I just don't - at the moment - know how to meet them.

    1. That the student learn that a computer is a tool, and be familiar with its uses and limitations.
    2. That the student have a lot of fun doing it.
  • Posts: 2,160

    Different point (so different post). Simeon made a point about dictionaries and being able to look up words easily and how that enriches his language skills. Sure, but you're talking about a language that you already know. Ever tried learning a new language from a dictionary? It can't be done. You need a native speaker around to tell you that the correct word for that concept in that situation is this and not the other. Dictionaries don't tell you that. Dictionaries are for when you already know the language.

  • SimeonSimeon Admin Mod
    edited September 2012 Posts: 4,459

    .@Andrew_Stacey interesting that you bring up mathematics — one of Bret Victor's passions is to rethink how maths is taught. Have a look here: http://worrydream.com (his articles under "kill math"). It would be interesting to know what you think.

    My point was not that you learn a language entirely from a dictionary, it was that having an inline dictionary stops you from having to memorise vocabulary. Same way that I don't memorise every API I use. I agree with Bret Victor on this — rote memorisation is not something you want occupying your mind when you have far more interesting things to fill it with.

    I'd also disagree that learning about pencils has much to do with learning to draw. Learning to see is more important, and learning to see the pictures in your mind is one of the hardest things I have ever had to learn. The technical process of transcribing the image in your head onto paper is pleasant, and can be assisted by tools, but it doesn't happen to your satisfaction unless you have learnt how to visualise things.

    Programming is a way of thinking: it involves abstraction, encapsulation, connections, logic, spatial relationships and a whole bunch of other problem solving abilities. Bret Victor's point is that thinking like a computer (i.e. simulating the instruction pointer in your head) should not be one of those requirements. If you use the former to control a computer without requiring the latter, you are doing a more efficient job (in terms of expending useful mental effort).

    Most of my programming now takes place in my head, away from the computer. I'm thinking about data structures, connections, algorithms. Problem solving. Most of the hardest problems end up being solved by drawing pictures and visualising. If I didn't then have to go and write the code to translate those solutions into a form the computer understood, I would be very happy.

    It's the same with drawing — if I didn't have to physically draw to make the things I see appear in the world, I would use that instead.

    Anything that makes it easier and faster to get an idea from your head and into reality is a good thing.

    In the end that's all it's about, expressing your ideas. Bret Victor believes that there is a better way to do that than the technical skill of programming — his attempts are still a bit nebulous, but I feel like it's something worth exploring.

    .@Bortels it's only context sensitive if you select the word "ellipse" and choose "Lookup" from the context menu.

  • Initial spark

    Bret Victor shows off some really fancy IDE cababilities for interactive computer-aided programming where you can understand everything without prior knowledge, just by doing and exploring.

    Proposal

    Codea should adapt as much of Bret's ideas as possible.

    My take on it

    We cannot change the way Lua looks and works (should we if we could?) and transforming Codea into a Scratch-like IDE (which was proposed in another thread) is something completely different (and shouldn't be done).

    So we're talking about the wide field of computer-aided programming. Why isn't Codea more like the things that Bret Victor shows us and how far can Codea get on a path to it? Well, I don't know exactly what Bret has shown. It is either a fully capable version of the IDE that probably works with small and very specific tasks or it is a special purpose IDE for every single example. I'm not sure, but that's what I think. Because doing it in a universal way is a lot of work, and if you are not a programmer you're maybe not even realizing how much of a work it is.

    The easiest way of computer-aided help is probably code intelligence. You type something, the IDE is guessing what you are going to type and suggests completions. You're opening a bracket, the IDE shows parameter lists it knows from scanning existing code and again makes suggestions. If it knows the types of variables it can also suggest specific variables. Simeon already promised to improve the way of getting help from Codea which is a very good (and doable) thing.

    Interactive editing is not so easy. You have draw a scenery with a flower and now the knob of the petal color. How does the IDE know what to change? The best guess is to re-run everything. This may work for some isolated code if it is simple and self-contained. Or it is part of a large code sequence but it is save to just re-run the function you a playing with at the moment. Or not. What if you change the position of the petals? The background must be redrawn or else you will fill the screen with old impression of outdated drawings.

    Instant code replay (how to name it) is equally difficult. I mean the punch card where you can point to an instruction inside a loop and see the intermediate result of it in time a space. How is the IDE going to do that for you? I mean: from Simeon's point of view who needs to code it?

    See, what Bret has shown us is a presentation of design principles, not a fully working product, because it would be a lot of work to get it blown up to the size of Codea.

    Interlude: While talking about Bret Victor, there's Brett Terpstra coming into my mind (because of the Christian names; silly, isn't it?). He's writing a lot of scripts to help himself (and others) doing his work. Why is he writing scripts? Because it is enough for him, and because writing a graphical tool that everybody can use without even reading the first chapter of the manual would not make him Brett Terpstra, the guy who probably writes ten tools a day to improve his workflow and then tells you about the best one of them. He would be Brett Terpstra, the once a month super-polished tool presenter. End of interlude.

    I appreciate any help a computer can provide, but if I had a wish list of Codea's future capabilities and adjust it by Simeon's limited time, then only very few of Bret's ideas would make it into the "realistic" edition. Nevertheless, Simeon may try and surprise us with things nobody did before.

  • Posts: 2,160

    (This bit is somewhat silly.)
    Learning to see is easy. I took a book away with me on holiday when I was 13 and after a month I'd learnt to see and could draw. It's learning to make something new that is the interesting bit, and that's where you really need to understand the capabilities of your tools.

    In what you (@Simeon) say, there's still a bit conflation between initial learning and onward development. Bret's article reads like "Here's how to help someone who has no idea" whereas what I get from what you write is "Here's what would help someone who is some way along the road.". Then, dictionaries and so forth help. But first you have to learn how to use them. The peril of Wikipedia is not that it is there, but that it makes us lazy. If we knew how to use it properly - read, if we knew how to teach how to use it properly - then we wouldn't have all this angst about students "just" quoting from Wikipedia.

    Maybe I misunderstood something. "thinking like a computer (i.e. simulating the instruction pointer in your head)". That isn't what I mean by "thinking like a computer". I mean thinking about data structures, control instructions, and so forth. That's "thinking like a computer".

    But here's where I really, really, really disagree:

    If I didn't then have to go and write the code to translate those solutions into a form the computer understood, I would be very happy.

    It's the same with drawing — if I didn't have to physically draw to make the things I see appear in the world, I would use that instead.

    Anything that makes it easier and faster to get an idea from your head and into reality is a good thing.

    In the end that's all it's about, expressing your ideas.

    This makes me quite sad. It reduces the medium of expression to a secondary role. If I draw, then my choice of charcoal versus pencil versus pen versus iPad makes a huge difference to what I draw and how it will turn out. Similarly, it's only when I write a proof down on paper that I really, truly have proved it. I have all manner of "proofs" zooming round in my head, but only when I make them concrete do they actually coalesce into something real. That process of getting it down on paper is not nothing. That's the "pushing the paint around on the canvas" part. That's where the vague and hazy idea becomes a reality, and that's the magic.

    As for rote learning, well I will admit that when Bret said something disparaging about learning times tables (at least, I read it that way) then my tolerance level went down quite a bit. One fun thing about watching everyone play with Codea is seeing how much maths everyone is using (and seeing how many "I don't know any maths" comments there are!). How much easier would life be if everyone here remembered their times tables? Or their trigonometry relations? SOHCAHTOA anyone?

  • First off, thank you all for the great discussion! Good stuff. :)

    There are so many points to respond to that I think I'll just try to address a couple of specific ones. It is my perspective that the computers and technology works for US, not vice versa. I am not a parser, nor a compiler, nor a memory manager (not recently anyways). There is a great benefit to understanding how things work at the lower level, but if I am trying to get something done, the underlying mechanics of the thing should not impede me focusing on my capturing and implementing the actual concept I am working on. We do not, by in large, spend a lot of time futzing with our pens or mechanical pencils trying to get them properly configured to write something. We grab them and we write with them.

    I started with basic, moved to machine code (didn't have an assembler), Assembly, C, C++, Perl, Java, C#, PHP, Python, Jython, Ruby, Groovy, ObjC and Clojure. I spend most of my time writing in Python nowadays and I have absolutely do not miss the other languages. If I need performance for something that is CPU bound (most work I do now is IO bound client/server stuff) I will write some Java.

    Most of the difficulty I ever have is learning the damn frameworks for each language platform. Each platform has differences that really do not go hand in hand with the general concept of knowing how to program. Any tools that help inform me about the specific of platform frameworks save me time and it is not NECESSARY for me to have have memorized .NET GDI or DirectX call, or Swing (Gridbag!), Spring.

    If I switch to do some Java programming, help me remember the UI methods and parameters so I can get the damn thing done that I am trying to do. My goal is not to "Learn Java Swing", my goal is "develop a front-end for some analytics tools". I view these activities largely as "Yak Shaving", any apparently useless activity which, by allowing you to overcome intermediate difficulties, allows you to solve a larger problem.

    I guess the genesis for the whole discussion from my part is making it easier to discover functionality available to me with a new platform/tool, such as Codea. Once I get the hang of the specific dialects, its easier to remember. If I can't even get out of the starting gate because I don't know how someone else has categorized a concept, its less useful to me. Affordances are helpful there, don't get in my way and make it easy for me to see what tools and functionality are available to me.

    .@Codeslinger, I like your breakdown, but I need to clarify a misstatement of my original reason for posting this. I'll quote myself here:

    Xavian Said:

    All this in mind, I came across a fantastic article tonight posted on Hacker News that is > directly relevant to reaching goals of making modern programming easier to learn and > there are several suggestions directly applicable to Codea itself:

    http://worrydream.com/LearnableProgramming/

    There are several suggestions directly applicable to Codea itself, which I outlined as:

    "Make meaning transparent" and "Explain in context" sections

    I am not advocating the stance of:

    Codea should adapt as much of Bret's ideas as possible.

    As many people have discussed, a lot of what he is saying is difficult if not impossible to apply to existing approaches. Its the underlying concepts that are most interesting.

    If I were go back and state my observation more simply, I might just say:

    I like Codea, its pretty easy to use and my only minor nitpick is that I wish Codea's context sensitive documentation feature were a little better.

    In any case, this spurred a great discussion and I have a much better understanding of other folks perspectives. I think there was a lot of reaction to the concept of adopting all of Bret's approaches and how doing that could cause other development items on Codea to grind to a halt or lessen Codea's appeal to existing folks.

    Thanks for the copious and detailed feedback! :)

    -Xavian

  • Posts: 2,160

    Just read Kill Math. Sure, kill whatever that is, it sounds horrible. But it ain't what I do, nor what I try to teach.

  • SimeonSimeon Admin Mod
    edited September 2012 Posts: 4,459

    .@Andrew_Stacey

    The peril of Wikipedia is not that it is there, but that it makes us lazy.

    Students quoting from Wikipedia is not really relevant to what I had in mind. The fact that Wikipedia exists means that I can offload things from my head — I know that I can reference them when the time comes. This frees up my mind for real problem solving.

    There is the stuff inside your head and the stuff outside. I would prefer, as much as possible, for the things not directly involved with creative expression to exist outside my head.

    This makes me quite sad. It reduces the medium of expression to a secondary role.

    To me the medium doesn't matter as long as the expression is consistent with what is in my head. The tools are there to make it happen. Sometimes I find I can't get the idea out of my head fast enough, the tools feel like a limitation at that point.

    As for pushing paint around canvas — I do that too. Sometimes you randomly draw or paint and then your visual system takes over and you end up constructing a model from your random shapes, lines and whatever emotions you were feeling at the time. It's a great experience, but I still place the importance on visualising.

    Edit: In the above case, where the medium can help you experiment and trigger you into discovering good ideas, I agree that it's important.

    Maybe I misunderstood something. "thinking like a computer (i.e. simulating the instruction pointer in your head)". That isn't what I mean by "thinking like a computer".

    I don't consider design, data structures, abstraction, and spatial problem solving to be "thinking like a computer." I find that I apply the same mental tools when forming an argument, solving a puzzle, building a model, and so on. Those mental tools should be used often and well.

    By "thinking like a computer" I refer to when you observe a code listing and mentally simulate what happens when the code runs (most often without realising it). The argument is that the computer should do this for you, immediately and often.

    That's where the vague and hazy idea becomes a reality, and that's the magic.

    I agree with this. And I don't think anything that Bret Victor wrote goes against this idea. Transcribing an idea is extremely important, but if you can do it faster, more directly, and more efficiently, then I think you should do that.

    Because then you can move onto the next idea even faster.

  • Posts: 1,211

    My wife has been a public school teacher for 25 years, is very skilled at her work, and recognizes that different people need very different approaches to understand and make use of information. There is no perfect approach that works for eveyrone. One person's inspiring insight, is another's plodding frustration. The tool or technique that looks glowingly obvious to you, may be a muddy mess to someone else.

    Speaking for myself, I like nothing more than examples. If I can find an example that's close to the problem at hand, I can twist it, turn it, learn from it, and build what I need. If I'm thrown back on just the bare documentation, things move very slowly.

    Give me examples, and I'll move... maybe not the world, but at least a few pixels.

  • BortelsBortels Mod
    Posts: 1,557

    So - I got even less far into "kill math" before my spine said "rubbish" and I moved on.

    He seems to have a bias against symbol manipulation - and at the very bottom, symbol manipulation is all there is.

    I think Andrew hit on a giant point - the computer is a tool, a means to achieve a goal, and programming is learning to use that tool. To use any tool, you need to understand both the tool itself, and the goal you're using the tool to work toward.

    There's a balance there - but the original article seems to be saying "we need to teach people how to understand the goal you're working toward", without teaching about the tools - and that's not going to work.

    I took some art classes in college (I took lots of different stuff in college) - you do need to understand the tools, and how they work, before you can use them toward a goal. I got pretty good with charcoals, surprisingly enough - There are effects you can achieve them them very easily that are quite difficult in other mediums. Now - understanding light and shadow and making the drawing is the real goal, but doing it well requires you understand the tools - if only to know which ones are appropriate to do the job.

    There are a lot of "programming" classes that focus on the tool - perhaps he's right that the emphasis shouldn't be on "Learning Java" but rather on "learning to program". But I think he goes too far, in that you cannot "learn to program" without knowing at least some of the tools. Necessary Prerequisite.

    Now - Math. math math math. You know - Pythagoras came up with his famous theorem (or rather, the proof for it) sometime around 500 BC - this significantly predates computers, and computer visualization. Mathematicians have been doing some pretty neat things for a long time without having something show it to them. They use their minds. We can too. Indeed - I think an important part of learning is actually using your mind, being forced to think a bit. You learn better by doing than by simply being shown.

  • SimeonSimeon Admin Mod
    edited September 2012 Posts: 4,459

    .@Bortels Bret Victor definitely has a bias towards the concrete — he likes to create with a concrete, visual structure and then abstract and encapsulate it. Adding and linking symbols as he goes.

    He likes to begin with an immediate visual representation, which requires real values that are later abstracted. This is one part I disagree with for non-learning applications. I think starting in the abstract is often a good skill to use.

    Regarding the art analogy — I'm not talking about achieving specific effects with a certain tool. I'm just talking about getting the idea that's in your head onto the page. Your choice of medium will effect that process, but the process itself should not be dependent on your choice of medium. And that process is the one that I'd like to speed up — so you can iterate on more ideas in the same time span.

    Like you say, @Bortels:

    Now - understanding light and shadow and making the drawing is the real goal, but doing it well requires you understand the tools - if only to know which ones are appropriate to do the job.

    Bret Victor's point is exactly that if the tools are more approachable, more discoverable, and more instantaneous, then they will get in the way less. And you can spend more time developing your internal understanding of light and shadow — surely this is more important than your understanding of how to create certain effects using specific tools.

  • BortelsBortels Mod
    Posts: 1,557

    You say "Your choice of medium will effect that process, but the process itself should not be dependent on your choice of medium"

    Thing is - this is patently untrue, and moreover almost cannot be true. If my goal is to, say, draw a face - the process is fundamentally different when I choose to use a pencil and paper, versus say doing it in Codea. I'm not even sure where "should not" can come from. Indeed - should! The process should and must vary depending on the tools you use. This is actually part of the point of tools, and of having an assortment - each tool has strengths and weaknesses, and knowing and using an assortment lets you learn to use the best tool for the job at hand. Saying that the tool shouldn't matter is saying all tools should be the same - that a hammer is good enough for any job. It ain't so, and it's not desirable for it to be so.

    As for tools "not getting in the way" - that's lovely, as far as it goes, but the simple fact is that sometimes, for a complex task, you may need a complex tool. This isn't something to shy away from - complexity can give you power. Someone who knows photoshop backwards and forwards can work magic - magic you simply couldn't do with other tools. More approachable is fine until it starts to actually break functionality.

    This guy needs less pie-in-the-sky and more hands-on-the-tools.

  • Lovely discussion. I read @simeon and tend to agree, then I read @bortels and @andrew_stacey and like the argument they are making too.

    My 2c. I've been developing a Lua IDE (http://studio.zerobrane.com/) that supports live coding for several months. I've taught an intro computer science class to middle- and high-school students (13-17) using this IDE and formed my opinions on what seems to work and what not.

    How much help do you need from the environment you work in? By watching people who are not familiar with programming or IDEs to learn, I can say as much as you can get, as long as it doesn't get in the way. There are two benefits in not having to worry about the API and the result of various calls: (1) you avoid context switching that is killing the flow in your work, and (2) you start seeing connections where you saw none previously because you take information through more channels and involve centers that could be dormant before. Seeing immediate effect of your changes also shortens your REPL cycle, which, again, helps to keep you in the flow.

    What you may be losing (to some degree) is the learning that happens when you think about what should happen before it happens and then test your theory. This is very powerful and one of the qualities that I believe sets good programmers apart from average ones. When you fix/change something, there is always a tendency to make the change and see the result and the availability of the immediate feedback from the environment makes it almost a reflex. But not so fast. If you give yourself just a tad of time to think about what should happen and why you think so, and then check what really happened, that's a brief interval of time where most of the learning happens.

    Bret is very explicit: "These are not training wheels." People who complain about applicability of what's shown to a small set of graphical applications are largely missing a point. Bret talks about visibility of effects from changes in programs. When I monkeypatch a small function in a complex system that involves distributed servers with layers of caching, templating, and various backend systems, knowing the potential impact of that change could save hours of my and someone else's time. And I've been doing this for almost 20 years. What saves me is the ability to think about possible consequences based on my knowledge about the system, but even that knowledge is limited and my environment can go a long way in helping me to assess those consequences.

    TL;DR: I've been working on enabling my IDE to help my users better see what happens with their programs. For those interested, there is a demo for live coding with Lua (http://notebook.kulchenko.com/zerobrane/live-coding-in-lua-bret-victor-style) and one for live coding with love2d (http://notebook.kulchenko.com/zerobrane/live-coding-with-love).

  • BortelsBortels Mod
    edited September 2012 Posts: 1,557

    This is sort of the area where you can have wildly divergent viewpoints - and both have validity, and be complete hokum, at the same time.

    I actually think this divides into the "but it could be so much better" camp, and the "if it was good enough for me, it's good enough for you" camp. Both sides have issues. Am I interested in making learning less painful? no - not really; It was painful for me, and I think the pain helps. Is it good to strive to teach better - sure; but at some point, they have to put their money where their mouth is, and stop bagging on javascript. :-)

    I actually think there's one other really important thing Codea provides, getting back down to nuts and bolts, that's heinously important - I referred to it obliquely before, when I talked about typing in programs from magazines: Working Examples.

    Moreso than some fancy IDE with context sensitive help, real-world working examples of real code doing real non-trivial things will teach you (with some study, and ideally some tweaking and experimentation) far more than a fancy IDE or hours spent poring over the API. Again - I liken it to a natural language. Class is fine, and a dictionary is fine, but if you want to learn French, go listen to native speakers talk; interact with them, find out what they say, and what they say when you try. Likewise - you want to learn Codea, or lua-in-the-large? Go look at the examples. Understand what they do, and how. Change them, improve them, ideally break them, then fix them. I think this activity, moreso than API issues, gets to the very heart of "learning to program".

    I have a friend that teaches English as a Second Language - fairly advanced, 3rd or 4th year, so the people in class already have basics - they have an ok vocabulary, and a grasp of the grammar. It is apparently almost entirely (or close to it) a study of idiom and custom. The "oomph" of English isn't the words, or the grammar - it's the fact that there is a deeper understanding of not taking certain phrases ("the sky's the limit!") literally. You're not expected to re-invent the idioms (I doubt you could) - you learn from example. I suspect all specialized vocabularies - computer languages includes - have these sorts of things - they are extra-language, extra-IDE, meta-programming juciness.

    It is unreasonable - in any science - to expect anyone to push the envelope if they have to spend their time re-inventing the wheel: We see farther, because we stand on the shoulders of the giants who went before. A good grounding in the "cultural literacy" of any science/language/pasttime is essential if you want to excel. And examples - good examples - are basically the only way I can think of to get that grounding.

  • Posts: 115

    Interesting article and great discussion here. Some of the videos in that article were quite intriguing.

    As a trial and error programmer (AKA as a hack), I am all for options that make it easier for me to cut down on the time it takes to get my ideas working as expected. Learning the language is important but the older I get the less my memory works. My mother has Alzheimer disease and I am beginning to think it is genetic.

    So any tips, hints, etc. that may be built in to Codea (with an option to turn it off for those not wanting it) would get my vote.

    .@paulclinger, I just wanted to thank you for posting your link. Up until downloading your zeroBrane Studio, I haven't fiddled with Lua on my PC. I am very much liking your interface along with the help, examples, and the real time ScratchPad function. Kudos and good luck in your future endeavors.

  • Posts: 83

    I've checked Khan academys videos, I see their written in JAVA, I've ended up with a lot of errors. Their still very good videos and easy to understand, which is a plus for a beginner...

Sign In or Register to comment.