My plan is to create several different programs that perform specific Algebraic

operations. My boys are learning Algebra 2 and I thought it might be a fun way

to help us all learn Algebra and programming together. Python seems to be a

good language for learning how to program.

If you are a programmer and want to start doing some 'real' math stuff, your approach is fine.

Conversely if you are a mathematician (or at least someone whose math fundamentals are well established) then too to start by getting your hands dirtywith coding up some paper-pen/chalk-blackboard math into a system is a good goal and python is as good a language for this as any.

The system sage

http://www.sagemath.org/ does some serious math with pythonas glue.

However if your boys are new to both math and programming, you are doing them a disservice by mixing the two using python.

The problem is that python is an imperative language and uses the '=' sign for assignment. In math of course '=' stands for equality.

Now these two usages of '=' are both center-stage and completely different:

- the math = is by definition symmetric -- we can replace x=y by y=x.Whereas in programming we can never replace x=1 by 1=x

- More significantly and dangerously the programming var=rhs has a timingelement and in fact introduces a basic notion of a 'unit of computation', --the statement -- a notion completely absent from the math something=somethingElse

Now one word or signifier meaning different things -- a pun -- is not necessarily bad. To the extent that we humans have more entities to deal with than ready words its even inevitable. Just yesterday there was a discussion about whether 'python' is a TV comic character or a snake, pike a fish or apoker. I think these are relatively harmless.

However with '=' in math and programming, the two are too different to beequated(!!) and too close to be separated.

Because after the programming statement var = lhs

the math predicate var = lhs is typically true.

But then what happens with something like this? x = x+1

For a programmer this is common daily fare.

For a mathematician its an impossibility.

So I suggest you try it (x=x+1) on your boys.

If they think its ok, youve damaged their mathematical acumen

If not, they've yet to begin programming.

If they can answer to the effect that in some contexts its natural and in some not then of course they are very mature and/or geniuses.

I should mention that John Backus, the creator of the first hi-level language and a Turing award winner, more or less said that the assignment is the single biggest problem why programming languages are in such a mess:

http://www.thocp.net/biographies/papers/backus_turingaward_lecture.pdf
Having said that, I also need to say that most programmers dont agree with that.

The minority that do, would earlier be called 'declarative-devotees'; nowadays the fashionable term is functional programming.

My own rather fringe minority position is that Backus et al are right in denouncing the assignment. However so far the attempts at practically realizing this smack of throwing out the baby with the bathwater.

So for the time being, languages like python remain eminently practical.

Its just that they are not so good for building kids' theoretical foundations -- especially of a mathematical sort.