Teaching kids to program (in Java)

A

Arne Vajhøj

The former is a primitive, while the latter is a reference to an object of
type Integer. This is covered in any introductory tutorial.

Yes. And every first year CS student will understand that.

But that does not mean that a 12 year old will.

Arne
 
A

Arne Vajhøj

No, he got exactly what the language spec says he should get.

It is not obvious to anyone who has not run across it before what
is happening in this case. Leading zeroes are used in the RW, and
never mean octal (that I have ever seen).

But can you find a programming language that does not require
learning some concept that is not used outside programming.

I doubt it.

Arne
 
G

glen herrmannsfeldt

(snip, someone wrote)
(snip, then I wrote)
(and also wrote)
My first language, mostly, was IBM Fortran IV, Fortran 66 with some
useful extensions. Much has been added since, including the most
recent 2008 standard. Fortran 66 is relatively simple, but with
some strange features left from earlier systems. Still, I didn't
have much trouble learning it during the summer before 9th grade.
(The IBM reference manual was my 8th grade graduation present.)
C is rather simple.

It is, but you have to understand pointers earlier than with
most other languages.
But explaining what is going on with incorrect programs
is not so fun.

Well, many languages have that problem, in many strange ways.

One that I remember from many years ago, watching others learn
Fortran. In the case of Fortran, it sometimes isn't obvious to
new programmers that you need to dimension an array in a called
subroutine. If you forget, it turns out that the compiler usually
can't detect it, and instead you die in a strange way at run time.
(The compiler instead believes that you passed the address of
a function, and jumps into your array.)

With the appropriate subset, it doesn't seem to me so bad. The base
language, as with C excluding the library, is fairly simple.

As someone mentioned, you have the complication of octal constants,
but it isn't that hard to learn about that one.
Medium size language with huge standard library.
I would say less quirks than most languages.

The library is big, but with a small subset you can do the usual
things that beginning programmers need to do.

-- glen
 
G

Gene Wirchenko

On 4/13/2012 11:39 AM, Gene Wirchenko wrote:
[snip]
It is not obvious to anyone who has not run across it before what
is happening in this case. Leading zeroes are used in the RW, and
never mean octal (that I have ever seen).

But can you find a programming language that does not require
learning some concept that is not used outside programming.

I doubt it.

Not the point. The leading-zero octal issue is that something
that appears mundane actually has a special meaning. I have seen
various ways for numeric literals to be represented in code, and
almost all of them have some weird characters to clue one in.
Examples:
=F'100' /360 & /370 Assembler fullword integer
1234H Z-80 assembler hexadecimal constant
0x1234 C's hexadecimal representation

If I had not heard of the leading-zero octal gotcha and someone
told me about it, it sounds that silly that I would want some proof.

Sincerely,

Gene Wirchenko
 
M

Mark

On Thu, 12 Apr 2012 04:36:19 -0700, Roedy Green

Thanks for all the posts.

[-snip useful and interesting post-]

I should have been more clear in my original post but I am not talking
about teaching a large group of kids in a school environment. I am
wanting to teach one or more of my own kids a bit of programming at
home because their schools don't. IT there consists of learning a bit
about Microsoft Office and making pretty models out of cardboard!

My oldest son was briefly interested in programming until he found out
it involved some work and rapidly gave up. My younger son seems more
motivated at present but may go the same way if he can't write a
Minecraft-like game in 5 minutes ;-)

Trouble is I know little about writing games at all as I have never
really done this before. I would like to start with something with a
large payback to effort ratio but I don't really have a good idea for
this.

I selected Java since I know a bit about it, it's easily used within
an IDE and is not difficult to encourage good programming practices.

PASCAL is also a good language but I haven't done any PASCAL for
years.

I wan't to avoid learning a new language myself so this rules out a
lot of options.

If you locate your essay I would be grateful for a copy. And I doubt
I will pour orange juice over any of the kids ;-)
 
G

Gunter Herrmann

Hi!
Why isn't the first element of an array at position 1?

Why is there the inconsistency between Java (0 based) and
JDBC (1 based)?

Consistency looks slightly different.

Gunter
 
A

Arne Vajhøj

Not the point. The leading-zero octal issue is that something
that appears mundane actually has a special meaning.

But it is not particular special. All languages has that
type of constructs.

One of the more common is probably that there is a max value
for an integer.

Most people know what an integer is, but when they switch to
a (traditional) programming language, then the definition is
suddenly different. And so is the behavior of concepts like
add and multiply.

Arne
 
G

Gene Wirchenko

But it is not particular special. All languages has that
type of constructs.

That is a particularly nasty one since many languages have
"warnings" (delimiters) of some sort.
One of the more common is probably that there is a max value
for an integer.

Not a surprise if you learn about datatypes.
Most people know what an integer is, but when they switch to
a (traditional) programming language, then the definition is
suddenly different. And so is the behavior of concepts like
add and multiply.

Behaviour of datatypes should be a very basic part of using them.
How else do you know which one to select? (If instructing, I would
define the datatype. "An int can hold an integer value in the range
of <low> to <high>. If you try to assign a value outisde of this
range, then <result>." and so on.)

I really do not like gotchas. They can waste a lot of time. I
really do not like the attitude of "Oh, well!" about them either.

Sincerely,

Gene Wirchenko
 
A

Arne Vajhøj

That is a particularly nasty one since many languages have
"warnings" (delimiters) of some sort.


Not a surprise if you learn about datatypes.


Behaviour of datatypes should be a very basic part of using them.
How else do you know which one to select? (If instructing, I would
define the datatype. "An int can hold an integer value in the range
of<low> to<high>. If you try to assign a value outisde of this
range, then<result>." and so on.)

I really do not like gotchas. They can waste a lot of time. I
really do not like the attitude of "Oh, well!" about them either.

I can not see a big difference between that and knowing that
an integer starting with zero is being interpreted as
being in octal.

One need to know the tool one uses.

Arne
 
G

Gene Wirchenko

On 4/17/2012 10:57 PM, Gene Wirchenko wrote:
[snip]
I really do not like gotchas. They can waste a lot of time. I
really do not like the attitude of "Oh, well!" about them either.

I can not see a big difference between that and knowing that
an integer starting with zero is being interpreted as
being in octal.

I can. One reads about datatypes for a language, and the first
thing that comes to mind is what values is it a collection of. Then,
comes operations.

One does not expect common things to be redefined without notice.
That is what the octal notation does. There is also a good reason for
using leading zeroes (alignment).
One need to know the tool one uses.

Certainly.

Sincerely,

Gene Wirchenko
 
A

Arved Sandstrom

On 4/17/2012 10:57 PM, Gene Wirchenko wrote:
[snip]
I really do not like gotchas. They can waste a lot of time. I
really do not like the attitude of "Oh, well!" about them either.

I can not see a big difference between that and knowing that
an integer starting with zero is being interpreted as
being in octal.

I can. One reads about datatypes for a language, and the first
thing that comes to mind is what values is it a collection of. Then,
comes operations.

One does not expect common things to be redefined without notice.
That is what the octal notation does. There is also a good reason for
using leading zeroes (alignment).
One need to know the tool one uses.

Certainly.

Sincerely,

Gene Wirchenko

I'm with Arne on this one. I expect programmers using a language to at
least thoroughly understand the datatypes for a language. Granted,
leading zeros are a pretty crappy prefix choice, which is why a lot of
languages use something else, but a diligent _learning_programmer should
have discovered this crappy choice when reading about literals.

AHS
 
G

Gene Wirchenko

On Thu, 19 Apr 2012 20:15:26 -0300, Arved Sandstrom

[snip]
I'm with Arne on this one. I expect programmers using a language to at
least thoroughly understand the datatypes for a language. Granted,
leading zeros are a pretty crappy prefix choice, which is why a lot of
languages use something else, but a diligent _learning_programmer should
have discovered this crappy choice when reading about literals.

Odd. You are agreeing with ME.

Sincerely,

Gene Wirchenko
 
L

Lew

Gene said:
Arved Sandstrom wrote:

[snip]
I'm with Arne on this one. I expect programmers using a language to at
least thoroughly understand the datatypes for a language. Granted,
leading zeros are a pretty crappy prefix choice, which is why a lot of
languages use something else, but a diligent _learning_programmer should
have discovered this crappy choice when reading about literals.

Odd. You are agreeing with ME.

Then you and Arne are in agreement.

Leading zeroes to represent octal values weren't added to the C language family "without notice" at all, but with abundant notice. Arne is simply saying that one must learn the programming language if one wishes to use it. This includes reading the documentation, wherein such notice is offered.

Computer programming uses all sorts of terms and notations in ways different from ordinary usage ("method", "call", "object", "integer", "%", "@"). Itis incumbent upon one learning a programming language to learn the specific semantics and syntax, and complaints that it is unlike other languages (programming or otherwise) are feckless.
 
G

Gene Wirchenko

Gene said:
Arved Sandstrom wrote:

[snip]
I'm with Arne on this one. I expect programmers using a language to at
least thoroughly understand the datatypes for a language. Granted,
leading zeros are a pretty crappy prefix choice, which is why a lot of
languages use something else, but a diligent _learning_programmer should
have discovered this crappy choice when reading about literals.

Odd. You are agreeing with ME.

Then you and Arne are in agreement.

Leading zeroes to represent octal values weren't added to the C language family
"without notice" at all, but with abundant notice. Arne is simply
saying that one must learn the programming language if one wishes to
use it. This includes reading the documentation, wherein such notice
is offered.

Not abundant notice if it has been missed by so many. It is a
violation of the Law of Least Astonishment.
Computer programming uses all sorts of terms and notations in ways different
from ordinary usage ("method", "call", "object", "integer", "%",
"@"). It is incumbent upon one learning a programming language to
learn the specific semantics and syntax, and complaints that it is
unlike other languages (programming or otherwise) are feckless.

Quite true. Learning those terms is part of the basics of
programming. How a particular language does something is not.

If I were to create a programming language, it would be
reasonable for me to expect that people would know what "method",
"call", etc. mean. It would not be so for something idiosyncratic to
my language.

If I were considering breaking with general practice on
something, perhaps, I should reconsider or document it very well.

At various times, I have had (and continue to have) considerable
difficulties with various features of various programming languages,
not because any given feature is all that difficult, but because of
the difficulty, if not impossibility, of getting a clear statement of
how the feature works. You know that programmers tend to not like
documenting, right?

I wind up having to guess, document the results, refine, and
repeat. This is very slow.

Gratuitous changing of behaviour is a gotcha. Gotchas waste
time.

Sincerely,

Gene Wirchenko
 
A

Arne Vajhøj

(snip, someone wrote)

(snip, then I wrote)

(and also wrote)

My first language, mostly, was IBM Fortran IV, Fortran 66 with some
useful extensions. Much has been added since, including the most
recent 2008 standard. Fortran 66 is relatively simple, but with
some strange features left from earlier systems. Still, I didn't
have much trouble learning it during the summer before 9th grade.
(The IBM reference manual was my 8th grade graduation present.)

I started with Fortran V aka 77.

Still a simple language and maybe even easier to learn than IV/66.
It is, but you have to understand pointers earlier than with
most other languages.
True.


Well, many languages have that problem, in many strange ways.

But languages that allow memory overwrites can be really nasty.
The library is big, but with a small subset you can do the usual
things that beginning programmers need to do.

java.lang, java.io and java.util could bring one a good
step forward.

Arne
 
A

Arne Vajhøj

I can. One reads about datatypes for a language, and the first
thing that comes to mind is what values is it a collection of. Then,
comes operations.

One does not expect common things to be redefined without notice.
That is what the octal notation does. There is also a good reason for
using leading zeroes (alignment).

Usually the section before or after "data types" is called
"literals". Not a good idea to skip that.

And a lot easier to understand than "behavior of data types", which
is frequently not even mentioned in a beginner book.

Arne
 
A

Arne Vajhøj

Gene said:
Arved Sandstrom wrote:

[snip]

I'm with Arne on this one. I expect programmers using a language to at
least thoroughly understand the datatypes for a language. Granted,
leading zeros are a pretty crappy prefix choice, which is why a lot of
languages use something else, but a diligent _learning_programmer should
have discovered this crappy choice when reading about literals.

Odd. You are agreeing with ME.

Then you and Arne are in agreement.

Leading zeroes to represent octal values weren't added to the C language family
"without notice" at all, but with abundant notice. Arne is simply
saying that one must learn the programming language if one wishes to
use it. This includes reading the documentation, wherein such notice
is offered.

Not abundant notice if it has been missed by so many.

If you search programming fora for problems relating to:
- max int values
- integer division
- FP inaccuracy
- octal
then I think you will see that octal is not a common problem compared
to other language features.
from ordinary usage ("method", "call", "object", "integer", "%",
"@"). It is incumbent upon one learning a programming language to
learn the specific semantics and syntax, and complaints that it is
unlike other languages (programming or otherwise) are feckless.

Quite true. Learning those terms is part of the basics of
programming. How a particular language does something is not.

If I were to create a programming language, it would be
reasonable for me to expect that people would know what "method",
"call", etc. mean. It would not be so for something idiosyncratic to
my language.

That seems to be a rather arbitrary division.

If you look at languages weighted after current use, then I think you
will see that octal constants are used more than call keyword.

Arne
 
M

Martin Gregorie

java.lang, java.io and java.util could bring one a good step forward.
java.lang and java.util are fine, but java.io has always struck me as
needlessly quirky. Coming, as I did, from an assembler/C/Algol/COBOL
background it was by far the most difficult part of Java to get my head
round.
 
M

Martin Gregorie

If you search programming fora for problems relating to:
- max int values
- integer division
- FP inaccuracy
- octal then I think
you will see that octal is not a common problem compared to other
language features.
I've always out that down to hardware changes. Way back when machines
using 6 bit ISO characters (ICL 1900 mainframes, Elliott scientific
boxes) I used to use octal all the time and didn't recall ever meeting
hex, which I first noticed after the switch to byte-oriented
architectures. I think that made sense: the 1900 used a 24 bit work that
split into 4 6-bit characters so octal works well for bit representations
of both words and characters. Hex would be far less useful.

OTOH Octal is a bad fit with a byte-oriented architecture for exactly
thew same reasons.

So, back when C was specified, it made sense to have both hex and octal
bit representations because it was being used on both byte and word
oriented hardware (didn't some early DEC kit use word and character
lengths that were multiples of 3 rather than 4?) but now, with the almost
universal adoption of byte-oriented architectures there's little reason,
other than historic, to use octal notation.
 
A

Arne Vajhøj

I've always out that down to hardware changes. Way back when machines
using 6 bit ISO characters (ICL 1900 mainframes, Elliott scientific
boxes) I used to use octal all the time and didn't recall ever meeting
hex, which I first noticed after the switch to byte-oriented
architectures. I think that made sense: the 1900 used a 24 bit work that
split into 4 6-bit characters so octal works well for bit representations
of both words and characters. Hex would be far less useful.

Same with CDC NOS.
OTOH Octal is a bad fit with a byte-oriented architecture for exactly
thew same reasons.

So, back when C was specified, it made sense to have both hex and octal
bit representations because it was being used on both byte and word
oriented hardware (didn't some early DEC kit use word and character
lengths that were multiples of 3 rather than 4?) but now, with the almost
universal adoption of byte-oriented architectures there's little reason,
other than historic, to use octal notation.

True.

But the history is still there.

Arne
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top