"Mastering C Pointers"....

R

Richard Heathfield

Roose said:
[...] the point is communication, not rigid technical definitions.

It is to facilitate effective communication that the rigid technical
definitions exist in the first place.
 
R

Roose

I do wonder why K&R didn't find it important enough to include a reference
to it in the index to the 2nd Edition....

Not to be redundant, but this is probably because it is not common in
colloquial usage among C programmers. It probably exists to give compiler
writers a convenient term for a general idea, which the regular C programmer
needn't be concerned with.
Whereas Roose's explanation of "object" wasn't quite as useful as the one
above, I was very much attracted to his ideas about anchoring the abstract
in the real world of cpu registers and such.

I'm glad you found it useful.
Perhaps it wouldn't be a bad idea to learn a little assembly language at
this point?

That might help, but you don't even need to learn a specific machine. You
can have a quite simplified model of what assembly language is and it will
be helpful in C programming. After all, it is different for each machine,
so you don't want to have _too_ specific an idea.

Basically you have a CPU which has a clock say every microsecond if you're
on a gighertz machine. A program is a stream of instructions (pretend they
are stored in 32 bits in a binary format). Pretend it is a big array of
bytes. You start at the beginning, and just drag through the stream. They
are executed as fast as possible on these discrete clock ticks. Some
instructions take more cycles than others. Adds are cheap (say they take 1
cycle), multiplies a little slower (2 cycles), divides slower (4 cycles).
These are numbers I pulled out of my ass, of course it's different on every
machine. You also have all the bitwise operators in C as single
instructions, more or less. You have floating point and integer
instructions. Character strings are arrays of integers (ASCII). Pointers
are integers as well.

There are instructions to load a word from memory into a CPU register, and
vice versa, to store a word from a CPU register to memory. Note that memory
accesses are much more expensive than arithmetic instructions (pretend 50
cycles on average). All the instructions only work on CPU registers. So to
add to integers in memory, you would have to:

1. load them both from memory to CPU registers
2. execute the add instruction and specify those two registers
3. store the result back to memory

Memory is just a big array of bytes. If you have 1 gig of memory, pretend
it is addressed 0 .. 0x3FFFFFFF. This is 2^30-1, or 1 gig. 0xFFFFFFFF is
2^32-1, which means 4 gigs. If you have heard Apple's stupid advertisements
that you can have more than 4 gigs of RAM in their PCs, that is because they
use more than 32 bits for pointers (i.e. the G5's 64-bit).

Also, you know that if's and goto's can be substituted for all for and while
loops. Loops are just a convenient abstraction. On the machine, they are
all implemented in terms of if's and goto's (called jumps in assembly
language, you jump from one instruction to another than is not sequential).
So basically the CPU drags through the instructions sequentially, until it
hits a jump instruction.

If you want to know about I/O, you'll have to learn about interrupts and
such, and I'll stop here because that is more than enough to absorb.

(Note: nitpickers will be flamed. I _know_ for a fact these things are not
all true on every machine, but is a convenient model, in the interest of
giving something _concrete_, again)
May I also point out that quoting the ANSI standard to a novice is about
as useful as replying in Swahili :)

Indeed.

Roose
 
R

Richard Heathfield

Roose said:
Give me a break.

Sure. Take a break, any time you like.
This is EXACTLY why CLC should not be just about standard C,

That appears to be your opinion, but I see no supporting evidence for your
statement.
and CLC.moderated should have that sole responsibility.

comp.lang.c.moderated is merely a moderated version of comp.lang.c

In comp.lang.c.moderated, I suspect that many of your articles would simply
not be approved by the moderator. Feel free to try to prove me wrong.
Regardless of
whether they're right for coming here, there are TONS of newbies that do.
Agreed.

And then they get these rigid, exacting answers

Not exacting. Merely exact (modulo typographical errors).
filled with irrelevant technical details,

You may consider the details irrelevant. I do not, however, pitch my answers
at your level of understanding, but at the questions actually asked by real
people. When the details matter, I give the details. When they don't, I
don't. Now, you may have considered my answer to be overly technical, but
it's a walk in the park compared to the answer I /could/ have given. I
included as much detail as I considered the OP could reasonably be expected
to take in (in one sitting), together with a little encouragement that it
really isn't as hard as he's been led to believe. I continue to maintain
that this was an appropriate response to his question.
which you would never hear in the real world.

Well, actually, I do discuss such (relevant) technical details in the real
world, as well as on Usenet. And anyway, what makes you think that Usenet
is not part of the real world?
Then they
get turned off to C and think it's for wankers.

Speculation. I've found that people often respond well to being told the
underlying truth, rather than the helicopter-joystick method of
programming: "fiddle with the program, and watch what the computer does,
because if you want it to do that again, that is the code you have to
write" is not a very satisfactory way to learn programming.
A LOT of people actually think this.

Well, I will accept that /you/ think it. Since you have already acknowledged
that you don't exist, however, your view probably doesn't count for much.
I don't doubt that you are technically correct.

Fine, so what's the problem here?
Just like I didn't in the interview thread.
Wonderful.

But your insistence on expounding on all these
trivialities

If they're such trivialities, why did you get them wrong?
obscures the real question. Getting a job, or learning how
to write a real program that does what you want it to do.

That wasn't the real question. The real question was about pointers.
It may technically, but not in colloquial usage.

Then colloquial usage is wrong, and should be changed.
If you start blathering
about objects in C at work, some people will be confused, and others might
wonder why you're being such a pedant.

Actually, I don't think I've ever confused or offended anyone in a workplace
situation by referring to objects. As for the accusation of pedantry, I
cheerfully confess. Pedants make the best programmers.
Some people might think you're
talking about objective C, or C++, or emulating object-orientation in C,
etc.

Yes, it's possible, and if they did think that, they'd be wrong. People are
wrong a /lot/. Get used to it.
Maybe for understanding the C standard. It is not vital for those who
simply wish to write successful programs,

If one wishes to write /correct/ C programs, an understanding of the
principles laid out in the Standard is vital. The easiest way to achieve
this understanding is by reading and understanding the C Standard.
me being an example of that.

We have yet to see any evidence of your ability to write successful
programs.

<snip>
 
R

Richard Heathfield

Alan said:
The concept of "object" is much clearer. (as are local battlelines :)

I do wonder why K&R didn't find it important enough to include a reference
to it in the index to the 2nd Edition....

There is an entry for "object" in the index on page 268, which references
pages 195 and 197.
May I also point out that quoting the ANSI standard to a novice is about
as useful as replying in Swahili :)

That's right. C is indeed a foreign language when you're a novice, just as
Swahili is a foreign language to a novice who is beginning to study
Swahili. And, just as the Swahili novice eventually begins to grasp the
language, so a C novice, in time, starts to make sense of C, and of the
Standard.
Thanks again, Richard.

Any time.
 
M

Mark Gordon

well it sounds simple enough, but I've heard so many things about
pointers that I thought it will be a good idea to get a book just
about pointers but I guess C books have pointer references in them as
well.

My experience is that the most common reasons for people having
difficulties understanding pointers are:

1) They think that pointers are difficult
2) The have been told that pointers are hard to understand
3) Pointers have been badly explained.

Of course, there are lots of complex things that can be done *with*
pointers, but the pointer itself is merely something that points
somewhere, and a pointer variable is just a variable that holds a
pointer.
but I'll take your advice and ask here if I have more questions.

Questions are always welcome. However, you should check the FAQ (google
will find it for you) just in case any question is already answered
there in a manner you can understand.
 
I

Irrwahn Grausewitz

Roose said:
Actually, it's best done by giving someone something to _do_, rather then
telling them information. Which is what I did.

No. You provided incorrect information and got defensive when you were
corrected.
Information that isn't relevant only confuses. Besides, the term object is
just a name, i.e. your statement is a synthetic fact (as opposed an
analytic). If you don't know what I mean, go read some philosophy.

In order to dicuss something you first have to get the terms right. If
you don't know what I mean, follow your own advice and go read some
philosophy.
I
didn't know what an object is in C until now, and I have programmed
successfully for many years, shipped many products, etc. It doesn't matter.

Hopefully I never have to use one of those products.
Nobody has ever learned anything by reading a bunch of facts. They learn by
doing. If the OP does what I told him to do successfully, he will have gone
A LOT farther toward being a professional programmer in C then if simply
reads all of the "definitions" post.

Reread the thread: the poster asked for the definition of the term
"object" in the context of the C language. Why not answer his question
instead of gibbering?
 
I

Irrwahn Grausewitz

Please don't strip attributions. It's not clear to whom you are
responding.
That's better for you. Not better for a newbie. I would love to see some
beginner ask you about C programming, and you start blathering on about
lvalues.

This is /not/ alt.comp.lang.learn.c-c++. This is c.l.c. You don't like
it? Then leave it.
..."insular little world of the C standard as a bible" ...

In c.l.c the C standard /is/ the bible. The topics and conventions of
c.l.c are /not/ defined by your personal opinions.
Damn I keep hearing echoes of my earlier posts again...

Sounds serious. You should contact a doctor.
 
L

Lew Pitcher

Sheldon said:
Apparently you have read the book. Tell me, do the words "far pointer"
and "near pointer" appear in the book? If so, don't read the book.

"The Magic Goes Away" is worth reading.

It sounds like "Mastering C Pointers" isn't worth reading at all.

P'haps Larry should ask Jerry P about C pointers ;-)

--
Lew Pitcher

Master Codewright and JOAT-in-training
Registered Linux User #112576 (http://counter.li.org/)
Slackware - Because I know what I'm doing.
 
R

Roose

Give me a break.
Sure. Take a break, any time you like.

Hilarious. Back to children's games, I see.
In comp.lang.c.moderated, I suspect that many of your articles would simply
not be approved by the moderator. Feel free to try to prove me wrong.

As would many of yours.
You may consider the details irrelevant. I do not, however, pitch my answers
at your level of understanding, but at the questions actually asked by real
people. When the details matter, I give the details. When they don't, I
don't. Now, you may have considered my answer to be overly technical, but
it's a walk in the park compared to the answer I /could/ have given. I
included as much detail as I considered the OP could reasonably be expected
to take in (in one sitting), together with a little encouragement that it
really isn't as hard as he's been led to believe. I continue to maintain
that this was an appropriate response to his question.

I'm not talking about your pointer response, which was generally good, as I
already said. I'm talking about the newsgroup in general. Read some of the
responses to newbies. See how helpful they are.

Even the first one in this thread was quite unhelpful. The guy was like "I
heard it was about Turbo C and MS-DOS -- that means you shouldn't read
it." -- with no further explanation. Do you think a newbie knows what that
means, or knows what Turbo C is (or even DOS)? A much more helpful response
would be along the lines of, "This book appears to be rooted in a particular
platform, one that happens to be outdated, and thus may confuse you with a
lot of information that you don't need to know."
Well, actually, I do discuss such (relevant) technical details in the real
world, as well as on Usenet. And anyway, what makes you think that Usenet
is not part of the real world?

Stop with the sniveling wordplay already.
Speculation. I've found that people often respond well to being told the
underlying truth, rather than the helicopter-joystick method of
programming: "fiddle with the program, and watch what the computer does,
because if you want it to do that again, that is the code you have to
write" is not a very satisfactory way to learn programming.

I agree that that's a terrible way to learn. Without the concepts, you're
just blindly hacking. That's why I specifically said that you need at least
TWO perspectives. I encourage them to read your article. My claim is that
that's not sufficient. They should read a concrete explanation like mine.
Together they will help you understand pointers.

Plus my article encouraged them to learn the abstract while being rooted in
the concrete, not to focus on the concrete at the expense of the abstract.

This is exactly like the top-post thread. You're so goddamn close-minded
that you think the only way to do everything is your own. I, on the other
hand, recognize that sometimes things needs multiple perspectives, and there
are multiple valid ways of doing things. Sometimes there is no right and
wrong, just personal preference. I know it's a lot to handle, since you
like rigid rules so much.
That wasn't the real question. The real question was about pointers.

OK, so do you think that the OP wanted just to study pointers for the sake
of dinner party conversation, or eventually write a real program? Again,
READ BETWEEN THE LINES. Jesus Christ.
Then colloquial usage is wrong, and should be changed.

Q.E.D. Another perfect example of why my underemployed flames were so
accurate. Everybody else is wrong; and you're right. Yes. You can insist
on using the term object however you want. However the rest of the world
will continue to use a more useful definition. I guess Stroustrop and
Ritchie and all the people who use the term OOP are wrong. Since, of
course, by default you should assume when someone says "object" in C++, they
mean something that takes storage, and not something with a constructor and
destructor and all that.

Let me suggest that you are the one who needs to change.
If one wishes to write /correct/ C programs, an understanding of the
principles laid out in the Standard is vital. The easiest way to achieve
this understanding is by reading and understanding the C Standard.

What percent of the total number of C programs in the world do you think are
correct in this sense?

Was your first C program correct in this sense? Would it have taken more
time and effort to make it correct, knowing what you knew then? What do you
think the OP is interested in doing, _at this stage in the game_? Do think
he is interested in reading the C standard, before having written anything
substantial or understanding pointers at a practical level?

No sarcasm there, just answer honestly.

Roose
 
A

Alan Connor

And whats wrong with learning the techncal words that help define the
language you're using? The concept of an lvalue is very useful in C.

Mark: The limit for sigs on the Usenet is 4 lines and they need to be
immediately below a "-- " line.

Fix yours or I will have to killfile you.

What Newsfeed.Com is doing amounts to nothing but spam. Ditch them.
 
A

Alan Connor

Not to be redundant, but this is probably because it is not common in
colloquial usage among C programmers. It probably exists to give compiler
writers a convenient term for a general idea, which the regular C programmer
needn't be concerned with.


I'm glad you found it useful.


That might help, but you don't even need to learn a specific machine. You
can have a quite simplified model of what assembly language is and it will
be helpful in C programming. After all, it is different for each machine,
so you don't want to have _too_ specific an idea.

Basically you have a CPU which has a clock say every microsecond if you're
on a gighertz machine.

Wouldn't that be "nanosecond"?

A program is a stream of instructions (pretend they
are stored in 32 bits in a binary format). Pretend it is a big array of
bytes. You start at the beginning, and just drag through the stream. They
are executed as fast as possible on these discrete clock ticks. Some
instructions take more cycles than others. Adds are cheap (say they take 1
cycle), multiplies a little slower (2 cycles), divides slower (4 cycles).
These are numbers I pulled out of my ass, of course it's different on every
machine.

One assumes that you washed them before posting :-|

You also have all the bitwise operators in C as single
instructions, more or less. You have floating point and integer
instructions. Character strings are arrays of integers (ASCII). Pointers
are integers as well.

There are instructions to load a word from memory into a CPU register, and
vice versa, to store a word from a CPU register to memory. Note that memory
accesses are much more expensive than arithmetic instructions (pretend 50
cycles on average). All the instructions only work on CPU registers. So to
add to integers in memory, you would have to:

1. load them both from memory to CPU registers
2. execute the add instruction and specify those two registers
3. store the result back to memory

So memory is just dumb storage. I didn't realize that.

Memory is just a big array of bytes. If you have 1 gig of memory, pretend
it is addressed 0 .. 0x3FFFFFFF. This is 2^30-1, or 1 gig. 0xFFFFFFFF is
2^32-1, which means 4 gigs. If you have heard Apple's stupid advertisements
that you can have more than 4 gigs of RAM in their PCs, that is because they
use more than 32 bits for pointers (i.e. the G5's 64-bit).

Don't get THAT at all, but that's okay for now.

In your previous post you recommended that I learn hexadecimal (which I
have a slight handle on: 0-9-F with one character to describe identify
a nibble.

By this you mean learning to convert it to decimal, right?

Also, you know that if's and goto's can be substituted for all for and while
loops.

I do?
Loops are just a convenient abstraction. On the machine, they are
all implemented in terms of if's and goto's (called jumps in assembly
language, you jump from one instruction to another than is not sequential).
So basically the CPU drags through the instructions sequentially, until it
hits a jump instruction.

Sounds like the b command in sed. (oops! The b command in sed must be
like the jump instruction in C :)

If you want to know about I/O, you'll have to learn about interrupts and
such,
and I'll stop here because that is more than enough to absorb.

Well, I knew THAT at least.
(Note: nitpickers will be flamed. I _know_ for a fact these things are not
all true on every machine, but is a convenient model, in the interest of
giving something _concrete_, again)

Singe the hair off their balls: I'm getting a lot out of this.

It's really hard to learn the basics of anything if you get too bogged down
in exactness.

Indeed.

Roose

I read this carefully, twice, consulting K&R for basic definitions.

At present I see each piece of hardware as having a control panel with
switches and dials (card) with the program deciding which switch to flip
or dial to turn, in what order...

Thanks again Roose. There is altogether too little practical talk in
the programming world, which makes it very hard for novices.
 
A

Alan Connor

No. You provided incorrect information and got defensive when you were
corrected.


In order to dicuss something you first have to get the terms right. If
you don't know what I mean, follow your own advice and go read some
philosophy.


Hopefully I never have to use one of those products.


Reread the thread: the poster asked for the definition of the term
"object" in the context of the C language. Why not answer his question
instead of gibbering?

It wasn't "gibbering", it was very helpful.

A lot more helpful than this.
 
K

Keith Thompson

Roose said:
Basically you have a CPU which has a clock say every microsecond if you're
on a gighertz machine.
[...]

To nitpick: a gigahertz machine would have a clock cycle every
nanosecond, not every microsecond.
 
C

Chris Torek

So memory is just dumb storage. I didn't realize that.

Good, because there are machines on which this is not the case. :)

C's model (of "objects" and values) does require that plain old
"dumb storage" memory *exist*, but it permits "smart storage" too,
and on machines that have it, compilers are allowed to use it.

(There is an important and interesting -- but alas, off-topic in
c.l.c -- class of machines with "atomic memory" on which various
forms of lock-free parallelism are possible.)
Don't get THAT at all, but that's okay for now.

I am not sure which part you "don't get", but be aware that this,
too, is not required of systems that will run C, and indeed, there
are systems on which it is not true, or at least only partly true.
One of them was very common only a decade ago: the 80x86 CPU in
the ordinary IBM PC has non-linear ("segmented") addressing, and
in 16-bit modes -- which are no longer used for much, but were all
one had until the 80386 came out -- a memory address was made up
of segment:eek:ffset pairs.

Of course, actually using this to good effect was slow at best,
and a lot of old C compilers for the x86 used bizarre modifiers
(spelled "far" and "near" in some cases; now more like _FAR and
_NEAR in backwards compatible modes) to let you mix "fast" code
that avoided the use of segments with "slow" code that used segments
so as to be able to break the "64K barrier" (or rather, 128K, with
64K of code and 64K of data, in a split I&D design reminiscent of
some PDP-11s, where C was originally developed).

All C really requires of memory is that it be able to store values,
and that it be "addressable" in some way. Which is where the pointers
that are in the "subject:" line here come in -- taking the address
of a memory region holding a C "object" yields a pointer to that
object.

Actually, C does impose a few more conditions on memory. In
particular, C requires that arrays be (or at least seem to be)
contiguous. An array is defined by its "element type" and a numeric
size -- in C89, an integer constant like 42 -- and an array must
be stored in memory as a contiguous sequence of that many of those
elements. Each element is itself an "object" (region of storage),
but the array as a whole is an object too, just a composed one.

If all of a given computer's memory is one big blob of bytes in
sequence, which is certainly an easy way to build a computer, then
it is equally easy to divide up this one blob any way you like.
But C does allow other arrangements, which can have add security
or reliability features for instance, as long as the resulting
memory can be grouped together to hold arrays. Other than those
arrays, there is no *requirement* that there be a neat numerical
sequencing of "all memory"; it just happens to be pretty common,
because it is so easy.

(As an aside: on modern operating systems, "memory" is usually an
illusion anyway. The OS will *simulate* a nice linear sequence of
bytes, but it does so with "pages" that are "paged in" and "paged
out" to/from some form of "backing store" -- typically on disk --
and shuffled and reshuffled without any individual program, whether
coded in C or any other language, being able to see what is happening.
The hardware must provide a certain amount of assistance to let
the OS know what a program is trying to do, and the OS then determines
whether the code is doing something "normal" and "allowed", and if
so, arranges to allow it. The OS can then also determine whether
the program is running amok, and if so, stop it. While large chunks
of OSes can be written in nothing other than strictly conforming
C code, parts *must* be written using stuff outside the C standard,
and often must be hand-coded in assembly. These include the parts
that talk with hardware to find out what the various user programs
are attempting.)
 
R

Roose

Let me preface this with some meta-comments. If your goal is to learn C, by
all means go ahead and dive right into the C language. You only learn by
making mistakes. But is a common fact in computers that you really only
master something when you have at least learned the level below it, i.e.
_what it is abstracting_. Then you see where the abstraction came from.

See these articles:

http://www.joelonsoftware.com/articles/LeakyAbstractions.html
http://biztech.ericsink.com/Abstraction_Pile.html

So if you configure OSes, it's good to learn some scripting. If you script,
good to know some compiled programming. If you program C, good to know some
assembly/computer architecture.

The idea was to give a simple concrete model that anyone can learn, but
don't get bogged down if your real goal is to learn C. Basically what I'm
describing is a von Neumann machine with some details (this is a very
influential hardware architecture that created the descendants of PCs,
roughly). Unfortunately with a cursory google search, I couldn't find any
good links for that, maybe someone else can help out here.
Wouldn't that be "nanosecond"?

Yes, whoops.
So memory is just dumb storage. I didn't realize that.

Yes, you can think of it as inert. The CPU controls everything, the
"brain". Memory is a separate unit which just stores bits. Same with the
hard disk. Notice that as the access time increases, so does the size of
the medium.
Don't get THAT at all, but that's okay for now.

The main point here was that a pointer is a C language abstraction (for an
address in memory). A pointer at the hardware level _is an integer_.
Memory is just a big array of bytes again. Say you have 1 meg of memory,
then you can just number the bytes 0 .. 1,048,575 (2^20-1). Suppose you
have a variable numFiles that is the number of files in a directory = 55.
That variable must be stored somewhere. Say it is stored at the 200,005th
byte. Then a pointer to numFiles would have the numeric value 200,004 (get
used to indexing from 0 in C). That's it. Nothing else. It's like an
array index if you think of the array as all of memory.

e.g.
int numFiles = 55;
int* pNumFiles = &numFiles; // pNumFiles is a variable of pointer
type, with the numeric value 200,004

Of course, as always, there are details, but I'm not going to clutter up
that simple concept with them. And this is very real, you can inspect
pointers in a debugger and see this (which was my suggestion).

Say you have a 32 bit register. After you learn binary/hex, you will see
why the number of different integers you can store in 32-bits is 2^32 = 4
gigs. A pointer can be thought of as an integer index, like I said. Thus
if a pointer is 32-bits, then you can only address 4 gigs of memory.
In your previous post you recommended that I learn hexadecimal (which I
have a slight handle on: 0-9-F with one character to describe identify
a nibble.

By this you mean learning to convert it to decimal, right?

You don't actually NEED to in order to do the exercise I suggested, since
many debuggers will display all values in decimal, but it helps to get into
the mindset. It is pretty simple if you are even mildly mathematical. Hex
is really a shorthand for binary, in some sense. There are probably plenty
of tutorials on the web, but afterwards you should be able to understand
these common equivalences:

2^8-1 = 255 = 0xFF = 1111 1111
2^8 = 256 = 0x100 = 1 0000 0000 = 0.25 K
2^16-1 = 65535 = 0xFFFF = 1111 1111 1111 1111 (spaces in binary added for
clarity)
2^16 = 65536 = 0x10000 = 1 0000 0000 0000 0000 = 64 K

0x1 = 0001 b
0x2 = 0010 b
0x4 = 0100 b
0x8 = 1000 b

0x1 = 0001 b
0x3 = 0011 b
0x7 = 0111 b
0xF = 1111 b

Yes, try taking a simple for loop or while loop and doing the exact same
thing with if's and goto's, if you don't see it right away. Of course this
is very bad programming practice, since loops make your logic more much
clearer to the reader.

(If you're not totally familiar with loops in general, you might want to try
a higher-level language like Python/Java/C# before attempting C.)
Sounds like the b command in sed. (oops! The b command in sed must be
like the jump instruction in C :)

Well "jump" is an assembly term for specific CPU instructions. Goto is the
equilavent in C. My point was that for and while loops compile down to
jumps and tests (instruction that return true or false, basically).
Singe the hair off their balls: I'm getting a lot out of this.

It's really hard to learn the basics of anything if you get too bogged down
in exactness.

Good, I'm glad. This confirms an observation developed from some years of
teaching. Exactness isn't necessarily the problem, but you just have to
know WHAT you need to be exact about (i.e. not irrelevant details like
rarely used terminology).

Let me close with some more high level observations. What does C add on top
of this model (besides nice syntax, I'm talking ideas here)?

1) Platform independence -- instead of writing in native assembly, you write
in C, and then compilers for every platform translate your C program into a
stream of instructions (just byte data)
2) Constructs like functions, for and while loops, if's and switches, to
organize this enormous stream of instructions
3) A strong type system -- this can be confusing at first
This is why I wanted to emphasize that pointers are just integers in
hardware. They're a C language construct. Same thing with characters/
character strings. Now floats ARE actually different in hardware, but we
won't get to that.
4) some other stuff which I don't care to think up now : )

The point of the type system is to catch mistakes at an obvious level.
Compiling a C program to see if all the types match up can catch a lot of
mistakes. It's sort of a sanity check, and it helps you structure your
program.

But again, you can read all you want, but if you can pull off the exercise
with the debugger I suggested, you will learn a whole lot.

Roose
 
R

Richard Heathfield

Roose said:
Not to be redundant, but this is probably because it is not common in
colloquial usage among C programmers.

On the contrary, it's simply a false statement. The term /is/ in the index
of K&R2.
 
S

Simon Biber

Richard Heathfield said:
Yes, a typo, already corrected elsethread. No, I didn't notice it myself.

Yes, sorry, the elsethread had not yet reached my news server when I replied.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,581
Members
45,056
Latest member
GlycogenSupporthealth

Latest Threads

Top