Why C?

D

Dik T. Winter

>
> If it isn't, the compiler is broken. An int is the machine's native
> word size; that's always been the meaning.
>
>
> Why do you need ints shorter than a word? They're inefficient, and I
> don't think anyone worries about memory usage anymore with simple
> variables.

Why do you think that on a machine a "word" is able to address all memory
(assuming a flat addressing space)?
 
S

Steve

Sorry, I'm getting tired of flames in my usual haunts. So, of course,
I feel compelled to solicit your flames here. Sometimes you just can't
win.

Keith Thompson wrote:
Why do you need ints shorter than a word? They're inefficient, and I
don't think anyone worries about memory usage anymore with simple
variables.

Sixteen-bit quantities are useful when you have lots of them. Some
programmers working outside of Microsoft are aware of this. And, hey,
you can't always rely on the optimizer.
The simple way:
#define int long long

Yes, you can redefine fundamental types - I always like
#define char unsigned char
so that char arithmetic works.

That's bound to confuse people who study your code and run into
constructs that violate the assumptions of the signed char type.

I fell into the habit of defining my types explicitly in the top-level
header for arbitrary applications: s16 for short int, u64 for unsigned
long long, and so on. Generally, I tried to ensure that I was aware of
the size of a type as I was writing code, since I figure boundary
conditions bite harder than I want to be bitten. Assuring I was
getting what I asked for is another matter, and I can only say that I
have not had the chance to port much of my code to different
architectures, so I can only say that I was preparing my software for
that eventuality.
Andrew Usher

There has been a lot of talk about the virtues of portability, on
abstracting language syntax from hardware dependencies, and I simply
cannot fathom why it is that programmers wish to remain entirely
insulated from the world their programs run in. I guess I'm just a
heretic, but as much as I like C, there are just some things that I
don't want the language doing with my code. I want to know what my
code looks like once it is compiled into assembly, I want my language
rigidly defined (operator overloading sucks), I want my types fixed
across all platforms, and in general, greater control over the language
features. printf() and all the rest should have never been made part
of the spec.

I think there should, in fact, be two specifications. One definition
of the grammar and syntax, and another for the features pertaining to
the interface with an actual system (i.e. like POSIX). Since we're
talking about types, I suggest it could be intelligent to abstract the
concept of types to a generalization that fits the primary language
specification. Thus, the inherent properties of numeric integer
quantities should be considered necessary concepts and embodied in the
grammar and syntax. And so on. As such, C takes too much for granted.


I realise that most people do not write software that absolutely must
work every time, and all the time. However, I seen to have this
curious obsessive compultion to assure that my software isn't going to
turn around and kill me at some indefinite point in the future.
Perhaps I'm wierd.

At any rate, inferior languages to C (and don't get me wrong; I like C
a lot) proliferate because people simply don't seem to want durability.


Regards,

Steve
 
K

Keith Thompson

Steve said:
I fell into the habit of defining my types explicitly in the top-level
header for arbitrary applications: s16 for short int, u64 for unsigned
long long, and so on. Generally, I tried to ensure that I was aware of
the size of a type as I was writing code, since I figure boundary
conditions bite harder than I want to be bitten. Assuring I was
getting what I asked for is another matter, and I can only say that I
have not had the chance to port much of my code to different
architectures, so I can only say that I was preparing my software for
that eventuality.

C99's <stdint.h> header covers this, with typedefs like int32_t,
uint32_t, and so forth -- and the header isn't difficult to implement
in C90. But of course none of these types are guaranteed to exist.

But the number of bits isn't always the most important aspect of a
type. If you want to count something, and you know there can't be
more than 32767 of them, just use int; if you want to measure the size
of something, use size_t.

What might be more useful is a syntax for specifying the *range* of a
type.
There has been a lot of talk about the virtues of portability, on
abstracting language syntax from hardware dependencies, and I simply
cannot fathom why it is that programmers wish to remain entirely
insulated from the world their programs run in. I guess I'm just a
heretic, but as much as I like C, there are just some things that I
don't want the language doing with my code. I want to know what my
code looks like once it is compiled into assembly, I want my language
rigidly defined (operator overloading sucks), I want my types fixed
across all platforms, and in general, greater control over the language
features. printf() and all the rest should have never been made part
of the spec.

I rarely care what the generated assembly code looks like. I care
that the code works the way I intend it to, in accordance with the
language specification. Much of the time, I wouldn't be able to read
the generated assembly language even if I bothered to examine it.
 
K

k_over_hbarc

Dik said:
Why do you think that on a machine a "word" is able to address all memory
(assuming a flat addressing space)?

That's what a 'flat addressing space' means. Machines with a non-flat
address space are obsolete I think.

Andrew Usher
 
W

Walter Roberson

Dik T. Winter wrote:
That's what a 'flat addressing space' means. Machines with a non-flat
address space are obsolete I think.

Counter example:

Silicon Graphics, IRIX 6.5, 'Challenge', 'Origin', 'Onyx' model
lines (and some other of their lines as well... but -not- O2).
Any particular binary might be 32 bit binary or a 64 bit binary.
The 32 bit binaries use c1/s2/i4/l4/p4 .
The 64 bit binaries use c1/s2/i4/l8/p8 .
That is, the size of longs and pointers differ between the two.

The machines may have ... ummm, I'm not sure what the largest ever
made was... somewhere around 1/2 terabytes of memory? The 32 bit
binaries can address 2 Gb's worth of that; the 64 bit binaries
can address everything.

As far as the 32 bit binaries are concerned, it is a flat memory
architecture, a virtual memory architecture -- the 32 bit binaries
simply can't access everything (not without going into kernel mode
and getting *really* ugly.)
 
D

Dik T. Winter

> Dik T. Winter wrote:
>
>
> That's what a 'flat addressing space' means. Machines with a non-flat
> address space are obsolete I think.

Wrong. 'Flat addressing space' means that each address corresponds to an
integer. *Not* that each corresponds to a word. Consider a 32-bit
processor with 8 GB of memory. A word is naturally 32 bits, but can not
address all of memory. On the other hand, a 64 bit long word can address
all of memory but is not the basic unit of information.
 
M

Mabden

Skarmander said:
jacob navia wrote:
[big discussion about memory requirements of C vs. VB.NET apps]
Another point is raw performance, what I will discuss in a next
post.

I'll opt out of that one, since I already know the results, and trust
everyone can draw their own conclusions about it.

That said, you're preaching to the choir. Why are you posting in
comp.lang.c about how C is good compared to other languages? This is
unlikely to enlighten anyone. Never mind that these topics are big and
old and a fertile ground for flamewars wherever they go -- in the past
the discussion might have been C vs. Pascal, or C vs. LISP, now it's C
vs. the unwashed .NET hordes, tomorrow it'll probably be something else.

"With C you can write small and fast programs" is not news to anyone.
Not even to the unwashed .NET hordes.

Hey! I washed last week...!! :-(

Well, face and pits, anyway.

C#!!!
 
S

Steve

Keith said:
C99's <stdint.h> header covers this, with typedefs like int32_t,
uint32_t, and so forth -- and the header isn't difficult to implement
in C90. But of course none of these types are guaranteed to exist.
Exactly.

But the number of bits isn't always the most important aspect of a
type. If you want to count something, and you know there can't be
more than 32767 of them, just use int; if you want to measure the size
of something, use size_t.

That's nice. What if I don't like the layer of abstraction chosen for
the definitions of [numeric] data types?
What might be more useful is a syntax for specifying the *range* of a
type.

There are probably quite a few things that various people would like.
I was merely expressing my opinion of the current approach.
I rarely care what the generated assembly code looks like. I care
that the code works the way I intend it to, in accordance with the
language specification. Much of the time, I wouldn't be able to read
the generated assembly language even if I bothered to examine it.

Yeah, well that's true too. But you must turn off certain
optimisations when you anticipate examining the assembly language.
Heavy reliance on macros can also impede the understanding of generated
code. This is by the by, however. On one level, I am simply of the
opinion that certain things about the underlying architecture ought to
be kept in mind while writing code, rather than allowing the
architecture to impose itself upon the program after the code is
written and running. Most HLLs seem to insulate the program from the
world, and hence the programmer as well.


Regards,

Steve
 
K

Keith Thompson

Steve said:
Keith Thompson wrote: [snip]
But the number of bits isn't always the most important aspect of a
type. If you want to count something, and you know there can't be
more than 32767 of them, just use int; if you want to measure the size
of something, use size_t.

That's nice. What if I don't like the layer of abstraction chosen for
the definitions of [numeric] data types?

Then define your own abstraction layer.

[snip]
Yeah, well that's true too. But you must turn off certain
optimisations when you anticipate examining the assembly language.
Heavy reliance on macros can also impede the understanding of generated
code. This is by the by, however. On one level, I am simply of the
opinion that certain things about the underlying architecture ought to
be kept in mind while writing code, rather than allowing the
architecture to impose itself upon the program after the code is
written and running. Most HLLs seem to insulate the program from the
world, and hence the programmer as well.

I tend to think of "the world" and the hardware as being at opposite
ends of the <handwaving>something-or-other</handwaving>, with the
programming language being a layer between them. I care about the
visible behavior of my code; I don't care so much about how the
compiler makes that happen.

Your mileage may vary, of course.
 
M

Michael Wojcik

Is it September again already? We have one of these stumbing about
comp.lang.cobol, too.
That's what a 'flat addressing space' means.

A "flat addressing space" means that memory locations are labelled
with consecutive integers. It says nothing about the relative sizes
of "words" and addresses.

The AS/400 has a flat addressing space - an *extremely* flat
addressing space (the "Single System Store"), where *every*
addressable entity has a unique address. It doesn't even use per-
process virtual address spaces.

Addresses on the AS/400 are 128 bits.

Yet the early AS/400s did not offer an integer type larger than 32
bits, and even recent ones do not offer one larger than 64 bits.
Machines with a non-flat address space are obsolete I think.

Your knowledge is insufficient. (For one thing, embedded systems,
which make up the vast majority of computers, often use non-flat
memory spaces, particularly Harvard architectures.) Displaying your
ignorance while arrogantly challenging the regulars - which indicates
that you have not lurked here for any time before posting - is not an
ideal way to correct the situation.
 
S

Steve

Keith said:
Steve said:
Keith Thompson wrote: [snip]
But the number of bits isn't always the most important aspect of a
type. If you want to count something, and you know there can't be
more than 32767 of them, just use int; if you want to measure the size
of something, use size_t.

That's nice. What if I don't like the layer of abstraction chosen for
the definitions of [numeric] data types?

Then define your own abstraction layer.

Maybe I will.
[snip]
Yeah, well that's true too. But you must turn off certain
optimisations when you anticipate examining the assembly language.
Heavy reliance on macros can also impede the understanding of generated
code. This is by the by, however. On one level, I am simply of the
opinion that certain things about the underlying architecture ought to
be kept in mind while writing code, rather than allowing the
architecture to impose itself upon the program after the code is
written and running. Most HLLs seem to insulate the program from the
world, and hence the programmer as well.

I tend to think of "the world" and the hardware as being at opposite
ends of the <handwaving>something-or-other</handwaving>, with the
programming language being a layer between them.

I suppose you would, considering what your background and focus seems
to be (Just going by a quick look at your home page).
I care about the
visible behavior of my code; I don't care so much about how the
compiler makes that happen.

Practical considerations are always significant, but especially so if
you intend to get useful work done. Theory is, of course, the starting
point for practical excercise. My priorities are probably different
from yours. In the context of the abstraction layer, I am finding that
the choices people have made previously when designing their
programming environments are not always helpful.

In terms of numeric data types, and their use when writing code, the
shape of, say, integer data types in a language definition like C
contributes to the overall philosophy underlying the programming
language, which in turn, shapes the language tools that people will use
to solve problems. This is not really news, though. What I am doing
is observing that the particular abstraction of data types in C
contributes towards error-prone programming constructs. I suggest this
is a result of the way the language was designed; a failure to take
into account the fact that programs do not exist independant of
reality. The idea of the virtual machine is not complete, and leads me
to provisionally conclude that developing `closed' systems (from the
perspective of information processing) in which external constraints
are masked, is representative of oversimplification in the domain of
intelligent use of computational resources.

Again, this is not necesarily absolutely critical; we *can* write
everything in LOGO if we must, but few people would want to. C is very
popular and C++ even more so because they are more efficient than
alternatives for many kinds of problem. Especially if every program is
a one-off short-lifetime solution, language choice is not so much of an
issue. In a world in which some degree of permanence as well as
correct program behavior is valued, there are ideas that are not
contained in C that mar its suitability, in my opinion.

There are mature, philosophical ideas that lend the characteristics of
reality to the tools used to interact with it. C is missing some of
them, and incidentally, I do not consider myself sufficiently learned
to identify all of them. But it doesn't help that the common languages
get in the way of such learning.

Reality is somewhat out of fashion this year, and so I do expect my
blasphemy will not go unpunished. There's no need for excessive
flaming, the priesthood will no doubt take care of that at their
convenience.
Your mileage may vary, of course.

Yeah.


Regards,

Steve
 
S

Steve

pete said:
abstract machine ?

Virtual machine is probably more correct. IIRC, an abstract machines
are generally found in liberal arts and humanities factories...
Certainly OT for c.l.c.


Regards,

Steve
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,776
Messages
2,569,602
Members
45,182
Latest member
BettinaPol

Latest Threads

Top