Syntax for union parameter

J

Jorgen Grahn

"Dennis and his friends" ... multiple people involved ... a specific
purpose in mind ... committee.

I'm not sure if you're joking, but the fact that you don't live alone
in a cave in a desert doesn't imply you're part of an ISO committee.

I don't actually know what happened at Bell Labs in the early 1970s,
but I /do/ know that good programmers talk to each other, and allow
ideas from the outside to affect their designs.

/Jorgen
 
D

David Brown

On 02/07/2014 01:41 PM, Rick C. Hodgin wrote:
The Microsoft authors could've agreed with me, but for whatever reason
chose to code in a manner contrary to the one I prescribe.
Do you have any reason why you would have chosen to code in a manner
contrary to the one that you yourself have prescribed?
No - I mean if you were creating the assembly code yourself, either
manually or by the use of RDC, rather than relying upon either MS or GCC
to create it.
I still don't understand. How have I "chosen to code in a manner contrary
to the one that [I myself] prescribed"?

You suggested that perhaps MS agreed with you about specifying a
particular order of evaluation, and then decided to implement the
opposite order in their compiler "for whatever reason". James was
trying to ask you if you could think of a reason why /you/ might specify
one method then implement the opposite. If you can't imagine such a
reason for yourself, why do you think it is a realistic possibility for MS?

AH! Thank you.

One reason: Because it had been done a particular way previously when
Microsoft acquired some company (that wrote a better compiler), and they
then released it under the Microsoft name without taking out the few parts
they disagreed with (because they didn't deem them important enough
issues to slow down public sales).

That would be a possible logical reason. Of course, it is not true in
this particular case (since MS don't have a specific order of evaluation
in their compiler), but yes, sometimes sales folk can force the
technical folk to make illogical decisions.

Here's an interesting bit of history for you regarding MS's C compiler
and another unspecified point in the C standards. The C standards do
not define the bit ordering in bitfields (sometimes the ABI for a target
platform defines the ordering, but DOS/Windows on x86 did not define
it). /Most/ compilers follow the convention of LSB being the first bit
for little-endian targets and the MSB being the first bit for big-endian
targets. Some big-endian targets also use the LSB as the first bit.
But early versions of MS's C compiler used the MSB as the first bit,
even though the x86 is little-endian - I don't know of any other
compiler that does that. But at one point, MS decided that they would
follow the conventions used by others and changed over to LSB for bit 0.
They did this between two minor revisions of their software - it
wasn't even a major upgrade. Any code that relied on the bit ordering
would break across those two versions.


While I disagree about your opinion that the C standards should specify
the ordering of operations between a pair of sequence points, I would
prefer them to have specified the ordering of bit fields.
 
D

David Brown

I suspect that requirement is equivalent to the halting problem, and

I just want to point out that nowhere did I claim that C *should* be a safe
language. In fact, I stated just the opposite - that C is not and never
will be a safe language.
[/QUOTE]

That's a fair point. But I was merely saying why I think C could not
be, and/or should not be, a "safe" language in this sense.

I also don't think D is "safe" in this sense, though it is undoubtedly a
lot "safer" than C. (I have never tried D, but I read a fair bit of the
documentation for it a few years ago.) I don't think /any/ language can
be completely "safe" - as you get "safer", you trade power, flexibility,
and efficiency for decreasing gains in safety. But there is no doubt
that one can move a good way along the path to safety from C without a
big sacrifice - both C++ and D are examples of this.
 
R

Robbie Brown

The idea that something can be 'standardized' and still allow
flexibility of implementation sounds like wanting your cake and eating
it to me ... the result, well I tried to count the number of times the
word 'undefined' appeared in n1256.pdf I got to 57 in first 100 pages
then decided to do something more productive.

Just for fun I thought I'd try the same thing with jls7.pdf which is the
spec for Java SE 7 Edition ... the word 'undefined' appears twice in 641
pages. Don't get excited, it's just a bit of fun but it does illustrate
what I mean about trying to standardize by committee.
 
D

David Brown

A standard should define how things operate in all cases. It should
define specific behavior that can be relied upon no matter the platform,
no matter the implementation, no matter the circumstances. And in all
cases, for all times, and in every situation, every last one of those
standards should be allowed to be violated through specific options
which purposefully, by design, enable or disable certain features.
These options should be determined by specific hardware advantages, or
limitations, thereby requiring or lending themselves naturally toward
their ends.

The fact that C leaves things up in the air is absolutely mind blowing
to me. How could a computer language be so poorly written? It is
absolutely essential that nothing be left to chance or personal desires
for implementation. The standard should define it all, and then anyone
who wants to deviate is free also to do so.

I think one of the troubles you are having here is understanding the
scope of C and its targets. When you are limiting your language to
specific platforms, such as x86 and ARM, you can make tighter
specifications. As an example, you can say that "int" is always 32-bit,
"char" is always 8-bit, numbers are always little-endian, and parameters
are always passed on the stack right-to-left. But C has to target a
huge range of devices. There are processors that cannot address
anything smaller than 32-bit. There are processors whose "int" is
naturally 64-bit, and processors whose "int" would be most efficient at
8-bit (though C says "int" must be at least 16-bit). Then there are
processors whose register sizes are 24-bit, or 40-bit. I have worked
with chips that have no stack, and even one with no ram (it was quite a
simple program). There is no possible way that all the details of the
language behaviour in C could be tightly specified while still making it
possible to generate efficient code on such wide ranges of platforms.


If /I/ were to define a new language today, then I would probably
specify it more tightly than C does - though I would still leave things
like evaluation order open to the implementation. And I would make
additional demands on the systems that support it - I would insist that
they support integers of size 8, 16, 32, and 64 bits, for example (using
multiple instructions as necessary). But the flexible nature of C means
that many things have to be left open in the specifications.
 
B

Ben Bacarisse

Robbie Brown said:
The idea that something can be 'standardized' and still allow
flexibility of implementation sounds like wanting your cake and eating
it to me

Well, I don't see that as incoherent, but I don't think there's much
value in arguing over the word.
... the result, well I tried to count the number of times the
word 'undefined' appeared in n1256.pdf I got to 57 in first 100 pages
then decided to do something more productive.

You will be missing loads then, because all the implementation defined
things also meet your criterion. However, appendix j has a handy
summary.

But I wonder how far you would want to go. It seems you'd like a fixed
evaluation order. OK. What about making accesses outside of an array
defined? What about simply constructing an invalid pointer? How about
making arithmetic overflow defined? Calling a function with the wrong
prototype in scope? Modifying a object defined as const qualified?
Using a misaligned pointer? Using the relational operators on pointers
into different aggregates? I'm not so much interested in the actual
answers, of course, but I would like to know what you think the coherent
approach to all these kinds of these things would have been.

<snip>
 
B

Ben Bacarisse

Rick C. Hodgin said:
The fact that C leaves things up in the air is absolutely mind blowing
to me. How could a computer language be so poorly written? It is
absolutely essential that nothing be left to chance or personal desires
for implementation. The standard should define it all, and then anyone
who wants to deviate is free also to do so.

That sounds crazy. When you say "anyone", presumably you mean the
compiler writers. If they can deviate in unconstrained ways, how you
write C at all? The language become 100% "defined", but you can't rely
on that definition. I must have misunderstood.
 
B

BartC

Robbie Brown said:
On 08/02/14 09:12, Robbie Brown wrote:

Just for fun I thought I'd try the same thing with jls7.pdf which is the
spec for Java SE 7 Edition ... the word 'undefined' appears twice in 641
pages. Don't get excited, it's just a bit of fun but it does illustrate
what I mean about trying to standardize by committee.

Yet, a typical JVM might be implemented in C (or if not C, then C++). It
seems all those hundreds of 'undefined's didn't put them off.

It is also odd how such a well-specified language as Java should rely so
much on such an apparently under-specified language as C.

Now it is possible that, when you are not obliged to directly target a real
machine, then writing a solid specification is much easier. For example,
Java's primitive types are always 8 (byte), 16 (short), 32 (int) and 64-bit
(long) integers; try and find anything near as definitive in the C standard
(and in implementations, long can just as easily be 32 or 64-bits, or
probably even 91 and a half!).

Java considers itself above having to deal with unsigned types too, another
massive simplification. In a real, efficient implementation however,
unsigned types are enormously helpful.

Now it is possible that a new, modern, streamlined and tidied-up version of
C, minus it's huge amount of historical baggage, could be created, and
targeted at at least PC-class machines with well-defined word-sizes and
capabilities; then a specification with much fewere 'undefined's could be
written.

But no-one seems interested in doing that (or they end up creating C++,
Java, or Go). (I've experimented with this but I'm also interested in
replacing its syntax. So effectively another language still, even though the
result is not much higher-level, and doesn't do anything that different,
compared with C.)
 
R

Robbie Brown

snip


You will be missing loads then, because all the implementation defined
things also meet your criterion. However, appendix j has a handy
summary.

But I wonder how far you would want to go. It seems you'd like a fixed
evaluation order. OK. What about making accesses outside of an array
defined? What about simply constructing an invalid pointer? How about
making arithmetic overflow defined? Calling a function with the wrong
prototype in scope? Modifying a object defined as const qualified?
Using a misaligned pointer? Using the relational operators on pointers
into different aggregates?

Good grief I have no idea, really, I'm coming (back) to C after years of
using a language that affords me the luxury of focusing all my brain
power on solving the problem at hand rather than having to be wary of
overflowing array bounds or dereferencing a null pointer or any of the
other million or so things you apparently need to be aware of. Let's
call it culture shock.
I'm not so much interested in the actual
answers, of course, but I would like to know what you think the coherent
approach to all these kinds of these things would have been.

Do you think I have the answers? Not likely, I'm just stating my
impression after a couple of weeks writing C code again.

What is the ultimate solution to the problem of supporting every
architecture past and present and every line of code written 10, 20, 30,
40 years ago and every possible interpretation of some phrase (like i++)
.... make *everything* undefined? that's not a specification is it?

If you look up the definition of the word 'specification' you see words
like 'explicit' and 'exact' and 'essential characteristics' etc etc.
Some might say that if anything in a specification is left undefined
then, technically, it isn't a specification. I'm not so rigid and
understand that C is a different animal to Java but I just get an uneasy
feeling that as soon as a difficult decision is required the default
solution is to leave whatever it is 'undefined'

A more coherent solution might be to draw a line in the sand and come up
with a far more concrete definition of the language. I can't see this
happening until you can reprogram human nature however and I don't
believe that it will ever be possible 'by committee'

This language certainly exercises the old gray matter though which can't
be bad.
 
R

Rick C. Hodgin

I think one of the troubles you are having here is understanding the
scope of C and its targets. When you are limiting your language to
specific platforms, such as x86 and ARM, you can make tighter
specifications. As an example, you can say that "int" is always 32-bit,
"char" is always 8-bit, numbers are always little-endian, and parameters
are always passed on the stack right-to-left.

You could say that in C also, and then for specific implementations allow
an override to alter it for the machine-specific quirks (a 9-bit word, for
example).
But C has to target a
huge range of devices. There are processors that cannot address
anything smaller than 32-bit. There are processors whose "int" is
naturally 64-bit, and processors whose "int" would be most efficient at
8-bit (though C says "int" must be at least 16-bit).

So? I'm developing code in a computer language. I EXPECT it to behave
a certain way. I don't expect to have to bend to the peculiarities of
a particular machine. In fact, I should not even care about them in
most cases.

The idea that a C developer should know the mechanics of the implementation
of the language in ALL cases is crazy. And relying upon the idea that a
particular quantity must be "at least" so many bits is insane.

Fixed, rigid requirements should be defined and adhered to. And the CPU
designers, should they wish to offer a new product, will look at their
audience and determine if their minimally-readble 32-bit quantity CPU is
actually a good idea or not. The market would sort that out pretty quick.
Then there are
processors whose register sizes are 24-bit, or 40-bit. I have worked
with chips that have no stack, and even one with no ram (it was quite a
simple program). There is no possible way that all the details of the
language behaviour in C could be tightly specified while still making it
possible to generate efficient code on such wide ranges of platforms.

So what? It is the requirement of the C language authors for those CPUs
to figure out the mechanics. I'm writing for C, not for a machine.

I'm frankly amazed C developers have tolerated this.
If /I/ were to define a new language today, then I would probably
specify it more tightly than C does - though I would still leave things
like evaluation order open to the implementation. And I would make
additional demands on the systems that support it - I would insist that
they support integers of size 8, 16, 32, and 64 bits, for example (using
multiple instructions as necessary). But the flexible nature of C means
that many things have to be left open in the specifications.

Flexibility can still exist when you have rigid specs. The flexibility
simply, at that point, exists outside of the specs, as per overrides which
render parts of it contrary to specified behavior. On a platform without
a particular feature (no integer engine), the underlying mechanics mean
that everything must be done with floating point. The C developer should
never see ANY side effect of that in their properly written C program.
The compiler should hide every ounce of that away so there are no
variations at all. Communicating between machine X and machine Y, through
C's protocols, in that way, should always yield correct results. Only
when a program is compiled outside of the C specs (as through an explicit
switch of some sort) should it exhibit alternate behavior.

This design (1) affixes a rigid spec, and (2) allows anyone to do anything
they want or need to do for a piece of hardware that doesn't operate like
other hardware due to constraints, or enhancements.

Best regards,
Rick C. Hodgin
 
R

Rick C. Hodgin

That sounds crazy. When you say "anyone", presumably you mean the
compiler writers.
Correct.

If they can deviate in unconstrained ways, how you write C at all?

You have to enable their extensions. For example, use:

xcc -switchA -switchB myfile.c

And now I've thrown switches A and B which change the default integer size
from 32-bits to 16-bits, and the default character size from 8 bits to 4
bits, because the xcc compiler, written for CPU xyz, knows about these
unique requirements of the hardware. However, if I use this:

xcc myfile.c

In this case my program compiles as per the specs, with a default 32-bit
integer, and a default 8-bit character (were those values actually hard-
coded into the specs for those quantities).
The language become 100% "defined", but you can't rely on that definition.
I must have misunderstood.

You can rely upon that definition, it's just that when you use switches
that are specific to that compiler, then you change the default behavior.

Best regards,
Rick C. Hodgin
 
B

Ben Bacarisse

Rick C. Hodgin said:
So? I'm developing code in a computer language. I EXPECT it to behave
a certain way.

That's why the language standard is so important.
I don't expect to have to bend to the peculiarities of
a particular machine. In fact, I should not even care about them in
most cases.

The idea that a C developer should know the mechanics of the implementation
of the language in ALL cases is crazy.

Sure, and they don't -- they need to know the language specification.

Is the C specification complex? Yes. That's largely down to its
history. Does that make C unsuitable for a large number of programmers?
Yes, I think it does. C seems to hold a mysterious fascination for many
people -- you're not a real programmer unless you can use C -- but there
are dozens of more productive ways to write most applications these
days.

The position is actually cyclic. There's a lot C code out there written
by people who've assumed a particular implementation's behaviour is what
the language guarantees. The standard can't tighten up the on
specification without breaking that code, or, more likely, having the
compiler vendors ignore the change because it would break their
customers' code.

<snip>
 
R

Rick C. Hodgin

Sure, and they don't -- they need to know the language specification.

Wrong. Developers need to know if on this computer the int size is 16-bits,
or 32-bits, or 64-bits, or a gazillion bits, which means they need to know
something other than things related to C alone, and even the compiler alone,
but they need to know about the mechanics of the machine's underlying
architecture ... and that's just wrong to impose upon every developer in
that way.

WAY too much information for a developer to have to be consider given that
he is writing code in a computer language, and not at the machine level.

RDC will have rigid types defined for char, short, int, long, and others.
They will never change on any platform, apart from command line switches
which allow extensions not specified in the spec.

The more I learn about C, the more I realize how horrid this thing is I've
been using all these years. I am so thankful I never knew about its rusty
undersides or I never would've devoted so much time and energy into coding
for it.

Best regards,
Rick C. Hodgin
 
B

Ben Bacarisse

Robbie Brown said:
Good grief I have no idea, really, I'm coming (back) to C after years
of using a language that affords me the luxury of focusing all my
brain power on solving the problem at hand rather than having to be
wary of overflowing array bounds or dereferencing a null pointer or
any of the other million or so things you apparently need to be aware
of. Let's call it culture shock.


Do you think I have the answers? Not likely, I'm just stating my
impression after a couple of weeks writing C code again.

Sorry, I didn't mean to put you on the spot. I wanted to point out that
avoiding what you call incoherence is no simple matter.
What is the ultimate solution to the problem of supporting every
architecture past and present and every line of code written 10, 20,
30, 40 years ago and every possible interpretation of some phrase
(like i++) ... make *everything* undefined? that's not a specification
is it?

Oh, come on. No one is suggesting that.
If you look up the definition of the word 'specification' you see
words like 'explicit' and 'exact' and 'essential characteristics' etc
etc. Some might say that if anything in a specification is left
undefined then, technically, it isn't a specification. I'm not so
rigid and understand that C is a different animal to Java but I just
get an uneasy feeling that as soon as a difficult decision is required
the default solution is to leave whatever it is 'undefined'

Really? Based one what evidence? The people who make these decisions
are not stupid.
A more coherent solution might be to draw a line in the sand and come
up with a far more concrete definition of the language.

Or to define a new one. Java, for example. But here's the thing...
Why are you using C now? What has driven you from the world of
well-defined behaviours into this quagmire of undefinedness? What
benefits does C have that might outweigh all these negatives? Do any
of them derive from the way C is defined?
I can't see
this happening until you can reprogram human nature however and I
don't believe that it will ever be possible 'by committee'

No, and not because "a committee" is some horrid thing you don't like,
but because a good committee represents all the stake-holders in the
language, in such a way that none can be thrown under a bus for the sake
of some other interest.
 
R

Rick C. Hodgin

The more I learn about C, the more I realize how horrid this thing is I've
been using all these years. I am so thankful I never knew about its rusty
undersides or I never would've devoted so much time and energy into coding
for it.

It is with renewed vigor I pursue RDC. I am thankful to the members of this
group for sticking so strongly to the peculiarities of C, voicing them in
this forum. I never knew. I never knew.

Best regards,
Rick C. Hodgin
 
B

Ben Bacarisse

Rick C. Hodgin said:
You can rely upon that definition, it's just that when you use switches
that are specific to that compiler, then you change the default
behavior.

I see. I thought you were saying something significant. I don't think
I've seen a language that does not work exactly like that, including C.
 
B

BartC

Rick C. Hodgin said:
Wrong. Developers need to know if on this computer the int size is
16-bits,
or 32-bits, or 64-bits, or a gazillion bits, which means they need to know
something other than things related to C alone, and even the compiler
alone,
but they need to know about the mechanics of the machine's underlying
architecture ... and that's just wrong to impose upon every developer in
that way.

I used to be a lot more critical of C than now. But a couple of years ago I
took a sizeable C program (some 20kloc) running on x86/Windows, and compiled
it on ARM/Linux. It worked first time, without changing a line of code
(afaicr).

From then on I've been a lot more impressed and less critical! (But still
fighting a private battle against its syntax and everything that makes
coding harder than it need be.)
WAY too much information for a developer to have to be consider given that
he is writing code in a computer language, and not at the machine level.

RDC will have rigid types defined for char, short, int, long, and others.

You can impose rigid types on C with some effort. If the hardware can't
support a particular width, then you will have the same problem with RDC.
Worse, because it's rigid.
They will never change on any platform, apart from command line switches
which allow extensions not specified in the spec.

Command line switches are a bad dependency for your program to have. The
source code should stand by itself. Pragmas are better.
 
D

David Brown

You could say that in C also, and then for specific implementations allow
an override to alter it for the machine-specific quirks (a 9-bit word, for
example).

If specific implementations can override standards-defined behaviour,
then the behaviour is no longer standard! You can't have a "standard"
that says "int is always 32-bit" and then say "but for /this/ particular
compiler, int is 16-bit". You have two choices - you can do as D does,
and specify that "int is always 32-bit" and therefore the language is
not suitable for smaller processors, or you can do as C does and say the
choice is "implementation dependent". A feature is /either/ fully
defined and specified, /or/ it is implementation dependent - it cannot
be both.
So? I'm developing code in a computer language. I EXPECT it to behave
a certain way. I don't expect to have to bend to the peculiarities of
a particular machine. In fact, I should not even care about them in
most cases.

The idea that a C developer should know the mechanics of the implementation
of the language in ALL cases is crazy. And relying upon the idea that a
particular quantity must be "at least" so many bits is insane.

The whole point with the C standards is that programmers know which
parts are fixed in the specs, and which are variable. They can rely on
the fixed parts. For /some/ code, you might want to rely on
implementation-specific features - not all code has to be portable.

In the particular case of bit sizes, it is often perfectly reasonable to
work with types that are defined as "at least 16 bits". If you are
counting up to 1000, you don't care if the variable has a max of 32K or
2G. If you need bigger numbers, you can use "long int" and know that it
is at least 32 bits. If you need specific sizes (I often do in my
work), you can use types like int16_t and uint32_t. The system is
clear, flexible, portable, and works well on big and small systems. Of
course, it all relies somewhat on the programmer being competent - but
that applies to all programming tasks.
Fixed, rigid requirements should be defined and adhered to. And the CPU
designers, should they wish to offer a new product, will look at their
audience and determine if their minimally-readble 32-bit quantity CPU is
actually a good idea or not. The market would sort that out pretty quick.

Apparently you have /no/ concept of how the processor market works. You
live in your little world of x86, with brief excursions to ARM. Did you
know that it is only a few years ago that shipments of 8-bit cores
exceeded those of 4-bit cores? And that there are still far more 8-bit
cores sold than 32-bit? As for cpus that cannot access 8-bit or 16-bit
data, these are almost always DSP's - and there is good reason for that
behaviour. The manufacturers will continue to produce them, and
designers will continue to use them - because they give better value for
money (or power, or space) than alternative solutions. And they will
continue to program them in C, because C works fine with such cores.
That is the way the market works.


One thing that strikes me in your writing here, is that you seem to have
a belief that there is such a thing as "absolute" specifications - that
you can define your language and say /exactly/ how it will always work.
This is nonsense. You can give more rigid specifications than the C
standards do - but there are no absolutes here. There are /always/
aspects of the language that will be different for different compilers,
different options, different targets. Once you understand this, I think
you will get on a little better.

So what? It is the requirement of the C language authors for those CPUs
to figure out the mechanics. I'm writing for C, not for a machine.

It is /precisely/ because C does not define these details, that you are
able to write for C and not for the machine. If C specified
requirements tuned for a particular processor type, then you would be
writing for that processor.
I'm frankly amazed C developers have tolerated this.


Flexibility can still exist when you have rigid specs. The flexibility
simply, at that point, exists outside of the specs, as per overrides which
render parts of it contrary to specified behavior.

If you allow "overrides", you no longer have rigid specs. You have
"implementation dependent" behaviour. This is precisely what the C
standards do. Sometimes I think the C standards could have some
multiple choice options rather than wider freedom (they have multiple
choices for the format of signed integers), but it is certainly better
that they say a particular point is "implementation dependent" than if
they were to say "/this/ is how to do it", and then allow
implementations to override that specification.
On a platform without
a particular feature (no integer engine), the underlying mechanics mean
that everything must be done with floating point. The C developer should
never see ANY side effect of that in their properly written C program.

Yes, and that is what happens today with C. So what is your point?
The compiler should hide every ounce of that away so there are no
variations at all.

No, it should not hide /everything/. You should be free to develop
general portable code, and free to take advantage of particular
underlying architectures, depending on the sort of code you are writing.
C gives you that.
Communicating between machine X and machine Y, through
C's protocols, in that way, should always yield correct results.

Yes, and C gives you that. You have to stick to the things that C
defines explicitly, but that's fine.
 
R

Robbie Brown

Sorry, I didn't mean to put you on the spot. I wanted to point out that
avoiding what you call incoherence is no simple matter.

I'll let it go ... just this once :)
Oh, come on. No one is suggesting that.


Really? Based one what evidence?

The sheer number of things that are or appear to be undefined.
The people who make these decisions
are not stupid.

That's an emotive statement, I never suggested such a thing.
Or to define a new one. Java, for example. But here's the thing...
Why are you using C now? What has driven you from the world of
well-defined behaviours into this quagmire of undefinedness?

Because I'm bored of thinking about business. I think in 'OO'.
Everything is an object, I'm now so far from the machine that I feel
uneasy. so I thought I'd look at C again, get some challenge back into
my comfortable, over paid, over fed life.
What
benefits does C have that might outweigh all these negatives? Do any
of them derive from the way C is defined?

Pointers. I like the whole pointer thing, pointer arithmetic,
manipulating pointers. Pointers to pointers, pointers to an array of
pointers to struct, pointers to pointers to pointers. It makes my head
spin in a way it hasn't done since the early days. Pointers.
No, and not because "a committee" is some horrid thing you don't like,
but because a good committee represents all the stake-holders in the
language, in such a way that none can be thrown under a bus for the sake
of some other interest.

Hmmm, I'm not sure about this. Can you think of a single successful
business that doesn't have a strong leader at it's head, someone who can
make the hard decisions and not take any crap from those who might not
like that decision?

I know some pretty smart software architects. They all have one thing in
common.
They have a vision and they implement it, sure they have people to help
and advise them but ultimately the buck stops with them, they take
responsibility and are responsible. It's their coherent vision that
drives them and they get things done.

Anyway, all this is very interesting but I have to get back to
understanding why snprintf changes the content of it's input string.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,013
Latest member
KatriceSwa

Latest Threads

Top