Boost process and C

R

Robert Latest

On Sat, 06 May 2006 14:48:30 +1200,
in Msg. said:
I fear that C is in danger of shrinking into that ever diminishing niche
where other languages can't go.

Even if that were so, why is it something to fear?

robert
 
J

John F

Robert Latest said:
On Sun, 7 May 2006 23:08:10 +0200,


That alone isn't enough to make it meaningless. After all, the
difference of two dates makes perfect sense although it is not a
date.

It is meaningless in context. You get a new object, which is only an
artefact of applying an operation which is not defined yet. You can't
define it by saying the converse operation yields the same result. You
will have to find a representation if you want to implement it. This
means (as Keith pointed out correctly) to fix an offset and a scale
for your representation of dates thus applying another projection into
the body of (maybe even) the real numbers. Only there you are allowed
to use the analogy of adding dates. There it is defined (although the
backtransform will give illegal dates). You can't define date1+date2
without assuming a representation as numbers.
 
I

Ian Collins

Robert said:
On Thu, 11 May 2006 21:24:17 +1200,



They can always learn a different language.
I did, several.

But I still hate to see my first programming language marginalised.
 
W

websnarf

Robert said:
Can you write 102 well tested functions

I said "the necessary functions".[/QUOTE]

I've got plenty of emails and toy programs telling me that those 102
functions are *necessary*. Remember, as I counted out for CBF, the
number of functions in Bstrlib are comparable to those in the C
standard library, and the library is meant as a complete replacement
for char *, and the key here is "well tested". Writing anything
non-trivial that is well tested in 1 hour (the estimated time of
downloading, including the files in your project, and skimming the
documentation the point of rudimentary understanding) seems highly
unlikely.
How can they have a performance advantage over "straight C" functions if
they add functionality to them and, more paradoxically, are written in
straight C themselves?

Because its a re-interpretation of the representation of strings versus
the char * nonsense that people generally use. There is something
preventing people from realizing that length prefixed strings just lays
waste to char * performance pretty much always (and the backwards
compatibility with char * meaning the few cases where the char * wins,
I can't lose either.) So its faster because people are either
unwilling to put in the effort, or are unable to write performance
optimized code, or are ignorant of the good speed paths of most
compilers or whatever to duplicate the performance of Bstrlib.

That is a lot of the value add of it. If it were so easy, why can't I
find anything comparable? Microsoft has a bazillion employees they
could throw at the problem, and the best they could come up with is
MFC's CString and TR 24731.

There's nothing deep going on here -- Bstrlib is just written to a
fairly high standard.
True. More to the point, every C programmer builds more or less his own
C library over the years.

Right. But this just bolsters the case that there should be an STL for
C.
 
W

websnarf

Robert said:
Even if that were so, why is it something to fear?

C is the only really scalable programming language that has intuitively
predictable performance. Unfortunately, the C standard committee
specific disavows this fact.
 
R

Richard Tobin

John F said:
It is meaningless in context. You get a new object, which is only an
artefact of applying an operation which is not defined yet. You can't
define it by saying the converse operation yields the same result.

Why not? It is perfectly reasonable to define objects in terms of the
operations that can be performed on them.
You will have to find a representation if you want to implement it.

So what? That can be completely opaque to the user. Every implementation
can do it differently.
This
means (as Keith pointed out correctly) to fix an offset and a scale
for your representation of dates thus applying another projection into
the body of (maybe even) the real numbers.

That's one possible implementation, probably the most reasonable one.
Only there you are allowed
to use the analogy of adding dates.

Why?

-- Richard
 
R

Robert Latest

On 11 May 2006 03:10:31 -0700,
in Msg. said:
Right. But this just bolsters the case that there should be an STL for
C.

Container libraries are a good idea. But they shouldn't be mandated by
the standard. It's both counterproductive and unneccessary. Let's not
confuse the debate over the usefulness or necessity of some library with
the debate about what should and should not be C.

robert
 
I

Ian Collins

Robert said:
On 11 May 2006 03:10:31 -0700,



Container libraries are a good idea. But they shouldn't be mandated by
the standard. It's both counterproductive and unneccessary. Let's not
confuse the debate over the usefulness or necessity of some library with
the debate about what should and should not be C.
The C++ standard library has been almost universally accepted by the
developer community. Being standard, it provides a solid base for
portable code. As such, it has become a very productive tool.

Its success has been partly due to to the language committee
standardising a well regarded library that was already widely used. It's
unfortunate the C lacks such a library.
 
R

Robert Latest

On Thu, 11 May 2006 23:37:21 +1200,
in Msg. said:
Its success has been partly due to to the language committee
standardising a well regarded library that was already widely used. It's
unfortunate the C lacks such a library.

C doesn't "lack" such a library in the sense that complex data
structures cannot be implemented effectively in C. Several good
container libraries exist, and you are free to choose and use the one
that best fits your application. And if you really think you can't do
without a ISO-approved library, you are free to write C wrappers around
C++'s STL.

I think it is one of the *strengths* of the C standard to require only a
minimal (compared to C++ oder other higher-level languages)
functionality from the implementation, and it doesn't seem to have
stopped C from being a language in which huge numbers of applications
have successfully been coded.

robert
 
R

Robert Latest

On Thu, 11 May 2006 08:48:51 +0000,
in Msg. said:
Heck, the teddy-bear thing even works
when I'm wandering randomly around the kitchen!

"Clint, where on earth is the kett... oh, I'm holding it, aren't I?"

Richard, you need to get is a life, not a teddy bear ;-)
As for the chattiness, well, I can't help that. I'm a naturally chatty
person.

I know. So am I. I do appreciate chattiness, so don't take my criticism
of yours too harshly.

There are things that I find worthier of criticism in CU, such as
instances of, "Here is the complete source of <whatever>. To see
examples of how it is used, look at the CD" when it should actually be
the other way round. If you like I can give you specifics.

No, I don't regret having bought the book. I like it, including the
chatty parts.

robert
 
R

Robert Latest

On 11 May 2006 03:15:01 -0700,
in Msg. said:
C is the only really scalable programming language that has intuitively
predictable performance. Unfortunately, the C standard committee
specific disavows this fact.

For very good reasons, the C Standard says *nothing* about C's
performance. Why should it? Performance issues should be left to the
people who know best, the implementors.

robert
 
R

Richard Heathfield

Robert Latest said:
There are things that I find worthier of criticism in CU, such as
instances of, "Here is the complete source of <whatever>. To see
examples of how it is used, look at the CD" when it should actually be
the other way round. If you like I can give you specifics.

No need. Personally, I think it should *all* be in the book, where you can
still read it even if you're caught on a train without a laptop to hand.
But there are limits to the carrying capacity of the average passenger
coach, and the railway lobby got to me in advance and warned me off (a guy
with a railway sleeper called at my house to tell me precisely what he'd do
with it if it didn't outweigh the finished book).
No, I don't regret having bought the book. I like it, including the
chatty parts.

Good, because - if I ever write another - it'll be even more
chatteryierierer.
 
J

John F

Richard Tobin said:
Why not? It is perfectly reasonable to define objects in terms of
the
operations that can be performed on them.

No. It doesn't make sense to define it that way. You can't make use of
the result if you do. (see below)
So what? That can be completely opaque to the user. Every
implementation
can do it differently.

We were talking about the math behind it. Weren't we?
That's one possible implementation, probably the most reasonable
one.

IMHO it is the only one to allow an addition of dates (with the same
syntax but different inherent semantics and without any result in the
domain of dates, just an intermediate result in the domain of
numbers).

Because the operation "add", as we are used to how it works (e.g. for
natural numbers: successively count all cardinalities in all
summands), is defined for numbers and not for dates. You will need a
(bijective) homomorphism between dates and numbers to be able to use
the same rules as with numbers.

Now it turns out that the backtransform of a sum of two numbers
representing dates into dates does not make any sense. So you can't
apply it to dates. It is an abstract artefact, like some effects in
frequency-domain vs. timedomain.
For g(jw)=1 you are not able to perform the backtransform within the
domain of continuous functions, you need distributions to represent
the result.

You will have to abstract from "date" as you know it to something
different to give it a meaning there. You can still accept it in the
domain of numbers, but you will not be able to transform

f(date1)+f(date2)=k

date3=f^(-1)(k)

this is illegal because it makes no sense to be interpreted as a date.

I hope that makes things alittle clearer now.
 
D

Dave Thompson

Flash Gordon wrote:

size_t is also a similar artificial limitation. The fact that arrays

The required properties of size_t force every C implementation to have
some limit, fixed not later than compile time, on the size of objects
and hence of arrays and strings. Given that, using size_t never
imposes any limit stricter than the implementation already does. But
int (and even unsigned int) may be to small to represent all valid
subscripts or offsets of a valid object; I think that's the point of
calling it an _artificial_ limit. In fact one of the (obscure and
rare) systems I use today does exactly that. And in fact I sometimes
write routines that use int or uint length/size for efficiency knowing
and accepting that this limits some possibly good uses of them.
can only take certain kinds of scalars as index parameters is also an
artificial limitation. But it turns out that basically every language

Not in C. Notice that 6.5.2.1 and 6.5.6 talk about 'integer' type, not
specifically 'int'. And similarly for declarations in 6.7.5.2. The
_values_ of these subscripts and bounds can't exceed some limit which
cannot be greater than SIZE_MAX (+1 where applicable), but the type
may be long long long long long int or whatever.

FORTRAN on the other hand does have this problem. Originally it had
only one size of INTEGER and dimensions and subscripts were of that
type. When F90 added multiple 'KINDs' (meaning widths) it specified
apparently for backward compatibility that bounds and subscripts are
of the 'default INTEGER KIND', which is also required to be the same
size (in storage at least) as the default = single precision REAL and
thus in practice must usually be the machine word, at least on
machines that have a recognizable word size. And there are machines,
and (particularly) now sizes of large number-crunching problems coded
in Fortran that people want to run, that exceed that default size.
and every array-like or string-like (with the notable exceptions of Lua
and Python) has a similar kind of limitation.
I'm not sure if you meant every language with any array-like or
string-like type, or every such type in every language. Although I
can't immediately think of a case where it makes a difference.

I'm pretty sure LISP allows array subscripting by bignums, but still
subject to available (virtual) memory which in reality <G> always has
some limit. I'm not sure about strings. LISPers traditionally didn't
focus much on things that would need long strings.

I don't recall what APL does here but I think it would be worth
checking; that language (or rather designer, Iverson) thought more
throroughly about mathematical 'sense' than any other I know.

- David.Thompson1 at worldnet.att.net
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,535
Members
45,007
Latest member
obedient dusk

Latest Threads

Top