James said:
James said:
[...]
That's boost::lexical_cast. Which is probably what I'd use if I
ever needed it. I'll admit that I never saw the utility of
boost::lexical_cast either. And I'm rather sceptical of using
iostream inserters and extractors for this, too.
Consider something like:
std::complex< double > z( 1.0, 0.0 ) ;
double x = boost::lexical_cast< double >( z ) ;
The result should obviously be 1.0.
There is nothing obvious about this.
There's nothing obvious about the fact that the complex number
1+0i should become simply 1 when converted to double? I'm not a
mathematician, but I don't see what else would be reasonable.
a) Nothing else would be reasonable.
b) That does not imply that returning the real part is reasonable. You seem
to assume that there must be a reasonable conversion from complex to real
(and since all other candidates are non-reasonable, it must be this one).
That's is somewhat like: We must do something, this is something, we must
do this.
The only natural map is the inclusion R >--> C. There is no natural map in
the other direction. That is why conversion from complex to real should
fail but conversion from real to complex ought to work.
I don't know. It might be the right thing to do for something
like complex<double>(0.0,1.0), but here, I rather doubt it.
Either way, it highlights the crux of the problem: the behavior
of the conversion is really randomly defined, with no regard to
what might be relative semantics. It's a case of doing
something because you can, not because it has any meaning.
Correct, lexical_cast<> is very much like reinterpret_cast<>. It just
chooses a different intermediate representation for the conversion (string
as opposed to bit-pattern). However, that is not a problem with the
definition of lexical_cast<>; it just shows that you have to give some
thought to whether you want to use it. I think, that is the reason for
the "cast" part in the name.
There's definitely an anti-pattern of defining generic solutions
when they're not appropriate. std:
air and boost::lexical_cast
being good examples.
I would conjecture that you are just not facing the problems that
std:
air<> solves. Defining std:
air is most definitely not an
anti-pattern. In my math programming, std:
air just corresponds to taking
cross-products. That is very useful and expressed intend very clearly. It
might be an abstraction that does little for you, but it works very well in
the problem domain that is of interest to me. The only anti-pattern in
sight would be to use std:
air where you shouldn't.
That's a bit different. One might argue that there is a design
flaw in requiring order where there isn't a logical order (e.g.
making map and set ordered). In this particular case, there are
pragmatic considerations involved as well, however.
That is not different from any other case. There are always pragmatic
considerations and in the end usefulness of a solution (be it generic or
not) ultimately decides.
It probably would have been cleaner if the standard containers
had only defined a specialization of less, rather than the
operator<. Arguably, < has a specific meaning which isn't
applicable here. (And that is really your argument: that
operator overloading has been abused.) But the results serve a
very definite pragmatic purpose. You may disagree with the
relationship between the operator and the semantics associated
with it (arbitrary ordering, for use in ordered containers), but
that doesn't mean that providing a generic solution for those
semantics is necessarily wrong.
I was not the one arguing that an overly generic solution is wrong. The one
calling that an anti-pattern was you. I just gave operator< as an examply
for why I don't believe your claim that defining generic solutions that can
be used beyond their natural domain of applicability is an anti-pattern. I
have to admit that I do prefer useful designs to clean ones. (Now, if you
can have both, then that's even better.)
From my point of view, it is better to have a library component that can be
misused but solves the problem it is supposed to solve to a library
component that tries to prevent misuse at the cost of restricted
applicability. I think a library should be by and large trusting to the
judgement of its clients. Thus, I don't mind a certain excess-genericity.
As for the particular case of lexical_cast, I found good uses
of it. But I also have generic IO for pairs, sequences,
tuples, and so on because it makes perfect sense in the
context of my code-base to have such operations available.
Then lexical_cast provides convenient way, e.g., to initialize
data in small test programs like so:
matrix< double > A =
lexical_cast< matrix< double > >( "[ [ 1.0 0.2 ] [ 0.0 -1.9 ] ]" );
That conveys content much better than all alternatives.
Hmmm. I'd have thought a constructor taking a string or an
istream would be more appropriate. Otherwise, a generic
conversion to/from string might be something to consider; I
don't seen anything wrong generic toString() and fromString
functions, for example.
I find it very clean to put the code for formatting in operator<< and the
code for reading in data in operator>> and be done with it. To have
additional constructors from stream or string smells like code doublication
to me. The generic toString and fromString very close to lexical_cast. In
either case, both ways just provide some syntactic sugar for conversion to
and from string that are available as soon as operator>> and operator<< are
defined.
To/from just anything, however, using
the string representation as an intermediary, is, however,
unnecessary genericity.
As I said, it is just convenient syntactic sugar. With the conversion
functions you suggested, one could write
target = fromString<target_type>( toString( source ) );
I see very little difference. In my code, almost all lexical_cast go from or
to string anyway. (However, that is mostly because I prefer the crazy
reinterpretations to take place through piping on the command line.)
Best
Kai-Uwe Bux