this should work

R

Rainer Weikusat

Ben Morrow said:
It explicitly does not keep the variable alive: it sets the PADSTALE
bit, which indicates that this variable is 'dead', and prevents string
evals and such from picking the variable up by mistake. It doesn't
deallocate the storage used for the variable, but that's just the same
level of optimisation as malloc uses when it doesn't reduce the brk just
because something has been freed.

It's not: Malloc usually retains whatever 'memory' (adress space, actually) it
already requested from the kernel in order to satisfy future
allocation requests. This means that code like this

-------------------
use Devel::peek;

sub yow
{
for (0 .. 100) {
{
my $a = "$_ + 1";
print(\$a, "\n");
Dump($a);
}

{
my $b = "$_ + 1";
print(\$b, "\n");
Dump($b);
}
}
}

&yow;
-------------------

would reuse the same space both for the $a and $b 'control blocks' and
their string bodies and that all meta-information associated with any
of these four 'logical' memory areas would be lost after each
block: There wouldn't be a way to 'set the PADSTALE bit' for any of
the scalars because whatever memory happened to be allocated to them
would turn into a completely generic 'free memory area' of a certain
size. That's not happening in this case (for perl 5.10.1 at least).

[...]
The computer has been programmed to work around this obsession with
the 'micro-optimized lifecycle management' (And this is a
euphemism. I'd wager a bet this this is practically just 'create a
new variable whenever you need one because remembering what
variables were created two lines ago would be SO cumbersome' [and
'plan in advance variables will be needed' an imposition beyond any
physically tolerable by man]).

We are not writing assembler, where you have a limited number of
registers and their use needs to be planned carefully.

Programming means 'construction of algorithms', these algorithms are
usually stateful, that is, they employ variables to keep track of
'past events' and constructing such an algorithm requires a certain
amount of planning/ forethought, especially if it is supposed to be a
sensibly implemented algorithm, that is, one which both solves a
certain problem efficiently and does so without unneeded complications
putting a burden on the mind of someone trying to understand the code.

You're free to believe that you're - thankfully - absolved form the
burden of thinking about what you're going to write before you write
it because "you are not writing assembler" but so far, I haven't
encountered a more ludicrous statement about 'software development'
(and I'm fairly certain that you didn't mean to express that).
A variable is just a way of giving a value a name;

This is true for so-called 'functional programming languages' but Perl
isn't one: There, a 'variable' is something like a deposit box
(term?): A container which can be used to store 'stuff' of a certain
kind (depending on the type of box) until it is again needed which can
be 'addressed' in a convenient way (usually, by using an abstract name
referring to the 'function' of this variable).

[...]
The whole point of lexical variables is to avoid the problems that
occur when uncontrolled and implicit data leakage occurs between
different parts of the program; having Perl ensure that values we no
longer need are properly disposed of as soon as possible is just
common sense.

(And, again, this is not about efficiency, either of CPU or memory. It's
about making the code comprehensible.)

IMO, it is about making the code incomprehensible for the sake of
'efficiency', namely, to avoid the dreaded, mythological function call
overhead, by cramming as many different algorithms into a single run
of sequential code as seems remotely feasible instead of giving
'different things different names' and invoke them using these in
higher-level control routines. If 'possible information leakage'
becomes a problem, the constituent parts of 'the code' are way too
large and do way too many different things.
Um, it's the one in the 'my' statement just above your cursor. That's
the whole point of tight scoping: except for the various kinds of
globals, which should not be created lightly and do require planning,
the scope of a single variable should not exceed one screenful of
code.

Except if 'screenful' is supposed to refer to 80x25, it should
usually be less: Some random 'screenful of code' I just looked at (148
lines of text) contained five different complete subroutines whose
bodies where (from top to bottome), 5 lines of text, 12 lines of text,
17 lines of text, 4 lines of text and 1 line of text.

[...]
If you are having to deal with code which has multiple variables with
the same name in nested scopes, some extending over hundreds of lines of
code, you have my sympathies. That is not a code style I am
advocating.

It's rather "multiple variables of the same type with the names spelled
somewhat differently, eg mdmCommand, MdmCommand and MDMCommand, all
supposed to contain the same thing, namely, the current MDM command,
which occur (or in this case, occurred) in the same 'hundreds of lines
of code' subroutine (Java method), with the number of different
spellings presumably equal to the number of different people who added
code to this particular method" (this is slight 'abstraction' of the
actual situation, but IMO an honest one). And without the possiblity
to 'declare variables on the spot' whenever one is needed, something
like this couldn't occur.
 
R

Rainer Weikusat

Ben Morrow said:
How is that different from 'just a way of giving a value a name'?

In simply terms, it is 'stuff the value into the named box' vs
'evaluating a term in the context of some binding', that is, with
certain set of 'name substitution rules' in place. I don't know if
something like 'introductory text books' (or WWW texts) on
'functional programming' (vs 'programming with imperative languages')
exist but if they do, you'd find a better explanation there
(especially one you're more likely to consider than to reject it 'with
a jerk of your knee' because it comes with $authority attached :->).
We were discussing these two forms:

for (...) { my $tmp;
my $tmp = "/.../$_"; for (...) {;
...; $tmp = "/.../$_";
} ...;
}

and you claimed the second was superior on grounds of efficiency.

I didn't claim that, I pointed it out. Arguably for tactical reasons
because trying to convince people who dumped $native_language with a
sigh of relief at the earliest possible opportunity during their
educational career in order to dedicate themselves to the domain of
incomprehensible strings of non-alphabetic symbols meticiously
arranged according to some set of traditional rules that they're still
writing texts and the same rules for "don't make a mess of it" apply
as to any other text (eg, don't constantly and gratuitiously introduce
new stuff just because you can) is hopeless: Pointing out that the
result is lacking according to some measurable 'hard' criterion will
usually at least lead to the admission that "well, it does, but -
cunningly - we programmmed the computer to work around that already".

But that doesn't really belong into a discussion of problems which might
occur in large, sequential runs of code when band aids of this kind
weren't available
Of course.


Do you understand the meaning of 'should not exceed'?

I think 'should not exceed' and 'should usually be a lot less than'
are substantially different.
 
T

Tim McDaniel

I would quibble with this terminology. There are languages (LISP and
its children) that I would call "functional" because the MAJOR
emphasis is for functional programming, but they may still provide a
way to change the value of variables (setq and such) in the same was
as an imperative language like Perl. I do not see setq and such as
fatally tainting the essential functionalness of the language. You
definitions may be different.
How is that different from 'just a way of giving a value a name'?

Because it's giving the STORAGE LOCATION a name. The value in the
storage location can change without altering the name of the
container.

If the Lochners move out of a house and the Willowbys move in,
and it remains 730 Watling Street, that's the name of the container.
If it goes from being Lochner House to Willowby House, it's the name
of the contents.

I contrast this with single-assignment languages. Since I wasn't
familiar with them, I'll explain in a bit of detail, even though many
other people already know about it. I was unfamiliar with them until
I had to use DSSSL, a Lisp-like language that was designed for
transforming XML into other XML or into formatted output. XSLT is a
later version of the same concept.

DSSSL is single-assignment, and I think XSLT is too. You can define a
name for the result of a computation, but the definition is immutable
-- you cannot change the right-hand side once defined. You cannot
change the input either.

In that case, there may BE no "storage location". Whereever you use
the name, so far as I can tell, the interpreter is entirely free to
substitute the right-hand side verbatim at that point. I think that
any reasonably efficient interpreter would have to evaluate it once
and then cache the result. But ideal caching is always behind the
scenes.

In that case, I'd call it a name for the value. You may object that
it's more properly a name for the expression, and I can't argue
against that viewpoint, but I am more interested in the value and they
do go together.

(As an aside, can I exclaim about what a hideous PAIN IN THE ASS it is
to program a single-assignment pattern-matching language? So many
times I wanted to do the imperative thing (assign -> use) when I
couldn't even think of a way to do it with this single-assignment
thing. End of rant.)

Anyway, sorry if I belabor the obvious.
 
C

Charlton Wilbur

BM> Why are you talking about Java? Who (here) cares about Java?

I find myself caring about it for professional reasons.

BM> Everyone knows it encourages poor programmers to write
BM> incomprehensible rubbish.

I thought that was Perl. Java encourages poor programmers whose
greatest aspiration is to be an unthinking cog in a poorly-designed
machine to write incredibly verbose rubbish, most of which could more
easily be copied and pasted.

Charlton
 
R

Rainer Weikusat

Ben Morrow said:
Yes, I understand the difference between let and setq, and between a pad
slot called $x and the SV it points to, and I know that Perl has no real
equivalent of a let-binding.

[JFTR: That would be local]

[...]
Arguably for tactical reasons because
[...]

ESENTENCETOOLONG: Stack overflow, core dumped.

Methinks this should be 'Core overflow, stack dumped' ...

NB: This is not a sensible reply insofar its content goes. But IMO,
everything has been said and fighting the war of the words to the
bitter in end order to remain standing seems tireseom.
 
R

Rainer Weikusat

Ivan Shmakov said:
Ivan Shmakov <[email protected]> writes:
[...]
When people can't have multiple disjunct sets of variables used by
unrelated parts of the same 'aggregate subroutine', they do what
they should be doing instead, namely, structure their code.
I disagree. I deem the use of nested scopes as crucial to code
structuring. Should the "roles" of the variables (whether input,
output, or local) become apparent later, it'd be trivial to split
the function, -- and that's likely to be done exactly along the
scope boundaries previously coded in.
To quote my boss: Make it work now quickly and clean it up later :).
Which means: Take some existing code which performs a more-or-less
related task, copy'n'paste it to some part of the countryside where
no trenches have been dug yet or the old ones worn out over time,
create a new nested scope lest all hell breaks lose because any
accidental interaction with the surrounings, any details about them
long lost in the land of ancient lore,

[...]

... Once, I will find the patience to wait for the food
engineers out there to design a sound nutritional solution.

Meanwhile, I'm forced to rely on the off-the-shelf products,
which are known to be full of undocumented features, deviate
from the specifications every now and then, and (while I'm yet
to see one myself) are reported to contain actual bugs...

Minus some obvious misconceptions (eg, the 'off the shelf' food is
designed by 'food engineers' to be 'soundly nutrirional', ie, contain
everything fashion currently demands that it should and not contain
anything fashion demands that it currently mustn't, while less
sophisticated people like me get by somehow with vegetables, meat,
spices and tools to prepare these in some completely 'unscientific'
way), I have no idea what this was supposed to mean.
 
I

Ivan Shmakov

Rainer Weikusat said:
[...]
[...]
... Once, I will find the patience to wait for the food engineers
out there to design a sound nutritional solution.
Meanwhile, I'm forced to rely on the off-the-shelf products, which
are known to be full of undocumented features, deviate from the
specifications every now and then, and (while I'm yet to see one
myself) are reported to contain actual bugs...
Minus some obvious misconceptions (eg, the 'off the shelf' food is
designed by 'food engineers' to be 'soundly nutrirional', ie, contain
everything fashion currently demands that it should and not contain
anything fashion demands that it currently mustn't,

The nutritional requirements of an average healthy adult are
more or less well-known (check, e. g., [1]), and do not depend
much on "fashion," whatever one's misconceptions may be.

[1] http://www.iom.edu/Global/News Anno... Files/Nutrition/DRIs/DRI_Summary_Listing.pdf
while less sophisticated people like me get by somehow with
vegetables, meat, spices

(... Except that all of the above were "engineered," one way or
the other.)
and tools to prepare these in some completely 'unscientific' way),

The "food engineers" of today have learned that they have to
make food "tasty", not "healthy," in order to succeed. Which
more or less corresponds to what I may otherwise call an
"unscientific" way.

(Not that there's much difference to "software engineers" in
this respect.)
I have no idea what this was supposed to mean.

My point is simple: if the deadline is today, one has to forget
about "science" (be it Wirth's, Borlaug's, or someone's else),
and use whatever "ingredients" available to solve the task at
hand. Be it a program, or a dinner.

And using nested scopes is as beneficial to writing software,
as washing one's hands is to preparing food.
 
R

Rainer Weikusat

The text below is only remotely concerned with software engineering
and some parts of it may be seriously offensive to some people or
groups of people.

Ivan Shmakov said:
Rainer Weikusat <[email protected]> writes:
[...]
To quote my boss: Make it work now quickly and clean it up later
[...]
... Once, I will find the patience to wait for the food engineers
out there to design a sound nutritional solution.
Meanwhile, I'm forced to rely on the off-the-shelf products, which
are known to be full of undocumented features, deviate from the
specifications every now and then, and (while I'm yet to see one
myself) are reported to contain actual bugs...
Minus some obvious misconceptions (eg, the 'off the shelf' food is
designed by 'food engineers' to be 'soundly nutrirional', ie, contain
everything fashion currently demands that it should and not contain
anything fashion demands that it currently mustn't,

The nutritional requirements of an average healthy adult are
more or less well-known (check, e. g., [1]), and do not depend
much on "fashion," whatever one's misconceptions may be.

[1] http://www.iom.edu/Global/News Anno... Files/Nutrition/DRIs/DRI_Summary_Listing.pdf

I'm sorry if I didn't pay proper respect to your preferred
(mis-)conception, but there are simply to many of them, even when just
counting 'current' ones which include books of impressively-looking
tables. Prior to Mad Cow Disease, nutrional requirements of cows were
already well-known. Do we really have a surge of ideologically blinded
suicide bombers nowadays? Or maybe Mad Muslim Disease caused by an
unfortunate diet combining with unfortunate circumstances?
(... Except that all of the above were "engineered," one way or
the other.)

Indeed. I remember an old joke which went roughly like this: Imagine
there's a mathematician on a strange planet wholly covered with gras
and the only other living being is a single sheep. The mathematician
is to catch this sheep, how does he proceed? Answer: He builds a
fence around himself and defines the place occupied by him as
'outside'.

With the help of a suitable set of definitions, any term can
be interpreted to mean anything, at the expense of rendering
meaningful communication impossible (which may be desired).
The "food engineers" of today have learned that they have to
make food "tasty", not "healthy," in order to succeed. Which
more or less corresponds to what I may otherwise call an
"unscientific" way.

The purpose of 'the sense of taste' is to enable distinction between
'healthy' and 'unhealthy' things one could possibly eat. It works
better for horses because these tend to approach the matter
empirically and with an unprejudiced mind, something humans, especially
humans wielding statistics, rarely do.
My point is simple: if the deadline is today, one has to forget
about "science" (be it Wirth's, Borlaug's, or someone's else),
and use whatever "ingredients" available to solve the task at
hand. Be it a program, or a dinner.

That just a convienent justification the proverbial old poodle uses in
order to defend against the supposition of having to learn new tricks:
Whatever the benefits might be, I've got no time for this ATM, I'm to
busy performing the old ones, constantly working around their
deificiencies, and won't ever have any time for that, either.
 
R

Rainer Weikusat

[...]
My point is simple: if the deadline is today, one has to forget
about "science" (be it Wirth's, Borlaug's, or someone's else),
and use whatever "ingredients" available to solve the task at
hand. Be it a program, or a dinner.

I also wrote a longer reply to this which I shouldn't have posted
(which will continue to be available via servers not processing
cancels) but this is the more interesting part: According to my
experience, writing 'bad' code (for a suitable definition of 'bad')
doesn't take less time than writing 'good' code (for a suitable
definition of 'good') to begin with and 'corners cut in order to be
able to present something superficially suitable to $superior now'
will come back to haunt you, ie, when taking the time necessary for
repairs/ maintenance and the time wasted trying to puzzle out the
meaning of 'the usual thicket' into account, 'bad code' needs a lot
more time overall.
 
I

Ivan Shmakov

The text below is only remotely concerned with software engineering

It's still concerned with food, though, which I'm having a kind
of a lifelong interest in.

[...]
Minus some obvious misconceptions (eg, the 'off the shelf' food is
designed by 'food engineers' to be 'soundly nutrirional', ie,
contain everything fashion currently demands that it should and
not contain anything fashion demands that it currently mustn't,
The nutritional requirements of an average healthy adult are more or
less well-known (check, e. g., [1]), and do not depend much on
"fashion," whatever one's misconceptions may be.
[1] http://www.iom.edu/Global/News Anno... Files/Nutrition/DRIs/DRI_Summary_Listing.pdf
I'm sorry if I didn't pay proper respect to your preferred
(mis-)conception, but there are simply to many of them, even when
just counting 'current' ones which include books of
impressively-looking tables.

While I understand the importance of doubt, I'd like to point
that in this case, these "impressively-looking tables" are based
on the very same kind of scientific evidence as that a surgeon
or a rocket engineer rely upon.

Sadly, the conventional wisdom puts more trust in glossies that
it does in the Institute of Medicine publications.
Prior to Mad Cow Disease, nutrional requirements of cows were already
well-known.

Isn't it stretched a bit? It sounds as if prions were once
thought as a constituent of a healthy cow diet, and then, -- all
of a sudden, -- were found not to be. (While in reality, the
"health benefits" of prions are as dependent on "fashion," as
are those of the most of cyanides, dioxins, or strychnine.)
Do we really have a surge of ideologically blinded suicide bombers
nowadays?

Somehow, it was my understanding that the family of a suicide
bomber will at times receive support from those "authorizing"
the bombing. Therefore, it indeed may have more to do with the
"diet" than with the "ideology."

[...]
Indeed. I remember an old joke which went roughly like this:
[...]

With the help of a suitable set of definitions, any term can be
interpreted to mean anything, at the expense of rendering meaningful
communication impossible (which may be desired).

Yet another joke I recall says that one doesn't ask questions to
a programmer, for the answer will be true, precise, and useless
in practice.

But the fact is: the varieties grown on the fields of today are
"better" (at least when it comes to the yield; and the
difference may easily be of an order of magnitude) than those
cultivated a century ago; and it's likely that those that will
be grown a century from now will be "better" still.

So, the choice is: to wait, or to use what's available right
now?
The purpose of 'the sense of taste' is to enable distinction between
'healthy' and 'unhealthy' things one could possibly eat. It works
better for horses because these tend to approach the matter
empirically and with an unprejudiced mind, something humans,
especially humans wielding statistics, rarely do.

The evolution (which gave the horse its "sense of taste" -- as
well as all the other senses) is as wise as it's blind. And, if
it's so easy to fool the eye with an optical illusion, shouldn't
it be at least remotely possible to fool one's sense of taste?

Why, it was my understanding that it's what at least some of the
pesticides do: use a toxin which is "tasty" to its victim.
(Something that, they argue, is the ultimate goal of the
"fast food" industry of today.)
That just a convenient justification the proverbial old poodle uses
in order to defend against the supposition of having to learn new
tricks: Whatever the benefits might be, I've got no time for this
ATM, I'm to busy performing the old ones, constantly working around
their deficiencies, and won't ever have any time for that, either.

And it's quite natural, and happens in just every field of human
activity. To paraphrase, it takes a touch of genius to do
otherwise.

I'd like to also respond to the other argument here.
According to my experience, writing 'bad' code (for a suitable
definition of 'bad') doesn't take less time than writing 'good' code
(for a suitable definition of 'good')

... It sounds as if the time required to learn to write "good"
code was somehow assumed to be zero or negligible, while I
sincerely doubt that it really is.
 
I

Ivan Shmakov

The text below is only remotely concerned with software engineering

It's still concerned with food, though, which I'm having a kind
of a lifelong interest in.

[...]
Minus some obvious misconceptions (eg, the 'off the shelf' food is
designed by 'food engineers' to be 'soundly nutrirional', ie,
contain everything fashion currently demands that it should and
not contain anything fashion demands that it currently mustn't,
The nutritional requirements of an average healthy adult are more or
less well-known (check, e. g., [1]), and do not depend much on
"fashion," whatever one's misconceptions may be.
[1] http://www.iom.edu/Global/News Anno... Files/Nutrition/DRIs/DRI_Summary_Listing.pdf
I'm sorry if I didn't pay proper respect to your preferred
(mis-)conception, but there are simply to many of them, even when
just counting 'current' ones which include books of
impressively-looking tables.

While I understand the importance of doubt, I'd like to point
that in this case, these "impressively-looking tables" are based
on the very same kind of scientific evidence as that a surgeon
or a rocket engineer rely upon.

Sadly, the conventional wisdom puts more trust in glossies that
it does in the Institute of Medicine publications.
Prior to Mad Cow Disease, nutrional requirements of cows were already
well-known.

Isn't it stretched a bit? It sounds as if prions were once
thought as a constituent of a healthy cow diet, and then, -- all
of a sudden, -- were found not to be. (While in reality, the
"health benefits" of prions are as dependent on "fashion," as
are those of the most of cyanides, dioxins, or strychnine.)
Do we really have a surge of ideologically blinded suicide bombers
nowadays?

Somehow, it was my understanding that the family of a suicide
bomber will at times receive support from those "authorizing"
the bombing. Therefore, it indeed may have more to do with the
"diet" than with the "ideology."

[...]
Indeed. I remember an old joke which went roughly like this:
[...]

With the help of a suitable set of definitions, any term can be
interpreted to mean anything, at the expense of rendering meaningful
communication impossible (which may be desired).

Yet another joke I recall says that one doesn't ask questions to
a programmer, for the answer will be true, precise, and useless
in practice.

But the fact is: the varieties grown on the fields of today are
"better" (at least when it comes to the yield; and the
difference may easily be of an order of magnitude) than those
cultivated a century ago; and it's likely that those that will
be grown a century from now will be "better" still.

So, the choice is: to wait, or to use what's available right
now?
The purpose of 'the sense of taste' is to enable distinction between
'healthy' and 'unhealthy' things one could possibly eat. It works
better for horses because these tend to approach the matter
empirically and with an unprejudiced mind, something humans,
especially humans wielding statistics, rarely do.

The evolution (which gave the horse its "sense of taste" -- as
well as all the other senses) is as wise as it's blind. And, if
it's so easy to fool the eye with an optical illusion, shouldn't
it be at least remotely possible to fool one's sense of taste?

Why, it was my understanding that it's what at least some of the
pesticides do: use a toxin which is "tasty" to its victim.
(Something that, they argue, is the ultimate goal of the
"fast food" industry of today.)
That just a convenient justification the proverbial old poodle uses
in order to defend against the supposition of having to learn new
tricks: Whatever the benefits might be, I've got no time for this
ATM, I'm to busy performing the old ones, constantly working around
their deficiencies, and won't ever have any time for that, either.

And it's quite natural, and happens in just every field of human
activity. To paraphrase, it takes a touch of genius to do
otherwise.

I'd like to also respond to the other argument here.
According to my experience, writing 'bad' code (for a suitable
definition of 'bad') doesn't take less time than writing 'good' code
(for a suitable definition of 'good')

... It sounds as if the time required to learn to write "good"
code was somehow assumed to be zero or negligible, while I
sincerely doubt that it really is.
 
R

Rui Maciel

Ivan said:
[...]

... Once, I will find the patience to wait for the food
engineers out there to design a sound nutritional solution.

Meanwhile, I'm forced to rely on the off-the-shelf products,
which are known to be full of undocumented features, deviate
from the specifications every now and then, and (while I'm yet
to see one myself) are reported to contain actual bugs...

[Cross-posting to and Just in case.]

All engineering relies on off-the-shelf products. The shelf, though, does
depend on the circumstances and requirements. Nevertheless, it's off-the-
shelf all the way down to the turtles, if you will.[1]


Rui Maciel

[1] http://en.wikipedia.org/wiki/Turtles_all_the_way_down
 
X

Xho Jingleheimerschmidt

On Sun, 14 Jul 2013 17:34:50 +0200 in comp.lang.perl.misc, "Dr.Ruud"


Uhm, no. When evaluating whether code is broken or not, you don't
assume perfect flawless input. You assume worst case malicious NSA
type input.

Fascinating. So no matter how much validating I've done of the data, I
have to assume that the data has not been validated. Do I get an
exemption from this if the data was consumed and validated with no
semicolons intervening? How about no newlines intervening? None of either?

Xho
 
R

Rainer Weikusat

David Harmon said:
On Sun, 14 Jul 2013 17:34:50 +0200 in comp.lang.perl.misc, "Dr.Ruud"


Uhm, no. When evaluating whether code is broken or not, you don't
assume perfect flawless input. You assume worst case malicious NSA
type input.

The code is 'broken' when it doesn't process the data it is supposed to
process such that the intended result results from that. Eg, this

sub sum
{
return $_[0] + $_[1];
}

is a function which will return the sum of its first two arguments
provided that both are numbers. It can be made to do something very much
different,

---------
package Ha;

use overload "+" => negate;

sub new
{
return bless([], $_[0]);
}

sub negate
{
return ~$_[1];
}

package main;

sub sum
{
return $_[0] + $_[1];
}

print(sum(3, 4), "\n");
print(sum(Ha->new(), 4), "\n");
---------

but this doesn't mean 'sum is broken': The precondition 'first two
arguments are numbers' is not true for the second call, hence, the
postcondition won't necessarily be true afterwards.

Whether or not 'will be a number > 1' is a sensible precondition in a
given situtation would be a different question.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,483
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top