Long(er) Factorial

K

Kelsey Bjarnason

[snips]

Ok, so my screen is a scratchpad, and I shouldn't depend on any
information on it sticking around for long. But it's *my* scratchpad,
and if a program clears it for no good reason, I'm not going to be
pleased.

I might not be, either... but I'm not going to be overly annoyed, either,
for the simple reason that it's a screen, not a permanent or even
semi-permanent record, so if it gets zapped, well, it was a scratch in the
first place, what the heck did I expect, especially when running an app
with unknown outputs smack bang in the middle of the scratch area?

If I did that and it, say, dumped crap into my kernel files, I'd be
_really_ annoyed; those are _not_ scratch data.

If your program needs to take control of the entire screen (say, it's a
text editor), on many systems it can use a system-specific library that
allows it to save and restore the current screen contents.

Indeed, in the ideal world, if you need to "clear the screen", do it in a
"window", even where that window is, itself, the full screen - then dump
the original contents back.

I just find the notion of getting upset that an app clears a scratch area
about as silly as complaining that it creates temp files in the temp file
dir. It's a transitory medium, and a lot of apps do unpredictable things
with it - not just clearing screens; could be simply dumping volumes of
data out fast enough to flush the scrollback buffer before you can kill
it. Didn't clear the screen, but the data's still gone. Relying on the
screen as a storage medium, then complaining when the storage gets nuked,
is just damnfoolishness.
What's on my screen is probably the output of the last program (or
programs) I ran. Your own program's output isn't so important that you
need to erase *my* (possibly unimportant) data.

My program may or may not be that important - that's for you to decide.
And you did decide it was that important, by running it in the middle of
your storage area. In fact, in doing so, you demonstrated that your
on-screen data has absolutely _zero_ value. Why? Simple: the app you're
complaining about is clearing the screen. If you _knew_ it was going to
do this, you'd have run it in another console, or saved your existing data
somewhere, etc.

Instead, the clear screen took you by surprise. This means you have no
idea what the output of the application is, whether it clears the screen,
locks up the computer, floods the scrollback buffers, etc. Doesn't matter
the actual details, the fact that it's taking you by surprise means you
simply _do not know_. If you don't know what the output is, but you're
running it in the middle of your data store, then you, yourself, are, by
that very action, defining the value of your data in the store. The value
is zero. If it were anything else, you wouldn't be running unknown
applications in the middle of it.
Unless you can think of a good reason why any program should need to
clear the screen before printing "Please enter a number"?

In the case of some silly-ass little app that does nothing more than
asking for a number, no, there's not much point. That doesn't mean that
there aren't apps where the author feels there _is_ a point in clearing
the screen. Or dumping enormous quantities of data. Or whatever.

I understand your key concept, that apps should be "well behaved", and I,
at least, generally try to achieve that end myself. On the other hand,
having been bitten once or twice by exactly this sort of thing - buffer
floods, screen clears, etc - I simply ask myself if what I've already got
matters. If it doesn't, run the app. If it does, and I run the app and
lose the data, I've got nobody to blame but myself.
 
R

Richard Bos

Kelsey Bjarnason said:
He is. That's where his important data is. where he stored it.

No, he didn't. He generated it for immediate use. If you want to call
that "storing", you may; by the same token, I may call you a hideous
baboon, but that doesn't mean you are one.
doesn't want it messed with, he has two choices: store it somewhere else,
or don't run unpredictable things in the same place it's being stored.

And that's the very point. If Turbo-C wannabe programmers weren't so
orgasmic about clearing the screen, their programs would not predictably
clobber over useful data we've just generated.

Richard
 
W

Walter Roberson

Kelsey Bjarnason said:
I just find the notion of getting upset that an app clears a scratch area
about as silly as complaining that it creates temp files in the temp file
dir.

[The below is inherently OT, as "temp file dir" is not a C concept.]

There are several reasons to grumble about temp files in the temp file
dir. Some of them include:

- Many programs assume that the temp file dir is /tmp which is not the
case on all systems (not even all unix-like systems)

- May programs do not use the TEMP environment variable to determine
where temporary files should go (a common unix-ism)

- It is not uncommon for programs to use invariant filenames for
their temporary files; if those files go into a common temporary directory
then the clash of filenames interferes with multiple copies of the
program running simultaneously

- The C standard routines tmpfile() and tmpnam() do not define where
the files will live, and there is no standard way to ask; this can
lead to difficulties with routines that try to avoid resource exhaustion

- When files are put into common temporary directories, a number of
security and race conditions arise, that are at least much -reduced-
if the files do not go into a common directory

- On heavily-used multiuser systems, such as university systems,
it is not uncommon for the standard temporary directories to be
on filesystems of strictly limited size, to avoid having temporary
files use too much space. On such systems, temporary files should
go into one of the user's directories (and thus subject to the user's
quotas) or some other authorized directory [e.g., staff might have
access to a temporary directory that students do not.] Thus users
need to be able to control where temporary files go instead of
having them automatically go into "the temp file dir".

- On heavily-used multiuser systems, it might be impractical to
have "the temp file dir" be on a filesystem large enough to accomedate
the sum of all the reasonable requests, whereas even having different
temporary directories for different sets of users might make the
resource allocation practical.


None of these reasons had to do with the transient nature of temporary
files, but they are all reasons against programs putting files
in "the temp file dir" ["the" implies there is only one such directory]
without attempting to discern where the user would like the files
to be placed.
 
K

Keith Thompson

Kelsey Bjarnason said:
[snips]

Ok, so my screen is a scratchpad, and I shouldn't depend on any
information on it sticking around for long. But it's *my* scratchpad,
and if a program clears it for no good reason, I'm not going to be
pleased.
[...]
What's on my screen is probably the output of the last program (or
programs) I ran. Your own program's output isn't so important that you
need to erase *my* (possibly unimportant) data.

My program may or may not be that important - that's for you to decide.
And you did decide it was that important, by running it in the middle of
your storage area. In fact, in doing so, you demonstrated that your
on-screen data has absolutely _zero_ value. Why? Simple: the app you're
complaining about is clearing the screen. If you _knew_ it was going to
do this, you'd have run it in another console, or saved your existing data
somewhere, etc.

How do you conclude that my on-screen data has zero value? Yes, if
it's at all important I should have stored it somewhere, or put it in
a separate window, or otherwise been more careful to avoid clearing
it. But if all my on-screen data had zero value, I'd run my shell in
a one-line terminal window (and I'd have room for a lot more windows
on my screen).
Instead, the clear screen took you by surprise. This means you have no
idea what the output of the application is, whether it clears the screen,
locks up the computer, floods the scrollback buffers, etc. Doesn't matter
the actual details, the fact that it's taking you by surprise means you
simply _do not know_. If you don't know what the output is, but you're
running it in the middle of your data store, then you, yourself, are, by
that very action, defining the value of your data in the store. The value
is zero. If it were anything else, you wouldn't be running unknown
applications in the middle of it.

Nonsense. By running an unknown program, I'm accepting, for the sake
of convenience, the risk that it might clear my screen. If the author
of the program was stupid enough to clear the screen for no good
reason, I'm going to be annoyed -- and I probably either won't run
that program again, or I'll modify it so it doesn't clear my screen.

Sure, it's my fault for trusting it. That doesn't excuse the author.
I understand your key concept, that apps should be "well behaved", and I,
at least, generally try to achieve that end myself.

Which is really all I'm trying to say.
On the other hand,
having been bitten once or twice by exactly this sort of thing - buffer
floods, screen clears, etc - I simply ask myself if what I've already got
matters. If it doesn't, run the app. If it does, and I run the app and
lose the data, I've got nobody to blame but myself.

I've been bitten by such things too. Since the information on my
current screen is generally not critical, it's an acceptable risk.
Since that information has *some* value, it's still annoying when it
happens.
 
S

SM Ryan

# How do you conclude that my on-screen data has zero value? Yes, if
# it's at all important I should have stored it somewhere, or put it in

My program can conclude your disk drive has zero value and
erase it--if the program so documents its function.
 
K

Keith Thompson

How do you conclude that my on-screen data has zero value? Yes, if
My program can conclude your disk drive has zero value and
erase it--if the program so documents its function.

(Attribution deliberately snipped.)

Certainly. If a program is intended to erase a disk drive, that's
exactly what it should do. If a program is intended to clear the
screen, such as the "clear" command in Unix or "cls" in Windows, then
it should do so.

My objection is to programs that clear the screen with no good reason.
I'm getting tired of people pretending that I've said anything more
than that.
 
K

Kelsey Bjarnason

[snips]

No, he didn't. He generated it for immediate use.

Then if it goes away, it doesn't matter, so the whole discussion is moot.
It is precisely because he doesn't *want* it to go away - he's storing it
- that the issue arises.
And that's the very point. If Turbo-C wannabe programmers weren't so
orgasmic about clearing the screen, their programs would not predictably
clobber over useful data we've just generated.

No, that's not the point. The point is that the user who runs
unpredictable programs in the middle of his important data has nobody but
himself to blame when the data goes away. Or are you the sort who
regularly runs random number generators over all the company financial
files and then complains when the results aren't valid data? No? Why
not? Oh, right, because running unpredictable programs in the middle of
your important data is stupid. Right, good. We're in perfect agreement.
 
K

Kelsey Bjarnason

[snips]

How do you conclude that my on-screen data has zero value?

As I explained; you defined that. By running unknown, unpredictable
programs in the middle of it. Since those programs may well nuke the
data, the fact you choose to run them there means that you, yourself, have
set the value of the data, and the value is zero. If it were non-zero, you
wouldn't be running unpredictable, unknown apps in the middle of it, or
you'd have saved it elsewhere.
Nonsense. By running an unknown program, I'm accepting, for the sake of
convenience, the risk that it might clear my screen.

Or flood the scrollback buffers, or lock the machine, etc, etc, etc, not
a single one of which is in the remotest degree consistent with keeping
the "important data" untouched.
If the author of
the program was stupid enough to clear the screen for no good reason,
I'm going to be annoyed

But a lockup, that's okay, flooding the buffers, that's okay, closing the
terminal, that's okay. Only clearing the screen isn't?

Rubbish. *Any* result which causes the loss of the data will have the
same, possibly even worse, annoyance factor. You know that, but guess
what? You run the program anyways. Obviously, you don't care about the
data, or you wouldn't be running the app in the middle of it.

You simply cannot have it both ways. Either the data matters, in which
case you *treat* it like it matters - not running unknown, unpredictable
apps in the middle of it - or the data doesn't matter. The very fact of
running such apps in the middle of it demonstrates that you don't care
about the data - thus the argument offered, thus far, at least against
screen clears, is complete tripe.

Why is it so many folks have this weird bug up their butt about screen
clears? You say "clear the screen", it's like pushing a magic button.
There's about 973 other ways to lose the data, all of which are equally
unacceptable, and every single one of which - including via screen clears
- can be trivially avoided, usually by *not* doing something so silly as
running unknown, unpredictable, untrusted apps in the middle of one's
important data stores.

The lesson to be learned here is not "don't clear the screen", but,
rather, "don't run untrusted apps in the middle of your important data".
If there's an argument against clearing the screen, I'd be happy to hear
it... but this isn't it. It focuses on one trivial detail to the
exclusion of the general premise - missing the forest for a single tree -
then blames the tree. It's patently absurd.
 
K

Keith Thompson

Kelsey Bjarnason said:
The lesson to be learned here is not "don't clear the screen", but,
rather, "don't run untrusted apps in the middle of your important data".
If there's an argument against clearing the screen, I'd be happy to hear
it... but this isn't it. It focuses on one trivial detail to the
exclusion of the general premise - missing the forest for a single tree -
then blames the tree. It's patently absurd.

Sure, there are many other ways a program can screw things up.
Unnecessarily clearing the screen is just one of them, and not even
the most important.

I mentioned it because it came up in the context of a program that
unnecessarily cleared the screen.

I'm done with this. Bye.
 
R

Richard Bos

Kelsey Bjarnason said:
Then if it goes away, it doesn't matter, so the whole discussion is moot.

Well, that just goes to show: never trust a programmer who isn't a user
as well.

Richard
 
C

CBFalconer

Kelsey said:
.... snip ...

The lesson to be learned here is not "don't clear the screen", but,
rather, "don't run untrusted apps in the middle of your important
data". If there's an argument against clearing the screen, I'd be
happy to hear it... but this isn't it. It focuses on one trivial
detail to the exclusion of the general premise - missing the forest
for a single tree - then blames the tree. It's patently absurd.

No, the lesson to be learned is "Do not unnecessarily annoy the
customer".

--
"If you want to post a followup via groups.google.com, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers." - Keith Thompson
More details at: <http://cfaj.freeshell.org/google/>
Also see <http://www.safalra.com/special/googlegroupsreply/>
 
A

Andrew Poelstra

Kelsey said:
[snips]

How do you conclude that my on-screen data has zero value?

As I explained; you defined that. By running unknown, unpredictable
programs in the middle of it. Since those programs may well nuke the
data, the fact you choose to run them there means that you, yourself, have
set the value of the data, and the value is zero. If it were non-zero, you
wouldn't be running unpredictable, unknown apps in the middle of it, or
you'd have saved it elsewhere.
Nonsense. By running an unknown program, I'm accepting, for the sake of
convenience, the risk that it might clear my screen.

Or flood the scrollback buffers, or lock the machine, etc, etc, etc, not
a single one of which is in the remotest degree consistent with keeping
the "important data" untouched.
If the author of
the program was stupid enough to clear the screen for no good reason,
I'm going to be annoyed

But a lockup, that's okay, flooding the buffers, that's okay, closing the
terminal, that's okay. Only clearing the screen isn't?

Rubbish. *Any* result which causes the loss of the data will have the
same, possibly even worse, annoyance factor. You know that, but guess
what? You run the program anyways. Obviously, you don't care about the
data, or you wouldn't be running the app in the middle of it.

You simply cannot have it both ways. Either the data matters, in which
case you *treat* it like it matters - not running unknown, unpredictable
apps in the middle of it - or the data doesn't matter. The very fact of
running such apps in the middle of it demonstrates that you don't care
about the data - thus the argument offered, thus far, at least against
screen clears, is complete tripe.

Why is it so many folks have this weird bug up their butt about screen
clears? You say "clear the screen", it's like pushing a magic button.
There's about 973 other ways to lose the data, all of which are equally
unacceptable, and every single one of which - including via screen clears
- can be trivially avoided, usually by *not* doing something so silly as
running unknown, unpredictable, untrusted apps in the middle of one's
important data stores.

The lesson to be learned here is not "don't clear the screen", but,
rather, "don't run untrusted apps in the middle of your important data".
If there's an argument against clearing the screen, I'd be happy to hear
it... but this isn't it. It focuses on one trivial detail to the
exclusion of the general premise - missing the forest for a single tree -
then blames the tree. It's patently absurd.

Unless you believe that all C apps should aspire to be "unpredictable",
you are shooting yourself in the foot, because obviously if a good
program should be predictable (which we know, as no one accepts UB
here), then an "unpredictable" program, or one that clears your screen,
must not be good.

Here's an analogy:
Suppose you are writing a test. Your teacher comes up to you and sets
fire to your notepaper. You look at him and ask why he did that. He
replies that by putting your looseleaf openly on your desk, you were
putting its value at 0, because you were in a room with an unpredictable
teacher holding a lighter. Anything important should have stored in your
binders, and unless you're the sort of person who frequently dips their
binder in a tank of ink, you have no right to complain when you put your
quick notes at his disposal like so.
 
W

Walter Roberson

As I explained; you defined that. By running unknown, unpredictable
programs in the middle of it. Since those programs may well nuke the
data, the fact you choose to run them there means that you, yourself, have
set the value of the data, and the value is zero. If it were non-zero, you
wouldn't be running unpredictable, unknown apps in the middle of it, or
you'd have saved it elsewhere.

No. Your logic is insufficient.

Each time a program is run, even a well-known program, there is a risk
of malbehaviour, even if only due to hardware errors. Indeed, with
traditional electronic displays, there is a non-zero risk of loss of
information, such as if the power fails, or if the device blows a fuse,
or if a capacitor burns out. One could arrange to have the outputs
logged to a file (e.g., the unix "script" program), but the disk might
fill up, the filesystem might get corrupted, the drive assembly might
Halt And Catch Fire. The risk can never be totally removed. By your
logic, since these risks of loss are non-zero, by doing any computation
at all, the person has set the value of the computation to be zero,
which is [to me anyhow] clearly not realistic.

Instead, the person is not setting the information value to be zero:
the person is multiplying the probability of significant unfriendly
program behaviour (or other failure) times the cost of reproducing the
information, and deciding that the value of the information is lower
than that risk-weighted cost. (More correctly, the person is
integrating rather than multiplying, as there are multiple potential
risks that have different associated costs.) The modern -probability-
that a program will clear the display upon starting is not high --
except perhaps when dealing with programs written by novices.

As the "if you valued your data you wouldn't have run the program"
argument is not correct, the only excuse for gratitiously clearing
the user's screen is for the perversity of reminding users to
take more care in running unknown programs -- a reminder that
unknown programs could be even more deliberately malicious.
 
M

Michael Wojcik

Then if it goes away, it doesn't matter, so the whole discussion is moot.

What nonsense. "Importance" isn't a binary attribute of data. Some
data are more important than others at any given moment; importance
changes over time; it depends on context, cost of reproduction, and a
host of other factors; and data may have subjective value that is
independent of its objective importance (that is, its productivity in
current and future processes versus the cost of recreating it or
doing without it).

Richard's ls example is a fine one. Yes, I can run ls again, as many
times as I like; that doesn't excuse a program's removing that
information from my screen and making me run ls again, if the program
has no reason to clear the screen.

As far as I can see, your argument hinges on a false dichotomy
between "important" information which must be carefully preserved and
"unimportant" information which can be discarded with zero cost.
That's a ridiculous model.
No, that's not the point. The point is that the user who runs
unpredictable programs in the middle of his important data has nobody but
himself to blame when the data goes away.

No, that's not the point either. The argument does not depend on
whether the user knows or does not know that the program will clear
the screen. I can know that a program will clear the screen
unnecessarily and still deplore that it does so.
 
T

Typhonike

I suggest you to change your second if as else if{...} then write a if
statement for return (n*fact(n-1)),and the last thing i wanna say is
change your fnc return type as double
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,583
Members
45,074
Latest member
StanleyFra

Latest Threads

Top