opening a file

G

George

You have never, ever mistyped a file name, started a script in the wrong
directory or as the wrong user? You, sir, have my utmost admiration -
or you would have if I believed you, which I don't.


Warnings don't catch mistyped filenames. Checking the return value of
open does.

hp

PS: I recall that not so long ago you started a thread with the subject
"crisis perl".

A mistyped filename is precisely what I did starting out this thread, where
I had ehp instead of eph, and I was pulling my hair out.

I consider myself well-advised (by you and others) to use:

open(my $fh, '<', $filename) or die "cannot open $filename: $!";
--
George

Bring them on.
George W. Bush

Picture of the Day http://apod.nasa.gov/apod/
 
G

George

I misspoke that there are really no differences, as there actually are,
depending on the scenario. Also, I should have actually added more
details about indirect file handles above. However, instead of
repeating a few of the advantages to the three-argument open and
indirect file handles, please refer to
perldoc -f open (on the web at
http://perldoc.perl.org/functions/open.html)

As well as, you can get read up quickly at:
http://perldoc.perl.org/perlopentut.html Specifically, refer to
"Simple Opens" and the latter mentioned 3-argument version. Directly
below is "Indirect Filehandles", which you should find of particular
interest. These outline the differences, again depending on the
scenario and provide all of the gory details.

Thanks, Tim, my fortran buddies were asking about this, and that link hits
the spot.

As OP, I'm done with this for this for now and am moving on to bigger and
better mistakes.:-0
--
George

Any government that supports, protects or harbours terrorists is complicit
in the murder of the innocent and equally guilty of terrorist crimes.
George W. Bush

Picture of the Day http://apod.nasa.gov/apod/
 
T

Tim Greer

Eric said:
I can't say for cartercc, but I have difficulties obviously

perl -wle '
use Benchmark qw|countit cmpthese timethese|;
my $lit = qq{abc\txyz};
my $t = timethese 50_000, {
die => sub { open my $fh, q|>|, q|/dev/null| or die; },
maybe => sub { open(my $fh, q|>|, q|/dev/null|) || die; },
live => sub { open my $fh, q|>|, q|/dev/null|; }, };
cmpthese $t;
'
Benchmark:
timing 50000 iterations of
die, live, maybe
...

die: 4 wallclock secs ( 1.80 usr + 0.91 sys = 2.71 CPU) @
18450.18/s (n=50000)

live: 3 wallclock secs ( 1.60 usr + 0.99 sys = 2.59 CPU) @
19305.02/s (n=50000)

maybe: 3 wallclock secs ( 1.72 usr + 0.90 sys = 2.62 CPU) @
19083.97/s (n=50000)

Rate die maybe live
die 18450/s -- -3% -4%
maybe 19084/s 3% -- -1%
live 19305/s 5% 1% --

Rate die maybe live
die 19608/s -- -2% -2%
maybe 20000/s 2% -- -0%
live 20080/s 2% 0% --

Rate die maybe live
die 19380/s -- -1% -2%
maybe 19531/s 1% -- -1%
live 19763/s 2% 1% --

And once I've even had that (though failed to recreate)

Rate live die
live 20000/s -- -1%
die 20161/s 1% --

*CUT*

I'd be curious to see the processing "costs" involved with the code that
would follow due to lack of a check on opening a file, compared to
skipping the code that would otherwise follow (or died or gracefully
moved on, whatever one wanted to do), where the lack of checking for a
failure or not would error (for example on reading from or writing to a
filehandle that wasn't ever opened). I'm thinking the costs would be
higher then, by a lot, depending on what the code would do (which would
be the other way you'd (maybe) see there was an issue with the file not
being opened).
 
P

Peter Scott

For another example, consider warnings. One of my tasks is to process a
file which may consists of several hundred thousand rows, totaling
different kinds of values. When I run the script with warnings, I get an
uninitialized value warning for every row printed to the screen, and the
script takes a significant amount of time to run. When I run the script
without warnings, the script runs quickly. To silence the warnings I can
either (1) initialize a hash value for each row, or (2) run without
warnings. I choose (2).

Again, this isn't a big deal, and I TOTALLY AGREE that the return value
for open() should generally be checked, and that warnings should
generally be enables. However, for the reasons I stated, I often don't
do this, and I am perfectly willing to accept the consequences.

You're an accident waiting to happen. When it does, I only hope that you
realize that it was a consequence of your poor discipline and not the
computer victimizing you.

If you don't take care with writing programs that are just for you, why
should anyone trust that you're going to get them right when you're under
pressure to deliver for someone else?
 
C

cartercc

You have never, ever mistyped a file name,

Yes, somewhat frequently.
started a script in the wrong directory

Never. My invariable habit is to copy all the data files to a task
directory and then write the script in that directory.
or as the wrong user?

Never. There is only one user on my machine: me.
Warnings don't catch mistyped filenames. Checking the return value of
open does.

True. A couple of days ago, I messed up a filehandle. The file opened
all right, there wasn't an error to catch, but nothing got written to
the file. In this case, checking the return value to open didn't catch
the error, and neither did warnings.
PS: I recall that not so long ago you started a thread with the subject
"crisis perl".

I did, and I have a story to tell.

My manager (a business type, not an IT type) had a project sitting on
his desk for two weeks. I knew something about it because of overheard
conversations, but I didn't know what it was because no one discussed
it with me. On Thursday, he came into my office to discuss it with me
and told me what he needed, except for the output requirements. I told
him I needed to know how the output was to be structured. Thursday
afternoon, he sent me an Excel file with no further information.

Friday morning, I told him that I thought I could finish the data
extraction part in about ten hours, as I had done something similar,
estimating the time that each piece would take. Friday afternoon he
sent me an email stating that the deadline was THAT FRIDAY (!!!) and
that the output was to be placed in Excel. The Excel file was several
hundred rows deep and about fifty columns wide. He said he wanted it
Monday morning. When I called him to tell him that I couldn't do it in
that time, I learned that he had gone home early that Friday and
couldn't be reached.

I worked until almost midnight that Friday. Saturday I worked from
8:30 a.m. to about 10:30 p.m. Sunday I worked from 3:30 p.m., and
finished around 2:00 a.m. Monday morning. Data extraction took about
ten hours, but creating the Excel file added another 14 hours, for a
total of 24 hours over the weekend.

I didn't mind working. I actually enjoyed doing the project. Also, I
got the credit for finishing the job, which was nice. However, the
code was a piece of crap, undocumented, brittle and not very stable.
When I indicated that I wanted to take a couple of days to clean it
up, I was told not to, that I should not spend any more time on it.

This particular project is an annual job. Guess what's going to happen
next year? And ... if I get tasked with the job again, I'll probably
have to rewrite most of it from scratch.

Question: Are these skills, giving professionals arbitrary and
capricious deadlines totally unrelated to the requirements of the job
at hand and forbidding them from documenting and cleaning up code,
some that's taught in management school, or is it something that
develops on the job?

CC
 
C

cartercc

I'd imagine it'd be easy enough to figure out in a lot of cases,
especially if you use unique filehandle names and such, but if you
didn't and the script grew, it could open the potential for problems to
crop up that weren't immediately obvious, though I still imagine it
wouldn't take very long to find the issue.  I suppose the issue varies
upon the risk and how you're using data, how important that data is,
and how important the results are -- though probably none of those
things are technically going to increase by not checking the return
value, it could in some situations (but again, that would be due to
poor logic in the script anyway, but if we were all perfect, we'd not
need failures to be reported, let alone return values).  I'm certain
people can (and have) created scripts without good checking that work
fine and may work fine indefinitely, but it's just easier to
intentionally put in checks and fail safes as you go along in my
opinion.

I strikes me that process may have a lot to do with this as well. I
write code incrementally, starting with this:

open INFILE, "<input.txt";
open OUTFILE, ">output.txt";
while (<INFILE>) { print OUTFILE; }
close OUTFILE;
close INFILE;

I then look at the output file. If it's identical to the input file, I
start developing the processing logic. If the output file exists but
is blank, I look for the error. In any case, once this works, I never
have to think about opening or closing the files again.

Obviously, if one were writing a script to be used on a general basis
by others, one would practice defensive programming. In that case, I
would consider the failure to check the call to open as an error.

CC
 
C

cartercc

You're an accident waiting to happen.  When it does, I only hope that you
realize that it was a consequence of your poor discipline and not the
computer victimizing you.

I think you may have missed the context of my comments. When I write a
one-time throw-away script FOR ME to transform some data from one form
to another, I often don't check the return value of open. THIS IS /
NOT/ THE SAME AS PRODUCING CODE FOR OTHERS TO USE! If the script
doesn't work, then I've made an error, so I find the error and fix it.
All I'm saying is that I'm comfortable working this way and find the
consequences acceptable, in part because open is very reliable and
almost never fails.
If you don't take care with writing programs that are just for you, why
should anyone trust that you're going to get them right when you're under
pressure to deliver for someone else?

(1) Because I don't deliver scripts to others that don't include a lot
of defensive programming. This isn't a philosophical stance, but
experiential -- I've spent too much time fixing scripts that break in
the field to deliver something that can be prevented before hand. (2)
In this particular situation, the failure of a call to open, the
problem isn't the script but the failure of the user to move the input
files in the correct directory. And yes, I've been guilty of the
following:

open IN, "<$in" or die "You idiot, you know you need to move $in into
this directory before attempting to run this script!";

CC
 
C

Charlton Wilbur

CC> Question: Are these skills, giving professionals arbitrary and
CC> capricious deadlines totally unrelated to the requirements of
CC> the job at hand and forbidding them from documenting and
CC> cleaning up code, some that's taught in management school, or is
CC> it something that develops on the job?

It's something that happens naturally when "professionals" don't stand
up for themselves. Your boss continues to do it because you allow him
to do it, and it will stop when you stand up to him.

Charlton
 
C

cartercc

It's something that happens naturally when "professionals" don't stand
up for themselves.  Your boss continues to do it because you allow him
to do it, and it will stop when you stand up to him.

You are right, but there's another side. A 'professional' who 'stands
up' in situations like this isn't a professional.

A true professional focuses on the job at hand, not letting personal
circumstances or personalities get in the way. A professional gets the
job done regardless of the behavior of others. A professional makes
completion of the task at hand his number one priority. A professional
doesn't complain about the actions of others that make his job more
difficult.

I don't disagree with you. However, my point would be that the IT
professional is a real professional with regard to his approach to the
job. Unfortunately, business and managerial types don't seem to be
professionals. From my POV, it's not professional to fool around until
the last moment with a big project and then pass it off to someone
else, maybe with the hope that the blame for failure to meet the
deadline can be passed off as well.

Also, I would say that an IT worker that watches the clock and insists
on his rights is not a professional but merely a wage earner, and
deserves to be treated like a wage earner.

I would much rather have the reputation of someone that can be
depended on to do a job and do it right, than of someone who can't be
pushed around.

CC
 
X

xhoster

Tim Greer said:
I'd imagine it'd be easy enough to figure out in a lot of cases,
especially if you use unique filehandle names and such, but if you
didn't and the script grew, it could open the potential for problems to
crop up that weren't immediately obvious, though I still imagine it
wouldn't take very long to find the issue.

Oh, absolutely it could grow into more of a problem, if used
indiscriminately. My rule of thumb is that if the script is
important/permanent enough to save to disk and to do so with a filename
better than "foo.pl" or "asdf.pl", then I wouldn't use such shortcuts in
it. But given the number of times I use -e or have scripts named foo.pl,
that still leaves a lot of scope where I rely on -w rather than checking
each open.

But the "sin" I do often engage in, even in permanent scripts, is not
reporting the file name that failed to open, but just $!. The file name in
the open is often constructed by interpolation on the fly, and I don't feel
like having the same thing in two places where they could get out of sync,
or constructing temporary variables. Usually I can figure it out just by
going to the offending line number; occasionally I have to change the die
to include more info and re-run the program. This is where Fatal may be
handy, but last time I tried it I encountered some problem with it that I
can no longer recall.


Xho

--
-------------------- http://NewsReader.Com/ --------------------
The costs of publication of this article were defrayed in part by the
payment of page charges. This article must therefore be hereby marked
advertisement in accordance with 18 U.S.C. Section 1734 solely to indicate
this fact.
 
C

Charlton Wilbur

cc> You are right, but there's another side. A 'professional' who
cc> 'stands up' in situations like this isn't a professional.

Um, no. A professional says "Yes, it's *possible* to do that, but
that's the wrong way to do it."

How long do you think a structural engineer would last if he signed off
on whatever the managers wanted, without pushing back when they wanted
to do something that was unsafe?

How long do you think a doctor would last if he did everything the
patient wanted, regardless of whether it was in the patient's best
interest?

How long do you think a lawyer would last if he did everything his
client wanted, legal or not, without advising the client of the
ramifications of his actions?

cc> A true professional focuses on the job at hand, not letting
cc> personal circumstances or personalities get in the way. A
cc> professional gets the job done regardless of the behavior of
cc> others. A professional makes completion of the task at hand his
cc> number one priority. A professional doesn't complain about the
cc> actions of others that make his job more difficult.

No, a professional considers his responsibility to the client and to the
profession. A professional refuses to do things that will injure the
client or the profession.

cc> I would much rather have the reputation of someone that can be
cc> depended on to do a job and do it right, than of someone who
cc> can't be pushed around.

Alas, then, that you're getting the reputation of someone who can be
depended on to put in the hours, but who produces shoddy work.

Charlton
 
T

Tim Greer

cartercc said:
I think you may have missed the context of my comments. When I write a
one-time throw-away script FOR ME to transform some data from one form
to another, I often don't check the return value of open. THIS IS /
NOT/ THE SAME AS PRODUCING CODE FOR OTHERS TO USE! If the script
doesn't work, then I've made an error, so I find the error and fix it.
All I'm saying is that I'm comfortable working this way and find the
consequences acceptable, in part because open is very reliable and
almost never fails.

As strange as it sounds (and I would personally never intentionally fail
to check a return on a call, but), I could actually see how doing the
more reckless programming for yourself (only), will get you familiar
with some of the more crytic bugs and errors that aren't obvious, if
you end up troubleshooting or fixing other people's code for your
actual job. Then again, I could see this backfiring by not having that
logic default from your head into the code you're writing. You don't
have to go as far as to make too detailed error logging or reporting
either, so I always add something helpful, even if it's in my own
scripts. Force of good habit, even in your own quick, trivial code, is
a good thing, but I don't mean it in a way that everyone needs to (but
that they probably should). Anyway, as long as it's just your own code
that other people won't use or aren't paying you for, I say do what you
want.
 
C

cartercc

    cc> You are right, but there's another side. A 'professional' who
    cc> 'stands up' in situations like this isn't a professional.

Um, no.  A professional says "Yes, it's *possible* to do that, but
that's the wrong way to do it."

Which is not what you implied. You implied that if the developer
refused to work on a project with an unreasonable deadline, the
manager would stop imposing unreasonable deadlines. In my case, it was
*possible* to complete the assignment, but only so by working over the
weekend.
How long do you think a structural engineer would last if he signed off
on whatever the managers wanted, without pushing back when they wanted
to do something that was unsafe?

Not the same thing at all. In this case, I referred to an inconvenient
deadline, not an impossible or unsafe deadline.
How long do you think a doctor would last if he did everything the
patient wanted, regardless of whether it was in the patient's best
interest?

Not the same thing. In fact, doctors do work impossible hours. If the
doctor said, 'I'm not going to take care of this patient because it
will cause me personal inconvenience,' how long do you think he would
last? Besides, there was no question (in my case) of inconsistency
between what was best for the enterprise and best for the manager --
coding up the job had absolute priority, regardless of my convenience.
How long do you think a lawyer would last if he did everything his
client wanted, legal or not, without advising the client of the
ramifications of his actions?

Not nearly the same thing. In my case, working over the weekend was
not illegal, and I took comp time to make up the hours. Nothing about
the project was illegal.
    cc> A true professional focuses on the job at hand, not letting
    cc> personal circumstances or personalities get in the way. A
    cc> professional gets the job done regardless of the behavior of
    cc> others. A professional makes completion of the task at hand his
    cc> number one priority. A professional doesn't complain about the
    cc> actions of others that make his job more difficult.

No, a professional considers his responsibility to the client and to the
profession.  A professional refuses to do things that will injure the
client or the profession.

And I suppose you have never faced a circumstance that required your
services after hours or during scheduled off time? Do you think it
proper to refuse to work simply because the off-hours duties were
directly caused by your manager's procrastination?
 
    cc> I would much rather have the reputation of someone that can be
    cc> depended on to do a job and do it right, than of someone who
    cc> can't be pushed around.

Alas, then, that you're getting the reputation of someone who can be
depended on to put in the hours, but who produces shoddy work.

I didn't say that the end product was shoddy. In fact, the end project
was exactly what it was supposed to be -- an Excel file with rows and
columns of numbers. What I said was that the coding was crap -
inefficient, redundant, undocumented, brittle, not scalable - just
like most first versions of software. Before documentation, before
modularization, before refactoring, etc. There is a big difference
between producing a product that meets the functional requirements and
doing so with good code. How many times has your first effort been
perfect? In most cases, the first effort does little more than
validate the specification.

I have the reputation of beating deadlines, not producing shoddy
products. Obviously, I'd fail the deadline rather than producing
something that didn't work.

CC
 
C

Charlton Wilbur

cc> Not the same thing. In fact, doctors do work impossible
cc> hours. If the doctor said, 'I'm not going to take care of this
cc> patient because it will cause me personal inconvenience,' how
cc> long do you think he would last? Besides, there was no question
cc> (in my case) of inconsistency between what was best for the
cc> enterprise and best for the manager -- coding up the job had
cc> absolute priority, regardless of my convenience.

Where did you throw personal convenience into the mix?

You're doing unprofessional work because you're acceding to your
manager's demands to do things that are quick and dirty. You're
rationalizing it as a first effort that merely validates the spec, but
what it boils down to is this: you, as a *professional*, by claiming
that term, are responsible for the quality of the code you produce.
This means pushing back when the manager tells you to do something
inappropriate.

Didn't you start a thread some time back about crisis mode programming?
Do you really not see a connection between your willingness to work on
deathmarch and panic-driven schedules without pushing back, and the
frequency with which you have to deal with crises?

cc> And I suppose you have never faced a circumstance that required
cc> your services after hours or during scheduled off time? Do you
cc> think it proper to refuse to work simply because the off-hours
cc> duties were directly caused by your manager's procrastination?

I am a firm believer in the maxim "A failure to prepare or plan on your
part does not constitute an emergency on mine."

I have had managers who were walking repositories of self-created crises
and emergencies. I pushed back hard against the self-created deadlines,
and when I found managers who were impossible to train, I got out of
those jobs as quickly as I could.

I advise you to do likewise.

Charlton
 
C

cartercc

Where did you throw personal convenience into the mix?

In fact, my wife and I had made plans for that weekend, and she was
very upset when I called her Friday afternoon and told her that I had
to work over the weekend. If I had planned to work the weekend, it
wouldn't have been a big deal, but I didn't lose any time because I
took three days later.
You're doing unprofessional work because you're acceding to your
manager's demands to do things that are quick and dirty.

Let's explore this. Is 'quick and dirty' ever appropriate? Microsoft
did QDOS (Quick and Dirty Operating System) early on, IIRC. In my
practice I often take short cuts and ignore best practices for one-
time scripts where the output is more important than the script. I
don't think it's necessarily unprofessional to do 'quick and dirty'
and at times it may be unprofessional to follow all the recommended
practices for a very small job. Besides, it's my manager's call, and
if he wants quick and dirty, shouldn't I comply? We're not talking
about critical health or safety issues here.
 You're
rationalizing it as a first effort that merely validates the spec, but
what it boils down to is this:  you, as a *professional*, by claiming
that term, are responsible for the quality of the code you produce.  

Absolutely! But the 'quality' relates to the output, not the code. I'm
in my fifth year at my current job, and I've NEVER had a customer look
at my code to see if it's up to snuff ... but I've had customers
complain plenty of times about the output the code produces.* We are
judged by the work product, not necessarily by the means used to
produce the work. No one here gives a rat's ass about the quality of
my code, except for me.
This means pushing back when the manager tells you to do something
inappropriate.

Absolutely! But there's a difference between 'inappropriate' and an
unreasonably short deadline. I very rarely get impossible deadlines,
but when I do, I make every effort to get the work done by the
deadline. If the deadline passes without the job being finished, then
it's on the manager, not me, and that's all the 'pushing back' that's
needed.
Didn't you start a thread some time back about crisis mode programming?
Do you really not see a connection between your willingness to work on
deathmarch and panic-driven schedules without pushing back, and the
frequency with which you have to deal with crises?

Hey, a large part of my job is to produce reports, and it's common for
the customer to request the report at the very last minute. These may
be crises, but they are not of my making. At the end of the day it's
the customer that faces the consequences of the crisis. Despite that,
I make the effort to do my job in such a way that no one can fault me
for the failure to get the work done on time. (Please note that my job
is NOT writing software but producing reports -- I just produce
reports by scripting.)
I am a firm believer in the maxim "A failure to prepare or plan on your
part does not constitute an emergency on mine."

I totally agree. My version of this is, 'Procrastination on your part
does not constitute an emergency on my part.'
I have had managers who were walking repositories of self-created crises
and emergencies.  I pushed back hard against the self-created deadlines,
and when I found managers who were impossible to train, I got out of
those jobs as quickly as I could.

I only have one manager, and I've seen him once this year. He
generally leaves me alone, and I probably don't get an assignment from
him more than once a month. OTOH, I have lots of customers, some of
whom give me plenty of time but others who notoriously send in jobs at
or past their deadline.
I advise you to do likewise.

My general approach is to meet or beat the deadline without
complaining. If I can't, I explain that I can't do a 12 hour job in
three hours --- BUT I MAKE SURE THAT THIS IS SEEN AS AN EXPLANATION
AND NOT AS AN EXCUSE. People know that I try hard, and most customers
are willing to accept the consequences of their own procrastination.

CC
-----------------
* There are many reasons a customer would complain about the output,
ranging from an ambiguous specification, to garbage input, to my not
understanding what was needed. Case in point: this week, an
administrator requested a report on 'all active students.' He got a
list of tens of thousands of names when he was expecting a couple of
dozen names, and he complained. What he wanted was HIS active
students, but what he requested was ALL active students, and I gave
him what he asked for, not what he meant to ask for. Also, it's not
uncommon for me to misunderstand the request, so I take credit for a
reasonable share of the blame.
 
C

cartercc

Wrong.  So, so, so wrong.

This is like the old debate on art for art's sake. Which is more
important? The product or the tool used to produce the product?
Nobody who drives across a bridge gives a rat's ass about the details of
its structural integrity--until the day the bridge collapses.  Maybe the
critical failure of your code won't cost lives, but it's still your
professional responsibility.

The bridge /IS/ the product. If you can produce /EXACTLY/ the same
product with (1) X dollars in Y days, or with (2) X * 10 dollars in Y
* 10 days, what do you choose? If you can by a BMW for $5,000 or the /
IDENTICAL/ BMW for $50,000, which one would you buy. If you produce
the /SAME/ output with less effort in less time, or with more effort
in more time, what do you do?

The issue is NOT the quality of the end product, because the end
product is identical. The issue is the effort used to produce the
product, and generally we like to minimize the effort. After all, the
cardinal virtues of Perl hackers are (1) laziness, (2) hubris, and (3)
impatience. Seems to me that I'm advocating all three.

CC
 
J

Jürgen Exner

cartercc said:
This is like the old debate on art for art's sake. Which is more
important? The product or the tool used to produce the product?


The bridge /IS/ the product. If you can produce /EXACTLY/ the same
product with (1) X dollars in Y days, or with (2) X * 10 dollars in Y
* 10 days, what do you choose? If you can by a BMW for $5,000 or the /
IDENTICAL/ BMW for $50,000, which one would you buy. If you produce
the /SAME/ output with less effort in less time, or with more effort
in more time, what do you do?

Poor comparison. If you write the same program a second time hopefully
you won't need the same amount of time again. At least I wouldn't hire
any programmer, who wouldn't take advantage of analysises already done,
lessons learned, and code already written the first time.

As a programmer you do not produce a physical product, you produce an
intellectual product. That is more like an architect producing the plans
for that bridge but not the bridge itself or even better an author
producing the manuscript for a book, but not the physical book. Those
intellectual products can be duplicated with much less effort the second
time.

jue
 
C

cartercc

Poor comparison. If you write the same program a second time hopefully
you won't need the same amount of time again. At least I wouldn't hire
any programmer, who wouldn't take advantage of analysises already done,
lessons learned, and code already written the first time.

I might be in a little different environment than most people. Much of
what I do involves querying a big institutional database over which I
have no control, and which is characterized by constant change. Two
quick examples: (1) Grades were identified by characters (A, B, C)
until one day my queries failed completed. I discovered that the
powers that be decided that henceforth grades would be identified by
digits (6, 7, 8), with no sort of notice to me or anyone else. (2)
This week, I discovered that two columns had been transposed, so that
the expected thousands of students were only 76. I couldn't just un-
transpose the columns, because of the 76 'right' students, so I had to
select and merge two columns. Obviously, the script that I wrote in
December had to be rewritten again this week to account for the
change.

My scripts frequently fail for this reason, so I can't rely on static
scripts that never change. Yes, the techniques are valuable, and
revisions take much less time than the original script. Still, I have
learned to take shortcuts that save time with no diminution in
quality, and I can't understand why that seems to bother some folks so
much.
As a programmer you do not produce a physical product, you produce an
intellectual product. That is more like an architect producing the plans
for that bridge but not the bridge itself or even better an author
producing the manuscript for a book, but not the physical book. Those
intellectual products can be duplicated with much less effort the second
time.

Then I must not be a programmer, because most of what I do gets
printed with a physical printer on real paper. My point is (and I'm
being redundant here) is that in doing a job you minimize costs while
maximizing benefits.

I /DO/ understand the rationale for 'best practices.' Using warnings,
using strict, checking the return value of open(), documenting scripts
and functions, etc., have the purpose of decreasing the cost of
failure, and as such are recommended. However, there are some tasks so
trivial that even the minimal effort of following these practices
seems costly in comparison with the benefit to be gained WITH RESPECT
TO THESE TRIVIAL SCRIPTS.

Again, I'm not advocating not checking the return value of a call to
open, or not following all the other best practices. What I said was
that in the daily grind I frequently omit some niceties. I'm an adult
and am perfectly capable of dealing with it.

CC
 
T

Tim Greer

cartercc said:
The bridge /IS/ the product. If you can produce /EXACTLY/ the same
product with (1) X dollars in Y days, or with (2) X * 10 dollars in Y
* 10 days, what do you choose? If you can by a BMW for $5,000 or the /
IDENTICAL/ BMW for $50,000, which one would you buy. If you produce
the /SAME/ output with less effort in less time, or with more effort
in more time, what do you do?

Even if you build the bridge the same way, one has fail safe's and
checking to allow the inspectors or even the people driving and walking
across the bridge some warnings about an existing problem that may lead
to a disaster, allowing workers to remedy the problem or close down the
bridge until the problem is resolved. Unlike the other bridge that
people happily driver over, and there's no indication of a problem,
until it's too late. That is a better example of the differences.
It's not always about checking returns of calls, but verifying that the
script is doing what it's intended to, and not risking it silently
creating a problem. Granted, not all such problems transpire from lack
of checking the return value alone, but some problems can happen. Of
course, you said that you'd never do that for a client, so you might be
okay, but why do that to yourself either. There's no more or less
effort in doing things the right way or the best way (there's no "best
way" or "same way" of doing the same quality coding -- but again,
you've said you don't cut corners for clients anyway, right?
 
J

J. Gleixner

Tim Greer wrote:
[...]
Even if you build the bridge the same way, one has fail safe's and
checking to allow the inspectors or even the people driving and walking
across the bridge some warnings about an existing problem that may lead
to a disaster, allowing workers to remedy the problem or close down the
bridge until the problem is resolved. [...]


Unless, of course, you live in Minneapolis.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top