From: Tim X said:
I also love the line where he says his supervisor may have asked him
to sign something, but he didn't know what it was for - imagine that,
signing things when you have no idea what you are signing away!
No, what I'm saying is that at the time I would have carefully read
what I was signing, but that would have been in 1977 (NMR) or 1984
(robot), both of which were more than 20 years ago, and I can't
remember so long ago which particular papers I did or did not sign at
some particular time. I have a vague memory of having to sign something
back in 1984 in regard to publishing the robot paper, but it was so
very long ago I can't trust that memory. And I have no memory at all of
any 1977 paper-signing which would have been more than 28 years ago.
Both are so awfully long ago you can't really expect me to remember
them as distinct events I can testify to here. I do however remember
that I did sign papers in 1978, in conjunction with the technology
assessment office at Stanford, when they were agreeing to help me
patent my data-compression invention. I was signing the dating of my
description so it'd be a legal document in their files whenever needed
for legal means such as proving that I was the first person to invent
that particular algorithm, and they were signing a non-disclosure
agreement, and witnessing my signature on the description on
such-and-such date, or something like that, I can't remember exactly
how many papers we signed 27 years ago, but it was a distinct thing I
had make a special appointment to do with them so that memory is more
distinct and definite than memory of other paper-signing events so very
long ago.
My point is that CGI is NOT limited to Unix environments. You
have CGI support with most multi-purpose web servers, like Apache and
they run on multiple platforms. I never made any statement that all
web servers provided CGI, only that CGI is not limited to Unix
environments.
Nice of you to agree with me now. Your original statement was unclear
and tended to imply that CGI on Unix is nothing more than just Unix by
itself because all Unix hosts support CGI and allow their users to use
CGI. My original point was that programming for the CGI environment is
quite a bit different from programming for the Unix stdio/stdout
environment in either filter (*) mode or interactive mode, so that fact
that I've written both CGI applications and regular Unix applications
is better skillset than if I had written only regular Unix
applications. But I was specifying CGI/Unix as the environment, because
I've never had access to any networked host that supported CGI except
this one that runs Unix, so if somebody is looking for somebody with
experience specifically on CGI/Windows I don't qualify, and if I just
say I have CGI experience somebody could easily mistakenly believe I
had such experience on Windows.
(*) Unix filter mode means you pipe some file or script or other data
source into stdin, and pipe stdout to another file or process, such as
with this command I wrote:
ps | grep '\- ' | awk ' {printf("kill %s\n",$1)} ' | sh
ps is the data source, while grep and awk are used in filter mode, and
sh is used in final-destination batch mode.
Note that *most* people who have written Unix applications have never
written any CGI application, so my experience there is special, more
than the average Joe's Unix programming experience. Also I specifically
used Unix features from CGI, rather than writing 100% pure CGI that
would run the same on Windows. In fact my first attempt at CGI/Perl ran
fine on Unix but wouldn't work at all on Tripod because they don't have
the underlying Unix that I was using, so I had to rewrite my CGI/Perl
script to make it run in pure CGI/Perl on Tripod. But that very limited
CGI/Perl experience was after I wrote that particular resume, so of
course only CGI/Unix was reported there. That was also before I got my
laptop running Linux, so of course my more recent Linux experience
wasn't reported either. My very latest resume, which is after both my
CGI/PurePerl and Linux experience, has CGI and Unix split apart, and
Unix/Linux joined together as a single kind of environment, to better
represent the range of my experience to-date. That part now reads:
* Platforms (programming environments): Unix/Linux shell, CGI,
Macintosh, MicroSoft Windows, and many others now obsolete
Do you have any nitpicks about that wording now?
I would now say that NOTHING you say would ever convince me of
anything.
That's a good skeptical attitude, not just for me, but for everyone who
ever tries to convince you of anything, such as that if you invest only
five dollars in a chainletter you can receive millions, or that male
enhancement drugs will make you popular with women, or that some
particular religion is based on the word of the Creator and therefore
is better than all other religions put together, or that by mixing
chemicals in a flask you can cause useful nuclear fusion to occur and
thereby provide a cheap safe source of energy.
Go with evidence, not with somebody claiming authority on some topic.
Please tell me what evidence I could present to you to convince you to
hire me to write software applications for you.
I would be very reluctant to employ anyone to solve my problems who
cannot even solve the simple problem of getting hs [sic] own
reasonable computing environment together.
I actually had a reasonble computing environment from 1989 until 1999,
able to develop Lisp applications on my Macintosh Plus, and wrote
several really useful applications there. But my MacPlus died in 1999,
and I haven't had the money to get it fixed, nor the money to purchase
a comparable programming environment on my newer Macintosh Performa.
Why didn't you hire me any time from 1991 to 1999 when I was unemployed
but still had a reasonable programming environment on my MacPlus, or
from 2004.Nov to 2004.May when I had a reasonable programming
environment on my Linux laptop plus a working modem for
uploading/downloading files between it and the net?
(I still have that reasonable programming environment on my laptop, but
without any way to move files between there and any other machine it's
not of much practical use for showing you my recent work or for doing
new work for you.)
Most employers in virtually any profession provide all the tools needed
by their employees. At most a company might require an employee to
purchase a short-sleeved white shirt (as with my job at Round Table
Pizza) or a uniform (advanced from first paycheque), but not the whole
computer system where software will be developed. When I worked for
SCU, they provided the IBM 1620 computer I used. When I worked for
Four-Phase, they provided the IV/70 computer I used. When I worked for
Stanford, they provided the PDP-10 and IBM-370 computers I used for the
various jobs there. If you claim to be an employer, but can't do the
common thing of providing a company computer for your employees to use
in developing software for you, your company totally sucks. It is
*you*, the prospective employer, not me, the prospective employee, who
needs to get a decent computer system for your employees to use.
what I wrote earlier about only commercial experience being counted
Please supply a clear definition of what *you* mean by "commercial
experience". I have no idea whether you refer *only* to software that
is sold in shrink-wrapped packs (CD-ROMs for example) in stores, or
also software that is distributed invisibly over the net in return for
e-money, or also software that isn't itself sold but which is used
in-house to solve practical problems for the employer, or also software
that is used in support of research that offers prestege to the
employer, or also software that is commercial quality but which is
written only for my own use to solve real life problems I face, or
something else I can't even guess what you might possibly mean.
regardless of what you have done outside paid employment, it doesn't
count.
Commercial contracting isn't employment, according to the IRS, it's in
a completely different tax/income category. So it doesn't count??
Likewise all open-source software is a complete waste, because even if
you eventually get income from supporting the software, it doesn't
count because it isn't formal employment?
Work you do on your own is generally not reviewed by anyone, not used
by many people and therefore has no external evaluation, only his on
subjective opinion,
That is a good point but only for *some* non-commercial work. Work for
which I got paid, as part of a university research project, is not
commercial in the shrink-wrap nor in any other software-for-sale sense,
yet still is reviewed by my supervisor. Work for which I didn't get
paid, but which is used substantially by other people, such as my
document formatting for XGP, is reviewed by the users, who choose
whether to use it for free or not use it. If they choose to make heavy
use of my software, as Bill Gosper and several others did, I consider
that a recommendation that my work is worthwhile.
For some other of my work, I put up free demos on the net, so that
random people could see my work and try it and announce how they like
it.
which to an employer is worth very little.
Do you speak for all employers there??
if he had spent the last 10 years working on some open source
projects ...
I already spent enough years of my life doing that in the past.
Why should I spend yet another ten years doing the same thing?
In the last 10+ years, the only things he seems to have produced are
some pretty skanky CGi apps.
You are quite ignorant of what I've done. I'm not going to waste my
time posting the information about my accomplishments that I've already
posted before. If you want to cease being ignorant, go look up what I
already posted.