Not that it matters, but as best I can tell from the definitions in
the online OED and Merriam-Webster, "ellipsis" refers to the omission
of a word or words, and also the mark *or marks* used to indicate the
omission. So one set of dots might properly be called "an ellipsis".
Point taken.
I'm not sure you've actually looked at *my* code, given the above
comments, but -- yes, that's true: Given that I'm not going to
do anything with overlapping occurrences it doesn't strike me as
useful to count them. What my code *does* need is a count of how
many times the "needle" string is going to be replaced, so I can
calculate the size of the output string.
Got it. But the ambiguity of "the number of occurences" remains.
(1) Computing is about people and machines to a greater extent than
traditional engineering
(2) Despite obfuscation, this means mastering people as we master
nature
(3) This is always an ugly business
..
I don't quite understand how this explains your replying to a technical
point with
"(Sigh) (Eye roll) (Crotch grab)"
I take it you disapprove. Well, I disapprove of someone trying to ruin
Schildt's reputation to puff his own, and I think my recreational
sexism is nothing in comparision.
Or a recognition of the fact that code, great or not, often survives
much longer than its author(s) might have intended, and is put to
uses they might not have imagined in writing it.
In my experience, contemporary American programmers rarely code, and
when they do, it's for specific assignments intended as one-off. They
simply can't be trusted anymore, and this is because corporate life
causes them to waste too much time with silly one-upmanship.
So why make the job more difficult ....
It's not made any easier by following formulae and shibboleth. Quite
the opposite; the formulae and shibboleth lull the coder into
neglecting real problems.
I think it's mildly disrespectful to not get someone's name right.
<shrug>
Babe, it's far more than "mildly" disrespectful to coin neologisms
such as "Bullschildt" and "Nilgewater" out of patronyms, and in
response I won't only coin Dweebach: I'll write original limericks in
Peter's honor:
A lousy coder named Dweebach
Was a munching on a Zweibach
Munched he, incoherently,
I knows about C
That mistaken coder named Dweebach
(Don't compete with me). This is because verbal self-defense is a
human right, sugar. Stop enabling.
Where "considerably improved" here means -- *very* considerably,
I'd say, and "misunderstood" means -- I have no idea.
Would you like to hear the story I call "How I Discovered
the Difference Between (MS-)DOS and a Real Operating System"?
(Summary version: Buggy user program puts machine in a state from
which only physically turning it off and back on can rescue it.)
Single-user versus multi-user, remotely accessible via not ....
I'm sure I can come up with some other differences that seem
pretty significant to me.
Remember the expert, who avoids minor errors while sweeping on to the
grand fallacy? We all know that MS-DOS is no longer viable.
But it's a mistake to believe that Linux is the answer. As the
(textbook author and professor) Andrew Tanenbaum pointed out, Linux as
compared to the MORE MODERN technology of microkernel OSen, is
innately insecure, less reliable and maintainable. Tanenbaum had a
"flame war" with Torvalds over this. He graciously apologized for some
of his flames; Torvalds did not, and failed to credit Tanenbaum for
Minix on which Linux was based. This was Maoism; the assault on
midlevel academic authority in the service of big money and power.
Far more than Windows, Linux is in the service of big money and power
since it is the product of slave labor. That is, each coder who
contributed to any version was a time-sliced slave. He might have been
a happy slave, but these are the best kind.
This allowed IBM, a larger and more powerful (and somewhat less
principled) company than Microsoft, to lay off its proprietary OS
developers and save the big bucks. It started with Torvalds' attack on
and expropriation of, Tanenbaum. These are as far as I can tell the
objective facts even if Tanenbaum expressed respect for Torvalds,
since computer science departments are so supported by corporate
interests that most of their professors lack true intellectual
independence.
The *command shells* may be. A command shell is not an operating
system.
No it is not. So why is it important enough to try to ruin careers
over command shells?
Pays them, but -- rumor has it -- expects rather a lot of hours in
return.
Expects deliverables. And to reduce hours, expects a deliverable, on
many of its projects, at 5:00 PM every day. This deliverable may be a
simple return, or the final product.
Just out of curiosity, how many of these students seemed to you
to be motivated by genuine interest in the field, and how many
by the desire to get training and/or a credential that would lead
to a good-paying job? (Not that there's necessarily much wrong
with the latter, especially in people who didn't grow up affluent.)
They cannot afford, growing up in hell, to disambiguate the two. I did
meet a former student in the loop. He was a janitor when he was in my
classes. He'd gotten a job at seven times the income that helped him
to support his family. He liked programming.
(What I've observed in the students where I teach is that, oh,
ten years ago during the so-called dot-com boom, there seemed
to be a lot of the latter, while now it's more the former.
We have fewer majors now, but the ones we have are more capable,
and more interested.)
[ snip ]
It remains a data processing myth which has created several software
crises, because it's more convenient to the corporation to celebrate
idiot savants than men and women.
I don't agree that it's a myth, or that it's confined to data
processing; I first noticed this phenomenon by comparing two of
my undergraduate acquaintances, neither of them involved with
data processing.
I think it's a myth, because in my experience the defective
personalities I've had to work with are normally the sort of
personalities who manifest anti-intellectualism, haven't dealt with
their resentments against an educational system that failed to serve
them, and who make precisely the decisions that cause large software
systems to fail.
Most of these personalities are aliterate, and think of user needs
based on the law as whims. As they age and stale, they are usually
attracted to right-wing simplifications of complexity.
Most are uninterested in Dijkstra's job one, not making a mess of it,
since they tend to like messes that form a mirror of the mess they're
in.
They are fond of laughing at anything that manifests thought in excess
of a low minimum.
If your acquaintances are in other fields, then the rot is spreading.
For example, before Reagan, GWB would not have been considered a
viable candidate, and Sarah Palin would have been locked up. But the
prototypes of GWB and Palin were forged in the corporation.
I have no idea what you mean here, but perhaps it will clarify
*my* intended meaning if I say that by "academic credentials"
I mean to include not only degrees but the entire record --
think "transcript" rather than "diploma(s)". But even the fuller
record doesn't tell the whole story. And really, I'm inclined to
have more respect for someone who's self-taught, since I think
that indicates a level of interest and commitment in a way that
academic coursework might -- *might* -- not.
Being self-taught in your sense might indicate interest and
committment, but in a company with tuition refund, not beating your
ass on over to NYU, Stanford or even DeVry might also indicate vanity
and reluctance to be exposed to the Other. I say this because in Peter
I detect this reluctance, most strongly from his not even reading the
first email from me when I tried to resolve the "flame war". He also
acted bizarrely as regards Schildt, refusing an offer from McGraw Hill
possibly because this might have meant an encounter. Also, I checked
his Mom's blog...which is public, to find her protesting the
affirmative expansion of science classes to minorities, which
indicates a sort of social background of mistrust and reluctance to
engage.
I was largely self-taught but also unafraid to take a LOT of graduate
classes. And, I succeeded at them. It simply is bizarre to brag, as
does Peter, about not taking CS.
[ snip ]
It means I'm pulling your leg.
This is not apparent, and indeed I'm -- skeptical. That's
offensive, yes, but so be it.
(Why am I reminded of "can't you take a joke?" used as a defense against
accusations of various -isms?)
Because you can't? Seriously, corporate anti-sexism is an ersatz for
personal decency. I will remind you that I allied myself and went to
bat for a female coworker at Bell Northern Research and this conduct
was not at all atypical.
I know what "affectation" means in general. I don't understand what
it means in context. Well, whatever.
Leg pull?
[ snip ]
Each spelling flame is one, since you want to disprove my claim that
I'm more literate by bringing down to the level of normalized
deviance.
I do? that seems unlikely, since I don't even know what "normalized
deviance" means. The point *I* think I'm making, in drawing attention
Cf. THE CHALLENGER LAUNCH DECISION, Univ of Chicago 1999, Diane
Vaughan. It's a study of the 1986 explosion of the Space Shuttle.
Vaughan had to develop a theory of "normalized deviance" because
quantitative sociologists tend to accept statistically predominant
behavior as non-deviant within a community, but it was obvious that as
a result of Reagan-era demands on NASA to "prove" that "America was
still great", engineers abandoned nondeviant practice from immediately
outside NASA, and normalized bad practice, including being proactively
skeptical of knowledge claims; engineers "knew that they didn't know"
how alloys on O-rings around fuel tanks would perform in unusually
(for Florida) low temperatures, and this absence was used (deviantly
with respect even to former NASA standards applied to Apollo) to
justify a disastrously aggressive launch schedule.
The normalized deviant here doesn't make spelling errors out of
knowing a variant practice (I have to teach UK spelling with an
American education), wit or laziness. He makes them because he doesn't
read books all that much.
I'm just different.
to spelling mistakes, is that you are not in the strongest position
to comment on others' spelling, as you sometimes do, or have done.
Only in the context of pointing out a broader ill or a literacy, and
NEVER in some tirade of the template "you think you're so fucking
smart but you make spelling mistakes like us, therefore (derefore) you
is a dime a dozen just like us".
I'd ask how what I'm doing here is different from *your* claiming(?)
that Seebs is in no position to criticize others' code given that he
makes mistakes in his own code, but -- that's rather the same line of
reasoning, isn't it?
No, because Seebs' posted code is in each case I've seen since the
beginning of this year, when I started to look at it, much worse than
Schildt's examples: failure to replace %s, off by one in one line, and
an unstructured switch().
My argument contra Dweebach is holistic. The package doesn't make the
grade. If he were a programmer like Bacarisse, he'd be forgiven the
lack of academic qualifications. If he had a PhD like Malcolm, I'd be
more patient with him. If he hadn't stalked Schildt, I wouldn't ****
with him at all.
My reasons for concluding that Seebach is off the rails are those
three. The worst is the stalking, then the bad code, and then the lack
of academic qualifications (which could be undegreed course work)
relative to his claims.