Not that I don't believe you about your "helping JohnNash" I just
don't care. It's not something that will win over credibility with
me. I don't revereNashas a computer science genius or whatever.
Cf. The Essential John Nash. In a 1954 he anticipated problems and
solutions in multiprocessing. His code was far better than that of
full-time expert programmers.
Maybe you should make up something about teaching Knuth how to
multiply or something.
However, I noted that as I tried to do anything significant or
elaborate, especially with GUIs, I was spending too much time on
infrastructure: for example, a struct seemed to me to require
additional routines to print it and inspect it for validity. I
discovered that primitive Visual Basic as of its early (1995) object-
oriented release did a better job of encapsulation, and wrote a
compiler in a later edition of Visual Basic, publishing said compiler
as the software for "Build Your Own .Net Language and
Compiler" (Apress 2004).
GDB can print contents of structures. If [and I have] needed to see
the contents of a structure quickly I'd just debug, set a breakpoint
and print it out. Takes all of 5 seconds.
That's not what I'm talking about. I am talking about the clearly
available and permanent existence of a toString() method that converts
a structure into string data.
C isn't always about efficiency as it is about just getting control
over data in precise manners. A big reason I despise languages like
Perl or Python for bit-twiddling things like ASN.1 or crypto work
isn't because they can't do the job, it's because they can't do it
well.
You prefer C methods of thinking and representation, but this is not
an argument for their superiority except from a narcissistic
standpoint. Second-rate people cannot think outside their notations
and for example find Polish notation incomprehensible. I never found C
incomprehensible, by the way: I just found it unsafe at any speed.
And really, what you're missing [because you have no clue what you're
talking about], is you'd write your data engine [task that does the
heavy lifting] in C, and your front end GUI in any language that makes
you happy and just use bindings to glue the two together.
This creates an unnecessary binary opposition between "heavy lifting"
and GUI with a nasty and unexplored sexism encased within, for of
course, men who are (cf Susan Faludi's book STIFFED) unmanned by
corporate life will seek to compensate by factoring problems into hard
"male" and soft "female" components unthinkingly. Whereas this isn't
appropriate in many instances. The consequence of the factoring is
often that the GUI team, as "women" is starved for resources and the
GUI is crap.
Nothing saying you can't write your GUI in VB and your [say] DSP code
in C.
Right tool for the job.
Right, "Joe the Plumber". Look, if you want to use "tools" in other
than a metaphorical sense, get a job as a plumber.
26 minutes to get to the gym eh?
I've worked in Java before. I've hit the 1.4/1.5/etc walls before.
Part of my job when I was a software dude at AMD was to look into
performance problems in Java and it was always a different story with
different versions of the API/jvm/etc.
What probably happened:
(1) Your boss got a sales call from some Java flack
(2) He asked you to look into it
(3) You were going to be God-damned before you learned a language that
would make you feel all girlie, so you sure as hell "looked into"
"performance problems". You didn't look at anything else.
Java code is not only not 100% portable, but it's also not guaranteed
to be bug free. You can easily code in race conditions in Java such
that 99% of the time it works for you, but I open the app on my SMP
box and voila bug. So this idea that only C applications suffer from
portability issues is just a lie.
The difference being you have to work at it, whereas creating race
conditions unintentionally is easy in C.
Now you're getting delusional. I've visited many software shops and
have seen people of all shapes and sizes. Being a fat, scruffy
looking dude with long hair is not the marker of a successful software
developer.
Also, age doesn't really matter. I'm hardly some 13 yr old kid
posting ramblings from space. You could be 30 years old than me and
it still wouldn't change the fact that you're posting falsehoods and
crying persecution when nobody agrees with you.
Who's crying "persecution"? You're engaged in some pretty deep
psychological transference.
let alone computer scientist, you think regex has something to do with
math [more than compiler theory that is], you think all Java
Well, I first encountered regular expressions in "Formal Languages and
Their Relation to Automata" by Jeffs Hopcroft and Ullman, published in
1972. This was a math book.
Automaton have everything to do with computer science and not strictly
I'd credit you with knowing what you were talking about more if only
you knew the plural of automaton.
math. Look at conways game of life. Sure there is a mathematical
I first programmed a version of Conway in 1973 on a machine with 8K
RAM, and was honored to meet Conway years later, so by all means, let
us look at Game of Life.
expression, but it's really an algorithm. Algorithms use math in the
No it is not. An algorithm is a procedure for solving a problem which
normally is guaranteed of success but not always. Conway Life is an
ontology (a world with an unbounded number of two-state cells) and a
set of rules.
Regarding Conway life as an "algorithm" results in the numerous buggy
Life simulators in which the "glider" pattern fails to create a new
cell beyond the boundary of the Life array and as a result, the glider
turns into a dead 2*2 block. The fact is that Conway's life is not an
algorithm and cannot be properly programmed, since Conway's world is
infinite in all directions.
way paragraphs use words. Building a grammar [paragraphs] and
building a vocabulary [words] are two different subjects are they not
even if they're umbrellaed by one subject header? I'd expect to study
sorting algorithms in a comp.sci class enough though their analysis
makes use of math. Language constructs, such as those that define a
regex are firmly founded in Compiler Theory. Where do you think you
get the concept of an LL or LR(1) parser from in the first place?
No, Compiler Theory, Ace Ventura, is grounded in math including a pre-
existing theory of regular expressions.
For example: *A is satisfied by a substring containing the letter A
preceding the last A. To "program" this regular expression, you must
invent a new "character", $, representing the end of the line. This
has nothing to do with math and it in fact destroys the theory,
replacing it with a barbarous form of machine-language programming.
So you may use some math formalism to describe a grammar, but it's
still not a math subject.
WRONG. Regarding computer science as independent of math creates
systems which redefine mathematical truth as above.
Of course, not all Java applications run anywhere all the time. But,
far fewer C applications do so in my experience.
Then I'm sorry for your experiences. That still has nothing to do
with the C language though. What you're really bitching about [it
seems] is the lack of a C VM that can run your C applications
sandboxed like Java applications. That's not a function of the C
language though, that's a function of available C tools.
You are blind but you see a truth in a perverted way. It is that the
purpose of digital technology wasn't to glorify individual ace
programmers, but to make the rich, richer, to advance American
hegemony, and to keep the rest of us in cubicles defined by our social
position. Despite my early fascination with computers, I could see
where all the interesting things having to do with writing assemblers
and debugging compilers was, although challenging and interesting,
leading in the direction of moronization and social rigidity.
I'd say taking away the learning experience of manipulating raw data
[like strings, structures, lists, arrays, etc] *IS* what is dumbing
down the computer society. You have "developers" nowadays that
couldn't qsort an array if their life depended on it. They couldn't
figure out how to manipulate a string, or really form proper list/tree/
etc structures, etc.
The problem being that:
(1) In C, the tyro learns that sentinels may stop strings. WRONG.
(2) In C, the tyro learns to care too much about "efficiency". WRONG.
(3) In C, the tyro learns to confuse the notion of an ordered n-tuple
with the very different notion of a buncha stuff in memory. BZZT
(4) In C, the tyro learns to buttfuck things at the last minute to
"get it done". WRONG.
(5) In C, the tyro learns that everything is Von Neumann and that
associative memory is a pipe dream. BZZT
(6) In C, the tyro learns to prize brevity over comprehension. WRONG.
(7) In C, the tyro cannot implement the common operation of summation
using a truly constant limit. WRONG.
(8) In C, the tyro learns that a program is just a text which can be
preprocessed as a text safely. WRONG.
(9) In C, the tyro is seduced by a "power" to do wrong.
There is in fact no reason why primitive operations and basic design
skills cannot be taught without having to treat the mistakes of
Ritchie as holy writ. A string object can be developed in Java or C
Sharp by assuming that there are no strings.
Sure it's nice to have tools to make common tasks simpler, but too
many people go for the end game before learning how to get there.
My first program was in machine language, Ace Ventura. What language
was your first language in? Yet I certainly didn't go on to think that
all data should be in the form of variable length strings in six bit
BCD addressible on the left with a word mark on the right. Instead I
wrote assemblers and debugged a compiler for Fortran II to escape
these constraints.
In the 1970s, working in Cobol, I heard a lot of fashionable talk
about C and took a class in it. I thought it semicool and (it goes
without saying) better than Cobol. But then I started to use Rexx, IBM
assembler, and proprietary Northern Telecom languages in Silicon
Valley.
The C developers were moaning and whining about wanting a DEC VAX, the
fashionable machine of the early 1980s. But then, the VAX was found to
have machine instructions that nobody ever used, instructions
something like "halt, add zero, catch fire, and eject the drinks
tray".
Dijkstra was wrong about object orientation.
Net result is when they face a problem that doesn't have an immediate
API solution to, they can't reason how to solve it correctly,
properly, efficiently, etc.
That speaks volumes not about lack of training in C, but about lack of
general culture and excessive coddling by matriarchal homes.
CS education used whatever was available in the old days. Then it used
interpreters like PUFFT (the Purdue University Fast Fortran
Translators) to avoid student code bringing down the whole kit and
kaboodle. Then they used Pascal. That was fun. Then it used C ("how to
debug a C program: change your major"). Today it uses Java. Onward and
upward. The language is irrevelant, and C was the worst choice.
I have no idea what this has to do with C.
No, you don't, because you separate social issues from technology.
<snip rest of insane rant>
You still haven't really come up with a "problem" that is actually the
fault of the C language. Let's examine them
1. Only C applications are non-portable. That's false.
That's not what I said, Reading Rainbow. I said that C is not as
portable as it should be, since its "power" results in the logical
possibility of bugs surfacing when the code is ported at a higher rate
than in other languages.
2. C has no VM like Java. That's not a C language problem
3. C developers are losers. um ... don't know what to say there
4. C promotes developers to do bad work. Again, this is one persons
opinion
etc...
Let me phrase this another way, to useNashlike thinking, what's a
win scenario for your little game you're devising? Is that where we
all collectively agree that you're right, C is bullshit, we revoke all
C compilers from the planet and all switch to Windows 7 and program in
Visual Basic?
This is an isomorph to unsolved social problems (such as absence of
health insurance in the US) in which the argument is made that we've
invested so much in the nonworking solution that we might as well go
on.