off-topic: Why is lisp so weird?

R

Ray Dillinger

Rahul said:
Of course, the issue of a compiler's quality of code generation is just
a constant-factor improvement. Being able to redirect programmer time
towards improving the algorithm and allowing the programmer to explore
the problem and solution space to discover better algorithms might make
the problem take O(n) time instead of O(n^4) time.

This is not quite true. I've seen and used compilers that did all sorts of
nifty tricks with pure-functional routines. I've had order(n!) source code
compile to order(n) object code using ART*Lisp, for example: the compiler
memoized the function to eliminate a (stupid, mea culpa) branching recursion.

In side-effect-o-rama languages like c/c++, this is almost never worthwhile
because provably pure-functional routines are rare. But in Lisps, once people
get the hang of it and if you can keep them from using mutable objects for
everything, well over 90% of all functions are pure functional (ie, cause no
side-effects and are unaffected by anything other than their arguments).
Compilers can aggressively memoize or even algebraically rewrite such functions,
not just to reduce by a constant factor, but to lower their order of complexity.

Bear
 
J

Joe Marshall

Ray Dillinger said:
In side-effect-o-rama languages like c/c++, this is almost never worthwhile
because provably pure-functional routines are rare. But in Lisps, once people
get the hang of it and if you can keep them from using mutable objects for
everything, well over 90% of all functions are pure functional (ie, cause no
side-effects and are unaffected by anything other than their arguments).

I once wrote a compiler that turned out to be pure functional. It
wasn't a very sophisticated compiler, and I didn't explicitly try to
be pure functional, it just so happened that I never had a need for
side effects.

Once it was working, I memoized the crap out of it.
 
R

Rahul Jain

Alan Mackenzie said:
Has anybody ever written a C compiler for a Symbolics?

It _came_ with a C compiler. ZETA-C has recently been put into the
public domain, too.
 
C

Christopher C. Stacy

Rahul> It _came_ with a C compiler.

Symbolics C was a layered product, and I think you had to pay for
it seperately. Zeta-Soft was a from a third party, and would
run on other Lisp Machines as well.
 
C

Christian Lynbech

Claudio> True, but that O(n) algorithm written in FastLanguage might
Claudio> run 10 times as fast as the same algorithm implemented in
Claudio> SlowLanguage. The company that implements that algorithm in
Claudio> FastLanguage can do 10 times more than their competitor that
Claudio> uses SlowLanguage or it can use a system that's 10 times
Claudio> slower to achieve the same goal.

But that factor 10x only holds for a fixed small set of input. As the
input size goes up, no amount of FastLanguage trickery is going to
keep the SlowLanguage implementation from taking the lead.

That is why people do O() analysis and study algorithms. A good
algorithm will always end up beating even the fastest implementation
on the fastest hardware of a poor one.


------------------------+-----------------------------------------------------
Christian Lynbech | christian #\@ defun #\. dk
------------------------+-----------------------------------------------------
Hit the philistines three times over the head with the Elisp reference manual.
- (e-mail address removed) (Michael A. Petonic)
 
A

Alex Mizrahi

(message (Hello 'Corey)
(you :wrote :eek:n '(Tue, 02 Mar 2004 07:04:10 +1300))
(

CM> Find me an open-source Lisp compiler that produces native
CM> executables that works on Windows, Linux and Mac... maybe Solaris
CM> too, just for fun.

ECL(Embedable Common Lisp) is written on pure C, and it uses C (gcc) to
compile it's code.
so it can be used to build native executables everywhere where C works(in
theory).
i tried it under win32/mingw and linux and it appeared to be working..
by the way, some benchmark code compiled with it (float math) outperformed
Intel C Compiler(well, it was gcc more aggresive optimizations, i think) -
so we can say lisp is faster than c?

GCL(GNU Common Lisp) works in similar way too..

)
(With-best-regards '(Alex Mizrahi) :aka 'killer_storm)
(prin1 "Jane dates only Lisp programmers"))
 
B

blmblm

Corey Murtagh said:
I've always wondered about that. I've been hearing for years how
Fortran is faster than - or as fast as - C, and it seems strange. As
fast as, I'm not concerned about. Faster than... I'd be interested in
how and why, from a purely academic perspective.

One reason: Fortran guarantees that aliasing between arrays does not
occur. For instance, in the loop

for (i = 0; i < 100; ++i) {
c = a[i-1] + b[i+1];
}

Fortran would not allow that c, a, and b point to identical memory
locations.


Nitpick: I thought this was only disallowed when c, a, and b were
subprogram arguments? i.e., when a compiler couldn't reasonably
figure out whether the arrays overlapped? (And note that the concern
is about whether they overlap at all, not whether they're identical.
Maybe that's what you meant.)
 
P

Paavo P

MSCHAEF.COM said:
The syntax of Lisp is that way primarily because of its regularity. Every

regularity is not equal to easy-to-learn or simple-to-use (this is not an
argument against lisp)
 
C

Claudio Puviani

Christian Lynbech said:
Claudio> True, but that O(n) algorithm written in FastLanguage might
Claudio> run 10 times as fast as the same algorithm implemented in
Claudio> SlowLanguage. The company that implements that algorithm in
Claudio> FastLanguage can do 10 times more than their competitor that
Claudio> uses SlowLanguage or it can use a system that's 10 times
Claudio> slower to achieve the same goal.

But that factor 10x only holds for a fixed small set of input. As the
input size goes up, no amount of FastLanguage trickery is going to
keep the SlowLanguage implementation from taking the lead.

That is why people do O() analysis and study algorithms. A good
algorithm will always end up beating even the fastest implementation
on the fastest hardware of a poor one.

Doh. I'm not arguing against finding a better algorithm. I'm saying that for a
given algorithm, including the most efficient one, there's an additional
advantage to be had by running in a more efficient environment. No one, least of
all I, ever said that you select a faster environment to the detriment of a
better algorithm. Read what I actually write.

Claudio Puviani
 
C

Christopher C. Stacy

On Wed, 3 Mar 2004 08:18:20 +0000, Alan Mackenzie (" Alan") writes:

Alan> Christopher C. Stacy said:
Rahul> It _came_ with a C compiler.

Alan> I didn't know that.

Alan> Did anybody do any comparitive benchmarks on it between C and Lisp?

This is the "Lisp Machine" -- all the languages compiled down to the
same instruction set, which was designed for running, guess what?
 
A

Alan Mackenzie

Rahul> It _came_ with a C compiler.
Alan> I didn't know that.
Alan> Did anybody do any comparitive benchmarks on it between C and
Lisp?
This is the "Lisp Machine" -- all the languages compiled down to the
same instruction set, which was designed for running, guess what?

Hah! I know (or knew) that instruction set well indeed. Twenty years
ago, I was working for a firm called Racal-Norsk in England, which was
producing a lisp-machine, first by emulation, then by microcoding, on a
Norsk-Data 500 minicomputer. We succeed technically, but deadlines were
such a joke that the project was pulled around the time that the first
(emulated) version was about to ship. :-(

But I didn't know there was a C-compiler for it.
 
D

David Kastrup

Claudio Puviani said:
Doh. I'm not arguing against finding a better algorithm. I'm saying
that for a given algorithm, including the most efficient one,
there's an additional advantage to be had by running in a more
efficient environment. No one, least of all I, ever said that you
select a faster environment to the detriment of a better
algorithm. Read what I actually write.

One problem to be noted is that the amount of complexity a human can
handle is limited. A language in which an efficient algorithm can be
most conveniently expressed will make the programmer tend to produce
efficient code.

If I can manage to produce an O(n^1.3 lg n) algorithm based on some
weird recursive arithmetic in modular rings in Lisp, but will fail the
same task in C before I finish it because I get drawn into complicated
storage size and index calculations for the intermediate results, the
program will run faster if I write in Lisp. Even though it can then
be automatically or manually translated into C. Because writing in C
I would never get to finish the algorithm, having my brain falling
apart before that time, and would need to use an O(n^1.6) algorithm
instead.
 
D

David Kastrup

for (i = 0; i < 100; ++i) {
c = a[i-1] + b[i+1];
}

Fortran would not allow that c, a, and b point to identical memory
locations.


Nitpick: I thought this was only disallowed when c, a, and b were
subprogram arguments? i.e., when a compiler couldn't reasonably
figure out whether the arrays overlapped?


So what? All high-performance numerical work is done in libraries.
You don't have highly qualified people open-code such stuff at the
flick of a wrist.
 
R

Richard Harter

[snip]
One problem to be noted is that the amount of complexity a human can
handle is limited. A language in which an efficient algorithm can be
most conveniently expressed will make the programmer tend to produce
efficient code.

If I can manage to produce an O(n^1.3 lg n) algorithm based on some
weird recursive arithmetic in modular rings in Lisp, but will fail the
same task in C before I finish it because I get drawn into complicated
storage size and index calculations for the intermediate results, the
program will run faster if I write in Lisp. Even though it can then
be automatically or manually translated into C. Because writing in C
I would never get to finish the algorithm, having my brain falling
apart before that time, and would need to use an O(n^1.6) algorithm
instead.

But are you speaking only for yourself re the difficulties of using C
or are you claiming this for all programmers using C?


Richard Harter, (e-mail address removed)
http://home.tiac.net/~cri, http://www.varinoma.com
I used to do things that were good for me because they were fun.
Now I do them because they are good for me. Fun was better.
 
D

David Kastrup

[snip]
One problem to be noted is that the amount of complexity a human can
handle is limited. A language in which an efficient algorithm can be
most conveniently expressed will make the programmer tend to produce
efficient code.

If I can manage to produce an O(n^1.3 lg n) algorithm based on some
weird recursive arithmetic in modular rings in Lisp, but will fail the
same task in C before I finish it because I get drawn into complicated
storage size and index calculations for the intermediate results, the
program will run faster if I write in Lisp. Even though it can then
be automatically or manually translated into C. Because writing in C
I would never get to finish the algorithm, having my brain falling
apart before that time, and would need to use an O(n^1.6) algorithm
instead.

But are you speaking only for yourself re the difficulties of using
C or are you claiming this for all programmers using C?

I can send you mathematical descriptions of algorithms which I have
finally abandoned completing casting into C. The mathematics behind
it is trivial, just adding and multiplication. You just drown in
index calculation, storage allocations and so on.

And don't underestimate me: I am able to program complete standalone
projects including operating system and applications, I have written
BIOSses and systems and target compilers in a number of different
languages including assembly, and know C by heart.

Again, I can send you the descriptions of the algorithms if you want
that are concerned here.
 
C

Camm Maguire

Greetings! Just to followup, gcl has been providing native
executables for maxima,acl2, and axiom on all 11 Debian platforms,
macosx, windows, and some bsd's for quite a while now. It is closely
linked to gcc, but much of it is written in lisp.

Take care,
 
R

Rahul Jain

Claudio Puviani said:
Doh. I'm not arguing against finding a better algorithm. I'm saying that for a
given algorithm, including the most efficient one, there's an additional
advantage to be had by running in a more efficient environment. No one, least of
all I, ever said that you select a faster environment to the detriment of a
better algorithm. Read what I actually write.

Then you've been ignoring what we've said. By choosing an environment
that allows you to explore your algorithm and its behaviors more easily,
as well as express your desires more abstractly so that the exact
implementation is separated from the interface, you can spend more time
finidng a faster algorithm instead of being forced to microoptimize some
algorithm that you just want to play around with.
 
C

Claudio Puviani

Rahul Jain said:
Then you've been ignoring what we've said. By choosing an environment
that allows you to explore your algorithm and its behaviors more easily,
as well as express your desires more abstractly so that the exact
implementation is separated from the interface, you can spend more time
finidng a faster algorithm instead of being forced to microoptimize some
algorithm that you just want to play around with.

Finding an algorithm is an analytical exercise that's completely abstract of the
programming environment -- and of programming. Unless one doesn't understand
algorithm analysis, there's no need to "explore" the behavior. In those sadly
too numerous cases where the programmer doesn't understand algorithm analysis,
both time and money would be better spent sending the individual to an
algorithmics class at the local university than they would giving that person a
tool that lets them experiment. Let me repeat this: you do not find better
algorithms more easily with one language than you do with another.

Claudio Puviani
 
W

Willem

Claudio wrote:
) Let me repeat this: you do not find better
) algorithms more easily with one language than you do with another.

I beg to differ, and I suggest the language PROLOG as proof.


SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,020
Latest member
GenesisGai

Latest Threads

Top