Empty list as default parameter

  • Thread starter Alex Panayotopoulos
  • Start date
J

Jay O'Connor

Bengt said:
What languages are you thinking of? A concrete example for comparison would
clarify things.

One example that had not occurred to me until reading this thread has to
do with how Smalltalk methods are built. Since Smalltalk doesn't allow
for default parameters, one very common idiom is for a developer to
create a sort of 'cascade' effect of multiple methods. The full method
will require all the paremeters,but one or more of the more common ways
of using the method will be provided which just turn around and call the
main method with some default parameter provided for the paramaters not
beiing specified.

An example:

Canvas>>drawShape: aShape color: aColor pen: aPen

"do all the drawing in this method."
...

Now, if the developer decides that often the color or pen are going to
be pretty standard he may provide some 'convenience methods' that other
developers can call that in turn just call the main method with defaults.

Canvas>>drawShape: aShape

"Use default color of balck and a solid pen"
^self drawShape: aShape color: #black pen: #solid

Canvas>>drawShape: aShape color: aColor

^self drawShape: aShape color: aColor pen: #solid

Canvas>>drawShape: aShape pen: aPen

^self drawShape: aShape: color: #black pen: aPen.

This can be a built akward to write sometimes, but makes life pretty
nice for the other developers. This is not alwasy the case but just
based on what the developer thinks are going to be likely patterns of use.

What came to mind, of course, is that this allows the defaults to be
dynamic.

Canvas>>drawShape: aShape
^self drawShape: aShape color: self currentColor pen: self currentPen

You are still providing defaults, but the defaults are based on the
current state of the system at execution, not at compile time. This is
actually a fairly common idiom in Smalltalk, but Smalltalk's mechanism
for method signatures is fairly unique and happens to support this
approach well.
 
P

Paul Rubin

What languages are you thinking of? A concrete example for
comparison would clarify things.

Common Lisp is the most obvious one.
Would you have default expressions effectively passed in as the
bodies of lambdas (which might mean creating closures, depending on
what was referenced) and then executed to create the local bindings
prior to the first line in a function or method? It would certainly
be inefficient for all the cases where you just wanted a static
default ...

The compiler can do the obvious things to make efficient code in the
normal cases.
(unless you special cased those to work as now -- but
remember, only bare names and constant literals could be special
cased that way. An expression like os.RD_ONLY (yes that is an
expression!) would have to be passed as lambda: os.RDONLY). So you'd
have to counter that by making bare-name bindings prior to calls,
like tmp_mode=os.RD_ONLY; os.open('foo.txt', tmp_mode); #etc

Rather than have the programmer go through such contortions it's better
fix the compiler to generate the obvious code inline, and then rely
on the compiler to get these things right.
 
P

Peter Hansen

Stian said:
* Robin Munn spake thusly:

Wouldn't it be more logical for a programmer that x should evaluate
to '3' inside f()?

I can't see what is the purpose of binding default variables at
definition time instead of runtime.

Purpose? Who needs a purpose? ... "def" is a *statement* in Python,
so naturally the code in the argument list is mostly easily handled
at definition time, when the def statement is being executed, rather
than at run time.

It also means that the default arguments don't have to be evaluated
dynamically each time the function is called, which would in some
cases be a performance nightmare...

-Peter
 
B

Bengt Richter

Common Lisp is the most obvious one.
You are referring to initforms, I assume. I wonder how often macros are used
IRL to defeat the deferral and plug in pre-computed static default values that
the compiler can't infer its way to at compile time?
The compiler can do the obvious things to make efficient code in the
normal cases.


Rather than have the programmer go through such contortions it's better
fix the compiler to generate the obvious code inline, and then rely
on the compiler to get these things right.
I wasn't really advocating programmer contortions ;-)
But the trouble is that the compiler can't guess what you _mean_, except
for the obvious cases of bare names and constant literals, so otherwise you have
to code explicitly in any case. E.g., is it obvious that getattr(os, 'RD_ONLY')
should be done at call time or optimized away to def time in os.open('foo.txt', os.RD_ONLY) ?
I don't think you can optimize it away without telling the compiler one way or another,
or changing the dynamic nature of the language.

In any case it would be a semantic change, and I'd hate to have the job of finding breakage ;-)

Regards,
Bengt Richter
 
B

Bengt Richter

On 23 Nov 2003 16:33:37 GMT, (e-mail address removed) (Bengt Richter) wrote:
[...]
to code explicitly in any case. E.g., is it obvious that getattr(os, 'RD_ONLY')
should be done at call time or optimized away to def time in os.open('foo.txt', os.RD_ONLY) ?
I meant during os.open('foo.txt') assuming def open(...) had a default mode expressed as an attribute
expression. But that's bad as a real example. So please assume a customized file opener that uses
os.open and has a default mode parameter os.RD_ONLY ;-)
I don't think you can optimize it away without telling the compiler one way or another,
or changing the dynamic nature of the language.

In any case it would be a semantic change, and I'd hate to have the job of finding breakage ;-)
I'll leave that as is ;-)

Regards,
Bengt Richter
 
P

Paul Rubin

You are referring to initforms, I assume. I wonder how often macros
are used IRL to defeat the deferral and plug in pre-computed static
default values that the compiler can't infer its way to at compile time?

I can't think of any times I ever did that, but I've never been a real
hardcore CL hacker.
But the trouble is that the compiler can't guess what you _mean_,
except for the obvious cases of bare names and constant literals, so
otherwise you have to code explicitly in any case. E.g., is it
obvious that getattr(os, 'RD_ONLY') should be done at call time or
optimized away to def time in os.open('foo.txt', os.RD_ONLY) ? I
don't think you can optimize it away without telling the compiler
one way or another, or changing the dynamic nature of the language.

I think some of that dynamicness should be toned down as Python matures.
In any case it would be a semantic change, and I'd hate to have the
job of finding breakage ;-)

Better to do it sooner then, so that there's less stuff to break ;-).
 
J

Jay O'Connor

Paul said:
(e-mail address removed) (Bengt Richter) writes:



I think some of that dynamicness should be toned down as Python matures.

One aspect of Python's dynamic nature that has always intrigued me is in
the terms of how instance variables. Python allows you to dynamically
add instance variables as you need, which is pretty cool, but seems to
require a lookup of every instance variable reference, which can be
pretty slow.

Consider the following test case, using Python 2.3 on Win95 in IDLE:
=====================
class A:
def __init__(self, val=1):
self.value= val

def setval (self, v):
self.value=v

def getval (self):
return self.value

def test(self):
for x in range (1,1000000):
self.value = 1
y = self.value

import time
a = A()
print "start"
t1 = time.time()
a.test()
t2 = time.time()
print t2 - t1
print "done"
=====================

This routinely return values of over 6.3.

Swithing the implementation of test() to use the accessor methods kicked
the times up to over 20 seconds

By contrast the equvalent Smalltalk* code (VisualWorks 7,1) on the same
machine. gave me consistant results of a little over 600 milliseconds
when using direct variable access and roughly 1.5 seconds when using
accessors.

I think a portion of the difference in that in Smalltalk, instance
variables are specified in the class metadata (in terms of ordering)
when the class is compiled and thus an instance variable reference is
just a pointer offset from the start of the object data in memory and
thus the compiler can optimize instance variable reference by just
compiling in the offset into the the mehod code.


This is a case where Python is more dynamic than Smalltalk in that
Python allows easy addition of instance variables to objects, but it
comes at a price in terms of performance. The Smalltalk solution to
adding instance variables dynamically is just to carry around a
dictionary (similar to Python) but most experienced Smalltalkers know
that "instance variables are much faster than dictionary lookups so
consider if this is really the right design and consider the
performance/flexibility tradeoff"

This is one case where I think Python's flexibility hurts it in the long
run and perhaps as it goes forward, it will adapt to a more rigid style
that will provide for better performance without too much tradeoff.
After all, Python is still very young, Smalltalk has a twenty year head
start on it.


*Smalltalk code
======================

A>>#test

| y |

10000000 timesRepeat: [
self value: 1.
y := self value
]
-----------
| a |

a := A new.
Time millisecondsToRun: [a test].
======================
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,764
Messages
2,569,565
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top