@decorator syntax is sugar, but for what exactly?

B

Bengt Richter

ISTM that
@limited_expression_producing_function
@another
def func(): pass

is syntactic sugar for creating a hidden list of functions. (Using '|' in place of '@'
doesn't change the picture much (except for people whose tools depend on '@' ;-)).

I.e., (not having the source or time to delve) the apparent semantics of the above
is something roughly like

__funclist__ = []
__funclist__.append(limited_expression_producing_function)
__funclist__.append(another)
def func():pass
while __funclist__: func = __funclist__.pop()(func)
del __funclist__

Is this a special case of a more general idea? E.g., could it apply to
right after ANY next name is bound, in general, not just a name bound by def?

thus (untested)

def censor(cls):
cls.__repr__ = lambda self: '<CENSORED>'
return cls
...
@censor
class C(object): pass

could have the expected effect (after metaclass effects, if any, presumably, BTW)
(note that censor could instead e.g. wrap selected methods or add class variable data etc.,
though IIRC __metaclass__ can create some things that are read-only later)

This is still very narrowly defined by prefix context. Is this context also
a special case default of something more general? IOW the default choice for
namespace is the lexically enclosing one. What about, e.g., being able to specify
decoration in one place at the top of a module and decorate (in the same way, using
the same function list) all methods of a specified (by name) list of classes?

I.e., a more general control over what to do when what names are bound in what namespace
could be envisaged. This begins to feel like event-driven processing. Could @deco1 @deco2
be sugar for the special case (don't take this literally, just illustrating semantics ;-)

when(event='next_binding', namespace='immediate', symbols='any', funclist=(deco1,deco2))
def foo(): pass

of something general that could also include other and possibly repeated events like on completion
of an arg list for a particular function or method being called or any rebinding of particular
symbols in a specified namespace (not just the immediate one), e.g., for debugging or profiling etc.

Since I am copying this to python-dev, I'll re-ask whether @decorator has any peculiar
thread safety pitfalls that should be warned against, just in case.

Please take this as a probe into the intended semantics, not as a proposal for any particular
functionality ;-)

Regards,
Bengt Richter
 
T

Tristan Seligmann

@limited_expression_producing_function
@another
def func(): pass

is equivalent to:

def func(): pass
func = limited_expression_producing_function(another(func))
--
mithrandi, i Ainil en-Balandor, a faer Ambar

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.5 (GNU/Linux)

iD8DBQFBFW0opNuXDQIV94oRAljjAJ0aJsggvp2vEVeCgBCapiTgbSb+gQCfTWTF
OUghKvwWu/GZ7JMTBf9ppzI=
=BwLb
-----END PGP SIGNATURE-----
 
R

Robert

I have read a few Python blogs and not a one is taking the decorator syntax
in a good way.

The Python of 1.5.2 simplicity will be long gone.
 
S

Scott David Daniels

Robert said:
I have read a few Python blogs and not a one is taking the decorator syntax
in a good way.
classmethod and staticmethod were introduced some time ago, to provide
a mechanism for getting to such effects. At the time there was no clear
syntax to use that "felt right." In the meantime people have used these
features to good effect despite the clunky way you used them. At this
point there would probably be a small riot (or at least a large whine)
if these two were removed.

When I first came to Python, I was delighted to see that Idle gave me
function clues not only for the system code, but for the code _I_ wrote.
Since it takes discipline to write comments that may never be read, it
was delightful to finally see an immediate reward for doing a little
documentation work -- I could help myself on my own utility functions.

However, there are more and more uses of the docstring for things other
than documentation. At this point, I see more and more docstring uses
as "cybercrud" -- The only easy function annotation is the docstring so
all ambitious program annotation schemes use it. There are good reasons
for wanting to be able to annotate functions and methods, but precious
few good reasons for polluting the document strings in order to do so.

Decorators are a way out. I don't know about the syntax, it looked bad
when I first saw it, but like some others, I welcome _almost_any_syntax_
for decorators. Not, so much, because I want to use decorators. I just
want others to stop using docstrings for non-documentation purposes. In
fact, I don't even really like the unittest convention of using method
names to identify test methods -- I prefer a language where the what you
call a thing does not affect how it works.
The Python of 1.5.2 simplicity will be long gone.
Well, there was a lot to like even then, but I'd hate to give up what
we now have -- you can do things in a much more functional style now,
with nested scopes. Would you really like to go back to three-strikes
and you are out symbol lookup? This language has changed at a good
pace, and (quite surprisingly) slowly to a more consistent model (a
major coup in language design). The aesthetic that has driven that
change suggests that this might be the right syntax for declarators.
I know I wouldn't have done nearly as well as Guido and gang at making
language decisions.

I certainly intend to take this alpha period as a time to experiment
with decorators and see if the syntax grows on me; I suggest others
do the same. Give it its best chance; you may become a fan. Several
of the py-dev people claim they have already gone from distaste to
support.
 
A

Anthony Baxter

I have read a few Python blogs and not a one is taking the decorator syntax
in a good way.

Most of the posts I read seemed to be from people having a visceral
response to the syntax. I think it's fair to say that many of the people
complaining about the syntax have not actually downloaded the alpha
and tried out the new decorators in actual code.
The Python of 1.5.2 simplicity will be long gone.

The "Python of 1.5.2 simplicitly" is long, long gone. I don't agree that
newer Python's are somehow worse because new things have been
added. A short list:

new style classes
foo(*arg, **kwarg)
iterators
generators
list comprehensions

In many cases, these new features actually lead to smaller, simpler
code. I challenge _anyone_ to tell me that
apply(func, args, kwargs)
is better than
func(*args, **kwargs)
 
?

=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=

Anthony said:
The "Python of 1.5.2 simplicitly" is long, long gone. I don't agree that
newer Python's are somehow worse because new things have been
added. A short list:

new style classes
foo(*arg, **kwarg)
iterators
generators
list comprehensions

My favourite one: string methods.

Regards,
Martin
 
D

Dan Sommers

In many cases, these new features actually lead to smaller, simpler
code. I challenge _anyone_ to tell me that
apply(func, args, kwargs)
is better than
func(*args, **kwargs)

Okay, I will. The old way is better than the new way.

Explicit is better than implicit, after all.

Given your second example, is func the name of a function or an object
that references a function? Can I grep my source code to find a
function named func? (Okay, I'll find the variable named func, so at
least I'll have some clue as to what's going on.) Is there a "from
module import func" statement in sight? What happens when I look up
"func" in the Python documentation to see if it's a built in name?

C did the same thing recently.

Old C: (*f)( argument );

New C: f( argument );

With the old syntax, I knew immediately that f was a pointer to a
function and that the function to which it pointed was being called
indirectly. With the new one, I have to track down a definition or
declaration of f to see that. Yes, this used to be much more important
than it is now. Perhaps my years of assembly language applications in
extremely tight situations and are showing through.

Yes, I know that Python functions are first class objects and should be
treated as such. Yes, I know that Python is not C. But with function
application being built right into the syntax of the language (i.e., an
identifyer followed by a parenthesized argument list), I would like to
see a visual difference between calling a named function vs. calling a
function indirectly through some sort of reference. Yes, I also know
that in most cases "it's obvious from context," but with the old way, it
was obvious without any context and evident to the simplest of automated
tools.

That all said, when I use the new syntax (okay, you got me, I do use the
new syntax on occasion), I always leave a comment nearby to explain
what's going on. I agree: the *code* is smaller and arguably simpler,
but the *program* and the *software* are not.

Regards,
Dan
 
D

Duncan Booth

Okay, I will. The old way is better than the new way.

Explicit is better than implicit, after all.

Given your second example, is func the name of a function or an object
that references a function? Can I grep my source code to find a
function named func? (Okay, I'll find the variable named func, so at
least I'll have some clue as to what's going on.) Is there a "from
module import func" statement in sight? What happens when I look up
"func" in the Python documentation to see if it's a built in name?

Why do you ask this about the second form only? func is a name that
references a function in both cases. It doesn't matter whether that name
was assigned directly with a def statement, or is the result of a
subsequent binding.
C did the same thing recently.

Old C: (*f)( argument );

New C: f( argument );

With the old syntax, I knew immediately that f was a pointer to a
function and that the function to which it pointed was being called
indirectly. With the new one, I have to track down a definition or
declaration of f to see that.

For 'recently' read 1987. In fact most C compilers probably implemented
this feature before the ISO standard came out, but it was a feature of the
first standardised version of C. I remember around that time being
extremely glad that I could finally omit those extraneous parentheses.
 
R

Roy Smith

Anthony Baxter said:
The "Python of 1.5.2 simplicitly" is long, long gone. I don't agree that
newer Python's are somehow worse because new things have been
added. A short list:

new style classes
foo(*arg, **kwarg)
iterators
generators
list comprehensions

Perhaps I'm just a luddite, but I don't actually use most of those
features. I have started playing around with iterators/generators, and
find them very cool.

The single biggest improvement I see in the language since 1.5.2 is
string methods! After that, maybe augmented assignments (or whatever
you call them; the ability to write "x += 1").

Most of the big improvements I've seen are in the library. When did
unitest get added? I can't live without unittest. I like the logging
module, even if I think it's about twice as complicated as it should be.

A lot of my own personal growth in how I use the language is discovering
modules which, while not new to the language, are new to me because I'd
never noticed them before.

Speaking of libraries, Dan Bishop posted some interesting example of
@memoize and @printreturns utility wrappers. This leads me to think
that a good way to leverage the idea of decorators would be a module of
common utility functions which could be used as decorators by anybody.
I'll call the module martha (since it supplies things used for
decorating). Does the proposed mechanism support something like (to use
one of Dan's exmaples, written with two different syntaxen):

import martha

@martha.memoize
def fibonacci(n):
if n in (0, 1):
return n
return fibonacci(n - 1) + fibonacci(n - 2)

def fibonacci(n):
@martha.memoize
if n in (0, 1):
return n
return fibonacci(n - 1) + fibonacci(n - 2)

Some of these things might even be usefully re-written in C for improved
performance.

I'm even wondering if somehow decorators could be used for i18n? The
obvious problem there, is print is a statement not a function, and you
can't decorate statements (or can you???).
 
A

Anthony Baxter

Perhaps I'm just a luddite, but I don't actually use most of those
features. I have started playing around with iterators/generators, and
find them very cool.

Fair enough. It usually took me a while to start using a new feature,
but the ones I listed I find utterly invaluable - to the point where coding
against Python 2.1 I find annoying. I'd hate to have to code against
Python 1.5.2, now that I'm used to having all the new tools.
The single biggest improvement I see in the language since 1.5.2 is
string methods! After that, maybe augmented assignments (or whatever
you call them; the ability to write "x += 1").

Good points on both - I forgot them in my list, but use them all the time.
Most of the big improvements I've seen are in the library. When did
unitest get added? I can't live without unittest. I like the logging
module, even if I think it's about twice as complicated as it should be.

There have been an enormous number of improvements to the stdlib,
true, but we're talking about changes to the language in this case.
Speaking of libraries, Dan Bishop posted some interesting example of
@memoize and @printreturns utility wrappers. This leads me to think
that a good way to leverage the idea of decorators would be a module of
common utility functions which could be used as decorators by anybody.
I'll call the module martha (since it supplies things used for
decorating). Does the proposed mechanism support something like (to use
one of Dan's exmaples, written with two different syntaxen):

That might happen, but it's unlikely for 2.4. Give people time to use the new
tools, find out what works well and not so well, and standardise those.
I'm even wondering if somehow decorators could be used for i18n? The
obvious problem there, is print is a statement not a function, and you
can't decorate statements (or can you???).

Huh? You can only decorate function-or-methods. These can contain
statements or expressions. And no, you can't decorate lambda. That
would be an obscenity wrapped in a heresy and smacked with the
ugly stick.

Anthony
 
A

Avner Ben

Scott said:
...
classmethod and staticmethod were introduced some time ago, to provide
a mechanism for getting to such effects. At the time there was no clear
syntax to use that "felt right." In the meantime people have used these
features to good effect despite the clunky way you used them. At this
point there would probably be a small riot (or at least a large whine)
if these two were removed.
...

The "property" call resembles both classmethod, staticmethod and
instancemethod, but cannot be eliminated using the new function
decorator syntax, because of its m:1 nature - one property binds
together a getter, a setter etc., where staticmethod etc. change the
status of one function in one way.

So, if the problem is to rid class definitions of bizarre function
calls, stuck in the middle of nowhere, that actually add to the
structure of the class (and which other OO languages solve by legitimate
syntax), I am dissapointed to observe that functuion decorators do not
do a complete job after all.

Talking about properties, I like the C# way of defining them, which is
straightforward and readable. The property begins like a method, but has
no argument list and includes a getter function with no arguments and a
setter function with one argument. Adapted to Python, it would look
something like:

class hasProperty:
def __init__(self,aProperty='')
self.aProperty = aProperty
def AProperty:
def get(self):
return self.aProperty
def set(self,value):
self.aProperty = value
obj = hasProperty()
obj.AProperty = 'test'
print obj.AProperty
 
R

Roy Smith

Speaking of libraries, Dan Bishop posted some interesting example of
@memoize and @printreturns utility wrappers. This leads me to think
that a good way to leverage the idea of decorators would be a module of
common utility functions which could be used as decorators by anybody.
I'll call the module martha (since it supplies things used for
decorating). Does the proposed mechanism support something like (to use
one of Dan's exmaples, written with two different syntaxen):

That might happen, but it's unlikely for 2.4. Give people time to use the new
tools, find out what works well and not so well, and standardise those.[/QUOTE]

I think you missed the gist of my question, which is probably my fault
for wrapping it up (decorating it?) with a lot of other peripheral
points.

The key question is whether the decorator mechanism would allow such a
thing? All of the examples I've seen have the decorator defined right
before it's used, and having a simple name. I'm guessing that the real
syntax is @<callable>, and that any expression that evaluations to a
callable object is kosher after the "@"? So, any of the following would
be syntactically correct:

---------------------

import martha
@martha.memoize
def foo (x):
return x

---------------------

# don't know why you would want to do this, but I'm
# exploring the edges of the envelope.

def decorator1 ():
pass

def decorator2 ():
pass

decorators = [decorator1, decorator2]

def getDecorator (i):
return decorators

@decorators[0]
@getDecorator (1)
def myFunction ():
pass


---------------------
 
A

Andrew Durdin

The key question is whether the decorator mechanism would allow such a
thing? All of the examples I've seen have the decorator defined right
before it's used, and having a simple name. I'm guessing that the real
syntax is @<callable>, and that any expression that evaluations to a
callable object is kosher after the "@"?

There was some discussion about this on python-dev, and the BDFL's
conclusion was that arbitrary expressions were not allowed, but only
dotted names (with optional parentheses), i.e.:

@decorator
@module_or_object.decorator
@func_returning_decorator(args)
@module_or_object.func_returning_decorator(args)

(With presumably multiple dotted levels allowed, e.g. module.object.decorator)

See http://www.python.org/dev/doc/devel/ref/function.html for the
grammar definition.
 
R

Roy Smith

Andrew Durdin said:
There was some discussion about this on python-dev, and the BDFL's
conclusion was that arbitrary expressions were not allowed, but only
dotted names (with optional parentheses), i.e.:

@decorator
@module_or_object.decorator
@func_returning_decorator(args)
@module_or_object.func_returning_decorator(args)

(With presumably multiple dotted levels allowed, e.g. module.object.decorator)

See http://www.python.org/dev/doc/devel/ref/function.html for the
grammar definition.

Wow, I'm glad I asked. That answer is quite surprising, and somewhat
disconcerting. Lot's of special cases going on here, which is bad.

BTW, I don't understand one of the examples in the grammar. It says:
If there are multiple decorators, they are applied in reverse order. For
example, the following code:

@f1
@f2
def func(): pass

is equivalent to:

def func(): pass
func = f2(f1(func))

I don't see what that's described as "reverse order". To my eye,
they're applied in the order they're specified. First you apply f1,
then you apply f2. The code above is the same as:

func = f1 (func)
func = f2 (func)

Reverse order to me would imply:

func = f1(f2(func))

-or-

func = f2 (func)
func = f1 (func)

i.e. you apply the last one first, as if you had pushed the decorators
onto a stack and processed them by popping the stack.

I'm not arguing that the semantics should be changed, but that using the
phrase "reverse order" to describe the semantics is confusing.
 
A

Andrew Durdin

Wow, I'm glad I asked. That answer is quite surprising, and somewhat
disconcerting. Lot's of special cases going on here, which is bad.

Well, I think I agree with the principle of limiting the expressions
to dotted names. Some of the possibilities if you allow arbitrary
expressions are pretty hairy:

"""
Things someone might want to do, ordered roughly from most reasonable
to least reasonable ;)
@foo().bar()
@foo or bar
@mydecorators['foo']
@lambda f: foo(f) or bar(f)
"""
(from http://mail.python.org/pipermail/python-dev/2004-August/046673.html)
 
R

Roy Smith

Andrew Durdin said:
Wow, I'm glad I asked. That answer is quite surprising, and somewhat
disconcerting. Lot's of special cases going on here, which is bad.

Well, I think I agree with the principle of limiting the expressions
to dotted names. Some of the possibilities if you allow arbitrary
expressions are pretty hairy:

"""
Things someone might want to do, ordered roughly from most reasonable
to least reasonable ;)
@foo().bar()
@foo or bar
@mydecorators['foo']
@lambda f: foo(f) or bar(f)
"""
(from http://mail.python.org/pipermail/python-dev/2004-August/046673.html)

You can always do lots of hairy things all over the place. Shall we
limit functions to no more than 4 arguments? Assignment statements to
no more than 5 operators? Function calls to nesting no more than 3
deep? I'm not in favor of hair, but the way to contain it is with good
style and practice, not with arbitrary constraints in the grammar.

The most simple thing is the most general thing. The thing after the @
gets called, therefore it must be a callable object. I don't see any
need to specify anything further.

It's not good style to crassly split infinitives either, but it sure
would be annoying if my news client prevented me from doing it.
 
D

Dan Sommers

Why do you ask this about the second form only? func is a name that
references a function in both cases. It doesn't matter whether that
name was assigned directly with a def statement, or is the result of a
subsequent binding.

Technically, that is correct. Optimally, I can treat any application of
any function like a black box that takes some inputs and returns a value
(and possibly changes the inputs or has other (desireable) side effects,
but that's another topic entirely). I shouldn't have to care where it
is, or who defined it, or how they defined it, or whether it's always
the same function throughout the execution of the program, as long as it
does what it's supposed to do when I invoke it. Maybe that's my
problem. I want to know a little more when I'm looking at code that
calls a function.

When I see f( x ), I think that f is a function bound by def (or an
extremely close relative, such as class or staticmethod), and that I can
grep for it somewhere, either in the source code or the library
reference. I also think, rightly or wrongly, that f *was* bound,
*before* I needed it, and *does not change over time*.

When I see apply( f, x ), I think that f varies over time, and is some
sort of a callback or plugin or whatever, and is *not* the name of an
actual function bound by def. I also accept (and expect) that a
different function may be called the next time this particular code
executes; those functions have their own names in their own contexts and
namespaces.

Yes, I know that all functions are free to redefine anything, but
functions that do are usually labelled pathological and then blamed for
the lack of optimizing python compilers.

ObDecorator Question: Does some decorator that specify that a function
is not knowingly rebound within its own namespace, such that f( x ) now
means the same as f( x ) later, assuming that x is the same? What about
at least that f is the same (and need not be looked up again), even if x
isn't? This question lurks in the minds of every potential Python
optimizing compiler author.

If the two forms of function application are truly interchangeable, then
why do we have both of them in the first place?

[snip my reasons for not liking that bit of New C]
For 'recently' read 1987. In fact most C compilers probably
implemented this feature before the ISO standard came out, but it was
a feature of the first standardised version of C. I remember around
that time being extremely glad that I could finally omit those
extraneous parentheses.

I must be getting old. :-/

My Second Edition K&R confirms that you are correct on all counts except
that bit about you being happy, but I'll take your word for that <wink>.
It seems more recent than that, or it might be that I spent an awful lot
of time in environments (office/political and target system) where the
difference remained important for design, documentation, extensibility,
performance, resource usage, testing, and/or debugging purposes.

Regards,
Dan
 
M

Mark Bottjer

Avner said:
> So, if the problem is to rid class definitions of bizarre function
> calls, stuck in the middle of nowhere, that actually add to the
> structure of the class (and which other OO languages solve by
> legitimate syntax), I am dissapointed to observe that functuion
> decorators do not do a complete job after all.

While true, I find this less of a disappointment than you. There are two
large differences between class/staticmethods and properties: methods
act like functions, while properties act like variables; and properties
have multiple associated code blocks (the getter and setter), while
methods have only one. @decorator as designed applies only to functions,
and is simply inappropriate for properties. Disappointing, perhaps, but
not surprising--variables and functions *are* fundamentally different.
> Talking about properties, I like the C# way of defining them, which
> is straightforward and readable. The property begins like a method,
> but has no argument list and includes a getter function with no
> arguments and a setter function with one argument. Adapted to Python,
> it would look something like:
>
> class hasProperty:
> def __init__(self,aProperty='')
> self.aProperty = aProperty
> def AProperty:
> def get(self):
> return self.aProperty
> def set(self,value):
> self.aProperty = value
> obj = hasProperty()
> obj.AProperty = 'test'
> print obj.AProperty

I, personally, don't like the idea of overloading def in this way. To
me, 'def' defines something that looks and acts like a function, just
like 'class' defines something that looks and acts like a class, or
'while' defines something which looks and acts like a loop. AProperty is
does not act like a function, so using def would be misleading.

I agree, though, that the best way to handle properties is though some
sort of extended syntax. After all, we already have *a* way of doing it,
we just don't like how it looks. To fix this, we need to compress the
various pieces (name, storage, getter, setter, etc.) into a single
declarative construct. (In fact, this is a large part of my problem with
the proposed @ syntax: it isn't part of the function it modifies, but
rather some sort of odd prefix. OTOH, the prefix notation would work
with variables as well as functions, which is not possible with an infix
notation: @global; x=10. Whether or not this is actually of any benefit
is left as an exercise for the reader.)

I've thought for a while now that Python is skirting a breakthrough in
how it treats statements. Python took a step in the right direction with
generalizing iteration via generators, and I see utility in generalizing
the idea of statements as well. In particular, what I'd like to see is
the ability to define new *types* of statement.

Consider again the classmethod, staticmethod, and property. If these
were defined as extended statements, we could code them more naturally:

class C:
c = 'Hi there!'

class_def cm( c, a, b): # class method
return 'C(%r).cm( %r, %r) -> %r' % (id(c), a, b, c.c)

static_def sm( a, b): # static method
return 'C.sm( %r, %r) -> %r' % (a, b, a % b)

def im( s, a, b): # instance method
return 'C(%r).im( %r, %r) -> %r' % (id(s), a, b, (a % b) * self.a)

def __init__( s, a): # constructor
s.a.__init__( a)

property a:
def __init__( s, v):
__set__( s, v)
def __get__( s):
print 'C(%r).a -> %r' % (s, s.__a)
return a
def __set__( s, v):
print 'C(%r).a <- %r' % (s, v)
a = v
C(<C instance at 0x98765432>).a -> 3.1415
3.1415

In the above, 'class' is as we know them. 'class_def' and 'static_def'
are statements similar to 'def', but define a class method or static
method, respectively. Finally, 'property' is a statement which defines a
new property, and takes a code block expected to contain functions with
predefined names. Each of these could be implemented natively, or the
way we would do it manually today.

Taking this to the extreme, all existing statement types (class, def,
print, del, etc.) could be recast as these generalized statements. We
could even allow subclassing of existing statements to modify their
behavior (for example, class_def subclassing def to make the new
function a class method). Strong Kung-Fu indeed. Scary, but strong.

Obviously, creating new control constructs is not something we'd want to
do every day, as it can be a great way to obfuscate code beyond all hope
of understanding--but the same is true of meta-classes. Just because it
*could* be abused doesn't mean that it would be.

On the flip side, this doesn't address some of the more creative uses of
decorators that people have been proposing. A new statement type, like
class_def, effectively applies a fixed set of decorators (in the case of
class_def, it applies only classmethod); if we want more variety, we
still need decorators. They are independent ideas, even though they can
overlap a bit.

I have absolutely no idea as to *how* any of this would be accomplished,
mind you, but it would be *terribly* nifty.

-- Mark
 
B

Bengt Richter

is equivalent to:

def func(): pass
func =3D limited_expression_producing_function(another(func))
[...]
I've seen that transformation before ;-)

I'd be interested if someone with 2.4 would do the following (untested):

def A(f): return f
def B(f): return f

def foo():
@A
@B
def bar(): pass
return bar

def baz():
def bar(): pass
return A(B(bar)

import dis
dis.dis(foo)
dis.dis(baz)

Regards,
Bengt Richter
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top