Aspect Programming Module

J

John Roth

Daniel Dittmar said:
AOP is not about patching existing programs/libraries. This is just the
way it is usually implemented in Java.
Other approaches used for languages like C++ use code generation. Code
generation isn't as nifty and doesn't work as well for Java as Java
hasn't got a #line directive to guide the debugger. But it's You Can See
What You Get.

Code generation is an interesting beast. Code generation where I
never have to look at the generated code for normal development
is something I appreciate. Code generation where I have to maintain
my own code around the generated code is an abomination. Since
I don't know how the tools work in C++, I can't comment on which
approach it takes.
AOP grew out of the observation that an evergrowing percentage of the
code of the average method is taken over by parts that are not related
to the main purpose of the method, but by logging, sychronization,
transaction handling and other stuff.

Bad design is bad design. Having methods with more
than one responsibility is poor cohesion and high coupling.
Spending a bit of time thinking about how to encapsulate
this type of crud will pay big dividends. Unfortunately, it
won't get academic publication credits.
This kind of code is often of the boilerplate kind, which lends itself
easily to code generation. And it requires often some context
information, which must then be passed either through parameters or
through objects, adding to the line noise that distracts from the 'real
code' of the method.

Duplicated code is duplicated code, and should be ruthlessly
eliminated wherever found. It doesn't matter whether it's
generated by some tool or inserted by the developer.
And the 'we do this through refactoring' argument ends where you use
external libraries unless you want to fork them.

Now you're contradicting yourself. First you say that it's not
about patching to get around stupid libraries, now you say it
is about patching to get around stupid libraries.

John Roth
 
D

Daniel Dittmar

John said:
Bad design is bad design. Having methods with more
than one responsibility is poor cohesion and high coupling.
Spending a bit of time thinking about how to encapsulate
this type of crud will pay big dividends. Unfortunately, it
won't get academic publication credits.

So if you want a feature (= public method) that supports logging +
synchronization + transaction handling + 'the main feature', you'll write
four methods, each one doing it's own and then calling the next layer. This
is certainly doable, but if you do this to several methods in several
classes, you'll be probably asking yourself if there isn't a better way. AOP
tries to be such a way.
Duplicated code is duplicated code, and should be ruthlessly
eliminated wherever found. It doesn't matter whether it's
generated by some tool or inserted by the developer.

It depends on your programming language if every kind of duplication can be
factored out. Assume that you want a variant of a class where several
methods are synchronized. The code for the methods in the new class will
always look like

lock.acquire ()
call_original_method (args)
lock.release ()

I do not see how the duplication in the structure can be factored out in
languages like Java or C++. In Python, I could do the following

class SynchronizedCall:
def __init__ (self, lock, method):
self.lock = lock
self.method = method

def __call__ (self, *args, **keywargs):
self.lock.acquire ()
try:
self.method (*args, *keywargs)
finally:
self.lock.release ()

obj.append = SynchronizedCall (obj.lock, obj.append)
# the following calls to obj.append () will be synchronized
Now you're contradicting yourself. First you say that it's not
about patching to get around stupid libraries, now you say it
is about patching to get around stupid libraries.

It's not *only* about patching existing libraries, it's a different way to
structure the code. In addition, it has the potential advantage that you can
adapt existing libraries without having to change them.

Daniel
 
W

Will Stuyvesant

[Daniel Dittmar]
class SynchronizedCall:
def __init__ (self, lock, method):
self.lock = lock
self.method = method

def __call__ (self, *args, **keywargs):
self.lock.acquire ()
try:
self.method (*args, *keywargs)
finally:
self.lock.release ()

obj.append = SynchronizedCall (obj.lock, obj.append)
# the following calls to obj.append () will be synchronized

That is a nice example. People (Peter Hansen, me, ...) asked earlier
in this and the previous thread for AOP usage examples/patterns/...
and this is one.

The academic AOP examples I saw could not convince me, but perhaps
somebody who uses AOP-like techniques a lot in practice (not me) can
collect such code examples like the above for something like a Python
aop-module? I think that would be interesting and possibly highly
useful.

the-size-of-the-python-aop-module-would-convert-more-java-peeps-ly
y'rs - will
 
J

Jacek Generowicz

Peter Hansen said:
I would be very interested to hear some real-world and real useful
use cases for AOP as well.

AOP is about separation of concerns. Let me repeat that: AOP is about


SEPARATION OF CONCERNS.


Forget all the Java-based AOP mumbo jumbo, forget about loggers and
think about separation of concerns. Try to think of some situations
where there are two (or more) different aspects to the code you are
writing, but you only want to think of one of them at a time. I wanted
to talk about this in an abstract way (in order to avoid replies which
nitpick the specific example I might choose, and thus divert the
discussion from the real issue: what the point of AOP is) ... but I
feel I need a concrete example as a crutch to illustrate the
point. Please don't attack my example, try to see the bigger picture
that it is trying to demonstrate. So, imagine you are writing some
code which perform some operations in some loops ... and you also want
some loop unrolling. Wouldn't it be nice if you could write your loops
in a clear concise way which expresses what it is that they are trying
to do ... and SOMEWHERE ELSE write a loop unrolling mechanism. You
_separate concerns_ by being concerned about functionality in one
place, and being concerned by optimization somewhere else. You then
use some "magic" to weave the two concerns together. (Of course,
optimizing compilers do just this kind of magic for you in this
specific case, so they do provide separation of concerns in some
sense, but the aspects provided by the compiler are fixed, while AOP
is supposed to allow you to create all your aspects yourself.)

(Note, I'm not saying anything about _how_ AOP aims to achieve this;
I'm merely trying to show _what_ it is trying to achieve: separation
of concerns.)

As for real world examples ... if you think of AOP as separation of
concerns which you want to be weaved together, then I am sure that you
will be able to come up with plenty of excellent real world examples
yourself.

Now, is AOP something that solves problems which are absent in Python?
Yes and no. Just because AOP is being formalized an popularized in the
context of Java (which makes its presentation obfuscated by Java's
problems) does not mean that AOP (ie separation of concerns) has no
value outside of Java.

Indeed, the roots of AOP can be traced back to the CLOS MOP (it's no
accident that Gregor Kiczales is a key figure in both); it seems that
AOP is an effort to bring the concern separating abilities discovered
in the MOP, to a wider audience.
 
P

Peter Hansen

Jacek Generowicz wrote:
(a lengthy reply, and thank you for taking the time!)
Peter Hansen said:
I would be very interested to hear some real-world and real useful
use cases for AOP as well.

AOP is about separation of concerns. Let me repeat that: AOP is about
SEPARATION OF CONCERNS. [snip]
As for real world examples ... if you think of AOP as separation of
concerns which you want to be weaved together, then I am sure that you
will be able to come up with plenty of excellent real world examples
yourself.

Unfortunately, that's just my point. So far I haven't been able
to do that, it seems. The example posted previously with the
"synchronization wrapper" approach is, however, quite understandable
and comes from real-world experience that many of us share. Is
that AOP? Is it not AOP? If it's an application of AOP principles,
then is AOP anything other than a good practice which some of us
have been doing for years, prettied up with a nice name. In other
words, Just Another Pattern?

Jacek, I appreciate the attempt to clarify, but so far it seems to
me that *everyone* who talks about AOP just talks in abstract terms
and isn't willing (or able?) to pin it down with *real* real-world
examples, just "imagine this" or "wouldn't it be nice if". You
seem to be someone who understands AOP well: do you use it? Could
you please provide an example from your own uses which demonstrates
the effectiveness of AOP versus "not AOP" in the same way that the
synchronization example posted earlier is clearly better when done
as a wrapper than with the code duplicated everywhere (i.e. with
the "concerns" not separated)?

-Peter
 
J

John Roth

Peter Hansen said:
Jacek Generowicz wrote:
(a lengthy reply, and thank you for taking the time!)
Peter Hansen said:
I would be very interested to hear some real-world and real useful
use cases for AOP as well.

AOP is about separation of concerns. Let me repeat that: AOP is about
SEPARATION OF CONCERNS. [snip]
As for real world examples ... if you think of AOP as separation of
concerns which you want to be weaved together, then I am sure that you
will be able to come up with plenty of excellent real world examples
yourself.

Unfortunately, that's just my point. So far I haven't been able
to do that, it seems. The example posted previously with the
"synchronization wrapper" approach is, however, quite understandable
and comes from real-world experience that many of us share. Is
that AOP? Is it not AOP? If it's an application of AOP principles,
then is AOP anything other than a good practice which some of us
have been doing for years, prettied up with a nice name. In other
words, Just Another Pattern?

Jacek, I appreciate the attempt to clarify, but so far it seems to
me that *everyone* who talks about AOP just talks in abstract terms
and isn't willing (or able?) to pin it down with *real* real-world
examples, just "imagine this" or "wouldn't it be nice if". You
seem to be someone who understands AOP well: do you use it? Could
you please provide an example from your own uses which demonstrates
the effectiveness of AOP versus "not AOP" in the same way that the
synchronization example posted earlier is clearly better when done
as a wrapper than with the code duplicated everywhere (i.e. with
the "concerns" not separated)?

Oh, absolutely. "Separation of concerns" is all very nice, but it's
simply another way of looking at program composability. And
composability has turned out to be a very hard problem: there are
no, and I mean no, examples of composable programs where there
was no planning on making it happen.

The other thing to note is that "separation of concerns" imposes
significant overhead, and I'm not just talking about performance.
It cuts down on communication, code sharing and any possibility
of finding structural congruence that might lead to deeper understanding.

John Roth
 
J

Jacek Generowicz

Peter Hansen said:
Jacek Generowicz wrote:
(a lengthy reply, and thank you for taking the time!)
AOP is about separation of concerns. Let me repeat that: AOP is about
SEPARATION OF CONCERNS. [snip]
As for real world examples ... if you think of AOP as separation of
concerns which you want to be weaved together, then I am sure that you
will be able to come up with plenty of excellent real world examples
yourself.

Unfortunately, that's just my point. So far I haven't been able
to do that, it seems.
The example posted previously with the "synchronization wrapper"
approach is, however, quite understandable and comes from real-world
experience that many of us share. Is that AOP? Is it not AOP?

Well, that depends on what you think AOP is. If you think that AOP is
Kiczales' work in Java, then clearly the synchronization wrapper isn't
AOP. If you think of AOP as separation of concerns, then the
synchornization wrapper is AOP ... but then so are optimizing
compilers. I am partial to a view somewhere in between those two
extremes ... maybe viewing AOP as a pattern.

But "AOP" is a buzzphrase and it's probably best not to get too hung
up on buzzphrases. I would prefer to see the ideas behind it and to
try to apply any that have merit where they may help me. "Separation
of concerns" summarizes something towards which I've been striving,
even unconciously, before I ever heard of AOP, but seeing it as an
active field of study encourages me to believe that it might become
easier to separate concerns in my programs, either through language
support, or people discovering neat ways of achieving it in various
languages.
If it's an application of AOP principles, then is AOP anything other
than a good practice which some of us have been doing for years,
prettied up with a nice name. In other words, Just Another Pattern?

Yes, I think that you are justified, to some extent, to view it as
just another pattern. Of course, some languages make it easier to do
AOP as a pattern, while others require external help, while in others
still it's so trivial that it doesn't even merit being called a
pattern:

Consider multiple dispatch in Common Lisp, C++ and C; in C++ it's a
pattern (Visitor), in CL it just is, and in C ... ?

How about the State Pattern? "foo.__class__ = bar" is how you do it
in Python, is that a pattern?

How about OOP in C++ and C? in C++ it just is, and in C it's a
pattern (or requires external help: ie somebody coming along and
implementing C++ or Objective C).

So, maybe AOP is a technology in Java, a pattern in Python, and just
is in the CLOS MOP ?

(Hmm, I recall a pertinent anecdote about this ... let's see if I can
find it ... here it is

Message id: <[email protected]> :

I am reminded of Gregor Kiczales (sp?) at ILC 2003 displaying some
AspectJ to a silent crowd, pausing, then plaintively adding, "When
I show that to Java programmers they stand up and cheer."
)
You seem to be someone who understands AOP well:

Oh lawd, I never intended to create such an illision !

I like to think that one of my merits is trying (and sometimes even
succeeding) to see past all the noise and hype surrounding something
and understaing, in as simple terms as possible, what the essence of
it is. It is from this perspective that I offered my article: I know
essentially bugger all about AOP in Java, but I believe that AOP is
about separation of concerns. If you look at it that way, next time
you are come across the need to separate concerns, a little bell might
ring and you might think of AOP. How you go about separating those
concerns is strongly dependent on the language you happen to be using.
do you use it?

Certainly not in the Java sense.

Unfortunately I'm stuck mostly in C++ hell, but I suspect that
separation of concerns creeps into my Python code (when I get to write
some), probably even without me explicitly identifying it a such.
Could you please provide an example from your own uses which
demonstrates the effectiveness of AOP versus "not AOP" in the same
way that the synchronization example posted earlier is clearly
better when done as a wrapper than with the code duplicated
everywhere (i.e. with the "concerns" not separated)?

Not offhand. If I notice some neat separation of concerns in my work
I'll try to remember to post it.


Sorry, this ended up much longer than it should have been.
 
J

Jacek Generowicz

John Roth said:
there are no, and I mean no, examples of composable programs where
there was no planning on making it happen.

Aaah, yes, Proof by Ignorance[*], one of my favourite logical tools :)


[*] I don't know a programming language whose name starts with "P" and
ends with "n", therefore none exist. QED[+]


[+] Best used in conjunction with Proof by Projection: None exist now,
therefore none have existed in the past, and none will exist in
the future. Ignorance is a powreful tool.
 
D

Daniel Dittmar

Jacek said:

C moved the I/O from the language to a library. This is separation of
concerns, but I wouldn't call it AOP.

My idea of AOP vs. OOP is about hooks or variation points: ways to customize
existing code.

OOP added two kinds of variation points: interfaces and calls of virtual
methods (yes, the two are closely related). When I have a method a, I can
- self.b (): add a call to a virtual method b of self: the behaviour of
method a can be changed by implementing a subclass and overriding b
- x.c (): add a call to a virtual method c of variable x: the behaviour of
method a can be changed by passing a different kind of x into the method

Both of these hooks must be provided be the implementor of a, he/she must
add calls self.b () and x.c ().

AOP tries to add hooks that are provided by the structure of the program so
that the programmer doesn't have to add them by hand:
- method entry and exit
- creation of objects
- calls to methods
- module import
- ...

Those of you using XML might want to visualize the program as a DOM tree and
AOP as applying a XSLT stylesheet. (Now visualize applying several different
stylesheets in arbitrary order)

This works probably best if each short rule matches many individual hooks.
This is why the logging example is so compelling. If many rules match only
one individual hook, then you have 'mincemeat of concerns'.

Daniel
 
M

Michele Simionato

Peter Hansen said:
Could
you please provide an example from your own uses which demonstrates
the effectiveness of AOP versus "not AOP" in the same way that the
synchronization example posted earlier is clearly better when done
as a wrapper than with the code duplicated everywhere (i.e. with
the "concerns" not separated)?

-Peter

I think Jacek made a beautiful post, or maybe he just summarized the way I
see AOP: nothing really different for "regular" programming, just another
way to help separation of concerns. Since you (Peter) always ask for
examples, here is a little script I posted few days ago on the mailing list,
which solves the issue of separating the concern of checking the
arguments of a constructor from the definition of the constructor.

==== quoting from a previous post of mine =====

class _WithConstructorChecked(type): # helper metaclass
def __call__(cls, *args, **kw):
assert len(args)<=2, "%s called with more than 2 args" % cls
assert kw.has_key("kw"), "%s needs a 'kw=' argument" % cls
return super(_WithConstructorChecked,cls).__call__(*args,**kw)

class WithConstructorChecked(object): # mixin class
__metaclass__ = _WithConstructorChecked

class C(WithConstructorChecked):
def __init__(self, *args, **kw):
pass

c=C(1,2,kw=3) # ok; try different signatures to get assertion errors

In this example the mixin class WithConstructorChecked ensures that C is
called with at least two positional arguments and a keyword argument named
'kw'. The code of class C is not touched at all, you just add
WithConstructorChecked to the list of its bases.

=== end quote ===

I tend to avoid the usage of AOP techniques in my production code, since I feel
them too magical and most of the time I can do what I want in "regular"
Python. Personally, I hate the hype about AOP, but I do like the concept.


Michele Simionato
 
M

Michael Hudson

[...]
Well, that depends on what you think AOP is.
[...]

Sorry, this ended up much longer than it should have been.

Nevertheless, thank you for it. I was at the PythonUK conference last
week, which was part of the ACCU spring conference, and *lots* of
people were asking "can you do Aspect Oriented Programming in python?"
and my repsonses tended to be much less articulate versions of that
post.

Cheers,
mwh
 
B

Bryan

Those of you using XML might want to visualize the program as a DOM tree and
AOP as applying a XSLT stylesheet. (Now visualize applying several different
stylesheets in arbitrary order)


Daniel


thank you for this explanation... it's the best explanation i've heard yet and now i have a grasp and can visualize what
AOP is.

bryan
 
W

Will Stuyvesant

[Peter Hansen ]
... I appreciate the attempt to clarify, but so far it seems to
me that *everyone* who talks about AOP just talks in abstract terms
and isn't willing (or able?) to pin it down with *real* real-world
examples ...

You got it.

AOP somehow did get scientific impact in computing
science. I had to read scientific papers about AOP and
I had to listen to AOP conference presentations. It's
all abstract chitchat about the benefits, and then they
usually introduce yet another formalism or even
mini-language to reason about. Formulas in ad-hoc
formalisms. Forget about *real* real-world examples,
these people just want to get papers published.
Usability is considered of minor impportance. They are,
with very few very rare exceptions, NOT able to show
useful code in a high level language like Python. What
they DO show is code in Java that solves Java problems,
usually to do with the limitations imposed by the static
typing system. It is a bad sign that so much crappy AOP
papers get accepted. I have come to the conclusion that
AOP is nothing more than what I expect from a decent
programmer: a good, or at least reasonable, design of
software in the first place.
 
J

Jacek Generowicz

Daniel Dittmar said:
Jacek said:
SEPARATION OF CONCERNS. [snip]

C moved the I/O from the language to a library. This is separation of
concerns

Not really. By separation of concerns I mean:

Where previously you had to keep two orthogonal goals in mind in a
single chunk of code, you now deal with each goal separately.

Stupid example:

def foo(a):
print "before", a
a.sort()
print "after", a
...
print "before", a
a.reverse()
print "after", a

foo(bar)

In the above you are manually mixing the concern of modifying an
object, with the concern of tracking its changes. In the following you
separate them.

# Worry about the modifications you want to make
def foo(a):
a.sort()
...
a.reverse()

# Worry about tracking changes to an object
class mutation_reporter:
def __init__(self, obj):
self.obj = obj
def __getattr__(self, name):
if name in mutating_methods:
print "before", self
self.blah(name)
print "after", self
def __setattr__(...):
...

# Weave together the separate concerns.
foo(mutation_reporter(bar))


Is this AOP? Should I care?

Is it useful? Should I care?

(I guess the way one answers questions 2 and 4, is ... err
.... interesting.)
 
J

John Roth

Jacek Generowicz said:
John Roth said:
there are no, and I mean no, examples of composable programs where
there was no planning on making it happen.

Aaah, yes, Proof by Ignorance[*], one of my favourite logical tools :)


[*] I don't know a programming language whose name starts with "P" and
ends with "n", therefore none exist. QED[+]


[+] Best used in conjunction with Proof by Projection: None exist now,
therefore none have existed in the past, and none will exist in
the future. Ignorance is a powreful tool.

Thus sayeth the Zealot, who, when he no longer has a leg
to stand on, resorts to cutsi-poo mindreading. If you will
come up with an example, I will stand by to show you exactly
where the planning happened that allowed it to be composed.

John Roth
 
J

Jacek Generowicz

What they DO show is code in Java that solves Java problems, usually
to do with the limitations imposed by the static typing system.

You might enjoy the first bullet in the conclusions of the following:

http://www.cs.uni-bonn.de/~costanza/dynfun.pdf

(Those of you not allergic to parentheses might find it interesting
reading thoughout.)
I have come to the conclusion that AOP is nothing more than what I
expect from a decent programmer: a good, or at least reasonable,
design of software in the first place.

I think there is _potentially_ more to it than that. I think that
languages can provide direct assistance with this goal. The quality,
weight, usefullnes and hype of this support varies greatly with the
language in question, of course.
 
D

Daniel Dittmar

Jacek said:
Not really. By separation of concerns I mean:

I know you meant 'separation of concers as defined by AOP'. But saying
'AOP allows separation of concerns as defined by AOP' is somewhat circular.
Stupid example:

Why not use an intelligent example instead? LISP had before and after
methods for ages, so there should be plenty of examples.

Daniel
 
J

Jacek Generowicz

Jacek Generowicz said:
Stupid example:

Let's give a better one, which is still short

# Mix concerns of calculating Fibonnaci numbers, and avoiding
# recalculating the same answer repeatedly

def fib(n):
if n<2: return 1
try:
return fib.cache[n]
except KeyError:
return fib.cache.setdefault(n, fib(n-1)+fib(n-2))
fib.cache = {}

###############################################################

# Deal with avoiding recalculating the same answer repeatedly
def memoize(fn):
cache = {}
def proxy(*args):
try:
return cache[args]
except KeyError:
return cache.setdefault(args, fn(*args))
return proxy

# Deal with calculating Fibonnaci numbers
def fib(n):
if n<2: return 1
return fib(n-1) + fib(n-2)

# Weave
fib = memoize(fib)
 
T

Terry Reedy

Jacek Generowicz said:
# Mix concerns of calculating Fibonnaci numbers, and avoiding
# recalculating the same answer repeatedly

def fib(n):
if n<2: return 1
try:
return fib.cache[n]
except KeyError:
return fib.cache.setdefault(n, fib(n-1)+fib(n-2))
fib.cache = {} ###############################################################

# Deal with avoiding recalculating the same answer repeatedly
def memoize(fn):
cache = {}
def proxy(*args):
try:
return cache[args]
except KeyError:
return cache.setdefault(args, fn(*args))
return proxy

# Deal with calculating Fibonnaci numbers
def fib(n):
if n<2: return 1
return fib(n-1) + fib(n-2)
# Weave
fib = memoize(fib)

The funny thing about this example is that its success critically depends
on fib *not* being 'optimized' by having the recursive calls being either
compiled as or replaced by (as with Hettinger's recent proposal) efficient
local calls.

In my opinion, functions defined on counts (0, 1, 2,...) are better
memoized with a list. For instance:

def listify(f_of_n):
cache = []
maxstored = [-1]
def proxy(n):
maxn = maxstored[0]
while n > maxn:
maxn += 1
cache.append(f_of_n(maxn))
maxstored[0] = maxn
return cache[n]
return proxy
987

Hmmm. When I started this reply, I was going to emphasize that 'separating
concerns' is much less efficient than more directly writing

def fibinit():
cache = [1, 1]
maxstored = [len(cache) -1]
def _fib(n):
maxn = maxstored[0]
while n > maxn:
maxn += 1
cache.append(cache[maxn-2] + cache[maxn-1])
maxstored[0] = maxn
return cache[n]
return _fib

fib = fibinit()987

which is true - there is a lot of fluff in having the proxy update loop
repeatedly call itself twice to look up the cache values after unnecessary
testing to see if the cache has the needed values. It is definitely faster
to look them up directly. On the other hand, having written and modified
listify, and thinking of possible edits, or even making it a class, I can
see the point of having it cleanly separated from the function definition.

Terry J. Reedy
 
P

Peter Hansen

Jacek said:
Let's give a better one, which is still short

# Mix concerns of calculating Fibonnaci numbers, and avoiding
# recalculating the same answer repeatedly

<chuckle> There's a large contingent of folks in the world,
not trained as computer scientists, who would question the
claim that anything involving Fibonnaci sequences makes
a good "real world" example. ;-)

-Peter
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,535
Members
45,007
Latest member
obedient dusk

Latest Threads

Top