Replacement for keyword 'global' good idea? (e.g. 'modulescope' or 'module' better?)

P

Paolino

Terry said:
The specialness of globals and locals was part of Python's original simple
namespace design, and is still reflected in exec statements and eval
functions, as well as in nested functions.




Accessing such names is already possible, even *after* the outer function
returns and the rest of its execution context is deleted.




This is rebinding, rather than merely accessing. Similar, but even more
problematical would be initial binding in outer from inner:

def enclosing():
def enclosed():
outer var = 4




Actually, it is the reinvention of classes:

class enclosing(object):
def __init__(self):
self.var = 2
def enclosed(self):
self.var = 4
This is using some really 'featured' namespace called class to do things
which are useful to all namespaces.So not a solution but a hack.
Also 'self' is something really away from my example.A cloned namespace
of the class probably.
Working hard with python namespaces is hacking: probably this makes us
masters of python, when using namespaces should be a base knowledge.
There was a long discussion on the pydev list a couple of years ago re
adding rebinding in addition to access (of outer variables). I think, in
the end, Guido concluded that there was no compelling reason, as of then,
to add another general mechanism for private, sharable, rebindable
variables.




There were lots of proposals for both the exact syntax and semantics of
outer binding/rebinding.
You cut the phrase and the meaning of it.

The only one solution I'm talking about is this:
Stated 'outer var' in a namespace, jumping out of the namespace
we found ourselves in another namespace.
If 'var' is bound there, that's the solution to
the 'outer' ,if not we jump out again.If the 'var' is never found we
can raise an UnboundOuter error probably.


Regards Paolino


___________________________________
Yahoo! Messenger: chiamate gratuite in tutto il mondo
http://it.beta.messenger.yahoo.com
 
M

Mike Meyer

Dennis Lee Bieber said:
For a schizoid view... read both comp.lang.ada and
comp.lang.python together <G>

Will you settle for comp.lang.eiffel + the smalleiffel list and
comp.lang.python?

<mike
 
B

Bengt Richter

The specialness of globals and locals was part of Python's original simple
namespace design, and is still reflected in exec statements and eval
functions, as well as in nested functions.
Ok ;-) But that's historical, and tells us something about how we got to the status quo.
ISTM someone wants to discuss possibilities for the future (unless they are just complaining ;-)
Accessing such names is already possible, even *after* the outer function
returns and the rest of its execution context is deleted.


This is rebinding, rather than merely accessing. Similar, but even more
problematical would be initial binding in outer from inner:

def enclosing():
def enclosed():
outer var = 4


Actually, it is the reinvention of classes:

class enclosing(object):
def __init__(self):
self.var = 2
def enclosed(self):
self.var = 4

There was a long discussion on the pydev list a couple of years ago re
adding rebinding in addition to access (of outer variables). I think, in
the end, Guido concluded that there was no compelling reason, as of then,
to add another general mechanism for private, sharable, rebindable
variables.


There were lots of proposals for both the exact syntax and semantics of
outer binding/rebinding.
No doubt ;-)
It might be interesting to compare attribute names with bare names. For attributes
we also have a legacy from the historical evolution of current functionaly. E.g.,
classic vs new classes, descriptors, and most relevenat to the current discussion,
mro and super.

Bare names do not (yet ;-) have an explicit "nro" (name resolution order), and there
is no "super" for names (though someone recently proposed an "unshadow" operator here
on c.l.p, so the subject is alive ;-)

Nor do we have any correspondence for bare names analogous to

setattr(obj, "name", value) <=> obj.name = value

e.g.,

setname(<namespace_selection>, "name", value)

There is nowhere near the kind of control the programmer has over attribute namespace
use for bare-name use, even though by various means we are able to select from a limited
set of namespaces. (BTW, if <namespace_selection> were a call to a suitable builtin function
that could return objects whose attribute namespace were the desired name space, then setname
could be effected with setattr, e.g.,

setattr(get_ns_obj(), "name", value) # name = value (defaulting to local, same as get_ns_obj(0))
setattr(get_ns_obj(1), "name", value) # name = value (in lexically immediately (1) enclosing scope)
setattr(get_ns_obj(-1), "name", value) # name = value (in global module scope)
)

I am thinking that there is a subliminal discussion under a layer of red herrings ;-)
I refer to the subject of unification/generalizing/orthogonalizing by removing special
legacy restrictions (or not introducing special restrictions in a new feature).

E.g., recently decorators were introduced, and part of the discussion (which became explicit ;-)
was whether to allow the expression following the '@' to be a fully general expression, or
whether to restrict it to names, dotted names, and function calls. The latter won out.

Some regard this kind of restriction as paternalistic, and protest that "we are adults here"
(even though we are not all, and we others not all the time ;-)

The BDFL has introduced many new ideas, yet has retained or introduced restrictions on fully
orthogonal functionality that might otherwise be allowed if e.g. names in certain contexts
were allowed to be full expressions. It goes the other way too. IIRC the list of bases for
a class will be allowed to be an empty "()" soon.

Bottom line, I think Python is eminently usable and very pleasant to use, but I think bare name
mechanisms could be improved.

<another HOTTOMH idea ;-)>
What about more namespace control in the definition of functions, along the lines of
"nro" for bare name lookup in the function body? This could be in the form of a sequence
of ordinary objects whose attribute name spaces should be searched before looking in globals().
Also using := rather than = to override local namespace binding with extended name space lookup,
e.g., (discuss **=<name space object sequence> syntax later ;-)

def fun(a, b=2, *args, **kw, **=(nso, nso2, etc)):
print x # looks for local x, nso.x, nso2.x, etc.x, and then globals()['x']
# i.e., not a used-before-bound error if **= spec exists
x = 123 # binds local x in any case
print x # finds the local x first
del x # unshadow **= namespace sequence
x := 456 # not local binding unless pre-existing, but if so rebind.
# Also rebind if found in enclosing closure scope, BTW.
# Otherwise find the first object nsofound with x attr in **= object attribute name space sequence,
# and then do nsofound.x=456 -- where x could be a normal property defined by type(nsofound)

The property possibility should be interesting ;-)

</another>

Be kind, I've suggested := for find-and-rebind before, but I haven't thought very far about this combination
with extended function namespace ;-)

Regards,
Bengt Richter
 
T

Terry Reedy

Mike Meyer said:
Um - I see no mention of "AST" in that article at all. He's mostly
talking about "Language Oriented Programming" (seems to be another
term to describe DSLs) and "Language Workbenches".

AST means Abstract Syntax Tree and there are lots of mentions of ASTs and
'abstract syntax' in the collection of pages. Moreover, elevating abstract
representations is the key idea of Fowler's essay. Near the beginning he
says

"This is a good moment to introduce a common distinction that you run into
in programming language circles - the distinction between abstract and
concrete syntax. The concrete syntax of a language is its syntax in its
representation that we see. The XML and custom language files have
different concrete syntaxes. However both share the same basic structure:
you have multiple mappings, each with a code, a target class name, and a
set of fields. This basic structure is the abstract syntax. When most
developers think about programming language syntax they don't make this
separation, but it's an important one when you use DSLs. You can think of
this in two ways. You can either say we have one language with two concrete
syntaxes, or two languages that share the same abstract syntax."

Roth (following Fowler) talked about taking the first view of seeing the
abstract syntax as the language with multiple possible concrete syntaxes.
Fowler later introduces a section with the summary sentence

"One of the strongest qualities of language workbenches is that they alter
the relationship between editing and compiling the program. Essentially
they shift from editing text files to editing the abstract representation
of the program."

Again, this is what John proposed for Python (if defined by its abstract
syntax (high-level, I presume)). The multiple-paragraph explanation of the
above includes

"For the purposes of this discussion we can break the
[traditional]compilation process into two steps. The first step takes the
text from the file foo.cs and parses it into an abstract syntax tree (AST).
The second step walks this tree generating CLR byte codes that it puts into
an assembly (an exe file)."

His companion article Generating Code for DSLs
http://martinfowler.com/articles/codeGenDsl.html
expands the sentence above and includes this:

"What's the advantage of separating the two stages? It does cost us a bit
of complexity - we have to add the AST classes. If we were only reading and
writing to a single format it's arguable whether the AST is worth the
effort - at least for this simple case. The real advantage in the AST lies
when we want to read or write multiple formats."

That is three "AST"s in three sentences, albeit on a different page of his
multipage essay. Anyway, back where we were, Fowler continues that in a
Language Workbench (by contrast to the traditional system)

"The key difference here is that the 'source' is no longer the editable
textual files. The key source that you manipulate is the abstract
representation itself. In order to edit it, the language workbench projects
the abstract representation into some form of editable representation. But
this editable representation is purely transient - it's only there to help
the human. The true source is the persistent abstract representation. "

So it seems that John Roth proposes that Python become at least in part a
'Language Workbench' language. as Fowler defines his new term. It also
seems that ASTs or some sort of abstract representation are central to such
a system.

Terry J.Reedy
 
T

Terry Reedy

John Roth said:

This clarified your proposal for Python considerably. So I note that now
and especially once the AST compiler is completed, you are quite free to
start a Python AST Extension (PASTE) project quite independently of Guido
and the PSF developers. Build an AST-based editor like Fowler described,
with transient text presentations. (And pick your preferred GUI for doing
so.) Or design a system for translating domain-specific languages into
PyASTs, from whence they can be compiled to bytecode and run.

Terry J. Reedy
 
B

Bengt Richter

This clarified your proposal for Python considerably. So I note that now
and especially once the AST compiler is completed, you are quite free to
start a Python AST Extension (PASTE) project quite independently of Guido
and the PSF developers. Build an AST-based editor like Fowler described,
with transient text presentations. (And pick your preferred GUI for doing
so.) Or design a system for translating domain-specific languages into
PyASTs, from whence they can be compiled to bytecode and run.
(Not implying that you need this for your edification ;-)

I think the relationship of abstract entities and their concrete representations
is very interesting. And it is useful to note that the representation of an AST
in computer memory with a python interpreter looking at the AST node representations
involves another layer of concrete representation. And the interpreter is an abstraction
with a concrete representation, etc. down to the CPU as interpreter of of instructions
etc., and the CPU being an abstraction made concrete my aggregating concrete elements
chosen for their physical nature as representations of abstractions and behaving
so as to transform concrete states in ways that reflect the transformations of the
corresponding abstractions. And so forth ;-)

IOW, a compiled python program AST in the abstract, would be the same abstraction
even if both compiler and tree represention were done in lisp and you had lispython
instead of cpython etc. Or whether we kept track of everything scribbling in beach
sand with our toes.

BTW, maybe this is a place to mention the concept of an AST decorator, that works like
a function decorator except that it is prefixed with @@ instead of @ and it operates
at compile time when the AST becomes available, but before it gets translated to code,
and what gets passed to the decorator is the AST and the node of its own call (which it would
typically eliminate from the AST as it does whatever else it is programmed to do). Which means
that the decorator must already be compiled and avaiable at that point, so it can be looked
up somewhere by the name. The idea is that this form of decoration could transform the
AST arbitrarily before code generation, and be a very flexible tool for mischief of course,
but also useful tricky things IWT.

Regards,
Bengt Richter
 
T

Terry Reedy

Bengt Richter said:
I think the relationship of abstract entities and their concrete
representations
is very interesting.
ditto

BTW, maybe this is a place to mention the concept of an AST decorator,
that works like
a function decorator except that it is prefixed with @@ instead of @ and
it operates
at compile time when the AST becomes available, but before it gets
translated to code,
and what gets passed to the decorator is the AST

One can do this much today:

import compiler

new_ast = ast_transformer(compiler.parse('''\
<code here>
''')

However, I can't see any way in the docs to get a code object from the AST.
I believe the AST-to-code compilet is currently being worked on. When it
is, @@ would be nice syntactic sugar but not really necessary.
The idea is that this form of decoration could transform the
AST arbitrarily before code generation, and be a very flexible tool
for mischief of course, but also useful tricky things.

At the moment, we are limited to manipulating concrete text before
compiling it.

Terry J. Reedy
 
P

Paolino

Bengt said:
Nor do we have any correspondence for bare names analogous to

setattr(obj, "name", value) <=> obj.name = value

e.g.,

setname(<namespace_selection>, "name", value)

Probably this parallelism is a better approach to what I commented
before.Thanks for clarity.
There is nowhere near the kind of control the programmer has over attribute namespace
use for bare-name use, even though by various means we are able to select from a limited
set of namespaces. (BTW, if <namespace_selection> were a call to a suitable builtin function
that could return objects whose attribute namespace were the desired name space, then setname
could be effected with setattr, e.g.,

setattr(get_ns_obj(), "name", value) # name = value (defaulting to local, same as get_ns_obj(0))
setattr(get_ns_obj(1), "name", value) # name = value (in lexically immediately (1) enclosing scope)
setattr(get_ns_obj(-1), "name", value) # name = value (in global module scope)
)
Yes namespaces should be unified in their base behaviour.
I am thinking that there is a subliminal discussion under a layer of red herrings ;-)
I refer to the subject of unification/generalizing/orthogonalizing by removing special
legacy restrictions (or not introducing special restrictions in a new feature).

E.g., recently decorators were introduced, and part of the discussion (which became explicit ;-)
was whether to allow the expression following the '@' to be a fully general expression, or
whether to restrict it to names, dotted names, and function calls. The latter won out.

Some regard this kind of restriction as paternalistic, and protest that "we are adults here"
(even though we are not all, and we others not all the time ;-)
Uhmpf, adults==theorists and childrens==experimentals ?
The BDFL has introduced many new ideas, yet has retained or introduced restrictions on fully
orthogonal functionality that might otherwise be allowed if e.g. names in certain contexts
were allowed to be full expressions. It goes the other way too. IIRC the list of bases for
a class will be allowed to be an empty "()" soon.

Bottom line, I think Python is eminently usable and very pleasant to use, but I think bare name
mechanisms could be improved.
A good dose of humility could be good for reingeneering something ,if
necessary.Python is very usable,and has an almost perfect surface layer.
But this is not enough.It needs to be strong and elegant in the insides
to survive.More, isn't the "Namespaces do more of them" a Python Zen Law ?

Thanks again for putting things in a saner and more open way then I did.

Regards Paolino





___________________________________
Yahoo! Mail: gratis 1GB per i messaggi e allegati da 10MB
http://mail.yahoo.it
 
B

Bengt Richter

One can do this much today:

import compiler

new_ast = ast_transformer(compiler.parse('''\
<code here>
''')

However, I can't see any way in the docs to get a code object from the AST.
I believe the AST-to-code compilet is currently being worked on. When it
is, @@ would be nice syntactic sugar but not really necessary.


At the moment, we are limited to manipulating concrete text before
compiling it.
Have we gone backwards from this?

http://groups.google.com/group/comp...read/thread/5fa80186d9f067f4/7a2351b221063a8c

I've been meaning to do something with that, to implement @@ decoration, I think probably in the context
of a customized importer, where I would be able to control the whole source conversion process, anticipating
usage something like (ut is my hodgepodge utility package ;-)

from ut.astdecoimport import astdecoimport
amodule = astdecoimport('amodule') # searches like import for amodule.py and does its thing

Regards,
Bengt Richter
 
R

Ron Adam

I've heard 2 people complain that word 'global' is confusing.

Perhaps 'modulescope' or 'module' would be better?

Am I the first peope to have thought of this and suggested it?

Is this a candidate for Python 3000 yet?

Chris


After reading though some of the suggestions in this thread, (but not
all of them), how about something a bit more flexible but not too different.

For python 3000 if at all...

Have the ability to define names as shared that only live while the
function that declared them has not exited.

The new statements could be called *share* and *shared*.

def boo():
shared x,y,z # Used names predefined in shared name space.
return x+1,y+2,z+3

def foo():
x,y,z = 1,2,3
share x,y,z # These would be visible to sub functions
# but not visible to parent scopes once the
# function ends. [*1]

boo() # modify shared x,y and z in foo.


[*1.] Unless they have also declared the same names as share. (See below.)

'Share' is used to define names to be visible in child scopes, and
'shared' allows access to shared names declared in parent scopes.

Having too keywords is more explicit, although this may work with a
single key word pretty much as it does now.

A single shared name space would still be used where 'share' adds names
to be 'shared' and those names are deleted when the function that
declared them exits. They don't need to live past the life of the
function they were first declared in.

In recursive functions, (or when a name is reshared), declaring a name
as shared could just increment a reference counter, and it wouldn't be
removed from shared until it reaches zero again.

Using 'share' twice with the same name in the same function should cause
an error. Using 'shared' with a name that is not in shared name space
would cause an error.


Just a few thoughts.

Cheers,
Ron
 
A

Antoon Pardon

Op 2005-08-06 said:
You can't "fix" this. This code (in some python-like langauge that
isn't python):

x = 23

def fun():
x = 25
# Rest of code

has two possible interpretations.

Either the occurrence of x in fun references the global, or it
references a local that shadows the global. There are reasons for
wanting both behaviors. So you have to have some way to distinguish
between the two, and you want it to happen per variable, not per
function. The method with the fewest keywords is to have one be the
default, and some keyword that triggers the other.

So the only way to remove the global statement would be to have some
way to mark the other interpretation, with say a "local"
decleration. I thik that would be much worse than "global". For one
thing, most variables would be local whether or not they are
declared. Second, having an indication that you need to check module
globals in the function is a better than not having that clue there.

I disagree here. The problem with "global", at least how it is
implemented in python, is that you only have access to module
scope and not to intermediate scopes.

I also think there is another possibility. Use a symbol to mark
the previous scope. e.g. x would be the variable in local scope.
@.x would be the variable one scope up. @[email protected] would be the
variable two scopes up etc.
 
A

Antoon Pardon

Op 2005-08-06 said:
Paolino said:
(e-mail address removed) wrote:
def enclosing():
var=[]
var[0]=2
def enclosed():
var[0]=4
which is like saying python is not working

It's ok to mark non locals,but why var=4 is not searched outside and
var[0]=4 yes?

Because "var=4" rebinds the name "var", while "var[0]=4" does not. It's
exactly the same issue with using "global", where you don't need it if
you aren't rebinding the name.

This doesn't answer the question at the appropiate level IMO.

Why has one made a difference in search policy for finding a
variable based on whether the variable is rebound or not
in the first place.
 
P

Peter Hansen

Antoon said:
Why has one made a difference in search policy for finding a
variable based on whether the variable is rebound or not
in the first place.

Do you really not understand the reason, or do you simply disagree with
it? It's a choice with rational thought behind it. Whether it's the
best choice is a matter of opinion.

-Peter
 
B

Bengt Richter

I'm not saying 'modulescope' and 'module' are the only alternatives or
even
the best anyone can come up with.

'global' has the connotation of being visible *EVERYWHERE*
where in Python it is just visible in one module's space.

One way is to spell this with a dotted name (e.g., "shared" or "SHARED_GLOBALS")
and use a module dedicated for this purpose, e.g.,

import SHARED_GLOBALS
def foo():
SHARED_GLOBALS.x = 'shared x may be (re)bound from anywhere'
Can you think of a better alternative or do you believe
'global' is the best possible?
I don't think 'global' would be best if we were starting out with
the name spaces we have now, but as you know it effectively means 'modulescope'
and derives from a time where that was the only unqualified alternative
to local.

There's been a lot of discussion about this, periodically ;-)

Regards,
Bengt Richter
 
B

Bengt Richter

The point is not to understand obvious technical things, but having a
coherent programming framework.If I can modify an out of scope object

Seems coherent to me:

a) names /BIND/ locally unless declared global (at which point they
bind within the file)

b) name /lookup/ is local first, then global

c) conflict occurs when a name lookup potentially could find a
global [clause b: name not found in local space, found in global], but
later in the same function that same name is bound locally [clause a: no
global declaration was seen so binding is to a local]. However, the
static language parse will have flagged the name as reserved for a
local, and then complains because one is attempting to use a local
before it has been bound to a value.

If you aren't changing the binding of the name, you don't need
to worry about "global"

And, in Python, this concept of BINDING is a core language
feature -- it is NOT something compatible to other languages, and
removing it will mean creating a new language that is NOT Python.

In other languages, a "name" is a synonym for a memory address
(call it a box), and it will always be the same box. Assignment copies
box contents from source to destination box.

In Python, a "name" is a movable label that is stuck to a box,
and the name can be moved to other boxes. "Assignment" in Python moves
the label from the "destination" (the old box) TO the "source" box --
the source box now has multiple labels (names) bound to it. Both names
refer to the same box.

var is a two step process: first find the box with the label
"var", THEN open the box and find the i'th item /in/ the box... You can
change the item /in/ the box without changing the label on the box.

I find the above label/box metaphor a bit misleading, because the "box"
surfaces where "labels" may be stuck are name spaces, and IMO are more like
cork bulletin boards than the containers suggested by "box" (although
admittedly a bulletin board could be viewed as a kind of container for labels ;-)

I prefer the name-tags-with-strings metaphor, where name tags may be
pinned on any namespace/bulletin board and the strings from tags on
many different bulletin boards may be tied to (bound) the same object.

But to carry this a little further, name tags aren't really the only
things that have strings that lead to objects. Name tags's strings
are accessed via name space objects' lookup mechanisms, which we program
with various name lookup syntax, but other objects can also have strings
leading to objects, e.g. lists, where you retrieve a string originating
from the nth string-tying-point instead of finding a string-tying-point by name
amongst a bunch of labels pinned to a bulletin board.

IOW, "...open the box and find the i'th item /in/ the box..." is not really
finding the i'th item _itself_ "/in/" the box. It is finding one end of a string
tied to some point /in/ the box, but the actual item/object is at the other end
of the string, not /in/ the box, and many other strings may potentially also
be leading to the same object, whether originating from anonymous structural
binding points in other objects, or named binding points in name-tag-containing
objects/namespaces.

Regards,
Bengt Richter
 
R

Ron Adam

Antoon said:
I disagree here. The problem with "global", at least how it is
implemented in python, is that you only have access to module
scope and not to intermediate scopes.

I also think there is another possibility. Use a symbol to mark
the previous scope. e.g. x would be the variable in local scope.
@.x would be the variable one scope up. @[email protected] would be the
variable two scopes up etc.

Looks like what you want is easier introspection and the ability to get
the parent scope from it in a simple way. Maybe something like a
builtin '__self__' name that contains the information, then a possible
short 'sugar' method to access it. '__self__.__parent__', would become
@ in your example and '__self__.__perent__.__self__.__parent__' could
become @.@.

Somthing other than '@' would be better I think. A bare leading '.' is
another possiblity. Then '..x' would be the x two scopes up.

This isn't the same as globals. Globals work the way they do because if
they weren't automatically visible to all objects in a module you
wouldn't be able to access any builtin functions or class's without
declaring them as global (or importing them) in every function or class
that uses them.

Cheers,
Ron
 
T

Terry Reedy

Bengt Richter said:
IOW, "...open the box and find the i'th item /in/ the box..." is not
really
finding the i'th item _itself_ "/in/" the box. It is finding one end of a
string
tied to some point /in/ the box, but the actual item/object is at the
other end
of the string, not /in/ the box, and many other strings may potentially
also
be leading to the same object, whether originating from anonymous
structural
binding points in other objects, or named binding points in
name-tag-containing
objects/namespaces.

The way I think of it is that Python's collective objects are like club
rosters: one person (object) can be on many rosters. A container would be
like a room, and a person could only be in one room at a time.

Terry J. Reedy
 
B

Bengt Richter

The way I think of it is that Python's collective objects are like club
rosters: one person (object) can be on many rosters. A container would be
like a room, and a person could only be in one room at a time.
Yes, the roster model works for me too, but I'm not sure I understand your
concept of "container/room" ;-) I.e., at the Python object level, object representations
themselves don't contain each other in the sense of the memory layout of nested
C structs, UIAM? Obviously there are C structs for basic object representation information
layout, but object-level parts are aggregated by reference via pointers rather than
by memory adjacency (again UIAM). E.g., a list of floats is not an array of doubles in memory.
It's not even a list of pointers to doubles in cpython, I believe, even though
one could conceive of a form of object reference handle/pointers with type clues
in the LSBs that could make pointing to a float object be represented exactly
identically to a C pointer to double. (Since ints are much more commonly used
that would be a waste though, I think, since you would want to use LSB bits to
encode differentiation between the most common primitive types for special handling,
and it would probably be nice to do simple integer arithmetic without needing to mask,
but this is getting to other topics ;-)

Regards,
Bengt Richter
 
A

Antoon Pardon

Op 2005-08-16 said:
Do you really not understand the reason, or do you simply disagree with
it?

How can I understand something that was never explained to me. Each time
I saw this coming up people answered the technical question about the
difference between rebinding and accessing or modification. I haven't seen
anyone answer the question asnwer at this level.
It's a choice with rational thought behind it.

Then please explain this rational thought instead of
just asserting that it is present.
 
T

Terry Reedy

Bengt Richter said:
Yes, the roster model works for me too, but I'm not sure I understand
your
concept of "container/room" ;-)

I only meant that if collective objects were containers of objects like
rooms are containers of people, then an object could be in only 1
collective at a time. But that is importantly not true. Therefore
collectives are not containers.

I once mistakenly thought of mathematical sets as being like boxes. Don't
know if someone else said so or if I just thought up that error on my own.
But then I realized that the box model leads to the the same counterfactual
conclusion. Therefore 'box' is a bad metaphor. Sets are rosters. The
very term 'member of' is a clue that I missed for years ;-) I hope to help
other avoid the same mistake.

The roster idea also explains how a set can be a 'member' of itself, and
how a list can include itself. Weird, perhaps, but easily possible.

The underlying problem is that 'contains' has two meanings: a room
contains people by actual presence and hence is a container. A club roster
metaphorically contains people by name (reference) as members, but not
actually, and hence is not a container even though we may speak of it as
'containing'.

Terry J. Reedy
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,766
Messages
2,569,569
Members
45,042
Latest member
icassiem

Latest Threads

Top