Pythonification of the asterisk-based collection packing/unpacking syntax

S

Steven D'Aprano

This is really going to be the last time I waste any words on this:

If only that were true, but after sending this post, you immediately
followed up with FIVE more posts on this subject in less than half an
hour.

The sentence 'collection unpacking is a type constraint' is entirely
nonsensical.

How true. Given that you have now acknowledged this fact, can you please
stop insisting that collection unpacking is a type constraint?

A type constraint is a linguistical construct that can be
applied in many ways; typically, to narrow down the semantics of use of
the symbol to which the type constraint is applied.

Traceback (most recent call last):
RuntimeError: maximum recursion depth exceeded

In case of python, collection PACKING (not unpacking) is signaled by a
construct that can be viewed as a type constraint.

Only by doing sufficient violence to the concept of type constraint that
it could mean *anything*.

Earlier, I insisted that a constraint is a rule that applies to input,
not output. I haven't seen anyone, including yourself, dispute that.

Normally we would say that in the statement:

y = list(x)

there is a constraint on x, namely that it is some sort of iterable
object (otherwise an exception will be raised), but it would be an abuse
of language to say that there is a type constraint on y. y ends up being
a list, true, but that isn't a constraint on y, it is an outcome.

In normal usage, "constraint" refers to pre-conditions, not post-
conditions. There are no pre-conditions on y in the above. It may not
even exist. Contrast it with this example:

y += list(x)

where there are constraints on y: it must exist, and it must be a list,
or something which can be added to a list.
 
S

Steven D'Aprano

I would like to be able to write something like:

a, middle::tuple, b = ::sequence

Where I would like the extra :: before the sequence to explicitly signal
collection unpacking on the rhs, to maintain the symmetry with
collection unpacking within a function call.

The :: on the right-hand side is redundant, because the left-hand side
already explicitly signals collection unpacking of the RHS. Requiring ::
on the RHS above is as unnecessary as it would be here:

n = len:):sequence)
 
E

Eelco

Take it from me Eelco. Once Alex drops into your thread and starts
name calling, it's over my friend.

Yes, he has quite worn out my patience; whats over is our (attempts
at) two sided communication, but I hope to continue the constructive
lines of argument in this thread.
 
E

Eelco

Not in Python, where it is a very common idiom.

I know we are talking about python; it was me that put that in the
title, after all. I know python makes more use of this than some
languages (and less than others; I wouldnt suggest such a verbose
syntax for a functional language for instance). Anyway, braces are
used at least an order of magnitude more than collection packing/
unpacking in typical code.
 
E

Eelco

Not apart from the trivial case of two identifiers separated by newlines.

What's your point?

My point is as I originally stated it: that this construct, of two
identifiers seperated by non-newline whitespace, as in 'list tail'
does not occur anywhere else in python, so introducing that syntax,
while i suppose technically possible, would be a break with existing
expectations. Normally speaking, if two identifiers interact, they are
explicitly 'joined' by an infixed operator of some sort, as in 3*4,
rather than * 3 4. That seems a sensible rule to me, and I see no
compelling reason to depart from it.
 
E

Eelco

If only that were true, but after sending this post, you immediately
followed up with FIVE more posts on this subject in less than half an
hour.

Did I waste any more words on collection packing and type constraints?
No, I did not. (though I am about to, and am willing to do so for
every question that seems genuinely aimed at engaging me on the
matter)

Did I intend to say that I was going to let a single troll shut down
my entire topic? No, I did not.
Only by doing sufficient violence to the concept of type constraint that
it could mean *anything*.

Earlier, I insisted that a constraint is a rule that applies to input,
not output. I haven't seen anyone, including yourself, dispute that.

Normally we would say that in the statement:

    y = list(x)

there is a constraint on x, namely that it is some sort of iterable
object (otherwise an exception will be raised), but it would be an abuse
of language to say that there is a type constraint on y. y ends up being
a list, true, but that isn't a constraint on y, it is an outcome.

In normal usage, "constraint" refers to pre-conditions, not post-
conditions. There are no pre-conditions on y in the above. It may not
even exist. Contrast it with this example:

    y += list(x)

where there are constraints on y: it must exist, and it must be a list,
or something which can be added to a list.

Yes, indeed it would be abuse of language to call this a type
constraint, since the fact that y is a list is indeed an outcome of
whatever happens to pop out at the right hand side. One could redefine
the identifier list to return any kind of object.

How is 'head, *tail = sequence' or semantically entirely equivalently,
'head, tail::list = sequence' any different then? Of course after
interpretation/compilation, what it boils down to is that we are
constructing a list and binding it to the identifier tail, but that is
not how it is formulated in python as a language (all talk of types is
meaningless after compilation; machine code is untyped). We dont have
something of the form 'tail = list_tail(sequence)'. Rather, we
annotate the identifier 'tail' with an attribute that unquestionably
destinates it to become a list*. It is no longer that 'tail' will just
take anything that pops out of the expression on the right hand side;
rather, the semantics of what will go on at right hand side is coerced
by the constraint placed on 'tail'.

But again, if you dont wish to view this as a type constraint, I wont
lose any sleep over that. In that case this line of argument was
simply never directed at you. It was directed at people who would
reasonably argue that 'tail::tuple is a type constraint and thats
unpythonic / type constraints have been considered and rejected'. If
you dont think it looks like a type constraint: fine. The simpler
argument is that whatever it is, its just a more verbose and flexible
variant of a construct that python already has.

*(I call that a 'type constraint', because that is what it literally
is; if you can make a case that this term has acquired a different
meaning in practice, and that there is another term in common use for
this kind of construct; please enlighten me. Until that time, im going
to ask you to take 'type constraint' by its literal meaning; a
coercion of the type of a symbol, rather than whatever particular
meaning it has acquired for you (it might help if you explained that).
Im not sure if it was you that brought that up, but let me reiterate
that I dont mean a 'type cast', which is a runtime concept. A 'type
constraint' is purely a linguistic construct that will be 'compiled
out')
 
E

Eelco

The :: on the right-hand side is redundant, because the left-hand side
already explicitly signals collection unpacking of the RHS. Requiring ::
on the RHS above is as unnecessary as it would be here:

Yes, it is redundant; hence the word 'extra' in my post.

Explicit and implicit are not well-defined terms, but I would say that
at the moment the signal is implicit, in the sense that one cannot see
what is going on by considering the rhs in isolation. Normally in
python, an assignment just binds the rhs to the identifiers on the
lhs, but in case of collection (un)packing, this rule that holds
almost all of the time is broken, and the assignment statement implies
a far more complicated construct, with a far more subtle meaning, and
non-constant time complexity.

Thats not a terrible thing, but a little extra explicitness there
would not hurt, and like I argued many times before, it is a nice
unification with the situation where the unpacking can not be
implicit, like inside a function call rather than assignment.

    n = len:):sequence)

Now you are just discrediting yourself in terms of having any idea
what you are talking about.
 
S

Steven D'Aprano

Anyway, braces are used at
least an order of magnitude more than collection packing/ unpacking in
typical code.

That's a wild and unjustified claim. Here's a quick and dirty test, using
the standard library as an example of typical idiomatic code:

[steve@orac ~]$ cd /usr/lib/python2.6
[steve@orac python2.6]$ grep "[*]args" *.py | wc -l
270
[steve@orac python2.6]$ grep "{" *.py | wc -l
550

Doesn't look like a factor of 10 difference to me.


And from one of my projects:

[steve@orac src]$ grep "[*]args" *.py | wc -l
267
[steve@orac src]$ grep "{" *.py | wc -l
8
 
C

Chris Angelico

Until that time, im going
to ask you to take 'type constraint' by its literal meaning; a
coercion of the type of a symbol, rather than whatever particular
meaning it has acquired for you (it might help if you explained that).
Im not sure if it was you that brought that up, but let me reiterate
that I dont mean a 'type cast', which is a runtime concept. A 'type
constraint' is purely a linguistic construct that will be 'compiled
out')

The dictionary definition of constraint is "a limitation or
restriction", and you're right that it can be "compiled out". In fact,
that is probably the best definition. Assuming everything is written
correctly, you should be able to eliminate all constraints and the
code will still function correctly*; but having the constrains means
that certain illegal operations will throw errors.

Here's two examples of tuple unpacking, one with a type constraint,
the other without:

a, b = ('hello', [1,2,3] )
a, b::list = ('hello', [1,2,3] )

The constraint on the second line means that, if the second element is
not a list, the interpreter should throw an error. It does NOT mean to
completely change the meaning of the statement to _make_ the last
argument into a list. That is not the job of a constraint.

ChrisA

* In databasing, it's not uncommon to have code depend on error
responses for correct operation; for instance, one implementation of
UPSERT is to attempt an INSERT, and if it fails due to a unique key
constraint, do the UPDATE instead. The same is also done in Python -
eg using an exception to terminate a loop - but in the context of this
discussion, assume that errors indicate errors.
 
S

Steven D'Aprano

Did I waste any more words on collection packing and type constraints?
No, I did not. (though I am about to, and am willing to do so for every
question that seems genuinely aimed at engaging me on the matter)

Did I intend to say that I was going to let a single troll shut down my
entire topic? No, I did not.

Ah, well whatever you *intended* wasn't clear from your comment. At least
not clear to *me*.


[...]
Yes, indeed it would be abuse of language to call this a type
constraint, since the fact that y is a list is indeed an outcome of
whatever happens to pop out at the right hand side. One could redefine
the identifier list to return any kind of object.

So far, we agree on this.
How is 'head, *tail = sequence' or semantically entirely equivalently,
'head, tail::list = sequence' any different then? Of course after
interpretation/compilation, what it boils down to is that we are
constructing a list and binding it to the identifier tail, but that is
not how it is formulated in python as a language

I'm afraid it is.

Here's the definition of assignment in Python 3:
http://docs.python.org/py3k/reference/simple_stmts.html#assignment-
statements

(all talk of types is
meaningless after compilation; machine code is untyped).

What does machine code have to do with Python?

We dont have
something of the form 'tail = list_tail(sequence)'.

I'm afraid we do. See the definition of assignment again.

Rather, we annotate
the identifier 'tail' with an attribute that unquestionably destinates
it to become a list*. It is no longer that 'tail' will just take
anything that pops out of the expression on the right hand side;

Of course it will. Python is a dynamically typed language. It doesn't
suddenly develop static types to ensure that 'tail' becomes a list;
'tail' is bound to a list because that's what the assignment statement
provides.

rather,
the semantics of what will go on at right hand side is coerced by the
constraint placed on 'tail'.

But it isn't a constraint placed on 'tail'. It is a consequence of the
definition of assignment in Python 3. 'tail' becomes bound to a list
because that is what the assignment statement is defined to do in that
circumstance, not because the identifier (symbol) 'tail' is constrained
to only accept lists. 'tail' may not even exist before hand, so talking
about constraints on 'tail' is an abuse of language, AS YOU AGREED ABOVE.


[...]
*(I call that a 'type constraint', because that is what it literally is;


No. It is literally a name binding of a dynamically typed, unconstrained
name to an object which happens to be a list.

if you can make a case that this term has acquired a different meaning
in practice, and that there is another term in common use for this kind
of construct; please enlighten me. Until that time, im going to ask you
to take 'type constraint' by its literal meaning; a coercion of the type
of a symbol,

But THERE IS NO COERCION OF THE TYPE OF THE SYMBOL.

I am sorry for shouting, but you seem oblivious to the simple fact that
Python is not statically typed, and the symbol 'tail' is NOT coerced to a
specific type. 'tail' may not even exist before the assignment is made;
if it does exist, it could be *any type at all* -- and after the
assignment takes place, there are no restrictions on subsequent
assignments.

'head, *tail = sequence' is no more a type constraint than 'x = 1' is.

Whatever the virtues of your proposal, you are doing it incalculable harm
by your insistence on this incorrect model of Python's behaviour.
 
E

Eelco

Anyway,  braces are used at
least an order of magnitude more than collection packing/ unpacking in
typical code.

That's a wild and unjustified claim. Here's a quick and dirty test, using
the standard library as an example of typical idiomatic code:

[steve@orac ~]$ cd /usr/lib/python2.6
[steve@orac python2.6]$ grep "[*]args" *.py | wc -l
270
[steve@orac python2.6]$ grep "{" *.py | wc -l
550

Doesn't look like a factor of 10 difference to me.

Now try it without changing the subject from round braces to
everything but round braces.
 
E

Eelco

Until that time, im going
to ask you to take 'type constraint' by its literal meaning; a
coercion of the type of a symbol, rather than whatever particular
meaning it has acquired for you (it might help if you explained that).
Im not sure if it was you that brought that up, but let me reiterate
that I dont mean a 'type cast', which is a runtime concept. A 'type
constraint' is purely a linguistic construct that will be 'compiled
out')

The dictionary definition of constraint is "a limitation or
restriction", and you're right that it can be "compiled out". In fact,
that is probably the best definition. Assuming everything is written
correctly, you should be able to eliminate all constraints and the
code will still function correctly*; but having the constrains means
that certain illegal operations will throw errors.

Here's two examples of tuple unpacking, one with a type constraint,
the other without:

a, b = ('hello', [1,2,3] )
a, b::list = ('hello', [1,2,3] )

The constraint on the second line means that, if the second element is
not a list, the interpreter should throw an error. It does NOT mean to
completely change the meaning of the statement to _make_ the last
argument into a list. That is not the job of a constraint.

Thank you for providing clarification on what a 'type constraint'
means to you. That clears things up a bit.

What you are talking about goes by the name of a 'dynamic type CHECK';
some kind of syntactic sugar for something like
'assert(type(obj)==sometype)'. Like a 'type cast', this is also a
runtime concept. How you manage to confuse that with what I am talking
about, given that ive stated many times I am not talking about a
runtime construct but a compile-time construct, is quite beyond me.
(not to mention that ive quite explicitly stated what I mean by 'type
constraint' many times now).

By contrast, here is the first google hit for 'type constraint'.

http://msdn.microsoft.com/en-us/library/d5x73970.aspx

Note that this is a different application of the concept of a type
constraint, but nonetheless, the concept is as I stated it: a
constraint to the type of a symbol to modify its compile-time
semantics. To cite from its first paragraph:

"...you can apply restrictions to the kinds of types ... by using a
type that is not allowed by a constraint, the result is a COMPILE-TIME
ERROR" (emphasis mine)
 
C

Chris Angelico

Now try it without changing the subject from round braces to
everything but round braces.

Around here, the term "braces" means the curly ones - { and } - that
delimit blocks of code in C, and dictionaries/sets in Python.
"Brackets" may be what you're looking for, if you mean all of ()[]{}.
Or if you just mean (), they're called "parentheses".

If your point is that parens are used more often than
packing/unpacking, that's almost certainly true, since function calls
(including method invocations) are so prevalent in pretty much any
code. But what does that prove?

ChrisA
 
C

Chris Angelico

What you are talking about goes by the name of a 'dynamic type CHECK';
some kind of syntactic sugar for something like
'assert(type(obj)==sometype)'. Like a 'type cast', this is also a
runtime concept...

By contrast, here is the first google hit for 'type constraint'.

http://msdn.microsoft.com/en-us/library/d5x73970.aspx

"...you can apply restrictions to the kinds of types ... by using a
type that is not allowed by a constraint, the result is a COMPILE-TIME
ERROR" (emphasis mine)

A constraint can be applied at compile time or at run time. It'd be
valid to apply them at edit time, if you so chose - your editor could
refuse to save your file until you fix the problem. Doesn't mean a
thing. Python, by its nature, cannot do compile-time type checking.
Under no circumstances, however, does this justify the use of the term
"constraint" to mean "utterly different semantics of the same code".

ChrisA
 
E

Eelco

Ah, well whatever you *intended* wasn't clear from your comment. At least
not clear to *me*.

Always glad to help.
So far, we agree on this.
Good.


I'm afraid it is.

Here's the definition of assignment in Python 3:http://docs.python.org/py3k/reference/simple_stmts.html#assignment-
statements

Que?

'head, *tail = sequence'

Is how one currently unpacks a head and tail in idiomatic python

This is semantically equivalent to

'head = sequence[0]'
'tail = list(sequence[1:])'

But these forms are linguistically different, in too many different
ways to mention.
I'm afraid we do. See the definition of assignment again.

Que?

My claim is that the two semantically identical formulations above do
not have isomorphic linguistic form. As far as I can make sense of
your words, you seem to be disputing this claim, but its a claim as
much worth debating as that the sun rises in the east.
Of course it will. Python is a dynamically typed language. It doesn't
suddenly develop static types to ensure that 'tail' becomes a list;
'tail' is bound to a list because that's what the assignment statement
provides.

How python accomplishes any of this under the hood is entirely
immaterial. The form is that of a compile-time type constraint,
regardless of whether the BDFL ever thought about it in these terms.
But it isn't a constraint placed on 'tail'. It is a consequence of the
definition of assignment in Python 3. 'tail' becomes bound to a list
because that is what the assignment statement is defined to do in that
circumstance, not because the identifier (symbol) 'tail' is constrained
to only accept lists. 'tail' may not even exist before hand, so talking
about constraints on 'tail' is an abuse of language, AS YOU AGREED ABOVE.

'tail' is (re)declared on the spot as a brand-new identifier (type
constraint included); whether it exists before has no significance
whatsoever, since python allows rebinding of identifiers.
No. It is literally a name binding of a dynamically typed, unconstrained
name to an object which happens to be a list.

Let me take a step back and reflect on the form of the argument we are
having. I claim the object in front of us is a 'cube'. You deny this
claim, by countering that it is 'just a particular configuration of
atoms'.

'look at the definition!', you say; 'its just a list of coordinates,
no mention of cubes whatsoever.'.

'Look at the definition of a cube', I counter. 'This particular list
of coordinates happens to fit the definition, whether the BDFT
intended to or not'.

You are correct, in the sense that I do not disagree that it is a
particular configuration of atoms. But if you had a better
understanding of cubes, youd realize it meets the definition; the fact
that people might not make a habit of pausing at this fact (there is
generally speaking not much of a point in doing so in this case), does
not diminish the validity of this perspective. But again, if this
perspective does not offer you anything useful, feel free not to share
in it.
But THERE IS NO COERCION OF THE TYPE OF THE SYMBOL.

I am sorry for shouting, but you seem oblivious to the simple fact that
Python is not statically typed, and the symbol 'tail' is NOT coerced to a
specific type. 'tail' may not even exist before the assignment is made;
if it does exist, it could be *any type at all* -- and after the
assignment takes place, there are no restrictions on subsequent
assignments.

'head, *tail = sequence' is no more a type constraint than 'x = 1' is..

Whatever the virtues of your proposal, you are doing it incalculable harm
by your insistence on this incorrect model of Python's behaviour.

Whatever the virtues of your critique of my proposal, you might end up
wasting less time doing so by reading a book or two on compiler theory.
 
E

Eelco

Now try it without changing the subject from round braces to
everything but round braces.

Around here, the term "braces" means the curly ones - { and } - that
delimit blocks of code in C, and dictionaries/sets in Python.
"Brackets" may be what you're looking for, if you mean all of ()[]{}.
Or if you just mean (), they're called "parentheses".

If your point is that parens are used more often than
packing/unpacking, that's almost certainly true, since function calls
(including method invocations) are so prevalent in pretty much any
code. But what does that prove?

That proves the original point of contention: that the below* is
suboptimal language design, not because terseness always trumps
verbosity, but because commonly-used constructs (such as parenthesis
or round brackets or whatever you wish to call them) are more
deserving of the limited space in both the ascii table and your
reflexive memory, than uncommonly used ones.


*original mock code by steve:

class MyClass superclasslist A, B C:
def method argumentlist self, x, y:
t = tuple 1, 2 tuple 3, 4 endtuple endtuple
return group x + y endgroup * group x - y endgroup
 
E

Eelco

A constraint can be applied at compile time or at run time. It'd be
valid to apply them at edit time, if you so chose - your editor could
refuse to save your file until you fix the problem. Doesn't mean a
thing.

A constraint in the sense that I have explained many times now, can in
no way, shape or form be applied at run time. Youd have better luck
applying a consternation to a squirrel. Perhaps you meant 'type check'
again? But then again, that makes no sense whatsoever at compile-
time... Im starting to doubt if there is any sense to be found here at
all.

Anyway, ill take your further silence on the matter as a 'sorry I
derailed your thread with my confusion of terminology'
Python, by its nature, cannot do compile-time type checking.

Python can do whatever its designers have put into it. In this case,
that includes the emission of different code based on a (type)
annotation at the point of declaration of an identifier (only in the
particular circumstance of collection unpacking though, as far as I am
aware).
Under no circumstances, however, does this justify the use of the term
"constraint" to mean "utterly different semantics of the same code".

Thank you for your theory on justice. Im sure its fascinating, but
until you get around to actually explaining it, im going to have to be
conservative and stick with the jargon in common use though, sorry.
 
C

Chris Angelico

That proves the original point of contention: that [Steve's demo code] is
suboptimal language design, not because terseness always trumps
verbosity, but because commonly-used constructs (such as parenthesis
or round brackets or whatever you wish to call them) are more
deserving of the limited space in both the ascii table and your
reflexive memory, than uncommonly used ones.

In Magic: The Gathering R&D, they have a term (the article reporting
which I can't actually find at the moment) called "spread complexity"
or "fan complexity" - the idea being that as you fan out a booster
pack, you see a certain amount of complexity in front of you. The
designers can afford to put more complex cards in as rares than they
can as commons, because you see ten commons for every rare - so a
common factors ten times as much as a rare in spread complexity. (Mark
Rosewater, my apologies if I'm misremembering here!)

The same applies here. When you cast your eye over a program, you're
going to see certain syntactic elements a lot. Assignment, arithmetic,
blocks of code (ie indent/dedent), and function calls are all
extremely common; lambdas, the use of decorators, and exception
handling are somewhat uncommon; and metaclasses, the writing of
decorators, and reloading of modules are all quite rare.

The elements that occur frequently should be:
a) Readable and grokkable;
b) Easily typed on a regular keyboard - no using ASCII character 126
to mean negation, tyvm!
c) Make sense.

Rarer elements (and I'm not talking about xenon and plutonium here)
are allowed to have long names, obscure syntax, or even be shoved away
in odd modules (the way module reloading is in Python 3). If 0.1% of
your code is suddenly twice as large as it used to be, will you
notice? But if a new syntax adds even 5% to the mindspace requirement
of basic assignment, your code will majorly suffer.

In summary: Terseness trumps verbosity primarily for common
operations, and only when doing so does not violate rules a and c
above.

ChrisA
 
C

Chris Angelico

A constraint in the sense that I have explained many times now, can in
no way, shape or form be applied at run time. Youd have better luck
applying a consternation to a squirrel. Perhaps you meant 'type check'
again? But then again, that makes no sense whatsoever at compile-
time... Im starting to doubt if there is any sense to be found here at
all.

A constraint failure causes an error at the time it's discovered.
1) Your editor pops up a message the instant you type something with
such an error, and forces you to correct it before going on.
2) Your compiler refuses to produce byte-code for the module.
3) When the line of code is executed, an exception is thrown.
All of these are valid ways of handling a constraint. Here is a Python
example of a type constraint:

def foo(arg):
if not isinstance(arg,list): raise "This won't work."

If you call it with something that's not a list, you get an error.
Call it with a list, and execution continues normally. That's a
constraint. Of course, it might not be a _type_ constraint:

def foo(arg):
if arg>5: raise "This won't work either." # and yes, that's oddly
truer in Python 3

(Aside: In Pike, I can actually put that into a type constraint
(declare the argument to be int(5..) for instance). This, however, is
not germane to the conversation.)
Anyway, ill take your further silence on the matter as a 'sorry I
derailed your thread with my confusion of terminology'

Like Mary Poppins, I never apologize for derailing threads :)
Python can do whatever its designers have put into it. In this case,
that includes the emission of different code based on a (type)
annotation at the point of declaration of an identifier (only in the
particular circumstance of collection unpacking though, as far as I am
aware).

Of course, but how can Python, without a completely different
structure, do compile-time checking of object types? Everything can be
monkey-patched at run-time. Anything could be affected by any function
call. It's impossible to say for certain, at compile time, what data
type anything will have.

ChrisA
 
E

Eelco

That proves the original point of contention: that [Steve's demo code] is
suboptimal language design, not because terseness always trumps
verbosity, but because commonly-used constructs (such as parenthesis
or round brackets or whatever you wish to call them) are more
deserving of the limited space in both the ascii table and your
reflexive memory, than uncommonly used ones.

In Magic: The Gathering R&D, they have a term (the article reporting
which I can't actually find at the moment) called "spread complexity"
or "fan complexity" - the idea being that as you fan out a booster
pack, you see a certain amount of complexity in front of you. The
designers can afford to put more complex cards in as rares than they
can as commons, because you see ten commons for every rare - so a
common factors ten times as much as a rare in spread complexity. (Mark
Rosewater, my apologies if I'm misremembering here!)

The same applies here. When you cast your eye over a program, you're
going to see certain syntactic elements a lot. Assignment, arithmetic,
blocks of code (ie indent/dedent), and function calls are all
extremely common; lambdas, the use of decorators, and exception
handling are somewhat uncommon; and metaclasses, the writing of
decorators, and reloading of modules are all quite rare.

The elements that occur frequently should be:
a) Readable and grokkable;
b) Easily typed on a regular keyboard - no using ASCII character 126
to mean negation, tyvm!
c) Make sense.

Rarer elements (and I'm not talking about xenon and plutonium here)
are allowed to have long names, obscure syntax, or even be shoved away
in odd modules (the way module reloading is in Python 3). If 0.1% of
your code is suddenly twice as large as it used to be, will you
notice? But if a new syntax adds even 5% to the mindspace requirement
of basic assignment, your code will majorly suffer.

In summary: Terseness trumps verbosity primarily for common
operations, and only when doing so does not violate rules a and c
above.

ChrisA

Good to see there is something we agree upon completely.

Not that I mean to say the question as to how verbose a syntax is
appropriate for collection (un)packing is settled; one could
reasonably argue they find tail::tuple too verbose. But parenthesis
are not a terribly good example to compare to, since they are infact
so much more used they are clearly in another category.

*args and **kwargs are debateable in the appropriateness of their
terseness (but I personally like to err on the side of verbosity), but
extended collection unpacking, as in 'head,*tail=sequence', is quite a
rare construct indeed, and here I very strongly feel a more explicit
syntax is preferrable. That is, as a seasoned python 2 user, I wouldnt
have been willing to gamble on what this does when id come across it
for the first time in python 3. Could as well be a completely new use
of the asterisk. But if collection packing/unpacking would be
presented as a more general construct from the start,
'head,tail::tuple=sequence' would be hard to miss.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,537
Members
45,020
Latest member
GenesisGai

Latest Threads

Top