Re: PEP 3107 Function Annotations for review and comment

Discussion in 'Python' started by =?ISO-8859-1?Q?BJ=F6rn_Lindqvist?=, Dec 30, 2006.

  1. On 12/29/06, Tony Lownds <> wrote:
    > Rationale
    > =========
    >
    > Because Python's 2.x series lacks a standard way of annotating a
    > function's parameters and return values (e.g., with information about
    > what type a function's return value should be), a variety of tools
    > and libraries have appeared to fill this gap [#tailexamp]_. Some
    > utilise the decorators introduced in "PEP 318", while others parse a
    > function's docstring, looking for annotations there.
    >
    > This PEP aims to provide a single, standard way of specifying this
    > information, reducing the confusion caused by the wide variation in
    > mechanism and syntax that has existed until this point.


    I think this rationale is very lacking and to weak for such a big
    change to Python. I definitely like to see it expanded.

    The reference links to two small libraries implementing type checking
    using decorators and doc strings. None of which to seem to be very
    popular in the Python community. Surely, those two libraries *alone*
    can't be enough of a motivation for this? To me, it is far from
    self-evident what purpose function annotations would serve.

    I also wonder why a very obtrusive syntax addition is needed when it
    clearly is possible to annotate functions in today's Python. Why is
    syntax better than just adding a function annotation decorator to the
    standard library?

    @annotate(a = int, b = dict, c = int)
    def foo(a, b, c = 5):
    ...

    Are decorators to ugly?

    --
    mvh Björn
    =?ISO-8859-1?Q?BJ=F6rn_Lindqvist?=, Dec 30, 2006
    #1
    1. Advertising

  2. BJörn Lindqvist wrote:

    > On 12/29/06, Tony Lownds <> wrote:
    >> Rationale
    >> =========
    >>
    >> Because Python's 2.x series lacks a standard way of annotating a
    >> function's parameters and return values (e.g., with information about
    >> what type a function's return value should be), a variety of tools
    >> and libraries have appeared to fill this gap [#tailexamp]_. Some
    >> utilise the decorators introduced in "PEP 318", while others parse a
    >> function's docstring, looking for annotations there.
    >>
    >> This PEP aims to provide a single, standard way of specifying this
    >> information, reducing the confusion caused by the wide variation in
    >> mechanism and syntax that has existed until this point.

    >
    > I think this rationale is very lacking and to weak for such a big
    > change to Python. I definitely like to see it expanded.
    >
    > The reference links to two small libraries implementing type checking
    > using decorators and doc strings. None of which to seem to be very
    > popular in the Python community. Surely, those two libraries *alone*
    > can't be enough of a motivation for this? To me, it is far from
    > self-evident what purpose function annotations would serve.
    >
    > I also wonder why a very obtrusive syntax addition is needed when it
    > clearly is possible to annotate functions in today's Python. Why is
    > syntax better than just adding a function annotation decorator to the
    > standard library?
    >
    > @annotate(a = int, b = dict, c = int)
    > def foo(a, b, c = 5):
    > ...
    >
    > Are decorators to ugly?


    I prefer the proposed syntax - it is much more concise and. modeled after
    well-known declaration syntaxes in other languages. Additionally, it spares
    us the doubled parameter list as your example above - and that is important
    here I'd say.

    Typing is a difficult and controversial subject. However, sooner or later
    python will grow JIT-compilers that will take advantage of such
    declarations, and I think it is better to have one accepted way of doing it
    hardwired than several concurring self-baked implementations.

    Diez
    Diez B. Roggisch, Dec 30, 2006
    #2
    1. Advertising

  3. =?ISO-8859-1?Q?BJ=F6rn_Lindqvist?=

    John Roth Guest

    BJörn Lindqvist wrote:
    > On 12/29/06, Tony Lownds <> wrote:
    > > Rationale
    > > =========
    > >
    > > Because Python's 2.x series lacks a standard way of annotating a
    > > function's parameters and return values (e.g., with information about
    > > what type a function's return value should be), a variety of tools
    > > and libraries have appeared to fill this gap [#tailexamp]_. Some
    > > utilise the decorators introduced in "PEP 318", while others parse a
    > > function's docstring, looking for annotations there.
    > >
    > > This PEP aims to provide a single, standard way of specifying this
    > > information, reducing the confusion caused by the wide variation in
    > > mechanism and syntax that has existed until this point.

    >
    > I think this rationale is very lacking and to weak for such a big
    > change to Python. I definitely like to see it expanded.
    >
    > The reference links to two small libraries implementing type checking
    > using decorators and doc strings. None of which to seem to be very
    > popular in the Python community. Surely, those two libraries *alone*
    > can't be enough of a motivation for this? To me, it is far from
    > self-evident what purpose function annotations would serve.
    >
    > I also wonder why a very obtrusive syntax addition is needed when it
    > clearly is possible to annotate functions in today's Python. Why is
    > syntax better than just adding a function annotation decorator to the
    > standard library?
    >
    > @annotate(a = int, b = dict, c = int)
    > def foo(a, b, c = 5):
    > ...
    >
    > Are decorators too ugly?
    >
    > --
    > mvh Björn


    The problem I have with it is that it doesn't solve the problem
    I've got, and I can see some user requests to use it rather than
    the metadata solution I've got now in Python FIT. Neither do
    decorators, by the way.

    So, what are the problems I see?

    First, it only handles functions/methods. Python FIT needs
    metadata on properties and assignable/readable attributes
    of all kinds. So in no sense is it a replacement. Parenthetically,
    neither is the decorator facility, and for exactly the same reason.

    Second, it has the potential to make reading the function
    header difficult. In the languages I'm familiar with, static type
    declarations are a very few, somewhat well chosen words.
    In this proposal, it can be a general expression. In Python
    FIT, that could well turn into a full blown dictionary with
    multiple keys.

    Third, it's half of a proposal. Type checking isn't the only use
    for metadata about functions/methods, classes, properties
    and other objects, and the notion that there are only going to
    be a small number of non-intersecting libraries out there is
    an abdication of responsibility to think this thing through.

    I should note that there are quite a few packages out there
    that use some form of annotation, be they comments
    (like Ned Bachelder's coverage analyzer and the two
    lint packages I'm aware of), docstrings, decorators or
    auxilliary dictionarys (like Python FIT, and a possible
    Python version of Naked Objects). They include a
    fair number of documentation packages.

    On a positive note, what I'd like is something similar to
    Java's Javadoc, but a bit looser. It could be a comment
    convention like Javadoc, but one that the compiler recognizes
    and stashes in the compiled .pyc / .pyo file. Or it could have
    different syntax. What is SHOULDN'T have is a mandatory tie
    to function/method syntax.

    Combined with a convention to identify which annotation
    belongs to who, it could be a quite useful mechanism.
    I, for one, have no difficulty with the notion of using someone
    else's annotations if I can identify them unambiguously.

    John Roth
    Python FIT
    John Roth, Dec 30, 2006
    #3
  4. =?ISO-8859-1?Q?BJ=F6rn_Lindqvist?=

    Tim Smith Guest

    here's a potentially nifty way of adding decorators to input args for python:

    def a(int(arg1), arg2, tuple(arg3)):
    #arg1 is an int (or was converted to an int)
    #arg2's type is not known (ie this decoration is optional)
    #arg3 is a tuple (may have been a list coming in, but is now a tuple)
    pass

    this would add optional conversion of input arguments to desired types
    (you could drop the parens, making it more like standard type syntax, but I put them there to intone that the int() method will be called on every arg1 coming in, and so on)

    this would also add ability to write your own conversion functions to handle type checking as arguments come into a function

    should add little to no overhead (as you are likely doing this manually like so if desired:
    def a(arg1, arg2, arg3):
    arg1 = int(arg1)
    arg3 = tuple(arg3)
    pass

    addendum:
    any type conversion method should throw ValueError on failure (this would allow python to catch this error and throw a new exception (InputArgError) or something

    so this:
    def a(int(arg1), arg2, tuple(arg3)):
    pass

    would more or less translate to this:
    def a(arg1, arg2, arg3):
    try:
    arg1 = int(arg1)
    arg3 = tuple(arg3)
    except ValueError:
    raise InputArgError("what went wrong")
    pass

    it would likely be desired to create some extra builtin functions like:
    convdict, convlist, convtuple, that if input is already this type, will return the input unmodified, (as opposed to calling a constructor on dict, list, tuple to create a whole new object (copying all the data))

    another nice side effect of this is it adds the ability to call by value instead of by reference:
    def a(list(b)):
    pass #call by value

    def a(convlist(b)):
    pass #call by reference (unless input type wasn't list)

    -- Tim



    -- On 12/30/06 "John Roth" <> wrote:

    > BJörn Lindqvist wrote:
    > > On 12/29/06, Tony Lownds <> wrote:
    > > > Rationale
    > > > =========
    > > >
    > > > Because Python's 2.x series lacks a standard way of annotating a
    > > > function's parameters and return values (e.g., with information about
    > > > what type a function's return value should be), a variety of tools
    > > > and libraries have appeared to fill this gap [#tailexamp]_. Some
    > > > utilise the decorators introduced in "PEP 318", while others parse a
    > > > function's docstring, looking for annotations there.
    > > >
    > > > This PEP aims to provide a single, standard way of specifying this
    > > > information, reducing the confusion caused by the wide variation in
    > > > mechanism and syntax that has existed until this point.

    > >
    > > I think this rationale is very lacking and to weak for such a big
    > > change to Python. I definitely like to see it expanded.
    > >
    > > The reference links to two small libraries implementing type checking
    > > using decorators and doc strings. None of which to seem to be very
    > > popular in the Python community. Surely, those two libraries *alone*
    > > can't be enough of a motivation for this? To me, it is far from
    > > self-evident what purpose function annotations would serve.
    > >
    > > I also wonder why a very obtrusive syntax addition is needed when it
    > > clearly is possible to annotate functions in today's Python. Why is
    > > syntax better than just adding a function annotation decorator to the
    > > standard library?
    > >
    > > @annotate(a = int, b = dict, c = int)
    > > def foo(a, b, c = 5):
    > > ...
    > >
    > > Are decorators too ugly?
    > >
    > > --
    > > mvh Björn

    >
    > The problem I have with it is that it doesn't solve the problem
    > I've got, and I can see some user requests to use it rather than
    > the metadata solution I've got now in Python FIT. Neither do
    > decorators, by the way.
    >
    > So, what are the problems I see?
    >
    > First, it only handles functions/methods. Python FIT needs
    > metadata on properties and assignable/readable attributes
    > of all kinds. So in no sense is it a replacement. Parenthetically,
    > neither is the decorator facility, and for exactly the same reason.
    >
    > Second, it has the potential to make reading the function
    > header difficult. In the languages I'm familiar with, static type
    > declarations are a very few, somewhat well chosen words.
    > In this proposal, it can be a general expression. In Python
    > FIT, that could well turn into a full blown dictionary with
    > multiple keys.
    >
    > Third, it's half of a proposal. Type checking isn't the only use
    > for metadata about functions/methods, classes, properties
    > and other objects, and the notion that there are only going to
    > be a small number of non-intersecting libraries out there is
    > an abdication of responsibility to think this thing through.
    >
    > I should note that there are quite a few packages out there
    > that use some form of annotation, be they comments
    > (like Ned Bachelder's coverage analyzer and the two
    > lint packages I'm aware of), docstrings, decorators or
    > auxilliary dictionarys (like Python FIT, and a possible
    > Python version of Naked Objects). They include a
    > fair number of documentation packages.
    >
    > On a positive note, what I'd like is something similar to
    > Java's Javadoc, but a bit looser. It could be a comment
    > convention like Javadoc, but one that the compiler recognizes
    > and stashes in the compiled .pyc / .pyo file. Or it could have
    > different syntax. What is SHOULDN'T have is a mandatory tie
    > to function/method syntax.
    >
    > Combined with a convention to identify which annotation
    > belongs to who, it could be a quite useful mechanism.
    > I, for one, have no difficulty with the notion of using someone
    > else's annotations if I can identify them unambiguously.
    >
    > John Roth
    > Python FIT
    >
    > --
    > http://mail.python.org/mailman/listinfo/python-list
    Tim Smith, Dec 30, 2006
    #4
  5. =?ISO-8859-1?Q?BJ=F6rn_Lindqvist?=

    Tony Lownds Guest

    > First, it only handles functions/methods. Python FIT needs
    > metadata on properties and assignable/readable attributes
    > of all kinds. So in no sense is it a replacement. Parenthetically,
    > neither is the decorator facility, and for exactly the same reason.
    >


    I can't argue against docstrings and maybe annotations on attributes,
    I'd like them myself. That should be a separate PEP because the scope
    of this one is Function Annotations.

    The syntax for function annotations has been much more thoroughly
    discussed than for annotations on attributes. See Guido's blog and
    other references in the PEP.

    > Second, it has the potential to make reading the function
    > header difficult. In the languages I'm familiar with, static type
    > declarations are a very few, somewhat well chosen words.
    > In this proposal, it can be a general expression. In Python
    > FIT, that could well turn into a full blown dictionary with
    > multiple keys.


    Any code can be hard to read. Having the annotation be a general
    expression lets authors use normal code factoring to make the
    function header more readable. For instance, one can change this:

    def f(x: some_really_long_expression):
    ...


    to this:

    t_X = some_really_long_expression
    def f(x: t_X):
    ...

    > Third, it's half of a proposal. Type checking isn't the only use
    > for metadata about functions/methods, classes, properties
    > and other objects, and the notion that there are only going to
    > be a small number of non-intersecting libraries out there is
    > an abdication of responsibility to think this thing through.
    >


    That comes from this paragraph from the PEP:

    There is no worry that these libraries will assign semantics at
    random, or that a variety of libraries will appear, each with
    varying semantics and interpretations of what, say, a tuple of
    strings means. The difficulty inherent in writing annotation
    interpreting libraries will keep their number low and their
    authorship in the hands of people who, frankly, know what they're
    doing.

    Perhaps you are right and intersecting libraries will become an issue.
    Designing a solution in advance of the problems being evident seems
    risky to me. What if the solution invented in a vacuum really is more
    of a hindrance?

    There is a clear intersection between documentation packages and
    type-checking or type-coercing libraries. Documentation libraries can
    just use repr(annotation), possibly with a little bit of special
    casing to
    represent classes and types better.

    I'm not sure there will be an important use for overlap of different
    type-checking
    or type-coercing libraries within the same module. What else could
    intersect and
    why can't the intersecting pieces develop an solution when it arises?

    More feedback from the community on this point (whether the PEP needs to
    take responsibility for interoperability) would be nice.

    Thanks for the feedback from everyone so far,
    -Tony
    Tony Lownds, Dec 30, 2006
    #5
  6. =?ISO-8859-1?Q?BJ=F6rn_Lindqvist?=

    John Roth Guest

    Tony Lownds wrote:
    > > First, it only handles functions/methods. Python FIT needs
    > > metadata on properties and assignable/readable attributes


    > ...
    >
    > > Third, it's half of a proposal. Type checking isn't the only use
    > > for metadata about functions/methods, classes, properties
    > > and other objects, and the notion that there are only going to
    > > be a small number of non-intersecting libraries out there is
    > > an abdication of responsibility to think this thing through.
    > >

    >
    > That comes from this paragraph from the PEP:
    >
    > There is no worry that these libraries will assign semantics at
    > random, or that a variety of libraries will appear, each with
    > varying semantics and interpretations of what, say, a tuple of
    > strings means. The difficulty inherent in writing annotation
    > interpreting libraries will keep their number low and their
    > authorship in the hands of people who, frankly, know what they're
    > doing.
    >
    > Perhaps you are right and intersecting libraries will become an issue.
    > Designing a solution in advance of the problems being evident seems
    > risky to me. What if the solution invented in a vacuum really is more
    > of a hindrance?


    Vacuum? What vacuum? Annotating programs for various tools
    has a history that goes back almost to the beginning of programming
    languages. It goes back even farther if you want to look at scholarly
    use of natural language.

    The only vacuum I see here is the focus on a solution
    rather than a focus on a problem.

    > There is a clear intersection between documentation packages and
    > type-checking or type-coercing libraries. Documentation libraries can
    > just use repr(annotation), possibly with a little bit of special
    > casing to
    > represent classes and types better.
    >
    > I'm not sure there will be an important use for overlap of different
    > type-checking
    > or type-coercing libraries within the same module. What else could
    > intersect and
    > why can't the intersecting pieces develop an solution when it arises?
    >
    > More feedback from the community on this point (whether the PEP needs to
    > take responsibility for interoperability) would be nice.


    At the moment the project I'm working on has annotations
    from three packages: Python FIT (which is the actual package),
    Ned Bachelder's code coverage tool, and one of the code
    standards tools. None of these are either documentation
    or type checking / coercion.

    The point I'm making is that there are other uses for
    annotations than type checking and documentation.
    I specifically referenced Python FIT because it is neither
    of these: it is an executable requirements test tool that
    can be used as an acceptance test tool as well.

    I also mentioned Naked Objects because it's got the
    same issue: it needs type and other metadata information.
    There isn't a current Python implementation that I'm
    aware of, although I'm thinking of it as a next project.

    > Thanks for the feedback from everyone so far,


    I stripped the comment about syntax from this
    response because I want to address it separately.

    John Roth
    > -Tony
    John Roth, Dec 31, 2006
    #6
  7. =?ISO-8859-1?Q?BJ=F6rn_Lindqvist?=

    John Roth Guest

    Tony Lownds wrote:
    > > First, it only handles functions/methods. Python FIT needs
    > > metadata on properties and assignable/readable attributes
    > > of all kinds. So in no sense is it a replacement. Parenthetically,
    > > neither is the decorator facility, and for exactly the same reason.
    > >

    >
    > I can't argue against docstrings and maybe annotations on attributes,
    > I'd like them myself. That should be a separate PEP because the scope
    > of this one is Function Annotations.
    >
    > The syntax for function annotations has been much more thoroughly
    > discussed than for annotations on attributes. See Guido's blog and
    > other references in the PEP.


    Syntax is always an issue. Looking at the history of annotations
    shows that people seem to prefer using a comment mechanism.
    This goes along with the notion of supplying minimal mechanism
    until you see what the actual usage is going to be.

    As far as I'm concerned, an annotation mechanism has to have
    several properties:

    1. The annotation data needs to be available at run time without
    having the source available.

    2. It needs to be syntax checkable by a mechanism supplied
    by the author of the annotation schema. I'd suggest a hook
    at the back end of import, since this doesn't get in the
    way of the compiler.

    The converse of this, of course, is that neither the language
    nor the compiler needs to have any idea of the actual syntax.
    This provides maximal freedom to experiment.

    3. It needs to have a convention that will allow authors of
    different schemas to stay out of each other's way.

    Docstrings almost manage this. While they're certainly
    available at run time (at least if you don't compile in a
    way that strips them out) you can only have one in any
    module, class or method. This means you can't always
    put them where you want them, that is, close to the
    item that they're annotating. Parenthetically, I'd note that
    adding docstring capabilities to properties was a definite
    step forward.

    John Roth

    >
    > Thanks for the feedback from everyone so far,
    > -Tony
    John Roth, Dec 31, 2006
    #7
  8. =?ISO-8859-1?Q?BJ=F6rn_Lindqvist?=

    Tony Lownds Guest

    On Dec 31, 2006, at 7:54 AM, John Roth wrote:
    > Tony Lownds wrote:
    >> Perhaps you are right and intersecting libraries will become an
    >> issue.
    >> Designing a solution in advance of the problems being evident seems
    >> risky to me. What if the solution invented in a vacuum really is more
    >> of a hindrance?

    >
    > Vacuum? What vacuum?


    No libraries use the new syntax. Hence no libraries can be currently
    intersecting on the usage.

    [...]
    > At the moment the project I'm working on has annotations
    > from three packages: Python FIT (which is the actual package),
    > Ned Bachelder's code coverage tool, and one of the code
    > standards tools. None of these are either documentation
    > or type checking / coercion.
    >


    Let me see if I can guess how those tools will use function annotations.
    You've decided Python FIT can't use function annotations. Code
    coverage tool won't even use function annotations. It's possible
    packages
    like pylint will learn to interpret function annotations to provide
    better static
    analysis. Right?

    The problem is, pylint may only understand annotation scheme A and the
    module author has written the annotations in scheme B. Is this is the
    kind of
    "intersection" issue you had in mind? Do you have any more specific
    cases to
    consider?

    Thanks
    -Tony
    Tony Lownds, Dec 31, 2006
    #8
  9. =?ISO-8859-1?Q?BJ=F6rn_Lindqvist?=

    Paul Boddie Guest

    Tony Lownds wrote:
    >
    > It's possible packages like pylint will learn to interpret function annotations to provide
    > better static analysis. Right?


    It's true that for the area to be explored, which I know you've been
    doing, one first has to introduce an annotation scheme that can then be
    used by things like pylint. I'd like to see assertions about the
    usefulness of such annotations verified by modified versions of tools
    like pylint before changes to the language are made, mostly because
    such assertions seem to be more conjecture than prediction. In other
    words, the changes should be advocated, implemented and tested in a
    closed system before being introduced as wider language changes,
    especially given that Python has already seen a number of speculative
    changes which were made in anticipation of certain needs that
    subsequently appeared to be less significant than first thought.

    Another thing I find worrying about function annotations is the
    ambivalence around their purpose: the feature is supposedly great for
    static typing, but when confronted over the consequences of having
    developers spray type declarations everywhere, we're told that they
    aren't really meant for such things and that type declarations are only
    an example of what annotations could do. Here, the sales department and
    the engineering department really have to get together and get their
    story straight.

    Paul
    Paul Boddie, Jan 1, 2007
    #9
  10. =?ISO-8859-1?Q?BJ=F6rn_Lindqvist?=

    Tony Lownds Guest

    On Jan 1, 2007, at 1:53 PM, Paul Boddie wrote:
    > It's true that for the area to be explored, which I know you've been
    > doing, one first has to introduce an annotation scheme that can
    > then be
    > used by things like pylint. I'd like to see assertions about the
    > usefulness of such annotations verified by modified versions of tools
    > like pylint before changes to the language are made, mostly because
    > such assertions seem to be more conjecture than prediction. In other
    > words, the changes should be advocated, implemented and tested in a
    > closed system before being introduced as wider language changes,
    > especially given that Python has already seen a number of speculative
    > changes which were made in anticipation of certain needs that
    > subsequently appeared to be less significant than first thought.
    >


    I can understand reluctance to change the language, keep in mind this is
    Python-3000 not Python 2.x, and even accepted features do get removed
    sometimes, like the access statement. I'm sure that feature qualifies
    as a
    speculative change -- I'd be curious to what other ones you were
    thinking
    about.

    At least one such modified tool exists: pydoc. That will display
    annotations.
    Does that count?

    It's not easy to develop syntax in a closed system that ISN'T python
    itself,
    especially when part of the point is to consolidate the ways in which
    function
    arguments and return types get annotated, and to do so as readably as
    possible.


    > Another thing I find worrying about function annotations is the
    > ambivalence around their purpose: the feature is supposedly great for
    > static typing, but when confronted over the consequences of having
    > developers spray type declarations everywhere, we're told that they
    > aren't really meant for such things and that type declarations are
    > only
    > an example of what annotations could do. Here, the sales department
    > and
    > the engineering department really have to get together and get their
    > story straight.


    Can you point out any specific text from the PEP you derived this from?
    Then "sales" can work from something specific :)

    Thanks
    -Tony
    Tony Lownds, Jan 1, 2007
    #10
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Talin
    Replies:
    3
    Views:
    304
    Talin
    May 22, 2006
  2. Tony Lownds
    Replies:
    4
    Views:
    448
    Tony Lownds
    Jan 1, 2007
  3. Stephen R Laniel
    Replies:
    160
    Views:
    2,146
    Hendrik van Rooyen
    Jul 22, 2007
  4. Hamilton, William
    Replies:
    3
    Views:
    340
    Donn Cave
    Jul 11, 2007
  5. Christoph Zwerschke

    Missing exceptions in PEP 3107

    Christoph Zwerschke, Aug 9, 2008, in forum: Python
    Replies:
    15
    Views:
    420
    Carl Banks
    Aug 18, 2008
Loading...

Share This Page