Which one to use: generate_tokens or tokenize?

A

Andr? Roberge

According to the Python documentation:

18.5 tokenize -- Tokenizer for Python source
....
The primary entry point is a generator:
generate_tokens(readline)
....
An older entry point is retained for backward compatibility:
tokenize(readline[, tokeneater])
====
Does this mean that one should preferably use generate_tokens? If so,
what are the advantages?

André Roberge
 
T

Tim Peters

[Andr? Roberge]
According to the Python documentation:

18.5 tokenize -- Tokenizer for Python source
...
The primary entry point is a generator:
generate_tokens(readline)
...
An older entry point is retained for backward compatibility:
tokenize(readline[, tokeneater])
====
Does this mean that one should preferably use generate_tokens?
Yes.

If so, what are the advantages?

Be adventurous: try them both. You'll figure it out quickly. If you
have to endure "an explanation" first, read PEP 255, where
tokenize.tokenize was used as an example motivating the desirability
of introducing generators.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,764
Messages
2,569,567
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top