Use of generators and efficiency

J

John O'Hagan

Hi,

I'm writing a Python program using combinatorial algorithms to generate music.
It originally was of this general form:


First, a combinatorial function producing a list of sub-lists, then:

For each sub-list in the list:
Filter/modifier A, then append modified sub-list to new list,

For each sublist in new list:
Filter/modifier B, then append modified sub-list to new list,

......and so on for C, D,etc.

Finally, for each sub-list in the final list, output the sub-list
(print/play as notes).


Because some of the combinatorial operations were slow
to produce results, I changed the program to be of the form:


A combinatorial generator producing one small list for each call, then:

Filter/modify list through A,B, etc
Output the modified list
Call the generator for next list.


To my great surprise, this approach was often considerably _slower_ to
complete than the original program (up to ~40% depending on which modifiers
were used), despite producing initial results more quickly.

Some possibly relevant details: each sub-list consists of a short sequence of
small numbers representing musical notes, but there may be millions of them;
the modifiers range from simple tests (e.g., " if len(sub-list) is not n:
continue" ) to relatively elaborate rearrangements and transformations.

My question is, how can it be quicker to unpack and rebuild a list for each
modifier, than than to run all modifications on each element in turn?

Any explanations, comments or advice?

Thanks,

John O'Hagan
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,754
Messages
2,569,522
Members
44,995
Latest member
PinupduzSap

Latest Threads

Top