Deleting from a list while iterating

R

Rhamphoryncus

The problems of this are well known, and a suggestion for making this
easier was recently posted on python-dev. However, I believe this can
be done just as well without a change to the language. What's more,
most of the suggested methods (in my search results as well as the
suggestion itself) do not scale well, which my approach would solve.

My approach is to make a set of indexes to removed while iterating,
then use a list comprehension to filter them out after. Timings of
this and two other common approaches follow:

setapproach = """\
def func(count):
from random import random
items = [random() for i in xrange(count)]
remove = set()
for index, x in enumerate(items):
#...do something...
if x < 0.5:
remove.add(index)
items = [x for index, x in enumerate(items) if index not in remove]
"""

copyapproach = """\
def func(count):
from random import random
items = [random() for i in xrange(count)]
for x in items[:]:
if x < 0.5:
items.remove(x)
"""

reverseapproach = """\
def func(count):
from random import random
items = [random() for i in xrange(count)]
for index in range(len(items) - 1, -1, -1):
if items[index] < 0.5:
del items[index]
"""
0.0011308193206787109
0.038264989852905273
2.2382969856262207

As you can see, although reverse iteration is somewhat faster at
smaller sizes, a set is substantially faster at larger sizes, and I
believe is more readable anyway. Copying shouldn't even be considered
unless you know the size will always be trivial (< 1000).

I'm sure there's a few other approaches that would do even better under
certain conditions. One is a generator, if your input and output
should both be iterators. Another is using slicing to move contiguous
sections of retained items over the removed items. I leave both of
these as an exercise for the reader.
 
M

Marc 'BlackJack' Rintsch

Rhamphoryncus said:
My approach is to make a set of indexes to removed while iterating,
then use a list comprehension to filter them out after. Timings of
this and two other common approaches follow:

setapproach = """\
def func(count):
from random import random
items = [random() for i in xrange(count)]
remove = set()
for index, x in enumerate(items):
#...do something...
if x < 0.5:
remove.add(index)
items = [x for index, x in enumerate(items) if index not in remove]
"""

Why do you make it that complicated? If you are going to build a new list
anyway, this can be done without the `set()` and just one listcomp:

items = [x for x in items if x < 0.5]

No need to iterate twice over the `items`. The two other approaches you
gave are just needed if it's important that the elements are deleted "in
place", i.e. that you don't rebind `items` to a new object.

Ciao,
Marc 'BlackJack' Rintsch
 
F

Fredrik Lundh

Rhamphoryncus said:
As you can see, although reverse iteration is somewhat faster at
smaller sizes, a set is substantially faster at larger sizes, and I
believe is more readable anyway.

your set approach doesn't modify the list in place, though; it creates
a new list, in a rather roundabout way. if modification in place isn't
important, the normal way is of course to create a *new* list:

items = [i for i in items if not i < 0.5]

on my machine, that's about two orders of magnitude faster than your
"fast" approach for n=100000.

(or twice as fast, if I don't factor out the time it takes to *create*
the original list from the benchmark).

</F>
 
F

Fredrik Lundh

Fredrik said:
on my machine, that's about two orders of magnitude faster than your
"fast" approach for n=100000.

oops. forget that; it's three times faster, if you're actually creating
the entire list, and not just a generator that will create it on demand ;-)

</F>
 
F

Fredrik Lundh

Marc said:
No need to iterate twice over the `items`. The two other approaches you
gave are just needed if it's important that the elements are deleted "in
place", i.e. that you don't rebind `items` to a new object.

and even when you do, that can often be written as, e.g:

items[:] = (i for i in items if not i < 0.5)

</F>
 
?

=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=

Rhamphoryncus said:
setapproach = """\
def func(count):
from random import random
items = [random() for i in xrange(count)]
remove = set()
for index, x in enumerate(items):
#...do something...
if x < 0.5:
remove.add(index)
items = [x for index, x in enumerate(items) if index not in remove]
"""

This is different from the other approaches in that it doesn't
modify items. If you wanted a new list, you could incrementally
build one already in the first pass, no need to collect the
indices first (as BlackJack explains).

If you wanted in-place modification, I'd do

newitems = []
for x in items:
if not (x < 0.5):
newitems.append(x)
items[:] = newitems
copyapproach = """\
def func(count):
from random import random
items = [random() for i in xrange(count)]
for x in items[:]:
if x < 0.5:
items.remove(x)
"""

This happens to work for your example, but is incorrect
in the general case: you meant to write

del items[i+removed]

here, as .remove(x) will search the list again, looking
for the first value to remove. If your condition for
removal happens to leave some items equal to x in the
list, it would remove the wrong item. Because the numbering
changes while the iteration is in progress, you have
to count the number of removed items also.

Regards,
Martin
 
R

Rhamphoryncus

Marc said:
Why do you make it that complicated? If you are going to build a new list
anyway, this can be done without the `set()` and just one listcomp:

Fredrik said:
your set approach doesn't modify the list in place, though; it creates
a new list, in a rather roundabout way. if modification in place isn't
important, the normal way is of course to create a *new* list:

items = [i for i in items if not i < 0.5]

on my machine, that's about two orders of magnitude faster than your
"fast" approach for n=100000.

Sorry, I should have clarified that the original post assumed you
needed info from the "do something" phase to determine if an element is
removed or not. As you say, a list comprehension is superior if that
is not necessary.
 
F

Fredrik Lundh

Rhamphoryncus said:
Sorry, I should have clarified that the original post assumed you
needed info from the "do something" phase to determine if an element is
removed or not. As you say, a list comprehension is superior if that
is not necessary.

that's spelled

out = []
for i in items:
... do something ...
if i > 0.5:
out.append(i)

in Python, and is only a little slower than a list comprehension, as
written above. if performance is really important, move the method
lookup out of the loop:

out = []
append = out.append
for i in items:
... do something ...
if i > 0.5:
append(i)

</F>
 
R

Rhamphoryncus

Martin said:
Rhamphoryncus said:
setapproach = """\
def func(count):
from random import random
items = [random() for i in xrange(count)]
remove = set()
for index, x in enumerate(items):
#...do something...
if x < 0.5:
remove.add(index)
items = [x for index, x in enumerate(items) if index not in remove]
"""

This is different from the other approaches in that it doesn't
modify items. If you wanted a new list, you could incrementally
build one already in the first pass, no need to collect the
indices first (as BlackJack explains).

I didn't feel this distinction was worth mentioning, since "oldlist[:]
= newlist" is so trivial. The only solution that really avoids making
a copy is the reverse-iteration one.

If you wanted in-place modification, I'd do

newitems = []
for x in items:
if not (x < 0.5):
newitems.append(x)
items[:] = newitems

I agree, that does seem simpler.

copyapproach = """\
def func(count):
from random import random
items = [random() for i in xrange(count)]
for x in items[:]:
if x < 0.5:
items.remove(x)
"""

This happens to work for your example, but is incorrect
in the general case: you meant to write

del items[i+removed]

here, as .remove(x) will search the list again, looking
for the first value to remove. If your condition for
removal happens to leave some items equal to x in the
list, it would remove the wrong item. Because the numbering
changes while the iteration is in progress, you have
to count the number of removed items also.

I agree that the example I gave here sucks. However, I copied it from
another posting as a recommended method to get removal to work *right*
(without noticing how slow it is).

There seems to have been many distinct approaches to this problem over
the years. Hopefully we can converge on a single ideal solution (or a
few situational ones) as TOOWTDI.
 
?

=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=

Rhamphoryncus said:
This is different from the other approaches in that it doesn't
modify items. If you wanted a new list, you could incrementally
build one already in the first pass, no need to collect the
indices first (as BlackJack explains).

I didn't feel this distinction was worth mentioning, since "oldlist[:]
= newlist" is so trivial. The only solution that really avoids making
a copy is the reverse-iteration one.

The distinction is relevant for performance, as the slice assignment
has a complexity linear with the number of elements.

Whether making a copy is expensive depends on the precise data: the
solution that makes the copy in reverse may have quadratic complexity
(if the number of removed elements increases with the list size);
the version that appends all remaining elements to a new list and
then does slice assignment should behave better-than-quadratic
(in recent versions, it should give you linear performance).

Regards,
Martin
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,579
Members
45,053
Latest member
BrodieSola

Latest Threads

Top