How to copy multi object array contents into single object arrays?

S

Scott Sauyet

I found a little time to run a number of tests, but I don't have time
now to analyze them. Anyone interested in seeing the raw data from
JSLitmus tests in five browsers running against lists with 10, 100,
1000, and 10000 elements can see my results, and run their own tests
here:

http://scott.sauyet.com/Javascript/Test/LoopTimer/3/

And in a slightly updated page with the exact same data:

http://scott.sauyet.com/Javascript/Test/LoopTimer/4/

The early part of this thread gives the context for these text
results, but most especially the following three

http://groups.google.com/group/comp.lang.javascript/msg/dcc830996b8afcef
http://groups.google.com/group/comp.lang.javascript/msg/73c40b2b284d970a
http://groups.google.com/group/comp.lang.javascript/msg/6c56a2bac08daaa4


Which are, respectively, the initial question posted by Tuxedo, a
response by me, and an alternative posted by Thomas Lahn. Note that
generally the comparisons that I'm looking at are between different
algorithms for the same browser at the same array size, although there
are some interesting questions when looking at a single algorithm in
one browser as the array size varies. At the end, I'll briefly
discuss the differences between browsers. Note that all tests are
performed on the same Windows XP SP2 box with dual 3GHz processors and
3.25 GB RAM.

Thomas' alternative made three distinct changes to my original
algorithm, which was, in essence:

var photos = [], captions = [], len = library.length;
for (var i = 0; i < len; i++) {
photos.push(library["img"]);
captions.push(library["caption"]);
}

The first change, was to replace the calls to push with directly
setting the array values. A new version might look like this:

var photos = [], captions = [], len = library.length;
for (var i = 0; i < len; i++) {
photos = library["img"];
captions = library["caption"];
}

The second change was to replace the array lookups using an temporary
variable:

var photos = [], captions = [], len = library.length;
for (var i = 0; i < len; i++) {
var o = library;
photos = o.img;
captions = o.caption;
}

And the final one was to run the iteration backward toward zero,
rather than forward:

var photos = [], captions = [], len = library.length;
for (var i = len; i--;) {
var o = library;
photos = o.img;
captions = o.caption;
}

Each of these changes is logically independent, so I tested eight
different versions of the algorithm (calling them
"pushLookupIncrement", "pushLookupDecrement", "pushNewVarIncrement",
"pushNewVarDecrement", "setLookupIncrement", "setLookupDecrement",
"setNewVarIncrement", and "setNewVarDecrement") in each browser I
tested. The browsers I used were

Chrome 3.0.195.27
Firefox 3.5.7
Internet Explorer 8
Opera 9.64
Safari 4.0.3

When I refer to a browser by name below, my results are specific to
the version tested. I tested each of them a few times for each
configuration. The results (except in Firefox) were fairly
consistent, and the result reported was an attempt to post a
representative result.

Because the size of the array to be copied has a likely impact on the
speed of the test, I ran the tests in every browser with initial
arrays of size 10, 100, 1000, and 10000. While it might be
interesting to analyze larger sizes, that didn't seem to have any
practical implication for most browser-based scripting, so it wasn't
pursued further.

One subsequent post [1] took me to task for not making my array
larger, noting that there would be significant noise from the test
harness for small array sizes. While this is true, the real question
was to see how these differing algorithms performed at different array
sizes, so we needed to test them at small array sizes as well as
larger ones. This necessarily involves some significant testing
error.

The first thing definitely gleaned from this is that setting the array
elements is clearly faster than pushing new elements onto the array.
In practically every instance, push is faster than set. As the code
is also cleaner, this leaves little doubt that in a loop like this,
where we're constructing new values for the end of the array and have
the index handy, it's best to set them directly. The only places
where this difference is less clear is with Chrome and 10000 elements
and Firefox with 1000 or 10000 elements, and there, the differences
might be attributable to noise.

The question of looking up the var via index versus a temporary
variable is much less clear. In most of these tests, the new variable
is faster by a small amount. I have not run the test with only one
array to set (that is, with, say, only "photos" and not "captions"),
but it wouldn't surprise me if that reversed this. I think in the end
there is not enough of a performance difference to matter much. The
project's coding style should probably dictate this choice.

The most interesting question was one of incrementing or decrementing
the array iterator. In Chrome, there are only minor differences for
array size 10, but at 100 and 1000, decrement is significantly
faster. But by 10000, increment is twice as fast. In Firefox, 10 is
too inconsistent to be useful; at 100 decerement is somewhat faster,
but at 1000 and 10000, increment is noticeably faster. In Internet
Explorer, for 10, decrement is a bit faster, but for 100, 1000, or
10000, increment is substantially faster. In Opera, decrement is
consistently a little faster than increment. In Safari, decrement is
noticeably faster for array sizes 10 and 100, but just barely faster
for the larger sizes.

Jorge pointed out [2] that this changes drastically if we up the ante
to 20,000 array elements. In fact in Safari with 20000 array
elements, the setLookupIncrement and setNewVarIncrement functions are
over a number of tests, between 5 and 25 times as fast as their
*Decrement counterparts. I followed this up in the other browsers
[3], although I haven't put it on the results page yet, and there is a
clear turn-around for all tested browsers -- except Opera -- somewhere
between 10000 and 20000 elements, although in no other browser is it
as drastic as it is in Safari.

The upshot is that decrement is generally better at smaller array
sizes, but you're probably better off with increment as the array gets
larger. Where the cut-off is will depend on the ES implementation
most used for your script. In IE, it's fairly low, between 10 and
100; in Chrome, it's between 1000 and 10000; in Firefox, between 100
and 1000; and in Safari, between 10000 and 20000. And in Opera, I
haven't found any such cut-off.

This was not the point of the tests, but it is interesting to note in
the raw numbers just how different the different ECMAScript
implementation are in speed. The reasons are probably well known to
this group, but the raw numbers can be quite astounding. At 100
elements, using setNewVarDecrement, we get the following number of
operations per second:

Chrome: 417,142
Firefox: 9,981
IE: 8,861
Opera: 32,789
Safari: 238,953

That's a factor of 47 between speeds in Chrome versus IE!

With all that said, I want to make it clear that performance is only
one consideration in choosing an algorithm. Unless there's a known
performance problem, I would not sacrifice code clarity to gain a
performance boost. But if you're having performance problems
assigning array elements inside a loop, I hope this data may be of
help.

-- Scott
____________________
[1] http://groups.google.com/group/comp.lang.javascript/msg/2b7a059a6c672e7e
[2] http://groups.google.com/group/comp.lang.javascript/msg/59e22efd4f3425df
[3] Chrome: http://tinyurl.com/ycrp3hp
Firefox: http://tinyurl.com/ybqlxd6
IE: http://tinyurl.com/yckh6vn
Opera: http://tinyurl.com/y9s3u8v
Safari: http://tinyurl.com/y9lk62j
 
J

Jorge

(...)
Jorge pointed out [2] that this changes drastically if we up the ante
to 20,000 array elements.  In fact in Safari with 20000 array
elements, the setLookupIncrement and setNewVarIncrement functions are
over a number of tests, between 5 and 25 times as fast as their
*Decrement counterparts.  I followed this up in the other browsers
[3], although I haven't put it on the results page yet, and there is a
clear turn-around for all tested browsers -- except Opera -- somewhere
between 10000 and 20000 elements, although in no other browser is it
as drastic as it is in Safari.

The upshot is that decrement is generally better at smaller array
sizes, but you're probably better off with increment as the array gets
larger.  Where the cut-off is will depend on the ES implementation
most used for your script.  In IE, it's fairly low, between 10 and
100; in Chrome, it's between 1000 and 10000; in Firefox, between 100
and 1000; and in Safari, between 10000 and 20000.  And in Opera, I
haven't found any such cut-off.
(...)

This thread might interest you:
http://groups.google.com/group/comp...7b2de1edad1/f140443fc7dbc8a8#d45575d71a6c117a
 
S

Scott Sauyet

Jorge pointed out [2] that this changes drastically if we up the ante
to 20,000 array elements.  [ ... ]

This thread might interest you:http://groups.google.com/group/comp...7b2de1edad1/f140443fc7dbc8a8#d45575d71a6c117a

Thanks. I did see that one first time through. Although it's
interesting in its own right, I don't think it's relevant to this
discussion. We are not pre-allocating the arrays, unless in one of
the decrement algorithms, an ES implementation does a pre-allocation
when faced with

var photos = [];
photos[10000] = //...

But it's interesting re-reading!

Cheers,

-- Scott
 
J

Jorge

Jorge pointed out [2] that this changes drastically if we up the ante
to 20,000 array elements.  [ ... ]

Thanks.  I did see that one first time through.  Although it's
interesting in its own right, I don't think it's relevant to this
discussion.(...)

Safari stores array elements in fast storage "slots" up to a limit,
but no more: "Our policy for when to use a vector and when to use a
sparse map. For all array indices under MIN_SPARSE_ARRAY_INDEX, we
always use a vector. When indices greater than MIN_SPARSE_ARRAY_INDEX
are involved, we use a vector as long as it is 1/8 full. If more
sparse than that, we use a map."

MIN_SPARSE_ARRAY_INDEX happens to be 1e4.
 
L

Lasse Reichstein Nielsen

Scott Sauyet said:
The most interesting question was one of incrementing or decrementing
the array iterator. In Chrome, there are only minor differences for
array size 10, but at 100 and 1000, decrement is significantly
faster. But by 10000, increment is twice as fast. In Firefox, 10 is
too inconsistent to be useful; at 100 decerement is somewhat faster,
but at 1000 and 10000, increment is noticeably faster.

The reason for this is the Chrome (and some of the other browsers too)
has special support for sparse arrays (which is slower than a
non-sparse array, but saves significant amounts of memory if the array
stays sparse). By starting the assignment from the end, the array
starts out looking very sparse.

If instead you allocate the array with its full size from the start,
i.e.:
var photos = new Array(library.length);
var captions = new Array(library.length);
then the arrays will be non-sparse and won't even need to grow
while filling. That should reduce the difference between increment
and decrement to just the handling of the loop variable.

The upshot is that decrement is generally better at smaller array
sizes, but you're probably better off with increment as the array gets
larger.

Try decrement with a pre-allocated array (although I'm not sure which
browsers apart from Chrome allows using New Array(number) to allocate
a non-sparse array).

/L
 
S

Scott Sauyet

Try decrement with a pre-allocated array (although I'm not sure which
browsers apart from Chrome allows using New Array(number) to allocate
a non-sparse array).

Thank you. That is definitely worth checking out, and I will in a day
or two when I'm not swamped with work.

-- Scott
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top