Too many (small) vs. too large linked script files in a document...

D

Dag Sunde

Just wondering if anyone have looked into this?

How to split up ones JavaScript library?
A lot of very specific (and small) .js files, or
a few larger files.

I'm thinking about load-time here...

I have a gut-feeling that it will be better to use
(a lot of ) smaller (very specific) files, giving you
a much better granularity...

Any thoughts?
Have someone benchmarked this?
Am I way too far down in my wine bottle?
Do you care?

;-)
 
R

RobG

Dag said:
Just wondering if anyone have looked into this?

How to split up ones JavaScript library?
A lot of very specific (and small) .js files, or
a few larger files.

I'm thinking about load-time here...

I have a gut-feeling that it will be better to use
(a lot of ) smaller (very specific) files, giving you
a much better granularity...

Any thoughts?
Have someone benchmarked this?
Am I way too far down in my wine bottle?
Do you care?

;-)

This is so easy to test - it probably takes longer to write the
question.

Loading 30 script files from local drive took 468ms, loading one file
that contained all the content of the 30 files took 16ms.

Is *anyone* surprised that reading 30 references, requesting 30 files,
opening, reading, parsing, closing, etc. is slower than reading one
reference, requesting one file... you know the rest.

Anyone who has tried to transfer lots of files over a network knows it
is much, much faster to make one big file, send it, then unpack it at
the other end - ever heard of CPIO or its friend, SCPIO? Is UNIX
really that dead? Or has Linux changed the name to something presumably
more sexy but nonetheless awfully geeky?

Damn, that wine bottle is really on empty, eh? ;-p
 
G

Grant Wagner

RobG said:
Anyone who has tried to transfer lots of files over a network knows it
is much, much faster to make one big file, send it, then unpack it at
the other end - ever heard of CPIO or its friend, SCPIO? Is UNIX
really that dead? Or has Linux changed the name to something presumably
more sexy but nonetheless awfully geeky?

Anyone who has downloaded content from a Web site knows it's much faster
to let the browser do GET requests on several images at once then it is
to do a GET, wait for an image to download, do another GET, let that
image download, etc.

By putting all your JavaScript in one large file, it must download
synchronously.

By splitting it into several small files, the browser can perform up to
4 GETs (HTTP 1.0) or 2 GETs (HTTP 1.1) simultaneously.

Anyone who has ever FTPed several large files over the Internet will
have also seen this. For example, when I FTP FreeBSD ISOs, I get
approximately 60-80Kb/s on each of the 4 downloads. As downloads end and
I have one left, that ISO downloads at approximately 100Kb/s, not the
240-320Kb/s you might expect.

When it comes to the Internet, one large pipe is not the same as several
smaller ones.
 
D

Dr John Stockton

JRS: In article <[email protected]>, dated Thu, 16 Dec
2004 16:57:04, seen in Grant Wagner
By splitting it into several small files, the browser can perform up to
4 GETs (HTTP 1.0) or 2 GETs (HTTP 1.1) simultaneously.

But be aware that that can be very annoying for users with limited
bandwidth who are trying to load a big page in the background of doing
other work, perhaps interactive.

An author should not presume that his work is all that is of interest to
his readers.
 
R

RobG

Grant Wagner wrote:
[...]
By putting all your JavaScript in one large file, it must download
synchronously.

By splitting it into several small files, the browser can perform up to
4 GETs (HTTP 1.0) or 2 GETs (HTTP 1.1) simultaneously.

Anyone who has ever FTPed several large files over the Internet will
have also seen this. For example, when I FTP FreeBSD ISOs, I get
approximately 60-80Kb/s on each of the 4 downloads. As downloads end and
I have one left, that ISO downloads at approximately 100Kb/s, not the
240-320Kb/s you might expect.

Next time my browser tries to download 4 JavaScript files of 300KB each
then maybe I'll remember your advice.

A JS file of 300 lines is perhaps 11KB. The overhead of doing the GET
is likely more than the effort to download the file, so better to
download one 11KB file than say four of 3KB each.

You also assume that the browser is only downloading the JS files - it
probably isn't. The other streams are probably downloading other page
content, so why use them for JS when they may be better employed
getting images or other content?
When it comes to the Internet, one large pipe is not the same as several
smaller ones.

Quite true, but it can't be claimed that lots of small files is
*always* better than one (or a smaller number of) bigger file(s).

I'll bet there are some frustrated network engineers lurking who could
argue this one for weeks on end...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,479
Members
44,900
Latest member
Nell636132

Latest Threads

Top