Speeding up URLConnection

M

mark13.pl

Hello,

I want to save as a string html file. My code looks like that:

URL url = new URL(fileName);
URLConnection conn = url.openConnection();
conn.setRequestProperty("Cookie", myCookieCode);
conn.connect();
BufferedReader dis = new BufferedReader(new
InputStreamReader(conn.getInputStream()));
String inputLine = null;
for(;;) {
inputLine = dis.readLine();
htmlCode += inputLine;
if (inputLine == null) break;
}

It works perfectly (in htmlCode I have the whole page as I wanted) but
it has very big disadvantage - it is VERY slow. In both my browsers
(Firefox, IE6.0, cache cleaned) it takes about 2seconds to load it
while in java: about 14seconds. Do you know where is a problem and how
can I speed it up??

Regards, mark
 
O

Oliver Wong

Hello,

I want to save as a string html file. My code looks like that:

URL url = new URL(fileName);
URLConnection conn = url.openConnection();
conn.setRequestProperty("Cookie", myCookieCode);
conn.connect();
BufferedReader dis = new BufferedReader(new
InputStreamReader(conn.getInputStream()));
String inputLine = null;
for(;;) {
inputLine = dis.readLine();
htmlCode += inputLine;
if (inputLine == null) break;
}

It works perfectly (in htmlCode I have the whole page as I wanted) but
it has very big disadvantage - it is VERY slow. In both my browsers
(Firefox, IE6.0, cache cleaned) it takes about 2seconds to load it
while in java: about 14seconds. Do you know where is a problem and how
can I speed it up??

If you have a profiler, you should use it to measure where your code is
spending all your time. However, from a quick glance, my guess is that the
slowestp art is string concatenation. Use a StringBuilder instead:

URL url = new URL(fileName);
URLConnection conn = url.openConnection();
conn.setRequestProperty("Cookie", myCookieCode);
conn.connect();
BufferedReader dis = new BufferedReader(new
InputStreamReader(conn.getInputStream()));
StringBuilder tempHtmlCode = new StringBuilder();
do {
String inputLine = dis.readLine();
tempHtmlCode.append(inputLine);
} while (inputLine != null)
htmlCode = tempHtmlCode.toString();

- Oliver
 
C

Chris Uppal

BufferedReader dis = new BufferedReader(new
InputStreamReader(conn.getInputStream()));

In general the buffering should be immediately around the lowest-level stream
(the one that does real IO). There may be circumstances where you need
buffering at outer levels too, but that's not so common.

Reader reader = new InputStreamReader(
new BufferedInputStream(
conn.getInputStream()));

BTW, /don't/ forget about the character encoding -- in this context it is very
unlikely to be an issue you can safely ignore.

-- chris
 
B

Babu Kalakrishnan

Hello,

I want to save as a string html file. My code looks like that:

URL url = new URL(fileName);
URLConnection conn = url.openConnection();
conn.setRequestProperty("Cookie", myCookieCode);
conn.connect();
BufferedReader dis = new BufferedReader(new
InputStreamReader(conn.getInputStream()));
String inputLine = null;
for(;;) {
inputLine = dis.readLine();
htmlCode += inputLine;
if (inputLine == null) break;
}

It works perfectly (in htmlCode I have the whole page as I wanted) but
it has very big disadvantage - it is VERY slow. In both my browsers
(Firefox, IE6.0, cache cleaned) it takes about 2seconds to load it
while in java: about 14seconds. Do you know where is a problem and how
can I speed it up??

While I agree with Oliver that your String concatenation code is pretty
inefficient, I wouldn't think it could cause that kinds of delays
(unless the number of lines was very very large).

One probable cause ( a guess really) is that the HttpUrlConnection
class tries to keep the connection open in anticipation of futher
traffic to the same URL, and so the reader returning null (indicating
the end of output) is delayed till the connection is eventually
dropped. Browsers on the other hand would keep the connection open, but
would look at headers to see the length of the expected response, and
terminate the read when the expecteed number of bytes were received.

I suggest that you try out another client (like say jakarta's
HttpClient) , and see if it helps. Sun's implementation had quite afew
issues like this in the past - and was really never meant to be
full-fledged HttpClient - but rather a simplistic mechanism for loading
Jar files from a remote source.

BK
 
M

mark13.pl

Hello,

Thank you for your answers. I have changed it to StringBuilder and now
everything speeds up a few times (it is perfectly acceptable now so I
will stop here without making additional experiments with other http
readers) . Thank you very much!

Regards, mark
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,009
Latest member
GidgetGamb

Latest Threads

Top