J
johnhurt
Hi,
Is there any way to speed up reading the contents of a URL from the
web? I tried doubling/tripling the buffersize of 8192 but that doesn't
seem to increase the speed much. Its only doing about 10k/sec I think.
URL url = new URL(retrieveUrl);
URLConnection connection = url.openConnection();
BufferedInputStream in = (new
BufferedInputStream(connection.getInputStream()));
byte[] buffer = new byte[8192];
StringBuffer strbuf = new StringBuffer(8192);
FileOutputStream destination = new FileOutputStream(new File("file"));
while (true)
{
int bytes_read = in.read(buffer);
if (bytes_read == -1)
{
break;
}
destination.write(buffer, 0, bytes_read);
strbuf.append(new String(buffer, 0, bytes_read));
}
in.close();
Is there any way to speed up reading the contents of a URL from the
web? I tried doubling/tripling the buffersize of 8192 but that doesn't
seem to increase the speed much. Its only doing about 10k/sec I think.
URL url = new URL(retrieveUrl);
URLConnection connection = url.openConnection();
BufferedInputStream in = (new
BufferedInputStream(connection.getInputStream()));
byte[] buffer = new byte[8192];
StringBuffer strbuf = new StringBuffer(8192);
FileOutputStream destination = new FileOutputStream(new File("file"));
while (true)
{
int bytes_read = in.read(buffer);
if (bytes_read == -1)
{
break;
}
destination.write(buffer, 0, bytes_read);
strbuf.append(new String(buffer, 0, bytes_read));
}
in.close();