java.nio problem

H

Henry

I had a NIO problem.
I tried to open a socket channel, and grep a HTML page from HTTP GET.
I did it success if I open and close socket channel everytime I send a
request and get the response.
However, if I keep open the socket channel and send multiple requests,
I'll get IOException.
Does anybody know how to solve this problem? Thanks in advance!
Note: I don't want to fire requests in parallel but one by one
sample code:
=============================================================
private Charset charset = Charset.forName("ISO-8859-1");
private SocketChannel channel;
try
{
// do connection
InetSocketAddress socketAddress =
new InetSocketAddress( "www.mydomain.com", 80);
channel = SocketChannel.open(socketAddress);

// send request #1
channel.write(charset.encode("GET /page1.html HTTP/1.0\r\n\r\n") );
// read response #1
ByteBuffer buffer = ByteBuffer.allocate(1024);
while ((channel.read(buffer)) != -1)
{
buffer.flip();
System.out.println(charset.decode(buffer));
buffer.clear();
}

// send request #2
channel.write(charset.encode("GET /page2.html HTTP/1.0\r\n\r\n") );
// read response #2
// program threw "java.io.IOException: Read failed" error
while ((channel.read(buffer)) != -1)
{
buffer.flip();
System.out.println(charset.decode(buffer));
buffer.clear();
}
}
catch (IOException e)
{
e.printStackTrace();
}
finally
{
if (channel != null)
{
try
{
channel.close();
} catch (IOException e) {}
}
}
=============================================================
 
D

Dave Schaumann

Henry said:
channel.write(charset.encode("GET /page1.html HTTP/1.0\r\n\r\n") );

Disclaimer: I am not an expert on HTTP, but

-- you should include a "Host: www.mydomain.com" header (modified for
the host of each request, of course).

-- you'll need to check the RFC to be sure, but I think HTTP/1.0
closes the connection by default after each request. Try adding
a "Connection: keep-alive" header to your request.
 
V

VK

Actually this is how HTTP supposes to work: a page with 2 pictures on it for
example requires 3 separate requests (for page itself and for 2 pictures on
it). "Serve request/close connection".
"keep-alive" option is a server-side setting. If server doesn't support it,
you cannot force it to. (IMHO)

If you are dealing with your own server, maybe you want to use FTP instead?
 
R

Robert Olofsson

VK ([email protected]) wrote:
: Actually this is how HTTP supposes to work: a page with 2 pictures on it for
: example requires 3 separate requests (for page itself and for 2 pictures on
: it). "Serve request/close connection".
: "keep-alive" option is a server-side setting.

Not true anymore. HTTP/1.0 closes the connection after each served
request as default. Some clients and server have keep-alive
implemented.

For HTTP/1.1 (rfc 2616) persistent connections are the default and a
client that issues a HTTP/1.1 request does not need to send any
keep-alive header (the server already knows that the client supports
it).

Since all compliant HTTP/1.1 devices have support for the chunked
encoding nearly all requests can actually be served by one
connection.

/robo
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,579
Members
45,053
Latest member
BrodieSola

Latest Threads

Top