G
Gordon Beaton
How come I have to specify the size of the input buffer when
receiving UDP DatagramPackets? I basically want to receive as many
bytes as the sent datagram contains?
You have to provide a buffer large enough for the largest expected
packet, and you have to tell the DatagramPacket how much of the buffer
it can fill. I suppose it could use buffer.length() itself, but there
may be cases when you don't want to fill the entire buffer.
After receiving a datagram, the length is updated to tell you how many
bytes were actually copied into the buffer.
It is extremely unfortunate that the same "length" is used for both
purposes, since it is far from obvious that you need to reset the
value before reusing the same DatagramPacket to receive additional
packets (otherwise you will receive successively smaller and smaller
packets, regardless of what is actually sent).
/gordon