D
Daisy
I'd like to hear the experience of others with object serialization at
high input rates. I implemented Object serialization between a server
and client. It works great, but I'm concerned that serialization
consumes a lot of cpu at high rates. While we load test, I'm thinking
through alternatives. I see two:
1. override Serializable
2. implement Externalizable
Either way, we're going to rewrite some code.
BACKGROUND
The server sends small objects to the clients. The client sends almost
nothing to the server. In the near future the server may transmit 3000
to 5000 small objects per second. I see two potential problems
currently:
A. default serialization is expensive
B. transmission of reference objects along with the small objects
What's a small object? It has 3 to 5 long and String fields.
class SmallObjectWithEventInfo extends AbstractEvent implements
Serializable {
long time = 0;
String label = "label";
long eventId= 0;
long measurement = 0;
String anotherString = "another string";
ContextInfo reference = null;
SmallObjectWithEventInfo(long time, long eventId, long measurement,
ContextInfo context){
this.time = time;
this.eventId=eventId ;
this.measurement = measurement;
this.anotherString = "another string";
reference = context;
label = "Event type "+eventId+ " for "+ context;
}
}
It also has some references to other common objects. In other words,
the small object contains information about a specific event, the
common object describes the context of that event. There are several
contexts, but they are mostly static. These references are handled
transparently for us in Serlializable. Since we only transmit 3 or 4
object types, we can easily override their default serialization.
To implement Externalizable, we would have to send reference objects
separately, then use some unique identifer/lookup like hashcode to
re-associate the reference objects at the client. We already have a
binary protocol in another part of our system so we could easily reuse
that in externalizable. However, my sense is that there are some
hidden gotchas.
QUESTIONS
I'm leaning towards starting with overriding default serialization. If
that doesn't get me enough performance, then I could go for the
reference lookup.
1. Is Java's default Serializable really cpu-intensive/slow? Has anyone
used it at high rates?
2. If you wanted to send 3000-5000 small objects per second, would you
override Serializable, use Externalizable, or something else?
3. Does inheritance complicate serialization?
Thanks
high input rates. I implemented Object serialization between a server
and client. It works great, but I'm concerned that serialization
consumes a lot of cpu at high rates. While we load test, I'm thinking
through alternatives. I see two:
1. override Serializable
2. implement Externalizable
Either way, we're going to rewrite some code.
BACKGROUND
The server sends small objects to the clients. The client sends almost
nothing to the server. In the near future the server may transmit 3000
to 5000 small objects per second. I see two potential problems
currently:
A. default serialization is expensive
B. transmission of reference objects along with the small objects
What's a small object? It has 3 to 5 long and String fields.
class SmallObjectWithEventInfo extends AbstractEvent implements
Serializable {
long time = 0;
String label = "label";
long eventId= 0;
long measurement = 0;
String anotherString = "another string";
ContextInfo reference = null;
SmallObjectWithEventInfo(long time, long eventId, long measurement,
ContextInfo context){
this.time = time;
this.eventId=eventId ;
this.measurement = measurement;
this.anotherString = "another string";
reference = context;
label = "Event type "+eventId+ " for "+ context;
}
}
It also has some references to other common objects. In other words,
the small object contains information about a specific event, the
common object describes the context of that event. There are several
contexts, but they are mostly static. These references are handled
transparently for us in Serlializable. Since we only transmit 3 or 4
object types, we can easily override their default serialization.
To implement Externalizable, we would have to send reference objects
separately, then use some unique identifer/lookup like hashcode to
re-associate the reference objects at the client. We already have a
binary protocol in another part of our system so we could easily reuse
that in externalizable. However, my sense is that there are some
hidden gotchas.
QUESTIONS
I'm leaning towards starting with overriding default serialization. If
that doesn't get me enough performance, then I could go for the
reference lookup.
1. Is Java's default Serializable really cpu-intensive/slow? Has anyone
used it at high rates?
2. If you wanted to send 3000-5000 small objects per second, would you
override Serializable, use Externalizable, or something else?
3. Does inheritance complicate serialization?
Thanks