W3C's "Binary XML" - dropped?

I

Ivan_G_S

Hello everyone!

The W3C did an excellent work, publishing documents of possible
properties and use cases of "binary XML" ( http://www.w3.org/XML/Binary/
). It was 2005.
What is the current state of this project? Was it dropped?

I would really like to see such thing as binary XML. It would make a
lot of things more interoperable and easier.

Quotes from documents which made me assume, the project is dropped:

http://www.w3.org/XML/Binary/ : "The XML Binary Characterization
Working Group is closing on April 1st, 2005"

http://www.w3.org/TR/xbc-characterization/ : "The XML Binary
Characterization Working Group has ended its work. This document is
not expected to become a Recommendation later. It will be maintained
as a WG Note."
 
R

Richard Tobin

Ivan_G_S said:
The W3C did an excellent work, publishing documents of possible
properties and use cases of "binary XML" ( http://www.w3.org/XML/Binary/
). It was 2005.
What is the current state of this project? Was it dropped?

See www.w3.org/TR/exi
I would really like to see such thing as binary XML. It would make a
lot of things more interoperable and easier.

I suspect it will make things a lot less interoperable!

-- Richard
 
G

Guest

What is the current state of this project? Was it dropped?

Dropped.

XML's strengths are interoperability and toolability. A binary format
would break both, for negligable gains -- a good XML parser can read
XML at nearly the same speed that one could deserialize from a binary
representation.

If you're worried about verbosity XML compresses wonderfully when put
through any of the standard adaptive algorithms (zip and its cousins),
and decompressing those doesn't add much overhead.

While there are certainly things a binary format would be better for,
that tends to be a huge mixed bag of goals, many of which directly
conflict in the requirements they would want to impose on the data
format.

If you want a binary format, by all means invent one that suits your
needs, and import/export it to XML. But don't expect it to be
interoperable. Remember, the whole reason XML took off was that binary
formats WEREN'T easily interoperable, even those explicitly intended
to be so.
 
I

Ivan_G_S

What is the current state of this project? Was it dropped?

Richard Tobin pointed out this link, which is (as far as I understood)
the future implementation of such binary XML format:
http://www.w3.org/TR/exi/ , called EXI.
XML's strengths are interoperability and toolability. A binary format
would break both, for negligable gains -- a good XML parser can read
XML at nearly the same speed that one could deserialize from a binary
representation.

The performance of good parsers is impressive, indeed. Well, it's
better than one would expect. (I've done few benchmarks with huge
amounts of data).
While there are certainly things a binary format would be better for,
that tends to be a huge mixed bag of goals, many of which directly
conflict in the requirements they would want to impose on the data
format.

Then, it would be good to have the "choice" between XML and EXI.
Well, it's even better than a "choice". Both formats will be inter-
convertible.
If you want a binary format, by all means invent one that suits your
needs, and import/export it to XML. But don't expect it to be
interoperable. Remember, the whole reason XML took off was that binary
formats WEREN'T easily interoperable, even those explicitly intended
to be so.

Inventing own format, that's a thing XML has been able to free us
from.
Binary XML (read: EXI) would do the same for binary formats.
Inventing, implementing, testing an own format costs time and requires
a lot of knowledge.

Consider, what XML is already used for: even such things as databases
(as in Apache Xindice) and even worse examples exist (nothing against
Apache Xindice). Time to have a binary counterpart ;)

Have a look at the EXI proposal, very interesting aspects are
mentioned there, many properties are considered. What do you think
about it?
 
P

Pavel Lepin

Richard Tobin said:

First WDs published just two days ago. I've watched EXI WG
activities for a few months now with mild interest, and
even considered asking the group for comments upon seeing
the drafts, but then thought better of it, since it's
likely too early for the discussion of how it affects us
humble application developers, which is all that I'm really
interested in.
I suspect it will make things a lot less interoperable!

That's entirely too true, but EXI is still a good thing I
believe. Of course, it's by no means a silver bullet of
binary formats. Fixed-length records and direct
marshalling/unmarshalling of PODs are altogether too
convenient sometimes, but a compact, standard, well-defined
binary representation of XML Infoset just can't be wrong.
 
M

mathieu

Dropped.

XML's strengths are interoperability and toolability. A binary format
would break both, for negligable gains -- a good XML parser can read
XML at nearly the same speed that one could deserialize from a binary
representation.

correction: 'can read' -> 'can parse'. I am dealing with huge binary
blob (up to 4Gb) and there is no way I am turning this into any kind
of encoding. So no, XML is not suited in such case and binary
serialization/deserialization is much faster in binary, since I can
simply fread into memory buffer. It is a lot less interoperable, I
agree.

2 cents
-Mathieu
 
E

edday2006


It has not been dropped. It is now Efficient XML Interchange (EXI)
and the group just published three new documents in the past week.
XML's strengths are interoperability and toolability. A binary format
would break both, for negligable gains -- a good XML parser can read
XML at nearly the same speed that one could deserialize from a binary
representation.

I think if you bothered to read the group's test report, you would
find you are wrong. A binary format can be parsed *much* faster than
XML. And that is not even taking into account the networking gains of
having to push smaller compressed packets across the network.
If you're worried about verbosity XML compresses wonderfully when put
through any of the standard adaptive algorithms (zip and its cousins),
and decompressing those doesn't add much overhead.

Do you have any evidence to support your assertions? Test data shows
that there are many situations where standard compression does not
work very well.
 
G

Guest

It has not been dropped. It is now Efficient XML Interchange (EXI)
and the group just published three new documents in the past week.

I'll admit error on that one. But it surprises me, given what I was
hearing from IBM's participant in the study which preceeded the EXI
group and given IBM's apparently decision not to stay involved in the
EXI effort.

I'll have to find time to read through this and crosscheck it. I
remain extremely skeptical, but I don't blame you for not taking my
word for it.
Do you have any evidence to support your assertions?

Experience over the past nine years. IBM has reviewed multiple "binary
XML" proposals internally. So far, the improvements achieved have not
matched their authors' original expectations, and have not been strong
enough to overcome the downside of making a breaking change to XML.

I grant that certain kinds of data compress poorly compared to re-
expressing them in binary. As far as I know, in most applications of
XML that's a minority of the actual information. I'll have to look at
their report in more detail to see why they're concluding otherwise.

I'm not opposed to the proposal of a serialized binary representation
of the XML infoset, if it really can be proven to have significant
value. Certainly there's legitimate use for binary representations of
the XML infoset within applications. But part of XML's strength as an
interchange format is its toolability, and some of that is
legitimately linked to its being a textual representation. The concept
of the "desperate Perl hacker" remains valid -- the fact that you can,
if necessary, visually debug and textually manipulate XML really does
have significant value.

As I say, I'll make time to review this proposal; thanks for calling
it to my attention. If it turns out that they've got something really
viable and valuable, more power to 'em. They may have found an angle
that actually works; new invention in this area *is* still going on,
after all.

For me, this is definitely in the "that trick never works!" category.
But who knows; maybe they'll find the right hat this time.

Your milage will vary.
 
S

schneider.agiledelta

Hi Ivan,

I'm one of the editors of the emerging W3C binary XML standard and
have to report that our demise has been greatly exagerated. ;-)

I can understand the confusion; however, because the W3C's work in
this area has been split over two groups. The first group (XBC) was
responsible for looking at use cases, defining requirements, and
ultimately deciding whether the W3C should create the standard. The
second group is responsible for selecting a binary XML format to start
with (there were several) and creating a standard based on it. The
information tying all the events together is spread all over the
place, so we've created a very concise run-down of the relevant
events, dates and links at http://www.agiledelta.com/technology_binaryxml.html
(admittedly, focused on our contributions).

The EXI WG just released the second public draft of the EXI standard,
along with a primer that describes EXI for mere mortals and a best
practices document that describes, among other things, ways to deploy
EXI without disrupting interoperability with systems that don't yet
implement it.

If you want to try it out and provide feedback, you can download free
evaluations of EXI implementations for most popular server, desktop
and mobile platforms from http://www.agiledelta.com/efx_download.html.
My company developed the Efficient XML technology on which the EXI
standard is based and has been creating Efficient XML products for
many years, including web-service plug-ins, HTTP proxies, SDKs,
runtimes and more to come :). The current release implements
everything in the latest draft EXI standard and some features
scheduled for the next draft (e.g., strict mode). And the HTTP
products follow the W3C best-practices, implementing standard HTTP
content negotiation to maintain seamless compatibility with systems
that are not yet using EXI.

Hope this helps!,

John
 
S

schneider.agiledelta

I suspect it will make things a lot less interoperable!

This is a very natural and legitimate concern; however, we've done
quite a lot of work to significantly reduce this risk. For example, we
use standard HTTP content negotiation to maintain seamless
interoperability with clients & servers that don't yet support EXI.
With content negotiation, clients and servers that don't yet support
EXI, never receive it and continue to operate as normal (albeit slower
and using more bandwidth than the EXI clients/servers). Also, when
debugging you can selectively turn EXI on/off at various clients/
servers, so text-XML becomes a very nice, well-understood "debugging
mode" for EXI and supported by a huge array of tools.

In addition, EXI was designed to work with all the existing XML
technologies, tools and products. It plugs in a the lowest layers of
the application beneath the standard DOM, SAX, StAX, etc. APIs. So,
applications that operate on DOM trees, SAX events, StAX streams, etc.
(and most do) are insulated from EXI and get the same view of XML
they've always had. Of course, if those applications present a parsed
view of the XML to the user (e.g., collapsible tree view in a browser
or IDE), the user will see and edit exactly the same text XML view of
EXI that they get with text-XML. For applications that must operate on
XML text streams, our EXI products also offer transcoding APIs for EXI
that emit/create standard text-XML streams and HTTP proxies that
automatically detect and transcode EXI to XML for applications that
don't yet support EXI.

Interestingly, one of the primary objectives of EXI is actually to
*increase* interoperability. In particular, there are many systems out
there that find it impractical or impossible to use XML because they
are performance sensitive (e.g., stock-trading systems), have limited
resources (bandwidth, CPU, power-battery, storage, etc.) or they are
cost-sensitive and the additional resources would price them out of
their market (e.g., mass-market mobile devices, set-top devcies, ...).
The objective is to expand the XML community to include these systems
so they can interoperate with the existing XML community and start
using all the great XML standards, technologies, communities, tools,
etc.

BTW: Efficient XML is very different from previous binary XML
technologies in that it approaches the theoretical optimum encoding,
so it achieves and sometimes surpasses the efficiency ofhand-optimized
binary message formats. As such, EXI is efficient enough for the most
demanding applications and enables almost any application to take
advantage of XML technologies. Take a look at http://www.agiledelta.com/efx_perffeatures.html
for some impressive compactness and parsing performance examples. The
parsing speeds here are compared to Sun's JAXP parser in JDK 1.5. The
fastest XML parsers are about 2-3 times faster than JAXP, but nowhere
close to as fast as Efficient XML.

Hope this helps!,

John
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,754
Messages
2,569,527
Members
44,999
Latest member
MakersCBDGummiesReview

Latest Threads

Top