Joshua said:
Shortening of names, attribute names/values, etc.: <o
b="d" />. I have
also seen people dump binary blobs into XML before under base64 instead
of restructuring them for XML.
It strikes me that such things do nothing (or near enough) to optimize
XML processing.
This falls in the categories of micro-optimization and sheer
foolishness that pertain to other areas of programming also. I
wouldn't blame XML for that one. It's a case of sacrificing human-
readability (whether you disparage it or not) for no benefit.
I think human-readability is a poor goal to strive for, since it
Why not if the cost is negligible?
It is extremely convenient and useful to be able to read through an
XML document without tools more special than an editor.
generally misses the point. What you rather need is a simple, clear
structure that is well-supported by tools (which, to be fair, XML more
or less accomplishes).
Yes, it does, without sacrificing human-readability.
Human-readable is most useful if what you really want is human-editable,
or human-verifiable, or human-explicable, or human-debuggable, ...
e.g., configuration files. Otherwise, it is merely a proxy for two
related goals: the ability to read the contents of transfer data, or the
Nothing to sneeze at
ability to search for contents in a data stream. Neither of these
Nothing to sneeze at
necessarily requires plain text format; they just require tools that
The range of tools to work with human-readable formats is much wider.
know how the format works. Wireshark, for example, is easily able to let
Right, ask my manager to use Wireshark. I'm a geek, and I have a hard
time reading Wireshark output, even with the fancy tools. Can you
imagine how hard it is for normal people?
you poke around data in every major network protocol, most of which is
binary.
But if the format is text, and human-readable, then you don't need to
mess with Wireshark.
Nothing is necessary. We could program all this with toggle
switches. But XML sure is convenient and labor-saving.