Java’s Future Lies In Mobile?

  • Thread starter Lawrence D'Oliveiro
  • Start date
A

Arne Vajhøj

Dirk said:
It's probably a neat idea, but I am not used to xml [sic]

It is a little strange to lay out graphics with XML - it's like drawing
a picture with words.

Adobe MXML, MS XAML, Mozilla XUL are pretty widely used.

Arne
 
L

Lawrence D'Oliveiro

... I am not used to xml

What have you been using instead?

I have used it as a transfer format and as a configuration file format. I
was even able to train a non-computer-savvy former client to use KXMLEditor
to create templates for driving my code with it. I have written code that
parses SVG, which is an application of XML. I’ll be using XML again in the
project for which I have already posted code snippets elsewhere.

In short, it’s a format of 1001 uses.
 
D

Dirk Bruere at NeoPax

Dirk said:
It's probably a neat idea, but I am not used to xml [sic]

It is a little strange to lay out graphics with XML - it's like drawing
a picture with words. But no Java programmer can afford to be unfamiliar
with XML.
Seems to be the standard for information exchange.
Hopefully the last such.
 
D

Dirk Bruere at NeoPax

What have you been using instead?

I have avoided it.
Previously used Netbeans and it GUI layout.
As for files, I have exported and imported in xml format without
actually learning the details of it
I have used it as a transfer format and as a configuration file format. I
was even able to train a non-computer-savvy former client to use KXMLEditor
to create templates for driving my code with it. I have written code that
parses SVG, which is an application of XML. I’ll be using XML again in the
project for which I have already posted code snippets elsewhere.

In short, it’s a format of 1001 uses.

So I am discovering
 
D

Dirk Bruere at NeoPax

And almost half of those are appropriate uses too.

XML is a fine meta-format. But it's less often the right choice than
is often thought.
At least its Human readable
 
P

Paul Cager

On 11/03/2011 11:00, Leif Roar Moldskred wrote:> L

At least its Human readable

True, and that's a big bonus. But I share Leif's concern: sometimes we
think "we would like a human-readable representation; XML is human
readable; therefore we'll use XML".

Other formats (such as DSLs) are also human-readable (and sometimes
much _more_ readable).
 
L

Lawrence D'Oliveiro

Seems to be the standard for information exchange.
Hopefully the last such.

Another handy convention is constructing more complex document structures as
ZIP archives.

Not sure who was first—Sun seems to have been one of the early ones with JAR
files, another common one now is the Open Document Format.

The latter adds another useful convention, where the first item in the
archive is called “mimetype†and its contents are always uncompressed. That
way, file-format-sniffing code can pick up that string “mimetype†at offset
30 and the actual MIME type string at offset 38.
 
J

Joshua Cranmer

At least its Human readable

It's human-readable in theory. At least, until you want to try to
improve the performance, at which point you get something roughly
resembling a cat sitting on the keyboard. The end result is a
poor-efficiency encoding that is unreadable without passing it through a
dump tool translator.
 
L

Lew

Leif said:
A well-designed XML format will reduce those problems, of course, but
as one of the great draws of XML is that you don't have to spend a lot
of thought on designing your data-format ... well; it's pretty far between
well-designed XML formats.

It's also pretty far between well-designed Java programs.

That's why the world needs competent programmers.
 
L

Lew

Joshua said:
[XML]'s human-readable in theory. At least, until you want to try to
improve the performance, at which point you get something roughly
resembling a cat sitting on the keyboard. The end result is a
poor-efficiency encoding that is unreadable without passing it through a
dump tool translator.

I can conceive of no ways to improve "performance", however you
construe the term for XML processing, that would reduce readability of
an XML document. Would you please elaborate on what sorts of
performance adjustments damage XML readability, and what the expected
benefit is for such adjustments?

I can think of one: ZIPping the XML. That reduces readability.

But I figure you meant something more pertinent, and less impertinent.
 
J

Joshua Cranmer

Joshua said:
[XML]'s human-readable in theory. At least, until you want to try to
improve the performance, at which point you get something roughly
resembling a cat sitting on the keyboard. The end result is a
poor-efficiency encoding that is unreadable without passing it through a
dump tool translator.

I can conceive of no ways to improve "performance", however you
construe the term for XML processing, that would reduce readability of
an XML document. Would you please elaborate on what sorts of
performance adjustments damage XML readability, and what the expected
benefit is for such adjustments?

Shortening of names, attribute names/values, etc.: <o:p b="d" />. I have
also seen people dump binary blobs into XML before under base64 instead
of restructuring them for XML.

I think human-readability is a poor goal to strive for, since it
generally misses the point. What you rather need is a simple, clear
structure that is well-supported by tools (which, to be fair, XML more
or less accomplishes).

Human-readable is most useful if what you really want is human-editable,
e.g., configuration files. Otherwise, it is merely a proxy for two
related goals: the ability to read the contents of transfer data, or the
ability to search for contents in a data stream. Neither of these
necessarily requires plain text format; they just require tools that
know how the format works. Wireshark, for example, is easily able to let
you poke around data in every major network protocol, most of which is
binary.
 
L

Lew

Joshua said:
Shortening of names, attribute names/values, etc.: <o:p b="d" />. I have
also seen people dump binary blobs into XML before under base64 instead
of restructuring them for XML.

It strikes me that such things do nothing (or near enough) to optimize
XML processing.

This falls in the categories of micro-optimization and sheer
foolishness that pertain to other areas of programming also. I
wouldn't blame XML for that one. It's a case of sacrificing human-
readability (whether you disparage it or not) for no benefit.
I think human-readability is a poor goal to strive for, since it

Why not if the cost is negligible?

It is extremely convenient and useful to be able to read through an
XML document without tools more special than an editor.
generally misses the point. What you rather need is a simple, clear
structure that is well-supported by tools (which, to be fair, XML more
or less accomplishes).

Yes, it does, without sacrificing human-readability.
Human-readable is most useful if what you really want is human-editable,

or human-verifiable, or human-explicable, or human-debuggable, ...
e.g., configuration files. Otherwise, it is merely a proxy for two
related goals: the ability to read the contents of transfer data, or the

Nothing to sneeze at
ability to search for contents in a data stream. Neither of these

Nothing to sneeze at
necessarily requires plain text format; they just require tools that

The range of tools to work with human-readable formats is much wider.
know how the format works. Wireshark, for example, is easily able to let

Right, ask my manager to use Wireshark. I'm a geek, and I have a hard
time reading Wireshark output, even with the fancy tools. Can you
imagine how hard it is for normal people?
you poke around data in every major network protocol, most of which is
binary.

But if the format is text, and human-readable, then you don't need to
mess with Wireshark.

Nothing is necessary. We could program all this with toggle
switches. But XML sure is convenient and labor-saving.
 
L

Lawrence D'Oliveiro

In short, it’s a format of 1001 uses.

What, nobody asked “is that binary or decimal�

Geeks aren’t what they used to be...
 
A

Arne Vajhøj

It strikes me that such things do nothing (or near enough) to optimize
XML processing.

In some cases it can have a significant effect.
This falls in the categories of micro-optimization and sheer
foolishness that pertain to other areas of programming also. I
wouldn't blame XML for that one. It's a case of sacrificing human-
readability (whether you disparage it or not) for no benefit.

Except in those cases where it has an important benefit.
Why not if the cost is negligible?

You may think it is negligible.

But in some cases it has been measured to be important.

Arne
 
A

Arne Vajhøj

Joshua said:
[XML]'s human-readable in theory. At least, until you want to try to
improve the performance, at which point you get something roughly
resembling a cat sitting on the keyboard. The end result is a
poor-efficiency encoding that is unreadable without passing it through a
dump tool translator.

I can conceive of no ways to improve "performance", however you
construe the term for XML processing, that would reduce readability of
an XML document. Would you please elaborate on what sorts of
performance adjustments damage XML readability, and what the expected
benefit is for such adjustments?

I can think of one: ZIPping the XML. That reduces readability.

But I figure you meant something more pertinent, and less impertinent.

Just by:
* use default namespace
* remove indentation
* use short names
* prefer attributes over elements
the readability of an XML document can be somewhat reduced.

And then there are:
* Fast Infoset
* EXI
and other binary XML formats.

Lots of things are being done for performance.

Arne
 
A

Arne Vajhøj

It's human-readable in theory. At least, until you want to try to
improve the performance, at which point you get something roughly
resembling a cat sitting on the keyboard. The end result is a
poor-efficiency encoding that is unreadable without passing it through a
dump tool translator.

Just SOAP with some of the additional standards on top can
be a bit unreadable.

Arne
 
A

Arne Vajhøj

It _can_ be, but usually it ends up being human _decodeable_ rather than
readable (and sometimes not even that.)

The trouble is that often all the "fluff" of an XML format will serve to
obscure the actual contents from the browsing eye. All the information is
_there_, but it can be a pain and slow going to actually read it and
quickly scanning a file or comparing two files quickly turns into a
mental game of whack-the-mole in the text-processing part of your brain.

That is not my experience.

Most XML (the exception typical being design by committee format)
seems rather readable to me.

start element with name for what it is + attributes with
additional information + data + end element

seems pretty logical to me.

It is also very similar to structured control structues
in programming languages.

Arne
 
A

Arne Vajhøj

And almost half of those are appropriate uses too.

XML is a fine meta-format. But it's less often the right choice than
is often thought.

The support for XML in libraries and developer skill sets
often makes it the best choice even though the data itself
does not require the features of XML.

Arne
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,774
Messages
2,569,596
Members
45,128
Latest member
ElwoodPhil
Top