I still think, IMHO, that most of the difference is just semantics
Most of the arguments are about semantics, certainly. There are
plenty of folk who use the term "standard" to mean whatever they want
it to mean. There's no "service mark" to guarantee that the term is
used correctly. All along, I've known the term "industry standard" to
mean "whatever the dominant vendor's dirty tricks department can
manage to come up with to prevent effective interworking with products
from their competitors". At different times that dominant vendor has
been a different company, but there was usually a DoJ or equivalent on
hand to set a limit to the relevant vendor's worst excesses.
Meantime, organisations like ISO, or like the IETF's standards-track
interworking specifications, are making rules for avoiding unnecessary
incompatibilities between products, and plenty of honest vendors are
doing their best to adhere to them.
The W3C, on the other hand, isn't a standards making body as such, but
an industry consortium, funded by the subscriptions of its members. It
has some well-intentioned folks in its fold, no mistake about it, but
when push comes to shove, they can't go against the common will of
their influential members. So they politely publish a low-profile
usage note hinting that "some implementations" (unnamed) are failing
to conform to this or that requirement of the specification, instead
of naming and shaming the dominant vendor who is doing it.
Sure, they lost control of "HTML" at a time when the majority of its
users came to believe that "HTML" was defined by whatever the
"Netploder" vendors chose to implement, no matter how (in)appropriate
to the aims of the WWW.
guess the differences to me are: which is most useful? I haven't delved
deeply into XHTML so find HTML most useful.
Well, HTML/4.01 is "most useful" in the actual field, for composing
web pages. But there's a wide range of different use profiles of
HTML4.01, so by saying that, one isn't pinning-down the options by
very much.