Jargons of Info Tech industry

J

John Bokma

Rich Teer said:
I know this is entirely inappropriate and OT, but am I th eonly person
who reads that sentence with a grin? The idea of my wife checking her
email while I'm "doing her" over my computer is most amusing! :)

Aargh :-D.
 
J

John Bokma

Mike Meyer said:
Neither one is installed by default on the systems in question. Both
are available via the system packaging tools.

fetch is installed on FreeBSD, but all it does is download the
contents of a URL - it doesn't render them.

My brain does :-D
 
J

John Bokma

Mike Meyer said:
By the time nn was out, there were a number of radically diffrent
alternatives. The original news client (not NNTP - it predated that)
was readnews. rn was the first alternative to gain any popularity. By
the time it came out, there were alterntiave curses-based readers like
notes and vnews. By the time nn came out, there were even X-based news
readers available like xrn and xvnews.

I recall something like pine? (or was that mail, and was there something
pine related for usenet?)
It may be that the site you were at only offered a few readers.

Probably more correct: I now and then used telnet to connect, so uhm.. no
X. And more important, I didn't look further :)
But
that's a different issue.

All of this is from memory, of course - and may well be wrong.

Those were the days, thanks.
 
A

Andrew Thompson

....
...The idea of my wife checking her
email while I'm "doing her" over my computer is most amusing! :)

It does raise the question though. Is it mundane sex,
...or riveting email, that causes this phenomenon? ;-)

[ F'Ups set to c.l.j.p. only ]
 
U

Ulrich Hobelmann

Mike said:
Definitely with Ajax. That's one of the things it does really well.

But then you're probably limited to the big 4 of browsers: MSIE,
Mozilla, KHTML/Safari, Opera. Ok, that should cover most desktop users,
but you might run into problems on embedded.

I've also noticed that especially web forums and dynamic websites take
up looots of memory on my machine (but then I have loooots).
Because you can - if you know how to use HTML properly - distribute
your application to platforms you've never even heard of - like the
Nokia Communicator.

If the NC has software that can properly interpret all that HTML, CSS,
JavaScript plus image formats, yes. But who guarantees that? I'd
rather develop a native client for the machine that people actually WANT
to use, instead of forcing them to use that little-fiddly web browser on
a teeny tiny display.

And again: connections might be slow, a compact protocol is better than
loading the whole UI every time. And while Ajax might work, despite the
UI being maybe too big for the little browser window, and even if it
works, it's still probably more work than a simple, native UI. First of
all it needs to load all the JS on first load, secondly sometimes for a
flexible UI you'd have to replace huge parts of the page with something
else. Native UIs are more up to the task.
I started writing web apps when I was doing internal tools development
for a software development company that had 90+ different platform
types installed inhouse. It was a *godsend*. By deploying one

If that's 90+ GUI platforms, then I agree. I just wonder who wrote
fully standards compliant web browsers for those 90 platforms. If you
have one Windows GUI (maybe C#), one Mac GUI (Cocoa), one Gtk GUI for X,
you're done. A GUI should be the smallest bunch of work on any given
application, so it's not prohibitive to write a couple of them, IMHO.
But then I've only ever used Swing and Cocoa and the latter *is* really
convenient, might be that the others are a PITA, who knows...
well-written app, I could make everyone happy, without having to do
versions for the Mac, Windows, DOS (this was a while ago), getting it
to compile on umpteen different Unix version, as well as making it
work on proprietary workstation OS's.

Well, stick to POSIX and X APIs and your stuff should run fine on pretty
much all Unices. I never understood those people who write all kinds of
weird ifdefs to run on all Unices. Maybe that was before my time,
during the Unix wars, before POSIX. And if it's not Unix, what's a
prop. workstation OS?
Of course, considering the state of most of the HTML on the web, I
have *no* idea why most of them are doing this.

Yep. Maybe it would be best to reengineer the whole thing as ONE UI
spec+action language, incompatible with the current mess, compact, so it
can be implemented with minimum fuss. And most of all, I wouldn't use a
MARKUP language, as a real application is not text-based (at least not
as characteristic #1).
 
U

Ulrich Hobelmann

Mike said:
Try turning off JavaScript (I assume you don't because you didn't
complain about it). Most of the sites on the web that use it don't
even use the NOSCRIPT tag to notify you that you have to turn the
things on - much less use it to do something useful.

I had JS off for a long time, but now so many websites expect it, and
even make browsing more convenient, that I grudgingly accepted it. ;)
Sturgeon's law applies to web sites, just like it does to everything
else.

Yep. Filtering is the future in the overloaded world.
 
A

axel

And workplaces. Some people have more then one computer in the house. My
partner can check her email when I had her over the computer. When I
want to check my email when she is using it, I have to change the
session, fire up Thunderbird (which eats away 20M), and change the
session back.

Not a Windows solution, but I find the 'screen' utility invaluable as
I can have my email, news, and an editor open in different screens
and then when I need to move to a different machine, I can simply
detach and reattach screen without disturbing anything that
might be running.

Axel
 
M

Mike Meyer

Ulrich Hobelmann said:
But then you're probably limited to the big 4 of browsers: MSIE,
Mozilla, KHTML/Safari, Opera. Ok, that should cover most desktop
users, but you might run into problems on embedded.

True - using Ajax definitely defeats what I consider to be the best
feature of the web.
If the NC has software that can properly interpret all that HTML, CSS,
JavaScript plus image formats, yes. But who guarantees that?

You don't need that guarantee. All you need is a reasonable HTML
renderer. The folks at W3C are smart, and did a good job of designing
the technologies so they degrade gracefully. Anyone with any
competence can design web pages that will both take advantage of
advanced technologies if they are present and still work properly if
they aren't. Yeah, the low-end interface harks back to 3270s, but IBM
had a *great* deal of success with that technology.
I'd rather develop a native client for the machine that people
actually WANT to use, instead of forcing them to use that
little-fiddly web browser on a teeny tiny display.

You missed the point: How are you going to provide native clients for
platforms you've never heard of?
And again: connections might be slow, a compact protocol is better
than loading the whole UI every time. And while Ajax might work,
despite the UI being maybe too big for the little browser window, and
even if it works, it's still probably more work than a simple, native
UI. First of all it needs to load all the JS on first load, secondly
sometimes for a flexible UI you'd have to replace huge parts of the
page with something else. Native UIs are more up to the task.

I'm not arguing that native UI's aren't better. I'm arguing that web
applications provide more portability - which is important for some
applications and some developers.
If that's 90+ GUI platforms, then I agree.

Why do you care if they are GUI or not? If you need to provide the
application for them, you need to provide the application for
them. Them not being GUI just means you can't try and use a standard
GUI library. It also means you have to know what you're doing when you
write HTML so that it works properly in a CLUI. But your native app
would have to have a CLUI anyway.
I just wonder who wrote fully standards compliant web browsers for
those 90 platforms.

Nobody. I doubt there's a fully standards compliant web browser
available for *any* platform, much less any non-trivial collection of
them. You write portable web applications to the standards, and design
them to degrade gracefully. Then you go back and work around any new
bugs you've uncovered in the most popular browsers - which
historically are among the *worst* at following standards.
If you have one Windows GUI (maybe C#), one Mac GUI (Cocoa), one Gtk
GUI for X, you're done.

You think you're done. A lot of developers think you can stop with the
first one or two. You're all right for some applications. For others,
you're not. Personally, I like applications that run on all the
platforms I use - and your set doesn't cover all three of those
systems.
Well, stick to POSIX and X APIs and your stuff should run fine on
pretty much all Unices.

You know, the same kind of advice applies to writing portable web
apps. Except when you do it with HTML, "portability" means damn near
any programmable device with a network interface, not some relatively
small fraction of all deployed platforms.
I never understood those people who write all kinds of weird ifdefs
to on all Unices. Maybe that was before my time, during the
Unix wars, before POSIX.

There were standards before POSIX. They didn't cover everything people
wanted to do, or didn't do them as fast as the OS vendor wanted. So
Unix vendors added their own proprietary extensions, which software
vendors had to use to get the best performance out of their
applications, which they had to do if they wanted people to buy/use
them.

That's still going on - people are adding new functionality that isn't
covered by POSIX to Unix systems all the time, or they are adding
alternatives that are better/faster than the POSIX version, and there
are lots of things that applications want to do that simply aren't
covered by POSIX. And not all implementations are created equal. Some
platforms malloc's provide - to be polite - less than optimal
performance under conditions real applications encounter, so those
applications conditionally use different malloc implementations. The
same thing applies to threads, except such code typically includes a
third option of not using threads at all. And so on.

And we haven't even started talking about the build process...

Basically, deciding to write to POSIX is a decision to trade away
performance on/to some platforms for portability to more
platforms. It's the same decision as deciding to write a web app,
except the tradeoffs are different. Each of the three solutions has a
different set of costs and benefits, and the correct choice will
depend on your application.
And if it's not Unix, what's a prop. workstation OS?

They've mostly died out since then. At the time, there were things
like Domain and VMS.
Yep. Maybe it would be best to reengineer the whole thing as ONE UI
spec+action language, incompatible with the current mess, compact, so
it can be implemented with minimum fuss. And most of all, I wouldn't
use a MARKUP language, as a real application is not text-based (at
least not as characteristic #1).

You mean most of the applications I run aren't real applications?
Right now, my desktop has exactly two GUI applications open on it - a
mixer and gkrellm. Everything else is characeter based. Hell, even my
window manager is character based.

I think you're right - a web standard designed for writing real
applications probably wouldn't start life as a markup for text. The
only thing I can think of that even tries is Flash, but it's
proprietary so I don't know much about it.

Care to tell me how you would design such a format if the goal were to
*not* lose any portability - which means it has to be possible to
design interfaces that work properly on character devices, things like
Palms three-color greyscale displays, and devices without pointers or
without keyboards, or even in an audio-only environment.

<mike
 
R

Rich Teer

I think you're right - a web standard designed for writing real
applications probably wouldn't start life as a markup for text. The
only thing I can think of that even tries is Flash, but it's

What about Java?

--
Rich Teer, SCNA, SCSA, OpenSolaris CAB member

President,
Rite Online Inc.

Voice: +1 (250) 979-1638
URL: http://www.rite-group.com/rich
 
C

Chris Head

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

John said:
John said:

HTML:
[QUOTE]
It can be made much faster. There will always be a delay since
messages have to be downloaded, but with a fast connection and a good
design, the delay will be very very small and the advantages are big.[/QUOTE]

What advantages would those be (other than access from 'net cafes, but
see below)?[/QUOTE]


And workplaces. Some people have more then one computer in the house. My
partner can check her email when I had her over the computer. When I
want to check my email when she is using it, I have to change the
session, fire up Thunderbird (which eats away 20M), and change the
session back.

[ .. ][/QUOTE]

Hmm. That would just be a matter of preference. Personally I moved my
Thunderbird profile into a shared directory and pointed everyone at it.
Now only one login session can run Thunderbird at a time, but any login
can see everyone's mailboxes.
[QUOTE]
Most people who use Thunderbird, yes. Different with OE, I am sure. With
a thin client *everybody*.[/QUOTE]

True. As a programmer I don't usually think about the people who never
download updates. The way I look at it, if somebody doesn't have the
latest version, they shouldn't be complaining about a bug. I guess thin
clients could be taken to mean you have a very light-weight auto-update
system ;)
[QUOTE]
But technically the UI (whether bloated or not) can be cached, and with
Ajax/Frames, etc. there is not really a need to refresh the entire page.
With smarter techniques (like automatically zipping pages), and
techniques like transmitting only deltas (Google experimented with this
some time ago) and better and faster rendering, the UI could be as fast
as a normal UI.

Isn't the UI in Thunderbird and Firefox created using JavaScript and
XML? Isn't how future UIs are going to be made?[/QUOTE]

I believe it is. I'm not sure if it's a good idea, but that's neither
here nor there.
[QUOTE]
Maybe because a lot of users aren't really heavy users. A nice example
(IMO) of a web client that works quite good: webmessenger (
http://webmessenger.msn.com/ ). It has been some time since I used it
the last time, but if I recall correctly I hardly noticed that I was
chatting in a JavaScript pop up window.[/QUOTE]

Haven't ever needed to use that program.
[QUOTE]
I rather have my email stored locally :-) But several webmail services
offer a form to download email.[/QUOTE]

I've not seen a service that allows that. Sounds nice.
[QUOTE]
I read sometime ago that about 1/3 of traffic consists out of bittorrent
traffic... If the Internet gets congested, new techniques are needed,
like mod_gzip on every server, a way to transfer only deltas of webpages
if an update occured (like Google did some time ago). Better handling of
RSS (I have the impression that there is no "page has not been
modified" thing like with HTML, or at least I see quite some clients
fetch my feed every hour, again and again).
[/QUOTE]

Eventually you reach the point where it's not bandwidth any more, it's
server load. All these things like mod_gzip, deltas, and so on add
server load.

As to the point about "page not modified", it's not in the HTML spec,
it's in the HTTP spec. RFC2616 (HTTP1.1) defines an "If-Modified-Since"
header a client may send to the server indicating that it has a cached
copy of the page at that date. If the page has not changed, the server
should send HTTP 304 (not modified) with no content. For best results
(due to clock mismatches etc), the client should set the
If-Modified-Since header to the value of the Last-Modified header sent
by the server when the page was first requested and cached.

I think we can agree that in some cases, Webmail is better, and in
others, clients are better. Much of this will be personal preference,
and I would like to see ISPs offering both methods of accessing e-mail
(as mine in fact does - POP3 and Webmail).

Chris
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (MingW32)

iD8DBQFDELff6ZGQ8LKA8nwRApxiAKDBU2R5KYAhp/4MJDoLlrbC5hWpLgCeNbnh
YK2tCasrMOY3SaUV1gMtZdg=
=N2wk
-----END PGP SIGNATURE-----
 
M

Mike Meyer

Ulrich Hobelmann said:
I had JS off for a long time, but now so many websites expect it, and
even make browsing more convenient, that I grudgingly accepted it. ;)

I've turned it on because I'm using an ISP that requires me to log in
via a javascript only web page. They have a link that claims to let
non-JS browsers log in, but it doesn't work. My primary browser
doesn't support JavaScript or CSS, and is configured with images
turned off, mostly because I want the web to be fast. What it does
have is the ability to launch one of three different external browsers
on either the current page or any link on the page, so those faciities
are a few keystrokes away. The default browser on my mac has all that
crap turned on, but I dont have anything I consider either import or
sensitive on it, and the only client who has data on it considers such
thinsg an acceptable risk.
Yep. Filtering is the future in the overloaded world.

And to the best of my knowledge, Google filters out content that my
primary desktop browser can't display.

<mike
 
M

Mike Meyer

Not a Windows solution, but I find the 'screen' utility invaluable as
I can have my email, news, and an editor open in different screens
and then when I need to move to a different machine, I can simply
detach and reattach screen without disturbing anything that
might be running.

For a more portable solution, check out VNC.

<mike
 
X

Xah Lee

previously i've made serious criticisms on Python's documentations
problems.
(see http://xahlee.org/perl-python/re-write_notes.html )

I have indicated that a exemplary documentation is Wolfram Research
Incorporated's Mathematica language. (available online at
http://documents.wolfram.com/mathematica/ )

Since Mathematica is a proprietary language costing over a thousand
dollars and most people in the IT industry are not familiar with it, i
like to announce a new discovery:

this week i happened to read the documentation of Microsoft's
JavaScript. See
http://msdn.microsoft.com/library/en-us/script56/html/js56jsconjscriptfundamentals.asp

This entire documentary is a paragon of technical writing. It has
clarity, conciseness, and precision. It does not abuse jargons, it
doesn't ramble, it doesn't exhibit author masturbation, and it covers
its area extremely well and complete. The documentation set are very
well organized into 3 sections: Fundamentals, Advanced, Reference. The
tutorial section “fundamentals†is extremely simple and to the
point. The “advanced†section gives a very concise yet easyto read
on some fine details of the language. And its language reference
section is complete and exact.

I would like the IT industry programers and the OpenSource fuckheads to
take note of this documentation so that you can learn.

Also, this is not the only good documentation in the industry. As i
have indicated, Mathematica documentation is equally excellent. In
fact, the official Java documentation (so-called Java API by Sun
Microsystems) is also extremely well-written, even though that Java the
language is unnecessarily very complex and involves far more technical
concepts that necessitate use of proper jargons as can be seen in their
doc.

A additional note i like to tell the OpenSource coding morons in the
industry, is that in general the fundamental reason that Perl, Python,
Unix, Apache etc documentations are extremely bad in multiple aspects
is because of OpenSource fanaticism. The fanaticism has made it that
OpenSource people simply became UNABLE to discern quality. This
situation can be seen in the responses of criticisms of OpenSource
docs. What made the situation worse is the OpenSource's mantra of
“contribution†— holding hostile any negative criticism unless
the critic “contributed†without charge.

Another important point i should point out is that the OpenSource
morons tend to attribute “lack of resources†as a excuse for their
lack of quality. (when they are kicked hard to finally admit that they
do lack quality in the first place) No, it is not lack of resources
that made the OpenSource doc criminally incompetent. OpenSource has
created tools that take far more energy and time than writing manuals.
Lack of resource of course CAN be a contribution reason, along with
OpenSource coder's general lack of ability to write well, among other
reasons, but the main cause as i have stated above, is OpenSource
fanaticism. It is that which have made them blind.

PS just to note, that my use of OpenSource here do not include Free
Software Foundation's Gnu's Not Unix project. GNU project in general
has very excellent documentation. GNU docs are geeky in comparison to
the commercial entity's docs, but do not exhibit jargon abuse,
rambling, author masturbation, or hodgepodge as do the OpenSource ones
mentioned above.

Xah
(e-mail address removed)
∑ http://xahlee.org/
 
R

Rich Teer

On Sat, 27 Aug 2005, Xah Lee wrote:

His usual crap.

___________________
/| /| | |
||__|| | Please do |
/ O O\__ NOT |
/ \ feed the |
/ \ \ trolls |
/ _ \ \ ______________|
/ |\____\ \ ||
/ | | | |\____/ ||
/ \|_|_|/ \ __||
/ / \ |____| ||
/ | | /| | --|
| | |// |____ --|
* _ | |_|_|_| | \-/
*-- _--\ _ \ // |
/ _ \\ _ // | /
* / \_ /- | - | |
* ___ c_c_c_C/ \C_c_c_c____________

--
Rich Teer, SCNA, SCSA, OpenSolaris CAB member

President,
Rite Online Inc.

Voice: +1 (250) 979-1638
URL: http://www.rite-group.com/rich
 
M

Mike Meyer

Rich Teer said:
What about Java?

Using HTML, I can build applications that work properly on anything
from monochrome terminals to the latest desktop box. Is there a
UI toolkit for Java that's that flexible?

<mike
 
U

Ulrich Hobelmann

Mike said:
You missed the point: How are you going to provide native clients for
platforms you've never heard of?

Who says I have to? With open protocols, everybody can. I know many
platforms that STILL don't have a browser that would work with most
websites out there. They all have NNTP, SMTP and POP clients.
Text-mode, GUI-mode, your choice.
I'm not arguing that native UI's aren't better. I'm arguing that web
applications provide more portability - which is important for some
applications and some developers.

Like Java provides more portability. Unless you ran NetBSD in 2003
(there was no Java back then that worked for me), hm, IRIX?, Plan9, BeOS
the list goes on... LOTS of platforms don't have the manpower to
develop a client that renders all of the huge bloated wagonload of W3C
tech that was only designed for *markup* from the beginning.
Why do you care if they are GUI or not? If you need to provide the
application for them, you need to provide the application for
them. Them not being GUI just means you can't try and use a standard
GUI library. It also means you have to know what you're doing when you
write HTML so that it works properly in a CLUI. But your native app
would have to have a CLUI anyway.

Ok, UI then ;)
I don't care what UIs people like and use.
Nobody. I doubt there's a fully standards compliant web browser

Nobody, huh? Then how could you run just ANY web application on those
platforms?
available for *any* platform, much less any non-trivial collection of
them. You write portable web applications to the standards, and design
them to degrade gracefully. Then you go back and work around any new

Oh right, they degrade gracefully. So without Javascript or cookies
(the former is often not implemented) you get a HTML page with an error
notice -- if you're lucky.

A server AND client for a simple protocol designed for its task (i.e.
not FTP for instance) can be implemented in much less work than even
designing even part of a web application backend that does that kind of
stuff. Plus you're not bound by HTTP request structure, you can use
publish/subscribe or whatever communication style you want for efficiency.
bugs you've uncovered in the most popular browsers - which
historically are among the *worst* at following standards.


You think you're done. A lot of developers think you can stop with the
first one or two. You're all right for some applications. For others,
you're not. Personally, I like applications that run on all the
platforms I use - and your set doesn't cover all three of those
systems.

Ok, I'd be interested to hear what those are. VMS, RiscOS, Mac OS 9...?
You know, the same kind of advice applies to writing portable web
apps. Except when you do it with HTML, "portability" means damn near
any programmable device with a network interface, not some relatively
small fraction of all deployed platforms.

Only that even years ago lots of even small platforms would run X, but
even today MANY platforms don't run a browser with XHTML/HTML4+JS+CSS
(well, okay, the CSS isn't that important).
There were standards before POSIX. They didn't cover everything people
wanted to do, or didn't do them as fast as the OS vendor wanted. So
Unix vendors added their own proprietary extensions, which software
vendors had to use to get the best performance out of their
applications, which they had to do if they wanted people to buy/use
them.

Performance? Hm, like epoll/kqueue vs select? Can't think of examples
here, but I pretty much only know BSD/POSIX.
That's still going on - people are adding new functionality that isn't
covered by POSIX to Unix systems all the time, or they are adding
alternatives that are better/faster than the POSIX version, and there
are lots of things that applications want to do that simply aren't
covered by POSIX. And not all implementations are created equal. Some
platforms malloc's provide - to be polite - less than optimal
performance under conditions real applications encounter, so those
applications conditionally use different malloc implementations. The

Well, OF COURSE malloc is ONE general purpose function that HAS to carry
some overhead. I routinely use my frontend(s) for it, to cluster
allocations locally (for caching and alloc performance). Matter of a
100 LOC usually. No problem at all.

If a system's scheduler, or select implementation sucks, though, I'd
complain to the vendor or simply abandon the platform for another.
Competition is good :)
same thing applies to threads, except such code typically includes a
third option of not using threads at all. And so on.

Well, who doesn't do threads after several years of POSIX IMHO can't be
taken seriously. Ok, the BSDs didn't until recently, but those are
volunteer projects.
And we haven't even started talking about the build process...

If the libraries are installed, just build and link it (if you use
standard C, POSIX + 3rd party libs that do the same). If not, then
tough luck -- it couldn't even run in theory then.
Basically, deciding to write to POSIX is a decision to trade away
performance on/to some platforms for portability to more
platforms. It's the same decision as deciding to write a web app,
except the tradeoffs are different. Each of the three solutions has a
different set of costs and benefits, and the correct choice will
depend on your application.

I'd like to hear about those performance problems... If someone can't
make the standard calls efficient, they should leave the business and
give their customers Linux or BSD.
They've mostly died out since then. At the time, there were things
like Domain and VMS.

Never heard of Domain, but VMS is called NT/2000/XP/2003/Vista now (with
some enhancements and a new GUI). ;)
You mean most of the applications I run aren't real applications?
Right now, my desktop has exactly two GUI applications open on it - a
mixer and gkrellm. Everything else is characeter based. Hell, even my
window manager is character based.

I meant not using text elements. Of course it includes text, in your
case predominantly. But even most curses clients have other elements
sometimes, like links. A standard spec language could cater easily for
text clients, but a text language like HTML has a harder time to cater
for good GUI clients. Most apps I use have buttons and menus that I
wouldn't want to express with markup (and web pages that try to do that
most invariably suck).
I think you're right - a web standard designed for writing real
applications probably wouldn't start life as a markup for text. The
only thing I can think of that even tries is Flash, but it's
proprietary so I don't know much about it.

Java has been mentioned in the other response, but there's also all
other kinds of application frameworks. Only XUL is markup based, with
the effect that there's almost no text at all between the markup tags I
guess ;)
Care to tell me how you would design such a format if the goal were to
*not* lose any portability - which means it has to be possible to
design interfaces that work properly on character devices, things like
Palms three-color greyscale displays, and devices without pointers or
without keyboards, or even in an audio-only environment.

Colors can be sampled down. Even the new Enlightenment libs do that
(they say). For mapping a GUI client to a text client, ok, tough. Face
it, lots of things just can't be expressed in pure text. Images, PDF
viewing, video, simulation with graphical representations...

Pointers could be added to any kind of machine, and even without it, you
could give it a gameboy-style controller for cursor movement (i.e. arrow
keys).

I'm just not talking about a language for audio- and text-mode clients ;)
 
M

Mike Meyer

Ulrich Hobelmann said:
Who says I have to? With open protocols, everybody can. I know many
platforms that STILL don't have a browser that would work with most
websites out there. They all have NNTP, SMTP and POP
clients. Text-mode, GUI-mode, your choice.

The people who are distributing applications via the web. You want to
convince them to quit using web technologies, you have to provide
something that can do the job that they do.
Like Java provides more portability. Unless you ran NetBSD in 2003
(there was no Java back then that worked for me), hm, IRIX?, Plan9,
BeOS the list goes on... LOTS of platforms don't have the manpower to
develop a client that renders all of the huge bloated wagonload of W3C
tech that was only designed for *markup* from the beginning.

I'm still waiting for an answer to that one - where's the Java toolkit
that handles full-featured GUIs as well as character cell
interfaces. Without that, you aren't doing the job that the web
technologies do.
Nobody, huh? Then how could you run just ANY web application on those
platforms?

The same way you write POSIX applications in the face of buggy
implementations - by working around the bugs in the working part of
the implementation, and using conditional code where that makes a
serious difference.
Oh right, they degrade gracefully. So without Javascript or cookies
(the former is often not implemented) you get a HTML page with an
error notice -- if you're lucky.

You left off the important part of what I had to say - that the
application be written by a moderately competent web author.
A server AND client for a simple protocol designed for its task
(i.e. not FTP for instance) can be implemented in much less work than
even designing even part of a web application backend that does that
kind of stuff.

Well, if it that easy (and web applications are dead simple), it
should be done fairly frequently. Care to provide an example?
Ok, I'd be interested to hear what those are. VMS, RiscOS, Mac OS 9...?

FreeBSD, OS X and a Palm Vx.
If a system's scheduler, or select implementation sucks, though, I'd
complain to the vendor or simply abandon the platform for
another. Competition is good :)

Complaining to the vendor doesn't always get the bug fixed. And
refusing to support a platform isn't always an option. Sometimes, you
have to byte the bullet and work around the bug on that platform.
Well, who doesn't do threads after several years of POSIX IMHO can't
be taken seriously. Ok, the BSDs didn't until recently, but those are
volunteer projects.

Not all platforms are POSIX. If you're ok limiting your application to
a small subset of the total number of platforms available, then
there's no advantage to using web technologies. Some of us aren't
satisifed with that, though.
If the libraries are installed, just build and link it (if you use
standard C, POSIX + 3rd party libs that do the same). If not, then
tough luck -- it couldn't even run in theory then.

You have to have the right build tool installed. Since you use BSD,
you've surely run into typing "make" only to have it blow up because
it expects gmake.
I meant not using text elements. Of course it includes text, in your
case predominantly. But even most curses clients have other elements
sometimes, like links. A standard spec language could cater easily
for text clients, but a text language like HTML has a harder time to
cater for good GUI clients. Most apps I use have buttons and menus
that I wouldn't want to express with markup (and web pages that try to
do that most invariably suck).

Well, marking up text is a pretty poor way to describe a UI - but
anything that is going to replace web technologies has to have a
media-independent way to describe the UI. One of the things that made
the web take off early was that anyone with a text editor could create
web pages. I think that's an important property to keep - you want the
tools that people use to create applications be as portable/flexible
as the applications. Since most GUI's are written in some programming
language or another, and most programming langauges are still flat
text, a GUI description as flat text exists for most GUIs, so this
requirement isn't a handicap.
Java has been mentioned in the other response, but there's also all
other kinds of application frameworks. Only XUL is markup based, with
the effect that there's almost no text at all between the markup tags
I guess ;)

You don't have to guess - finding examples of XUL isn't hard at all. I
think XML gets used in a lot of places where it isn't appropriate. One
of the few places where it is appropriate is where you want a file
format that lots of independent implementations are going to be
reading. This could well be one of those times.
Colors can be sampled down. Even the new Enlightenment libs do that
(they say). For mapping a GUI client to a text client, ok, tough.
Face it, lots of things just can't be expressed in pure text. Images,
PDF viewing, video, simulation with graphical representations...

Applications aren't one of those things. Even applications that work
with those things don't need GUI interfaces.
Pointers could be added to any kind of machine, and even without it,
you could give it a gameboy-style controller for cursor movement
(i.e. arrow keys).

Yeah, if you're willing to tell your potential users "Go out and buy
more hardware". If you're Microsoft, you probably do that with the
addendum "from us". Not being Microsoft or a control freak, I want
applications that work with whatever the users already have.
I'm just not talking about a language for audio- and text-mode clients ;)

Then you're not talking about replacing HTML et. al.

<mike
 
U

Ulrich Hobelmann

Mike said:
I'm still waiting for an answer to that one - where's the Java toolkit
that handles full-featured GUIs as well as character cell
interfaces. Without that, you aren't doing the job that the web
technologies do.

Where is the text-mode browser that would even run part of the web apps
I use, like home-banking, all web forums, server configuration
interfaces, etc.? I think we should leave both these questions open.
(In fact, as to UIs using Java, blech! Haven't seen a really good one...)
The same way you write POSIX applications in the face of buggy
implementations - by working around the bugs in the working part of
the implementation, and using conditional code where that makes a
serious difference.

But as soon as some user of platform 54 tries your website, she'll
encounter some weird behavior without even knowing why. And maybe so
will you, especially if you don't have that platform there for testing.
I don't understand how this web thing changes anything... With POSIX
at least you have a real bug-report for the guy responsible for it. If
a platform keeps being buggy, with no fixes coming, screw them. Every
user will see that sooner or later, and these platforms die. Even
Windows is quite stable/reliable after 10+ years NT!
You left off the important part of what I had to say - that the
application be written by a moderately competent web author.

But if you can cater for all kinds of sub-platforms, then why not just
provide a CLI as well as those GUI interfaces, when we're duplicating
work to begin with? ;)

If it doesn't run without JS, then you lock out 90% of all alive
platforms (and maybe 1% of all alive users :D) anyway.
Well, if it that easy (and web applications are dead simple), it
should be done fairly frequently. Care to provide an example?

We have all the web standards, with various extensions over the years.
Some FTP clients even don't crash if they see that some server doesn't
yet support the extension from RFC XY1234$!@. Then there's tons of
inter-application traffic in XML already, growing fast. Then there are
s-expressions (Lisp XML if you want). Then probably thousands of ad-hoc
line-based text protocols, but I don't know how well they can be
extended. There's CORBA. Most web standards are simple, at least if
you would subtract the weird stuff (and IMHO there should be new
versions of everything with the crap removed). XML is somewhat simple,
just hook libxml.

There's NNTP. There's RSS. There's Atom. The latter two emerged quite
painlessly, even though you could maybe use some website for what they
provide. But this way you have lots of clients for lots of platforms
already.
FreeBSD, OS X and a Palm Vx.

Didn't I say, a GUI for the Mac, for X11, and Windows? That only leaves
out the Palm. I heard they aren't too hard to program for, either. But
I haven't heard of a really decent browser for pre-OS5 PalmOS (not sure
about OS5).
Complaining to the vendor doesn't always get the bug fixed. And
refusing to support a platform isn't always an option. Sometimes, you
have to byte the bullet and work around the bug on that platform.

Sure, but you can tell your customers that unfortunately their system
vendor refuses to fix a bug and ask THEM to ask that vendor. Boy, will
they consider another platform in the future, where bugs do get fixed ;)
Not all platforms are POSIX. If you're ok limiting your application to
a small subset of the total number of platforms available, then
there's no advantage to using web technologies. Some of us aren't
satisifed with that, though.

Sure. You have to look where your users are. Chances are that with
obscure systems they can't use most web-apps either.
You have to have the right build tool installed. Since you use BSD,
you've surely run into typing "make" only to have it blow up because
it expects gmake.

With 3rd party-stuff, yes. The little that I've written yet compiled
immediately (with pmake and gmake), except for C99 (the FreeBSD I tried
had gcc 2.95 installed). But now I write all C as pre-99 anyway, looks
cleaner, IMHO.
Well, marking up text is a pretty poor way to describe a UI - but
anything that is going to replace web technologies has to have a
media-independent way to describe the UI. One of the things that made
the web take off early was that anyone with a text editor could create
web pages. I think that's an important property to keep - you want the
tools that people use to create applications be as portable/flexible
as the applications. Since most GUI's are written in some programming
language or another, and most programming langauges are still flat
text, a GUI description as flat text exists for most GUIs, so this
requirement isn't a handicap.

That's true, though I think the future of development lies in overcoming
that program-code-as-text thing (NOT visual programming, just
tool-based, structured). Smalltalk did it decades ago.
You don't have to guess - finding examples of XUL isn't hard at all. I
think XML gets used in a lot of places where it isn't appropriate. One
of the few places where it is appropriate is where you want a file
format that lots of independent implementations are going to be
reading. This could well be one of those times.

Maybe, but for applications that aren't predominantly concerned about
text, I'd really rather use a structured data type (like s-expressions),
not text markup like XML. For hypertext, XHTML is fine, though, if a
bit verbose.

[follow-up set to comp.unix.programmer]

(I just noticed I replaced my sig with something web-related yesterday.
This is pure coincidence :D)
 
A

axel

For a more portable solution, check out VNC.

I know... but it is a bugger to set up and I believe it is no longer
freeware (if it ever was), and it does not have the stark simplicity
which screen has... I only need to have a compiled version of screen
on the machine on which I do most of my work and be able to ssh/telnet
to that machine without involving any additional software installations
on other machines.

Axel
 
H

Henry Law

I wonder could you guys stop cross-posting this stuff to
comp.lang.perl.misc? The person who started this thread - a
well-known troll - saw fit to post it there, and now all your posts
are going there too.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,261
Messages
2,571,041
Members
48,769
Latest member
Clifft

Latest Threads

Top