Jargons of Info Tech industry

D

Denis Kasak

T said:
Wasn't the point... I never said they were. HTML is at version 4.0(I
think?) now, AND we've added extra layers of stuff you can use
alongside of it. The internet is a free-flowing evolving place... to
try to protect one little segment like usenet from ever evolving is
just ensuring it's slow death, IMHO.

That's all...

HTML is at version 4.01 to be precise, but that is precisely off-topic.
This discussion has being going on for long enough. It is held in a
large crosspost to newsgroups that have nothing to do with HTML or the
evolution of Usenet. The bottom line is that most Usenet users like it
the way it is now, and it certainly serves it's purpose. The Web and
Usenet should not mix as they are two distinct entities and merging them
would lose some of their distinctive qualities, making the Internet a
poorer place.

I suggest letting the matter rest or taking it to a more appropriate
newsgroup.

-- Denis
 
U

Ulrich Hobelmann

John said:
So what do you want? An error page for every site that wants to set a
cookie?

No, the few sites where I actually have to log in to do anything useful,
when they're well-coded, tell me that they need cookies, and if I think
I like that website I make an exception entry for that site, allowing
cookies. Most sites just bombard you with useless, crap cookies (maybe
advertising), so they are silently ignored by my browser.

The only thing I hate is when I am directed to some website that needs
cookies, but doesn't tell me. A couple times I did a survey, wasting
maybe 10 minutes of my life for a good cause, and then there was an
error. Great! I guess that page needed cookies, but didn't bother to
tell me. Back button didn't work, either, so I just left that website.

OTOH, people who can't code can be fun, too, such as when you visit a
website and there are lots of PHP, Java, SQL, or ASP errors ;)
 
G

Gordon Burditt

The only thing I hate is when I am directed to some website that needs
cookies, but doesn't tell me. A couple times I did a survey, wasting
maybe 10 minutes of my life for a good cause, and then there was an
error. Great! I guess that page needed cookies, but didn't bother to
tell me. Back button didn't work, either, so I just left that website.

Some sites do much worse than that. If you have cookies off, they
cause an infinite redirect loop. Sometimes my browser manages to
detect this after a few minutes and shut it off, and sometimes it
doesn't (usually on different sites). I think I can manually get
out of this with the STOP button, but until I do, it likely causes
a lot of useless load on the web site.

Gordon L. Burditt
 
J

John W. Kennedy

CBFalconer said:
Chris Head wrote:

.... snip ...



Because the Lord High PoohBah (Bill) has so decreed. He has
replaced General bullMoose.

Not particularly his doing. SGI was using a Netscape plugin to
distribute and install operating-system patches when Billionaire
"Intelligent Design" Billy was still denying that TCP/IP had a future.

And there are places for web forums: public feedback pages, for example.
(Add RSS and/or e-mail and/or NNTP feeds for more advanced users.)
 
C

Chris Head

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

John said:
It can be made much faster. There will always be a delay since messages
have to be downloaded, but with a fast connection and a good design, the
delay will be very very small and the advantages are big.

What advantages would those be (other than access from 'net cafes, but
see below)?

[snip]
if A -> B, it doesn't say that B -> A :) I.e. that it works via HTML
doesn't mean it doesn't with a dedicated client ;-).

I live in Mexico, most people here rely on so called Internet cafes for
their connection, and even the use of a computer. For them Thunderbird
*doesn't work*.

This point I agree with. There are some situations - 'net cafes included
- - where thick e-mail clients don't work. Even so, see below.
Each has it's place. A bug in a thick client means each and everyone has
to be fixed. With a thin one, just one has to be fixed :-D.

True. However, if people are annoyed by a Thunderbird bug, once it's
fixed, most people will probably go and download the fix (the
Thunderbird developers really only need to fix the bug once too).
Depends on where your mailbox resides. Isn't there something called
MAPI? (I haven't used it myself, but I recall something like that).

IMAP. It stores the messages on the server. Even so, it only has to
transfer the messages, not the bloated UI. I concede that Webmail might
be just as fast when using a perfectly-designed Javascript/frames-driven
interface. In the real world, Webmail isn't (unfortunately) that perfect.

As I said above regarding 'net cafes:

If the Internet cafe has an e-mail client installed on their computers,
you could use IMAP to access your messages. You'd have to do a bit more
configuration than for Webmail, so it depends on the user I guess.
Personally I doubt my ISP would like me saving a few hundred megs of
e-mail on their server, while Thunderbird is quite happy to have 1504
messages in my Inbox on my local machine. If I had to use an Internet
cafe, I would rather use IMAP than Webmail.
Ah, yeah, wasn't that predicted to happen in like 2001?

Wasn't what predicted to happen? Congestion? It happens even today
(maybe it's the Internet, maybe it's the server, whatever...). Hotmail
is often pretty slow.
Also, unless you have some program that kills spam on the server, you
have to download all with Thunderbird. I remember a funny day when I got
2000 messages/hour due to a virus outbreak :-( With hotmail, if you have
100 new messages you download them when you read them. Or kill them when
you don't want to read.

Fortunately I'm not plagued by spam. I get around 150 messages per day.
Of those, about 140 are from a mailing list, 5 are personal, and 5 are
spam. I used to get about 100 messages per day of which 90 or so were
spam, but it suddenly stopped. To this day, I have not figured out why.
Nevertheless, I agree that not having to download all those messages is
one place where Webmail blows POP out of the water (but IMAP, which
could be a sort of "middle ground", doesn't suffer from this).

Chris
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (MingW32)

iD8DBQFDD0eM6ZGQ8LKA8nwRArpyAJwJ+W2Q2H2wZLrcNcj8Z70sCoBIswCfZZUV
DaaHKbqfADYKOWAE9APey7w=
=6Mmv
-----END PGP SIGNATURE-----
 
J

John Bokma

Denis Kasak said:
Agreed. This is actually your first post with which content I agree
totally. From your other posts I got the impression that you are one
of those people that are trying to make Usenet and WWW more similar to
one another.

No, not at all. My point is what I wrote above. But also, a lot of
functionality available on Usenet is also available on www. You don't
*have* to convert people to Usenet, and to warn them against a forum on
www. For day to day usage I experience hardly any difference. I mean, I am
not struggeling when I post a message on a web board. And if things take
time to download (for example a thread with many many pictures), I switch
to another tab and do something else (tab browsing is fun :). Each has its
place. I don't want www on Usenet, and I don't want Usenet on www.
 
J

John Bokma

Rich Teer said:
Heh. Quite the opposite, I reckon: it would get much better (higher
SNR)! :)

:-D. I recently contributed to a thread in which someone was afraid that
Usenet was going to die, because he had the impression there where less
people on it. There are more people on it compared to 20 years ago, when it
was about to die (like every 2 years).
 
J

John Bokma

Denis Kasak said:
Yes, they can, provided they are not properly coded. However, those
things only interact locally with the user and have none or very
limited interaction with the user on the other side of the line. As
such, they can hardly be exploitable.

Uhm... one post can affect a number of clients, hence quite exploitable.
On a good newsreader the memory use difference should be irrelevantly
small, even if one does not use the features. I would call that a
nitpicky argument.

Xnews - 10 M
Thunderbird - 20 M

There was a time I had only 128M in this computer :-D. And there was a time
I read news on a RISC OS machine. I guess the client was about 300 K (!).
Also, the risk in question is not comparable
because of the reasons stated above. The kind of risk you are talking
about happens with /any/ software.

True. The more code, the more possibilities on holes.
To stay away from that we shouldn't
have newsreaders (or any other software, for that matter) in the first
place.

telnet :p.
No, you missed the point. I am arguing that HTML is completely and
utterly /useless/ on Usenet.

But I beg to differ :). I can think of several *good* uses of HTML on
Usenet. But like I said, it will be abused. And you can't enforce a subset
of HTML.
Time spent for writing HTML in Usenet

But you are not going to *write* HTML, you let your client hide that. I
mean, it's not that hard to have a client turn *bold* into <strong>bold
posts is comparable to that spent on arguing about coding style or

Agreed, I have learned things from arguing on coding style, even adjusted
my style based on it.
writing followups to Xah Lee.

Ok, now there is something one shouldn't spent time on :)
It adds no further insight on a
particular subject,

Yes, it does. That's why for example figures, tables, and now and then
colours are used in scientific publications. ASCII art, now that's a huge
waste of time.
but _does_ add further delays, spam, bandwidth
consumation, exploits, and is generally a pain in the arse. It's
redundant.

I have to disagree. Mind, I am not saying that HTML *should* be used on
Usenet, I am happy with Usenet as it is, but I wouldn't call it useless nor
redundant.
 
J

John Bokma

Denis Kasak said:
And that is, in your opinion, completely comparable to running your
own, private client?

Oh, but if you want your own private client, feel free to come up with
one. I for one would welcome an XML interface to for example phpBB. (Not
sure if such a thing exists). So I agree with you for a part.

However, how many people, do *need* a kill file? Most boards have active
moderators. Also, in my experience, most boards are a tighter knit crowd
with less need for kill filing.
Is the admin obliged to install the mod?

No, but in my experience, they listen.
Is the admin
even reachable?

Of course.
You seem to be forgetting that we are mainly talking about end users
here

No, I am not. Most end users of those boards don't *require* what you
want, you look at a board from a programmers point of view. And hence,
as a programmer you *can* do such things.
who most probably will not have the sufficient expertise to do all

moreover he/she doesn't care.
that. And even if they do, it's still time consuming.

If I am not happy with my Usenet client, I have the same problem. I like
Xnews for example, but AFAIK it's closed source. I don't know of any
open source Windows client that comes close to Xnews.

It's time consuming because there is (yet) no need for it. When I
started to use Usenet there where only a handful of clients (IIRC), nn
and another one (rn?) are the only ones that I can recall.

Like I said, it's not that hard to create a SOAP/XML-RPC interface to,
for example phpBB. Maybe it's already there. The tools are there. And a
next step could be to create a wrapper, one end acts as a local nntp
server, the other end talks using XML with phpBB.

Once that's written, you could use (probably within limits) a Usenet
client :)

Now this seems to be cross posted in several comp.lang groups. Anyone?
 
J

John Bokma

T Beck said:
John said:
They are not additions to HTML, like PNG is no addition to HTML, or
wav, mp3, etc.
[snip]

Wasn't the point... I never said they were.

"HTML itself has grown. We've also added Javascript"

I read that as: JavaScript is an addition to HTML.
HTML is at version 4.0(I
think?)

4.01? And I think it will stay there, since XML seems to be the future.
now, AND we've added extra layers of stuff you can use
alongside of it. The internet is a free-flowing evolving place... to
try to protect one little segment like usenet from ever evolving is
just ensuring it's slow death, IMHO.

And if so, who cares? As long as people hang out on Usenet it will stay.
Does Usenet need al those extra gimmicks? To me, it would be nice if a
small set would be available. But need? No.

The death of Usenet has been predicted for ages. And I see only more and
more groups, and maybe more and more people on it.

As long as people who have to say something sensible keep using it, it
will stay.
 
J

John Bokma

Ulrich Hobelmann said:
No, the few sites where I actually have to log in to do anything
useful, when they're well-coded, tell me that they need cookies, and
if I think I like that website I make an exception entry for that
site, allowing cookies. Most sites just bombard you with useless,
crap cookies (maybe advertising), so they are silently ignored by my
browser.

Delete them after each session automatically, except the ones on the
exception list. You are clearly not an average user, so your usage pattern
probably only messes up the stats they obtain via cookies anyway.

I have long ago given up on manually accepting each and every cookie, and
trying to guess it's purpose.
 
J

John Bokma

Chris Head said:
John said:
HTML:
[QUOTE]
It can be made much faster. There will always be a delay since
messages have to be downloaded, but with a fast connection and a good
design, the delay will be very very small and the advantages are big.[/QUOTE]

What advantages would those be (other than access from 'net cafes, but
see below)?[/QUOTE]

And workplaces. Some people have more then one computer in the house. My
partner can check her email when I had her over the computer. When I
want to check my email when she is using it, I have to change the
session, fire up Thunderbird (which eats away 20M), and change the
session back.

[ .. ]
[QUOTE]
True. However, if people are annoyed by a Thunderbird bug, once it's
fixed, most people will probably go and download the fix (the
Thunderbird developers really only need to fix the bug once too).[/QUOTE]

Most people who use Thunderbird, yes. Different with OE, I am sure. With
a thin client *everybody*.
[QUOTE]
IMAP. It stores the messages on the server. Even so, it only has to
transfer the messages, not the bloated UI.[/QUOTE]

But technically the UI (whether bloated or not) can be cached, and with
Ajax/Frames, etc. there is not really a need to refresh the entire page.
With smarter techniques (like automatically zipping pages), and
techniques like transmitting only deltas (Google experimented with this
some time ago) and better and faster rendering, the UI could be as fast
as a normal UI.

Isn't the UI in Thunderbird and Firefox created using JavaScript and
XML? Isn't how future UIs are going to be made?
[QUOTE]
I concede that Webmail
might be just as fast when using a perfectly-designed
Javascript/frames-driven interface. In the real world, Webmail isn't
(unfortunately) that perfect.[/QUOTE]

Maybe because a lot of users aren't really heavy users. A nice example
(IMO) of a web client that works quite good: webmessenger (
http://webmessenger.msn.com/ ). It has been some time since I used it
the last time, but if I recall correctly I hardly noticed that I was
chatting in a JavaScript pop up window.
[QUOTE]
As I said above regarding 'net cafes:

If the Internet cafe has an e-mail client installed on their
computers, you could use IMAP to access your messages. You'd have to
do a bit more configuration than for Webmail, so it depends on the
user I guess. Personally I doubt my ISP would like me saving a few
hundred megs of e-mail on their server, while Thunderbird is quite
happy to have 1504 messages in my Inbox on my local machine. If I had
to use an Internet cafe, I would rather use IMAP than Webmail.[/QUOTE]

I rather have my email stored locally :-) But several webmail services
offer a form to download email.
[QUOTE]
Wasn't what predicted to happen? Congestion? It happens even today
(maybe it's the Internet, maybe it's the server, whatever...). Hotmail
is often pretty slow.[/QUOTE]

I read sometime ago that about 1/3 of traffic consists out of bittorrent
traffic... If the Internet gets congested, new techniques are needed,
like mod_gzip on every server, a way to transfer only deltas of webpages
if an update occured (like Google did some time ago). Better handling of
RSS (I have the impression that there is no "page has not been
modified" thing like with HTML, or at least I see quite some clients
fetch my feed every hour, again and again).
 
U

Ulrich Hobelmann

John Bokma wrote:
[cookies]
Delete them after each session automatically, except the ones on the
exception list.

But why? I simply don't even take them, except my exception list ;)

Some people have all cookies turned off.
You are clearly not an average user, so your usage pattern
probably only messes up the stats they obtain via cookies anyway.

I have long ago given up on manually accepting each and every cookie, and
trying to guess it's purpose.

Exactly. That's why I don't do that. I just block them all, except
some good sites where I want to be auto-logged in.
 
R

Rich Teer

And workplaces. Some people have more then one computer in the house. My
partner can check her email when I had her over the computer. When I

I know this is entirely inappropriate and OT, but am I th eonly person
who reads that sentence with a grin? The idea of my wife checking her
email while I'm "doing her" over my computer is most amusing! :)

--
Rich Teer, SCNA, SCSA, OpenSolaris CAB member

President,
Rite Online Inc.

Voice: +1 (250) 979-1638
URL: http://www.rite-group.com/rich
 
B

Brian Raiter

I know this is entirely inappropriate and OT, [...]

Yeah -- unlike the rest of this misbegotten thread, which is right
bang on-topic for all five newsgroups and is not suffering at all from
topic drift, no not in the least.

b
 
M

Mike Meyer

John Bokma said:

Neither one is installed by default on the systems in question. Both
are available via the system packaging tools.

fetch is installed on FreeBSD, but all it does is download the
contents of a URL - it doesn't render them.

<mike
 
M

Mike Meyer

And links. And cookies. And any kind of external site or local
file access. And browser history.

That depends on whether you're trying to keep an HTML message from
doing anything nasty (like revealing that you read it) when you render
it, or to make sure it *never* does anything nasty, no matter what you
do with the message.

If all you want is the former - which is what the OP asked for, and I
was replying to - then nothing on the list you gave is required. Some
of the things you list are a danger even without HTML; most modern
news/mail readers will follow links in flat ascii.

<mike
 
M

Mike Meyer

Ulrich Hobelmann said:
No, the few sites where I actually have to log in to do anything
useful, when they're well-coded, tell me that they need cookies, and
if I think I like that website I make an exception entry for that
site, allowing cookies. Most sites just bombard you with useless,
crap cookies (maybe advertising), so they are silently ignored by my
browser.

I believe (but I'm not sure) that some releases of apache could be
configured in such a way that they would start using cookies without
you having to turn them on.
The only thing I hate is when I am directed to some website that needs
cookies, but doesn't tell me. A couple times I did a survey, wasting
maybe 10 minutes of my life for a good cause, and then there was an
error. Great! I guess that page needed cookies, but didn't bother to
tell me. Back button didn't work, either, so I just left that website.

Try turning off JavaScript (I assume you don't because you didn't
complain about it). Most of the sites on the web that use it don't
even use the NOSCRIPT tag to notify you that you have to turn the
things on - much less use it to do something useful.

Sturgeon's law applies to web sites, just like it does to everything
else.

<mike
 
M

Mike Meyer

John Bokma said:
This can be designed much better by using iframes, maybe even Ajax.

Definitely with Ajax. That's one of the things it does really well.
Because it works?

Because you can - if you know how to use HTML properly - distribute
your application to platforms you've never even heard of - like the
Nokia Communicator.

I started writing web apps when I was doing internal tools development
for a software development company that had 90+ different platform
types installed inhouse. It was a *godsend*. By deploying one
well-written app, I could make everyone happy, without having to do
versions for the Mac, Windows, DOS (this was a while ago), getting it
to compile on umpteen different Unix version, as well as making it
work on proprietary workstation OS's.

Of course, considering the state of most of the HTML on the web, I
have *no* idea why most of them are doing this.

<mike
 
M

Mike Meyer

John Bokma said:
It's time consuming because there is (yet) no need for it. When I
started to use Usenet there where only a handful of clients (IIRC), nn
and another one (rn?) are the only ones that I can recall.

By the time nn was out, there were a number of radically diffrent
alternatives. The original news client (not NNTP - it predated that)
was readnews. rn was the first alternative to gain any popularity. By
the time it came out, there were alterntiave curses-based readers like
notes and vnews. By the time nn came out, there were even X-based news
readers available like xrn and xvnews.

It may be that the site you were at only offered a few readers. But
that's a different issue.

All of this is from memory, of course - and may well be wrong.

<mike
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,815
Messages
2,569,703
Members
45,494
Latest member
KandyFrank

Latest Threads

Top