Future reuse of code

R

Roedy Green

I prefer assembly language to everything...what does that mean?

The great appeal of writing the core of my Forth/Abundance interpreter
in assembler was that I knew exactly what was going on in side down to
the bit level. Nothing was happening I did not know about. This
desire for microcontrol and perfection comes best from writing in
Assembler. The only idiots you have to swear at are the folk who
designed the instruction set.
 
D

Dr Engelbert Buxbaum

Paul Hsieh wrote:

COBOL and Pascal (the other groups you crossposted this message to)
will decrease in usage over time, not increase. There is absolutely
no new serious development being done in either language. In 15
years, Pascal will probably be completely dead, and the COBOL
community will be reduced even from the size of today's community
(human mortality alone will guarantee this.)

This may be true for COBOL, but Pascal is very much alive and kicking,
in the form of Delphi/Kylix. I am currently writing Kylix software, most
of the cutting edge routines (that do the real work rather than the user
interface) are straight plug-ins of 15 year old Turbo-Pascal code. Now
with Borland going for cross-platform (Windozze/Unix) compatibility
there is no reason why Pascal should die in the foreseable future.
 
G

goose

Howard Brazee said:
Greatest number of machines.

I don't think someone writing a business application cares about, say
stoplights. But stoplights are machines with computer programs in them.

So the question should be - which language gives me an advantage in reaching
more prospective paying customers for my product with the least cost to me?

the original statement (which was snipped) was
----
OK, I exaggerated. But it runs on a lot more platforms than anything else
without costing more.
----

I have already pointed out that this is not true.
If my application runs best on big iron, that may be CoBOL. (Good for me, that
is my native programming language).
If my application is to show me on my hand held which golf club I need for my
next shot (according to my past history, a map of the course, and the GPS
satellite), then CoBOL isn't a good choice.

But if I am wanting to create a program that all of the students in a university
can use to interface with the campus's computers - I can assume most of them can
already run my XML and Java code.

and yet creating a std C program would not only get you that, it would also
get you a fairly snappy application *and* leave you open in the future
to be able to support those people who have machines that are not
capable of running java (certain designer palmtop-types) to *also*
interface with the campus machines.

java doesn't *buy* you anything extra in terms of portability.
The only relatively *portable* way I can think of is when writing
applets for web-pages (note: /relatively/). as long as the browser
has a java runtime environment, of course.

Java does have its advantages. Portability isn't one of them.

hth
goose,
I feel very strongly about the "while" loop. I suggest we take
it hostage to demand the release of the "goto" ;-)
 
M

Marco van de Voort

There are 400,000,000 reasons why ALL procedural languages (including COBOL
and PASCAL) should "die" in the not-too-distant future. (I don't know your
definition of "foreseeable" but mine is around 20 years...)

Really? Please name and discuss them.
They are the number of people who access the internet every day. (For the
sake of this argument, I'll call them the "user base"...) They are not about
to become "computer programmers".
Indeed.

Instead, they will demand better interfaces, smarter software,
True

and MUCH better ways of developing computer systems than sequential Von
Neumann code.

On the contrary, specially for these kinds of users, sequential jobs are a
way of thinking that is normal to them.
Most of them are "smarter" and more "computer literate" than their
prdecessors of even 10 years ago.

Yes. They are not scared anymore. OTOH the requirements on them have severly
increased also. I sometimes doubt if increased computer literacy actually kept
up with the added computer tasks for the avg person.
They are not intimidated by computer technology, will happily interact
with smart software to achieve a result, and are not prepared to rely on
and wait for, remote, faceless, technocrats to provide them with computer
solutions to business problems.

Yes, they want smug buzzword talking con-men to take advantage of them ?
We may have our own favourite Languages and we can poddle away in a corner
somewhere cutting code for the fun of it, but the real world demands that it
get solutions.

Exactly. So as long as my solution is good, and I can justify using a language,
waht is the problem.
By 2015 a new generation of development software will see "programmers"
removed from the loop and end users interacting and iterating with smart
software until they get what they want.

Sure. The telepathic kinds.
Procedural code is already into Gotterdammerung.

It takes too long, requires too much skill,

Programming is what requires the skill. Not the language. If you studied programming
closer, you'd know that.
is too inflexible (the accelerating rate of change in the Marketplace and
in technology is another reason why it is doomed to extinction) and,
overall, costs far too much.

And where are you references for that. You don't even say what it is up
against, except some vague references about software which is going to
emerge as a winner in 2015 (and which I assume is telepathic, at least if I
see your description)
skills... Why bother? Why should an Insurance company spend $50,000,000 a
year on in house IT when they could buy the service for $10,000,000?

Ah, but could they, and with the same secondary securities? Price is not the only
point of competition.
The only thing that COULD save procedural coding of solutions would be if
it priced itself back into the market. This MIGHT happen with offshore
outsourcing, but it is unlikely.

Bottom Line: Don't get smug about COBOL dying and PASCAL surviving; they are
on the same parachute and the ground is coming up....

Bottom Line: I think we can safely award you the "troll of the week" award, with
"don't panic" in nice friendly letters.
 
K

Karl Heinz Buchegger

Peter E.C. Dashwood said:
There are 400,000,000 reasons why ALL procedural languages (including COBOL
and PASCAL) should "die" in the not-too-distant future. (I don't know your
definition of "foreseeable" but mine is around 20 years...)

.... and replaced by what?

In the early 80-es there was a hype on PROLOG: The japanese are working
with PROLOG and 10 years from now PROLOG will replace traditional procedural
computer languages completely. So, where is PROLOG today, 20 years later?

[snip a lot of interesting thoughts]
Bottom Line: Don't get smug about COBOL dying and PASCAL surviving; they are
on the same parachute and the ground is coming up....

Procedural languages will be there for a long time. The languages may be different,
but still use the same principle. Knowing how to program in this paradigm will still
be the entry key to programming those languages. The rest is syntactic
sugar (simplified).
 
P

Peter E.C. Dashwood

Karl Heinz Buchegger said:
Peter E.C. Dashwood said:
There are 400,000,000 reasons why ALL procedural languages (including COBOL
and PASCAL) should "die" in the not-too-distant future. (I don't know your
definition of "foreseeable" but mine is around 20 years...)

... and replaced by what?

In the early 80-es there was a hype on PROLOG: The japanese are working
with PROLOG and 10 years from now PROLOG will replace traditional procedural
computer languages completely. So, where is PROLOG today, 20 years later?

[snip a lot of interesting thoughts]

Yes, I remember the Japanese PROLOG push and the drive to develop the first
AI Operating System.

It certainly failed.

So did attempts to build a lacemaking machine in the late 18th century in
England. The received wisdom was that it was impossible because the process
of making lace was just too intricate.

It took countless attempts, ruined families, suicides, and 30 years, but the
machine is viewable today in the Lace museum in Nottingham.

To answer your very fair question (... and replaced by what?), I believe
that new methodologies for system development will arise in response to the
pressure from the Marketplace. I have already seen interesting departures
from traditional methods that achieved much faster results and were much
more flexible. The key to these approaches is a more RAD like process with
iteration and interaction by users. Currently, we have programmers and
"Quick Build" tools in the loop, but it is only a matter of time before
smarter software will take on these functions. Eventually, end-users will
interact with smart software to achieve what they want, and there will be no
programmer in the loop at all.

There is far too much on this to go into here (sorry, I know that sounds
like a cop out, but I have been writing on this subject for some years now
and have been using alternative approaches in the real world in industry
with results that are very encouraging.), but I will close by saying that
everything I am saying is simply extrapolation from what is happening NOW. I
claim no psychic powers, just good observation and a lifetime of experience
in IT.
Procedural languages will be there for a long time. The languages may be different,
but still use the same principle. Knowing how to program in this paradigm will still
be the entry key to programming those languages. The rest is syntactic
sugar (simplified).

Well, time will tell...<G>

Pete.
 
H

Howard Brazee

I wonder why your response is so vitriolic?
Really?

I didn't set out to attack you.
Agreed.

Could you be a little sensitive to the truth of what I'm saying?

Isn't that human nature - when the truth hurts?
 
H

Howard Brazee

Well, I always enjoyed the Hitchhikers Guide to the Galaxy, but I have never
been a troll. You have no idea who you are dealing with <G>.

The set of trolls includes a large number of trouble makers. So we tend to
deny that what we're doing is trolling, when our purpose is noble.

But a statement designed to gain a response still qualifies. We need more
intelligent, useful trolling.
 
S

Scott Moore

Peter E.C. Dashwood said:
There are 400,000,000 reasons why ALL procedural languages (including COBOL
and PASCAL) should "die" in the not-too-distant future. (I don't know your
definition of "foreseeable" but mine is around 20 years...)

Pascal is not any more purely procedural than C++. Last time I checked, C++
still had functions. If you want to insist that Pascal has not evolved since
1973, then you are going to insist on being wrong.
 
R

Roedy Green

Procedural code is already into Gotterdammerung. It takes too long, requires
too much skill, is too inflexible (the accelerating rate of change in the
Marketplace and in technology is another reason why it is doomed to
extinction) and, overall, costs far too much.

What other options do we have?

1. OO -- also requires considerable skill.

2. FORTH where you create a language for solving problems in a
particular domain. The users of the language just string words
together. Usually only a handful of people understand how it works
under the covers.

3. Spreadsheets, where the emphasis is on relationships, not on
precise order of computation. The complexity is added gradually with
real life data used to test at every stage.

4. wizards, where you configure some generic application into a custom
app.

5. query by example.

6. training neural nets.


Spreadsheet logic is the one with the lowest threshold of technology
required to integrate it into Java. It should be possible to write
generic apps, e.g. a retail sales package, and have the customer or
someone with minimal skill, customise it with bits of spreadsheet
logic.
 
R

Roedy Green

If you studied programming
closer, you'd know that.

This is an interesting conversation. If you would spare the nasty
barbs it would also be enjoyable. Take your bitterness out on someone
who deserves it. I could suggest some politicians, but that would
start a flame war.
 
M

Marco van de Voort

Well, Marco, I wonder how long it is since you looked?

Well, I think I went to work today.
Software tools are already emerging that substitute iteration and
interaction for sequential processes.

Sure, for certain limited domains, the actual engineering is done,
and there is a nice tool to customize that in several ways.

Useful? Certainly. Timesaver? sure. Potential to be universal? No way.
SQL Server for example has a "drag and drop" tool that allows processing
streams to be built in minutes.

I've used laboratory controlling software which was nice in a lot of ways
too. It was truely useful, productive and IMHO the most important,
it avoided a lot of errors.

But I would never claim that such a thing could be generalized and replace
software engineering. They are problem-domain specific solutions, nothing
more.
These same streams using procedural code would take days.

Sure. But that doesn't spell the end of procedural programming.
What's more, if you get it wrong you can simply go to a graphic interface
and change it.

I do that with sequential programming (Delphi) too. Tools for a specific
domain. It saves time (and equally important) makes the product somewhat
easier to maintain.
I have seen at least one Graphic design package that uses a
similar principle. Non computer literate designers can easily manipulate
these tools, interact with them, iterate their processes, until they
achieve what they want.

Within very simple, limited borders. There is not one such tool that
replaces general programming (regardless of which paradigm you use)
Programming knowledge is NOT a requirement.

It is. Those environments are extended using normal programming, tackling
large projects still needs skill.

Those tools are just that, extra tools. Pretty comparable to fancy
runtime (and later classes-) libraries. None of them spelled the end
of programming either.

Hmm, I think that is a good description. Some extra tools to aid software
development, and allow a _user_ some customization.
Currently, tools like this are in their infancy. In 15 years we can expect
significant improvement.

Well. Then exactly this is our point of disagreement. Please explain
why you think (and how do you envision that will happen) how you get
from domain-specific solutions (math,laboratory handling, db handling)
to general purpose programming.

This is also what annoyed me about your previous message. It is a message
of a believer. There is no reasoning behind it. You only "get" it when
you are a believer.
The computer skills of the Business are rising very rapidly.

IT skills in using their own applications. Not in programming and
customization. In that department it got worse, especially in companies with
highly skilled (non-IT) technical people.

In the old days any beta-sciences student or graduate could do some
general purpose programming, usually they learned it so they could
program calculations. Nowadays they use mathlab, which is
absolutely great, but combined with faster machines requires less
programming skill to achieve the same result.
Which is good for them, but not for the level of programming knowledge
in a company.
No, they're getting pretty wise to that one too...in fact, most of us are.

If that were the case, I wouldn't get +/- 50-100 spams a day.
It isn't enough just to provide a solution; it has to be an acceptable
solution.
That means using tools and methods the users are comfortable with.

I don't see why that would be the case? Its like a car mechanic who has to
fix a car with the tools an average person has in his house.
In 15 years they WON'T be comfortable with some old academic cobbling code
together for a solution... By then they will have bypassed the need for
coding and will be implementing their own solutions.

Amen:) Still a little thin on reasons though.
That was the whole point of my argument. They are doing it already... More
and more Business departments are gaining enough computer literacy to
build their own systems using standard solutions like spreadsheets and
databases.

That kind of limited hobbying always has happened.
The last place I worked (a major utility in the Midlands of England) there
were more people in the Business with Computer Science degrees, than I had
on my IT staff.

Hmm. My former employer, which was IT related, had more chemists (including
me) than people with CS degrees.
There is no problem. I never suggested there was one. You can go ahead and
use procedural code for the rest of your natural life. (I intend to...). You
just won't make a lot of money at it. It'll become a "cottage industry" by
2020...<G>

A well. We have religious freedom :)
The process of iteration, as you would know if you had ever worked in a RAD
environment, does not require telepathy.

Very interesting. Why wouldn't I've worked in a RAD environment? Telepathy
again?
Your scorn is misplaced. Interaction and iteration enable HUMAN intelligence
to get in the loop, but does NOT require specific technical (i.e.
programming) skill.

While I think in retrospect that my tone might have been misplaced,
your second post confirmed my suspicious. You have a firm believe in
something, and really want to advocate that. However I don't find
much evidence, not even shallow ones.

Except maybe that one story of a business department full of people
with CS degree's. Now that's surprising that they were more likely
to get something working using minimal and standard tools :)
LOL! While I note that you are at a very reputable University (apparently
learning to use procedural code...)

I'm still honorary member of my former universities computer club.
I can assure you I have studied programming for the whole of my working
life (some 38 years - I started programming computers in 1965
What were YOU doing then <G>?) not behind
cloisters but in the real world.
Leaving aside your intended slight, I agree
that programming does require skill, but it was you who turned my statement
into a separation between programming as a skill and programming as an art.
I said that "Procedural Coding" is in decline. That includes the Language
and the Art...

Yet, except a firm believe that ordinary users with a few standarised
tools will replace them, you don't reveal much reasons.
I wonder why your response is so vitriolic?
I didn't set out to attack you.

I've been on news for over a decade now. And on Fidonet before that. While
trolling and wild speculation presented as "truth" might seem innocent to
you, it doesn't to me. It poisons a group, and creates an unequal position
for discussion.

The problem is that you don't have to justify yourself in 15 years if you
are totally wrong, like you would in a company. It is nearly anonymous, easy
and safe, yet it still does damage.

(and btw, keep in mind nearly medium-longterm IT forecasts have been wrong
till now)

As said before Maybe I was to harsh, yet I still stand behind the original
intentions behind that message.
Could you be a little sensitive to the truth of what I'm saying?

Please don't degrade to amateur psychology. It's seriously flawed enough
already.
The typical response of the student.

Again a belittling comment. Try to argue with something more substantial
arguments.
Are you saying that, without a
reference, you would question whether there is an accelerating rate of
change in computer technology?

No. I question if it goes in the direction that you say it is. So not
IF there will be change, just if it is going to be the change you proclaim.
OK, Alvin Toffler, Moore's Law, and the fact that I have to get a new
computer every 18 months...

Relates to programming how?
As for my knowledge of the Market place, I have worked in industry IT
services all my life. It is axiomatic to me that the Business needs are
accelerating and greater flexibility in response to changing and new
Markets is required in IT today than was the case even 5 years ago. I
don't need a text book to tell me this; my users are drumming it into me
every day... I can SEE the need for flexibility in system design and
implementation.

Sure, but you practically argue that this will replace software engineering.
Thank goodness there are tools and systems that are addressing this need.
(Client/Server, distributed networks, OOD and OOP are all paradigms that
are much more flexible than the traditional mainframe Waterfall
methodology, and coincidentally, none of them is tied to Procedural
Coding...)

We'll see. I consider OOP to be procedural programming too btw.
My figures are based on a real case. The Company concerned sold their IT and
leased it back. They did this when they had a bad year due to claims for
floods and droughts.

That's organisational detail.
It is interesting that in the "good" years they took no
action. Try telling a Board of Directors faced with a huge cash flow
requirement, that "Price is not the only point of competition". Even if
you're right (and I don't disagree with the statement) you will not help
your career...

Mine is practice too, everytime I argue on price, they come with "support"
(which they never use, and won't get), "security" (better large than
small company) etc.
award, with

Well, I always enjoyed the Hitchhikers Guide to the Galaxy, but I have never
been a troll. You have no idea who you are dealing with <G>.

One of the joys of usenet:)
 
J

jce

Marco van de Voort said:
Well, I think I went to work today.
Did you ask people whether "sequential jobs are a way of thinking that is
normal to them" or are you telepathic too?
But I would never claim that such a thing could be generalized and replace
software engineering. They are problem-domain specific solutions, nothing
more.
Maybe not software engineering as it evolves...but as it exists now and in
parts, probably. Software will be around for a while...so there will be
software engineers.....but I was told my grandfather kept saying "machines
will build cars....humbug!".
Within very simple, limited borders. There is not one such tool that
replaces general programming (regardless of which paradigm you use)
But it will replace large sections of general programming....in every
paradigm. I haven't written a math library recently.....I don't write gui
components much ...I don't write messaging software...I don't even have to
write my own storage/retrieval system...but I thought we were talking about
Software Engineering which is not exactly programming is it.
In 1903 they flew for the first time...in 1969 came the 747 and we are still
using the 747....Time brings improvement but it's only grows with demand or
reward in the risk-reward stakes. If there is no reward then no one will do
anything. The world is littered with tools - there are tens of commercial
vendors with profiling, generating, visual assistance and yet not one of
them lets you get by without understanding what it is you are doing...many
are listed as pre-reqs for jobs because they aren't just <pick em up and use
em>.
Without uniform acceptance tools will improve but not replace software
skills. The skills may evolve and get better tuned or apt to deal with new
products - be faster, more flexible.....My car is essentially a souped up
model-T ;-)
I believe in the escalator principle...
Tools get developed to replace complex manual type function......
Those doing the manual function are replaced with the more technical
developers to setup the automated process.
Tools get developed to replace the complex automated type setup with a neat
gui tool
Those setting up the complex automated type function are replaced by the gui
experts who had special training....
ad infinitum...

The people at the bottom get off....the higher paid get on the top and ride
down...
The key is to make the journey last until you're 55.
are.
If that were the case, I wouldn't get +/- 50-100 spams a day.
And you read them? or are you pretty wise to that one and delete them?
The escalator already started there then ....

While I think in retrospect that my tone might have been misplaced,
your second post confirmed my suspicious. You have a firm believe in
something, and really want to advocate that. However I don't find
much evidence, not even shallow ones.
I don't find much in the way of evidence that you've presented *against* it
either.
The idea of a generalized tool is way out there (2015 isn't that long).
There are sure to be major inroads into large chunks of the IT industry. If
people latch onto a successful tool and it gains support then it could be
looked at in other areas. It depends on how the rich would benefit.
I've been on news for over a decade now. And on Fidonet before that. While
trolling and wild speculation presented as "truth" might seem innocent to
you, it doesn't to me. It poisons a group, and creates an unequal position
for discussion.
I don't see the Troll here. He provides way too much useful input to groups
to be a "troll". When you crosspost you get to get input from all the
groups...most people are too busy being useful contributors in *all* groups.
The problem is that you don't have to justify yourself in 15 years if you
are totally wrong, like you would in a company. It is nearly anonymous, easy
and safe, yet it still does damage.
What damage does it do? More or less than off shoring...more or less than
war...more or less than enron...more or less than pension decreases...more
or less than the rising cost of insurance...more or less than the wealthy
become more so.....It's an opinion he has..Let him share it. It's
interesting...we can discuss it and decide for ourselves if it's crap. I
don't need you to protect me from anything.
As said before Maybe I was to harsh, yet I still stand behind the original
intentions behind that message.
Just be nicer about it...else you get *plonked* and no one hears you then.
Your perfectly valid and salient points become valid and silent.
Please don't degrade to amateur psychology. It's seriously flawed enough
already.
He's not degrading to amateur psychology..he's telepathic remember...:)
Again a belittling comment. Try to argue with something more substantial
arguments.
You started it....ha ha
No. I question if it goes in the direction that you say it is. So not
IF there will be change, just if it is going to be the change you
proclaim.
Ok - so your defense is that nothing has worked before? What if we put the
date to 2050...does that change anything? The only major flaw I see in his
argument is (a) the scale with which Peter sees this occurring - ubiquitous
and (b) the time frame....I have no evidence or justification for this
position.
Sure, but you practically argue that this will replace software
engineering.
It will replace LARGE aspects of software engineering....the need for PMs,
the RA role will change, therefore the SE position will have to go with it
and most of the tasks will be automated.
Mine is practice too, everytime I argue on price, they come with "support"
(which they never use, and won't get), "security" (better large than
small company) etc.
That's organizational detail ;-)

JCE
 
P

Peter E.C. Dashwood

Marco,

a good and fair response.

You have me down as a "believer"; I'm not. Neither am I trying to evangelise
ONE point of view.
(Been in this game to long...seen it all come and go...however, that does
not blind me to emerging trends and the fact that there is no requirement
for the future to be exactly like the past; because something failed in the
past doesn't mean it will not succeed the next time someone tries it (with
more knowledge and experience)).

I really don't mind if people disagree with what I'm saying. (At worst, the
ideas presented will have made them think; at best, they will have enjoyed
my post.)

But I am capable of extrapolating from observation and I have a track record
of being fairly right about it.

My comments are sincere but they are intended to stimulate, not to wound.
And if I am wrong at the end of the day, then I'll be embarrassed and glad
about it.

I am not seeking to "poison" this or any other group. The free exchange of
ideas (even where it is from "Trolls" who are seeking simply to "stir"
things) can only be beneficial to groups of people who have the intelligence
and vision to recognise what is important, and are capable of making their
own judgements on what is posted.

Unfortuately, the reasoning and observation behind my arguments is more
lengthy than can easily be accommodated in this particular forum. Also, my
comments are confined to commercial computer programming and not other
specialised areas of cyber development.(Like Chemical Engineering...<G>)

I had a look at your web site and see you are a proponent of Delphi and
PASCAL. (Both excellent languages and I have programmed in both of them,
although not extensively.) I guess this explains your reaction to my post.
It is not a comforting thought that the Languages we love have a limited
commercial lifetime, but that should not blind us to what is happening in
the Marketplace. (There are many COBOL programmers who are dismayed and
bewildered as they see the erosion of their traditional power base, too. My
advice has been to extend their skill set, but perhaps I should have said:
"Get an Accountancy or Business Management qualification...".)

The fact is that there are forces at work in the Marketplace that are
driving the "traditional" methods of developing commercial computer systems
into the ground. The Market wants computing "de-skilled" to the point where
end users can get the results they need without necessity for detailed
technical expertise. (My bet is that they will get it...). The Business
Functionality and the ability to support it in a rapidly changing
environment are paramount. Tools and Methods are emerging that have the
capability to deliver this within a reasonable (say, 15 years...) timeframe.

I respect your right to disagree, but I maintain my position.

Pete.

TOP POST - nothing further below here.
 
T

Thomas A. Li

Don't forget Java reflection. It is possible to pull out function/method
signature from binary.
Java and C# include reflection and therefore make them self-contained.
Both Java and C# binary file are targeted for virtual machine.

I think for reuse of code, OO based Java or C# code will be the first
option.

Thomas
 
G

Georgie

The fact is that there are forces at work in the Marketplace that are
driving the "traditional" methods of developing commercial computer systems
into the ground. The Market wants computing "de-skilled" to the point where
end users can get the results they need without necessity for detailed
technical expertise. (My bet is that they will get it...). The Business
Functionality and the ability to support it in a rapidly changing
environment are paramount. Tools and Methods are emerging that have the
capability to deliver this within a reasonable (say, 15 years...) timeframe.

I respect your right to disagree, but I maintain my position.

Pete.
I agree those market forces are busy. But being quiet new in the traditional
Mainframe Cobol, SE business (6 years) and also programming in VB.NET(1
year). I believe that to achieve that goal will be very difficult. It means
users will need to be skilled at their usual job and also able to configure
their IT tools. I've worked in seven companies and I can't imagine them
doing that now. Perhaps in 15 years but it'll require a new approach on
training.
Btw, I've got that accountancy and business degree and moved on to IT (my
hobby since the days of the C64 and the Amiga). My last projects were not
maintenance and/or change projects, but completely new applications
developed in COBOL II running under CICS. The one I'm applying for within 2
hours is also completely new. The company is a world leader in it's
business.

Georgie.
 
P

Peter E.C. Dashwood

Georgie said:
I agree those market forces are busy. But being quiet new in the traditional
Mainframe Cobol, SE business (6 years) and also programming in VB.NET(1
year). I believe that to achieve that goal will be very difficult.

Yes, it will. However, we have been working on it for nearly 50 years now...

(The fundamental goal of commercial computing has been to have computers
that are capable of "understanding" Business needs and meeting them, in a
manner that would enable a Business User (or Users) to identify and design
the system and "explain" what is needed to the computer, in as simple a
manner as possible, without need for in depth technical skills. It is
interesting to me that COBOL was one of the first attempts to achieve this,
with the Conference on Data Systems Languages in 1959 even foregoing
commercial advantage on the part of some of the contributors, for the
greater good of the Business community. This is why it rankles me so much
that the Language has since been hijacked by ANSI for commercial gain,
despite the protestations that this is a non-profit organization...Don't
start me...<G>)

There are indications that it CAN be achieved. However, you are correct that
it will be difficult and many Mainframe COBOL sites will be dragged kicking
and screaming into it (or will find themselves outsourced to India...) We
had some interesting threads recently in comp.lang.cobol where the reasons
for what I call "Fortress COBOL" were explored. Adoption of new technology
is probably hardest on the mainframe sites, where there is a very long
tradition of doing things a certain way. (The fact that this way has NEVER
worked satisfactorily, has left Users disappointed and disheartened with IT,
and that there are now better ways, seems to be lost on some IT
departments...)

If you would like to see the background for my thoughts on this please take
this link:

www.aboutlegacycoding.com/Archives/V3/V30501.asp

It means
users will need to be skilled at their usual job and also able to configure
their IT tools.

Well, I see it as our job (as IT professionals) to make sure the tools are
so user friendly, the Users can concentrate on their usual job without
having to become "computer programmers" (It would not be possible for the
expanding User base to all become computer programmers anyway, even if they
had the inclination to, which they don't...)

I guess what I'm saying is that if we do our job properly, we won't have a
job in 15 years...<G>

Actually, the nature of our work will change so that it isn't strictly
true, but the elements of truth are there...

I've worked in seven companies and I can't imagine them
doing that now. Perhaps in 15 years but it'll require a new approach on
training.

Yes, absolutely. There are some very innovative approaches on training and
self-education in the pipeline. The advent of technologies like DVD and the
interactive extensions of it will certainly change auto-education.
Btw, I've got that accountancy and business degree and moved on to IT (my
hobby since the days of the C64 and the Amiga). My last projects were not
maintenance and/or change projects, but completely new applications
developed in COBOL II running under CICS. The one I'm applying for within 2
hours is also completely new. The company is a world leader in it's
business.
Glad to hear you are productively employed and enjoying it, George. Hope it
stays that way for you. It looks like you have a "fall back" position
already established if it comes to it. A wise move...

Pete.
 
P

Peter E.C. Dashwood

Karl Heinz Buchegger said:
Do you have some links on that subject?

Here are just two articles which give some insight into the technology which
I believe will enable the Future I have been describing:

www.aboutlegacycoding.com/Archives/V3/V30501.asp (This is about what's wrong
with what we do now, in terms of development methodology...It was the
attempt to break out of this way of working which suggested to me how things
might work in the future.)

www.aboutlegacycoding.com/Archives/V3/V30201.asp (This is about the
direction I see programming technology going in. The emerging component
based systems will provide the basis for the User interaction I have been
postulating. Components are platform independent, small, with encapsulated
functionality and consistent and robust interfaces. These attributes are
just what is needed to respond to rapid change and provide flexibility.)

I'm sorry, these are links to just two of the articles I have written which
are pertinent. I don't normally promote my own work, but you did ask for
some links and these will at least give you an idea of where I'm coming
from.

I am personally interested in that subject. As you say: the old, traditional
textual representations of procedural programming is something which requires
skills, skills we cannot expect from ordinary users. I have thought about
some sort of graphical programming. The problem seems to be: It is relatively
easy to come up with such a thing for a specific topic (eg. image manipulation),
but as I see it, it is hard to generalize this to general programming.
Yes, that's right. There are models where we can achieve a general result if
we use a Human in the loop. (Basically, the Human is providing the
intelligence and discernment to decide whether a given proposed result is
acceptable or not.) By the key processes of ITERATION and INTERACTION, a
better and better solution can be arrived at. (If you like, the solution is
"evolved" towards...this varies markedly from the "traditional" IT approach
where the solution is "designed" from scratch, then built.)

Consider this...

Small Businessman goes to the "computer shop" and purchases a "small
business computer".

(Let's assume he is a very competent Businessman and knows his trade
extremely well, but he isn't big on computer technology, apart from the
basic "computer literacy" that children are growing up with today...).

He gets home, unpacks the computer, uses an interactive DVD (or similar) to
connect everything up, and switches on.

B'man: "Print me an invoice."
Computer: "What's an invoice?".
(This is a contrived example because the machine would certainly "know" what
an invoice is...but bear with me a little longer...)
B'man: "An invoice is a document that records the details of a sale
transaction."
Computer: "Ah, I know about Sales. That is a transaction where a CUSTOMER
purchases a PRODUCT."
(The machine comes equipped with the "concept" of a CUSTOMER and a PRODUCT,
among others. It also recognises the transactions we would expect to be
associated with a small business.)
Computer: "So you will want details of the CUSTOMER and the PRODUCT on
this INVOICE?"
B'Man: "Yes."
Computer: "Like this...?"
(It produces a document...)
B'Man: "Yes, but I need to see how many and what the unit price was.
Then calculate the total and add Sales Tax."
Computer: "How's this...?"
B'Man: "That's right. Put the totals for each product in a separate
column. And move the name and address details to here." (He indicates on the
screen with his finger or a pointing device.) "Print the overall total
payable in blue."
Computer: "Like this..?"
B'Man: "Yes, exactly like that."

There are some assumptions in the above whimsy (the computer has certain
inbuilt "concepts" - [you could think of this as a set of components with
all the attributes and Methods of a CUSTOMER, for instance], a natural
language interface, a "test" database for each of its concepts, however, all
of these things are possible with today's technology, and what is currently
"bleeding edge" will be passe in 15 years...), but I don't think anyone with
a programming background would say it was "impossible" or "infeasible".

Of course this demonstrates a solution to just one class of problem. There
are many others. But there are many other solutions also...

The bottom line is that "general" solutions ARE obtainable (note that in the
example above, we never know what particular "business" our "Businessman" is
in...the solution works for ALL small businesses, or can be "tailored"
easily to accommodate exceptions to the "norm".)

The keys to success in this are Interaction and Iteration. The Human
provides the "intelligence".(It is really "discernment"...)

But a time will come when the software will be capable of evaluating its own
results, matching them against stated requirements, rebuilding an entire
system in seconds to ensure that what is required is delivered. (Then doing
it all again, without complaint, when the User changes his mind <G>).

"Evolved" systems show every indication of being at least as good as
designed ones.

And they don't require computer skills on the part of the User.

Pete.
 
D

docdwarf

[snip]
(The fundamental goal of commercial computing has been to have computers
that are capable of "understanding" Business needs and meeting them, in a
manner that would enable a Business User (or Users) to identify and design
the system and "explain" what is needed to the computer, in as simple a
manner as possible, without need for in depth technical skills.

This, to me, screams for the implementation for the DWIM (Do What I Mean)
command.

DD
 
J

Jirka Klaue

[snip]
(The fundamental goal of commercial computing has been to have computers
that are capable of "understanding" Business needs and meeting them, in a
manner that would enable a Business User (or Users) to identify and design
the system and "explain" what is needed to the computer, in as simple a
manner as possible, without need for in depth technical skills.

This, to me, screams for the implementation for the DWIM (Do What I Mean)
command.

And this immediately leads to the need of the DWI_S_M command. :)

Jirka
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,483
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top