Who gets higher salary a Java Programmer or a C++ Programmer?

J

James Kanze

It is much easier to learn Java than C++ as first language.

I don't know about that. I think it depends on how they are
taught. Neither is really designed as a pedagogic language, but
both can be used successfully if the instructor is willing to
insist that the students accept some magic incantations at the
But it is also much easier to go C++ -> Java than Java -> C++.

I don't know about that. Knowing one will make learning the
other easier, because they share a number of very low level
structures (control flow, expression syntax, etc.). Beyond
that, they tend to have very different idioms for a lot of
fundamental programming tasks, and are really two separate
languages, where what you know about one doesn't help that much
in the other.
So I am not convinced that learning Java first and C++ later
is in total easier than learning C++ first and Java later.

If you're teaching yourself, you should probably start with some
more pedagogic language. If you're taking a course, you should
start with whatever the course starts with (and which language
it uses is really not as important in chosing a course as any
number of other things).
 
J

James Kanze

Sanny wrote:
If you want to try for the big bucks, I hear COBOL is coming
back in vogue.

I don't know about being in vogue, but it's sure that skilled
Cobol programmers can pull down more than C++ or Java
programmers. There's a lot of legacy (and some new) Cobol out
there, and there are very few people who would admit to Cobol
skills, even if they have them. So even if the actual demand
isn't as great as for C++ or Java skills, the difference between
the offer and the demand is enormous. (It helps, too, that the
Cobol applications are typically mainframe---for whatever
reasons, the bigger the machine you work on, the more you get
paid, and that they are often the critical central piece of the
IT infrastructure; something goes wrong with them, and nothing
else works.)
 
J

James Kanze

Sanny wrote:
If you are applying for jobs where English is a relevant
skill, you will improve your earning power by increasing your
mastery of that language.
Non-programming skills often count for more than one's
technical abilities when climbing the corporate rungs.
Some might look at your random capitalization of different
words in English and wonder if you are sensitive to case
sensitivity in Java. It is a shame, perhaps, that your
command of English might block someone's ability to perceive
your command of programming, but that is a reality in the work
world.

The ability to understand other people's ideas is essential if
you are to be productive. The ability to communicate your own
ideas is essential if you are to be productive in anything but
the lowest level positions. If the working language of the
project is English, the ability of a candidate to communicate in
English is very important.

[...]
This is not parochialism, but a necessity when one is forced
to communicate in any language. It is vitally necessary in
written communications; face to face, people will forgive
accents and unusual constructions, but in written
communication there is little tolerance for fundamental
errors, and less reason for there to be any.

Expressing yourself clearly and concisely is an essential skill,
whether it be in a human language or in a programming language.
In general, people unable to avoid spelling, punctuation and
grammar mistakes in their native language will find it equally
difficult to do so in C++ or Java.
 
J

James Kanze

IME that's nothing to do with the manager who needs the new
hire: the initial hiring task gets given to HR who know
nothing about programming or programming skills but do know
how to match acronyms and names on the manager's skills list
with those on a CV. The same applies to recruitment agencies.
The result is that the candidates who get interviewed are
simply those whose CVs get the most hits from what's little
more than a clerical matching exercise.
IOW the manager may know what he wants in the way of
transferrable skills but this gets dropped on the floor by the
agency and HR people because they don't understand IT. The
current habit of condensing CVs to one or two pages and
concentrating only on recent experience just exacerbates the
problem.

In this regard, see http://www.idinews.com/keywordskills.html.
(For that matter, the site http://www.idinews.com/ is one that I
would highly recommend.)
 
J

James Kanze

To me that would be well down on my list of considerations. I
ask questions like this:
1. which language do I enjoy coding more? What counts is how
much I enjoy my life. I spend a LOT of it coding.

On the other hand, a lot of people would like doing something
other than coding.

In the end, I think most people are like me, and strike a
compromize. There are a lot of things I like more than
programming, regardless of the language, but I can't make a
decent living from them. And there are a lot of things which
pay more than programming, but I can't bring myself to do them.
Programming is a nice compromise (for me): it pays reasonably
well, and is somewhat agreeable.
2. which language will let me tackle more interesting
projects. For than reason COBOL is out. I have no interested
in maintaining payroll programs.

One of the reasons Cobol pays so well is that no one wants to do
it. (For a long time, my sig was "Conseils en informatique
industrielle/Beratung in industrieller Datenverarbeitung". When
asked what I meant by "informatique industrielle"/"industrielle
Datenverarbeitung", I responded that it meant that I didn't do
Cobol. On the other hand, I once had a collegue who worked on
the Cobol part of the project; he continued working on it mainly
because the company accepted paying him a full time salary for
20 hours a week, so he had more spare time for playing bass
guitare.)
If I wanted to make money, I would learn the arcane art of
Unix system administration.

Try IBM mainframe administration. It's even more arcane, and
even better paying.
3. Which language will leave my options open where I work. I
don't want to get stuck in some place I hate. I want to be
able to go anywhere. Which language is become more accepted.
Which are becoming obsolete?

I see what you're getting at, but IMHO, it's a dangerous
attitude, carried to extremes. Of course, once "in" languages
never die, but they do fall out of grace, leaving a glut of
programmers in them, and making it hard to change jobs.
(Remember, at one time, Cobol was the most accepted language for
business applications.)
4. Which languages offer work from home?

Been there, done that. It doesn't work. In practice, some
physical presence in the office is necessary, in order to
maintain effective communications.
 
J

James Kanze

I've got to ask.. I've tried this alternative using places
like Guru.com to get jobs and I've found I just can't make any
money and the clients are not the kind of clients you want.
They want a lot of work for very little money.. even work for
free in some cases. I want to make my clients happy, but I
need to get paid for my time and this just doesn't seem to
work out well.
How do you get jobs where you get to work from home without
having to charge the same rates a person in Deli will charge.

That part's easy. Do a contract for them on site, first, so
they know what you can do. Then make working from home a
condition for the next contract. If you've done your job right,
then they'll likely prefer using you, working from home, to
someone else, working in their office.

Be prepared to be disappointed, however. Unless you are more or
less regularly present in the office, the quality of your work
will go down considerably. Quality programming is a team
activity, and good teamwork requires physical presence. (This
doesn't mean that 100% of your time must be spent in the office.
But judging from my experience, at least 2 days a week.)
 
J

James Kanze

Spoken like someone who is not an engineer, and not like
someone who engenders confidence in their programming skills.

Also like someone who has never authored a book. They may not
call it engineering, but most of the best books are carefully
organized very much like a engineering project.
 
L

Lew

public class Foo {
   public Foo() {
     bitBucket.add(this);
   }
   private static java.util.LinkedList<?> bitBucket =
     new java.util.LinkedList<?>();

}

That's a memory leak (so long as bitBucket is not otherwise impacted by
any code path coming from a public API).

Strictly speaking it's not a leak but a plug - memory is held by the
program, not leaked from it. It is called a "memory leak" in Java,
but similarly to the term "reference", its meaning in Java is not the
same as in another language like C++.
 
J

James Kanze

public class Foo {
   public Foo() {
     bitBucket.add(this);
   }
   private static java.util.LinkedList<?> bitBucket =
     new java.util.LinkedList<?>();
}
That's a memory leak (so long as bitBucket is not otherwise
impacted by any code path coming from a public API).
[/QUOTE]
Strictly speaking it's not a leak but a plug - memory is held
by the program, not leaked from it.  It is called a "memory
leak" in Java, but similarly to the term "reference", its
meaning in Java is not the same as in another language like
C++.

In no sense of the word is it a leak. A leak isn't a couple of
drops spilling over the edge; it is a continuous loss. A
typical example of a leak in Java (or in C++) would be if in
reaction to some external event, you created a new, unique
identifier, stored some data (perhaps a functional object) under
that identifier in a map, and didn't remove it from the map once
the identifier was no longer in use. Without the identifier,
you no longer access the memory, and you "leak" memory each time
a specific event occurs.
 
T

Tom Anderson

I offered 40 years experience. A young squirt just out of school can't
solve problems the way I could. He could code to a spec, but he could
not write the specs, write the docs in idiomatic English. He could not
suggest 10 totally different ways to approach a problem. You need
experience to do that.

One of my bosses expounds the same basic idea. The strategy is that we
compete with the outsourcers on a cost-per-project basis. We can't compete
on cost-per-head, because we live in London (or even worse, live in the UK
and commute to London), not Bangalore, and we like to work 40-hour weeks
and earn enough to live comfortable lives [1]. However, if we're good
enough - skilled, experienced, and well-organised enough - we can deliver
a project with a fraction of the man-hours of an outsourcing shop, and
thus be competitive on total cost.

This strategy seems to be working. It works for us for several reasons:

- We're a small company - two software gurus, a business/product/customer
guru, a training and tech writing guy, one permanent programmer (and
jobbing sysop), the intern, and zero to a handful of contractors. That
means we have very low communication overhead, and can coordinate
effectively. We don't waste time with meetings, emails queued in inboxes,
duplication of effort, misunderstood instructions, etc. We don't spend any
time doing anything that isn't concrete, productive work. Apart from
ten-minute stand-up meetings twice a day. And the crossword at lunch.

- We do agile development. I know not everyone's a convert, but honestly,
when you do it right (or near enough to right), it works like a charm. You
spend more of your working time on producing deliverables, and more of
that time is spent effectively.

- We do a lot of work with an e-commerce framework that's powerful but
arcane. We know it well enough to work with it smoothly, which few people
do. That means we can build big, complex e-commerce apps using it more
quickly than non-adepts could, with or without it (probably). It also
means that we're very well placed to pick up maintenance and extension
work on apps that have already been built with it - fortunately, there are
a lot of other shops using it, but using it badly, supplying a crop of
opportunities for us to harvest! That said, we are interested in doing
stuff not using this framework, as it's a rather harsh mistress; it'd be
interesting to see how much competitive edge we lose by stepping away from
it.

- We're not a straight development shop. Indeed, the gurus claim that we
aren't a development shop - apparently we're primarily a consultancy, but
one which can also do implementation should the need arise. A good slice
of the company's work (handled almost entirely by the gurus) is stuff like
proposing an architecture or outline of a solution to company A, or
reviewing such a proposal that company B have got from company C, or going
over company D's existing systems and telling them how to do what they
wan, etc. I don't fully grok how this works - i understand how they can do
that, but i don't see how it synergises with the programmers in the back
office. It's not like you can waltz in and say "oh, clearly the right
solution is to hire us to build this for you", can you. Can you? Maybe
it's about building trust and connections, and establishing ourselves as
people who can be called in to sort things out and get things done. Sort
of a cross between Red Adair and the SAS.

- I would say this, but we're bloody good programmers! The gurus are wise
and crafty (both the technical guys are ex-Smalltalkers, FWIW), and they
make a policy of only hiring people who are good. Our mercenaries^W
contractors are drawn from a smallish pool of people we know and trust,
who have worked with us on and off for ages. We don't have people sitting
around not being on top of their game or not pulling their weight. Apart
from me, obviously - i'm still quite shocked that i haven't been fired.

Anyway, as i said, it seems to be working.

One of the difficulties we run into is that clients' purchasing
departments don't always see things our way. Perversely, they don't look
at the price of the thing they're buying, they actually look at the price
of the programmers. We've had clients offering contracts which did things
like specify caps for salaries, and even annual increases in salaries. I
guess some clients are used to thinking in terms of parts-and-labour
costing, rather than fixed-price. This is shortsighted when the actual
value of labour can vary wildly (or so we claim).

It'll be interesting to see how this strategy pans out in the long run. At
the moment, it works because the competition, domestic and outsourced, are
not so hot. As the Indian software industry matures, we'll see companies
every bit as good as us emerge - and in huge numbers, of course. Will be
all be out of a job then? Or will salaries in India have inflated to the
point where they've lost their cost edge by then? Oh well, chances are the
whole of Europe'll be bankrupt by then anyway.

tom

[1] At least, the others do. I'm part-time, and technically an intern, so
my salary is correspondingly spartan!
 
T

Tom Anderson

Try IBM mainframe administration. It's even more arcane, and
even better paying.

At one point, i heard Lotus Notes app development was paying like mad - a
thousand pounds a day for contracting work. And that was back when the
pound was worth something!

tom
 
B

Ben Voigt [C++ MVP]

James said:
In no sense of the word is it a leak. A leak isn't a couple of
drops spilling over the edge; it is a continuous loss. A
typical example of a leak in Java (or in C++) would be if in
reaction to some external event, you created a new, unique
identifier, stored some data (perhaps a functional object) under
that identifier in a map, and didn't remove it from the map once
the identifier was no longer in use. Without the identifier,
you no longer access the memory, and you "leak" memory each time
a specific event occurs.

That's a pretty apt description of the example given. The memory is still
reachable, but only from inside an internal data structure, so not *useful*.
And C++ has basically the same thing, orphaned memory is still reachable,
but only via the heap internal data structures (there is actually a public
API to do that though all type information is lost), it's effectively
unusable at that point.
 
L

Lew

Tom said:
We've had clients offering contracts which did things
like specify caps for salaries, and even annual increases in salaries.

How singularly arrogant of them!

I've heard of this thing before, actually, I've been on the short end
of this before, and it never ceases to shock me. How big must the
client think their balls are that they can dictate employment policies
to another company?

Worse yet is when the consulting firm accepts these types of
restrictions. Oh, please, spare me the defense of the practice; I'm
aware of the considerations. It's still the height of chutzpah to
dictate to another company their business practices so that you can be
their customer.

It's the social equivalent of using reflection to access private
members of another class. There are occasional, very restricted
scenarios where this kind of thing can be justified, but in the common
case it's just awful.
 
T

Tim Roberts

Sanny said:
At http://www.GetClub.com/Experts.php you can tell your expertise and
get work from home work. You only get small orders in such places. As
large work people choose already established companies instead of
giving work to strangers who may spoil the work.

I have a hard time using the word "expert" anywhere around a web site that
thinks the word "doctor" is actually spelled "docter".

I won't be going back to GetClub.com.
 
T

Tim Roberts

Lew said:
Spoken like someone who is not an engineer, and not like someone who engenders
confidence in their programming skills.

Software engineering has its place in every software project, and that place
is everywhere in the software.

I have a speech about that.

We are in the middle of a revolution in software development. Today, with
few exceptions, software development is FAR more art than engineering, and
most of us are "software artists" instead of "software engineers". Programs
are carefully hand-crafted one line at a time, like an amateur building a
bridge out of balsa wood, and not engineered, like an engineering firm
building a highway bridge.

If we are ever to build reliable large software systems, software
development needs to transition to a true engineering discipline. The
tools are on their way to being able to enable that transition, but they
aren't there yet.

The tools that my partner uses to create chips are much better suited to
"engineering" than the tools I use to create software. Compare a Pentium
IV processor to Windows, for example. I've had many debates, usually under
the influence of intoxicants, over which one of those is more
"complicated". I would argue that Windows is much less reliable than the
Pentium IV, because the tools aren't as good.

When programming does transition to an engineering discipline, it will be a
much better thing for the world at large, but it won't be nearly as much
fun.
 
R

Roedy Green

3. Which language will leave my options open where I work. I don't
want to get stuck in some place I hate. I want to be able to go
anywhere. Which language is become more accepted. Which are becoming
obsolete?

There was a news item today that having a boss you can't stand leads
to heart disease.
--
Roedy Green Canadian Mind Products
http://mindprod.com
"Humanity is conducting an unintended, uncontrolled, globally pervasive experiment
whose ultimate consequences could be second only to global nuclear war."
~ Environment Canada (The Canadian equivalent of the EPA on global warming)
 
L

LR

Lew said:
Spoken like someone who is not an engineer, and not like someone who engenders
confidence in their programming skills.

Shouldn't that rather be t'other way round? Isn't it that people who
understand that software isn't, and cannot be an engineering discipline,
inspire the confidence that comes from having an appreciation of one's
tools?

In the beginning, middle and end, good software is engineered, like a bridge
and, arguably, like a well-written book. If it is not well engineered, it's
not going to be good software.

I've seen some bridges come with documentation that says things like "No
vehicles over 5 [sic] tons." But I can't say I can recall ever seeing
documentation for a program that says "Do not enter numbers over
1,000,000.00 or your computer will break." Sometimes the program itself
will tell you after you enter a bad number. I suppose that would be
more like driving a six ton truck on the aforementioned bridge, without
the sign present, with perhaps obvious consequences. Although I suspect
the bridge sign was probably written with some safety factor in mind,
whereas the program probably won't work if 1,000,000.01 is entered.
There are sound principles behind sound programming decisions. These
principles come down to providing desired functionality with provable
management of risk of defects or system failures, within budgetary
constraints.

Can I use those to prove that my program will halt at some point?
Some of the more formally-expressed principles in software
engineering turn out to have the most significant pragmatic relevance, just
like in bridge-building.

"Software engineering", properly applied, is no buzzword at all, but a precise
description of proper software design and construction.

When I was young, or at least younger than I am now, I was taught that
engineering is the application of scientific principles. This makes me
curious to know, what scientific principles are being applied in the
development of software?

LR
 
J

James Kanze

On Mon, 24 Nov 2008 14:14:01 -0800, James Kanze
Just goes to show, you can have this argument in any
programming newsgroup, with or without a garbage collector.
:)
I have never considered "continuous loss" (that is,
continually increasing over time) to be part of the criteria
for defining a "leak". Even a single block of data for which
no reference remains and so making the block of data
unreachable is a "leak" in my book.

Well, you can define it anyway you want, but any definition not
implying continuously increasing memory use has no practical
implications. And of course, in non technical English, a leak
is also more or less permanent: you don't say a bucket leaks
because a few drops spill over the top; you would only speak of
a leak if there was a continuous loss.

But by your definition, his code didn't leak either, so I'm not
sure what you're arguing about. I'm aware of this definition,
and considered it in my "no sense of the word".
You may feel free to disagree, but I find it pointless for you
to write something like "in no sense of the word".

OK. "In no reasonable sense of the word", then. Or "in no
practically usable sense of the word." Or "in no sense of the
word I've ever seen."
There are many "senses of the word" when it comes to the word
"leak", and lots of people uses sense of the word that
support Lew's and my interpretation, not yours.

The actual example code didn't leak, in any sense of the word.
It corresponded to an established and widely used pattern.

Actually, on looking at the code closer, I think it is a leak.
I'd missed the fact that each constructor added the object to
the list. He has, effectively, created a situation where
objects of type Foo can never be recovered by garbage
collection; this is not fundamentally different from my
classical example of a class registering itself somewhere. So
either:

1) The program needs a record of all instances of Foo; once
created, an instance lives forever; and the program never
creates more than a bound finite set of Foo. In this case,
the code is perfectly correct; if the cardinality of the
bound finite set is 1, we even have the established idiom of
a singleton. (In this case, the constructor really should be
private.)

2) The program needs a record of all instances of Foo; once
created, an instance lives forever; and the number of
instances which may be created in not bound. In this case,
he has a real problem, since his application requires a
machine with infinite memory in order to run. He probably
needs to review the requirements.

Note that in some cases, this may be more or less
inevitable, and the requirements will end up having to be
formulated in terms of "within its resource limits, the
program will...", with a definition of the behavior when the
resource limits are exceeded. Consider the symbol table in
a compiler, for example. If the program is supposed to run
24 hours a day, 7 days a week, however, this could be a
killer problem.

3) The program only needs a record of the active instances of
Foo, and the programmer has forgotten to provide a means of
"deactivating" an instance (or user code has forgotten to
call it). In this case, the code does leak. At least by my
"useful" definition.

[...]
As far as this unimportant disagreement goes, here's my
stance: garbage-collected systems cannot have true "leaks",
except for a bug in the GC itself. On the other hand,
imperative memory management systems, such as C++'s
"malloc/free/new/delete" can. And the way those "leaks"
happen is that you _do_ remove the "identifier" (i.e. the
pointer/reference to the memory) without calling the
appropriate function to actually release the memory block
back to the memory manager.

That's a nice definition for commercial purposes. It sounds
nice to be able to say that your language cannot have a leak (or
that your product detects all leaks). But it's really is a sort
of commercial new-speak to redefine "leak" in order to be able
to say that. It's sort of like Java (and C++ for the unsigned
integral types) redefining arithmetic "overflow", in order to
say that integral arithmetic can't overflow. With the
difference that there are a few exotic cases (at least with
unsigned values) where the new definition is definitely useful
(and even in the case of Java, it allows "defined" behavior at
no cost on most machines---and even incorrect defined behavior
is better than undefined behavior).
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,767
Messages
2,569,570
Members
45,045
Latest member
DRCM

Latest Threads

Top