Motivation of software professionals

  • Thread starter Stefan Kiryazov
  • Start date
S

Seebs

The more channels you have available, the better communication
works.

Not so. Some channels can swamp others. If you're busy picking up
facial expressions, instead of properly processing the raw data, the
extra channel has HARMED your quality of communication.
There are probably some special exceptions, but other peoples
expressions and gestes are a vital part of communications.

They may well be -- but my experience has been that you can communicate
some things much better without them.
Not to mention the informal communications which occur when you
meet at the coffee pot. I've worked from home, and in the end,
I was frustrated by it because I was missing so much of the
informal communications which make things go.

I would miss that, except that in my workplace (which spans several
continents), the "coffee pot" is IRC.
That sort of thing is essential for any review. You do it
before the face-to-face meeting. But the reviewer isn't God,
either; the purpose of the meeting is to discuss the issues, not
to say that the coder did it wrong.

If you do it well enough, I don't think the face-to-face meeting does
anything but cater to superstition.
Almost universally. Ask any psychologist. We communicate
through many different channels.

I do, in fact, have a psych degree. And what I can tell you is that, while
there are many channels, sometimes you get better or more reliable
communication by *suppressing* the non-analytic channels. Say, if you
were trying to obtain accurate data about a thing subject to pure analysis,
rather than trying to develop a feel for someone else's emotional state.

The goal is not to have the largest possible total number of bits
communicated, no matter what those bits are or what they communicate about;
it's to communicate a narrowly-defined specific class of things, and for
that plain text can have advantages.

Most people I know have had the experience of discovering that a particular
communication worked much better in writing than it did in speech. Real-time
mechanisms can be a very bad choice for some communications.

You get more data per second if you are watching ten televisions than if
you're watching only one. That doesn't mean that, if you want to learn a
lot, the best way to do it is to watch multiple televisions at once. For
that matter, while a picture may be worth a thousand words, sometimes it's
only worth the exact thousand words it would take to describe the picture.
Why would we read code when we could watch a movie of someone reading it,
complete with facial expressions, tone, and guestures?

Because facial expressions, tone, and guestures swamp our capacity to
process input, and leave us feeling like we've really connected but with
a very high probability of having completely missed something because
we were too busy being connected to think carefully. It's like the way
that people *feel* more productive when they multitask, but they actually
get less done and don't do it as well.

-s
 
A

Alf P. Steinbach

* Seebs:
[snippety]
Why would we read code when we could watch a movie of someone reading it,
complete with facial expressions, tone, and guestures?

You might notice the person picking his/her nose all the time, and goodbye.

Or, you might notice that hey, that's [insert name of Really Attractive Lady
here], and and that's who I'll be working with via the net? Hah. Better invite
her to a working lunch, or something.

Otherwise, if learning about the code is your interest, skip the video.


Cheers & hth.,

- Alf
 
N

Nick Keighley

I've got mixed opinions on this. The real review takes place off line.
Explanation and discussion of possible solutions (I know, a code
walthru isn't supposed to consider solutions- a daft idea if you ask
me [1]) at a meeting.

Design meeetings can work.

[1] my review comments would then read "I know a much better way to do
this! Can you guess what it is {he! he!}?"


sometimes he *is* wrong! Some things get discussed/argued to death.
There is nothing more tedious than going through everey written
comment on a long list. I'd rather skip along. "Ok you accept items
1-9 but you don't understand what 10 is about- lets discuss item 10"

If you do it well enough, I don't think the face-to-face meeting does
anything but cater to superstition.

I find it easier to communicate with someone if I've met him in layer
1 at least once. I like a mental picture of who I'm talking to. Um
except here...


Most people I know have had the experience of discovering that a particular
communication worked much better in writing than it did in speech.  Real-time
mechanisms can be a very bad choice for some communications.

try dictating hex patches down a phone line. Or unix commands. You
rapidly discover Italian vowels aren't the same as ours. "ee? do you e
or i?"

Imagine if all programming specifications had to delivered in a
speech. Or chanted in iambic pentameter (no, spinoza, that's not a
challenge)
 
A

Arved Sandstrom

James said:
James said:
Did you actually try using any free software back in the early
1990's [sic]?
Seebs said:
Same here.
That's pure fantasy.
I used a couple of Linux distributions in the early nineties,
and they worked better than commercial UNIX variants.

And I tried to use them, and they just didn't stop crashing.
Even today, Linux is only gradually approaching the level of the
Unixes back then.
[ SNIP ]

I have to agree with you here. My earliest use of Linux was 1993, side
by side with IRIX and SunOS. I don't remember frequent crashing of Linux
but there was no question but that the UNIX systems were more stable,
more polished and had more capability. Granted, everyone back then was
throwing Linux on old PCs, which probably didn't help, but still...

AHS
 
A

Arved Sandstrom

Seebs said:
Not so. Some channels can swamp others. If you're busy picking up
facial expressions, instead of properly processing the raw data, the
extra channel has HARMED your quality of communication.


They may well be -- but my experience has been that you can communicate
some things much better without them.


I would miss that, except that in my workplace (which spans several
continents), the "coffee pot" is IRC.


If you do it well enough, I don't think the face-to-face meeting does
anything but cater to superstition.


I do, in fact, have a psych degree. And what I can tell you is that, while
there are many channels, sometimes you get better or more reliable
communication by *suppressing* the non-analytic channels. Say, if you
were trying to obtain accurate data about a thing subject to pure analysis,
rather than trying to develop a feel for someone else's emotional state.

The goal is not to have the largest possible total number of bits
communicated, no matter what those bits are or what they communicate about;
it's to communicate a narrowly-defined specific class of things, and for
that plain text can have advantages.

Most people I know have had the experience of discovering that a particular
communication worked much better in writing than it did in speech. Real-time
mechanisms can be a very bad choice for some communications.
[ SNIP ]

There is absolutely no question but that some things - many things -
work better in written form than in speech. Requirements specifications,
design documents, test plans and code itself are good examples.

As for code reviews I believe those can go either way. It depends on
skill levels overall, skill level differences, personalities, and
problems (or lack thereof) with prerequisite artifacts like design and
requirements. A code review that involves dysfunctional prerequisites,
dubious skill levels amongst the coders, and lots of ego - sort of a
common situation actually - is probably best handled f2f. IMHO.

But I've certainly seen code reviews that were handled nicely with no
personal interaction other than email or chat. This usually happened
when good requirements and design informed the whole discussion, all the
programmers were skilled, and the egos weren't too large.

A lot of times in a real code review you most definitely are managing
emotional state. That requires developing a feel for it, which you can't
do over chat. Seeing those blank expressions, or looks of anger, is
quite helpful in steering the review towards a somewhat productive
conclusion.

AHS
 
S

Seebs

A lot of times in a real code review you most definitely are managing
emotional state. That requires developing a feel for it, which you can't
do over chat. Seeing those blank expressions, or looks of anger, is
quite helpful in steering the review towards a somewhat productive
conclusion.

You make a good point here. With some experience, you can learn to preempt
a lot of that by attention to wording. At $dayjob, we have a couple of
techniques applying to this:

1. Code review is done on a list everyone sees. (We're still small enough
to get away with that, for now.)
2. Everyone's code is reviewed, even the "senior" people.
3. Over time, anyone watching the list will see enough people, some of them
quite senior, caught in mistakes or oversights, that they'll develop a
good feel for how to handle that.

It works surprisingly well. When you've seen a couple of the senior staff
say "whoops, yeah, I totally missed that, nevermind, I'll submit a V2 in a
day or two", it becomes much less worrisome to be asked to fix something.

-s
 
L

Lew

Branimir said:
I admire you ;)

I used to walk secretaries through swapping out or otherwise repairing
motherboard components over the phone back in the early 90s as part of my
tech-support role. Our customers also often had very weird software issues
that we'd help with.

They would only call my employer after having at least one other "expert" make
the problem worse.

The key was to make them take a break of at least an hour before I would help
them, usually out of their office at a park or some other nice place.

People aren't usually stupid, and if they're highly motivated to solve a
problem you can get them through almost anything if you are kind, empathetic
and very, very patient. There was a science to it also - my techniques were
not random. Starting with assuming competence on the part of the customer.
There's a difference between ignorance and stupidity; the former is curable.

I compare it to being the guy in the airport control tower who talks young
Timmy through how to land the plane after the pilot has a heart attack.
 
N

Nick Keighley

the phone line wasn't good. As well as the Alpha, Bravo, Charlie stuff
it seemed to help if you spoke like a boxing referee "one-a, two-a,
three-a"

That guy was pretty techie, because it contradicts my experience.
For example one guy was not capable to just find and change
parameter in config file, and I mailed him that.
Problem was that config file had about 1000 lines and
he didn;t knew how to apply case insensitive search.
In my experince lot of them don;t even know what is text
editor let alone debugger. So that guy was techie for, sure....

my mum had problems with instructions dictated by some some sort of
technical support because her interpretation of the terms "forward-
slash" and "backward-slash" were the exact opposite of most
(technical) people's
 
N

Nick Keighley

Lew said:
Richard Heathfield wrote:
People aren't usually stupid, and if they're highly motivated to solve a
[problem]

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~­~~~
I think that this is most important factor. Customer support where
I live is not very well motivated...

Other guy that claimed C/C++ expert (worked in Portugal for
military) asked did C have function pointer. When we
showed question from Stroustrups book (duffs device),
he dind;t know what is all about.

well it looks pretty weird the first time you see it. "can you really
do /that/ in a switch?!" was my reaction. Is Duff's device important?

No one of fifty computer
scientist masters dind't recognized duffs device (even guy which
was borne in 63').

'63 was a good year? Programmers are like wines?

So I concluded if I see university diploma "master of computer scinece"
that's sure sine of ignorance or something similar in country where I live.

I didn't learn Duff's device at university.

One guy who claimed wrote sw for robots dind;t knew how much is 2^32 ;)

nor do I if you want the exact value. I'd look it up if I needed it (I
just use hex!)
I think that are very few people who know ho to program computers these
days.

"The Earth is degenerating these days. Bribery and corruption abound.
Children no longer mind their parents ... and it is evident that the
end of
the world is fast approaching."
-- Assyrian stone tablet, c.2800bc

Blame educations system, because "C is not safe" and
"stay away from assembler". Soon no one will know how to program,
and older guys will earn lot of money , but there would be not enough of
them...

sounds good to me!
 
N

Nick Keighley

Nick Keighley wrote:



What is the point?
Average Joe makes memory leaks in Java no problem...
these days...
Software gets more bloated, more and more bugs, ...

I was noting the fixed point in the human experience. Things are
degenerating and were always better in the past.

<snip>
 
R

Richard Bos

Lew said:
You say that like the developers were at fault. I cannot tell you how many
times I've seen management overrule developers who wanted to make things
right. It's been the overwhelming majority, though. I recall a manager in
1982 refusing to let a team fix the Y2K bug in the project.

I've seen that - _my_ manager, in _my_ fix in _my_ program - in 1995.
Three years later he thought that it would be a good idea for me to
start paying attention to this Y2K thing he'd just heard about.

And then there's the users. Don't get me started on the users.

Richard
 
R

Richard Bos

Flash Gordon said:
I know there is software flying around today that is running on Z80
processors (well, the military variant of them) and the plan in the late
90s was for it to continue for another 20 years (I don't know the
details, but a customer signed off on some form of ongoing support
contract). Admittedly the software I used was not doing date processing
(apart from the test rigs, which used the date on printouts, which I
tested to "destruction" which turned out to be 2028).

Single signed byte?

Richard
 
L

Lew

Richard said:
I've seen that - _my_ manager, in _my_ fix in _my_ program - in 1995.
Three years later he thought that it would be a good idea for me to
start paying attention to this Y2K thing he'd just heard about.

And then there's the users. Don't get me started on the users.

Yeah. Our jobs would be so much easier if we only didn't have customers!

Don't dis the customers, man. Having a derogatory attitude toward "users"
(there are only two industries that call their customers "users") is a major
arrogance. Shame on you.
 
L

Lew

Branimir said:
One guy who claimed wrote sw for robots dind;t knew how much is 2^32 ;)

Actually, a correct answer to that is "2^32". So you gave him the answer in
the question.

Another correct answer is "100000000 base 16".

If you are disparaging the guy for simply not knowing the expansion to decimal
digits, well, Albert Einstein didn't bother memorizing his home phone number
on the basis that he could simply look it up in the phone book on those rare
occasions when he needed it.
I think that are very few people who know ho to program computers these
days.

That's only a problem if those people who don't know how to program are paid
based on a claim that they do.

Unfortunately that happens a lot.
 
L

Lew

Nick said:
I was noting the fixed point in the human experience. Things are
degenerating and were always better in the past.

To paraphrase /Dilbert/: "Back in my day, we carved our bits out of wood."

The problem with those good old days is you had to measure memory in barqs
rather than bytes, and everyone knows that the barq is worse than the byte.
 
L

Lew

Branimir said:
To be honest things were always simpler in the past.

That's not honesty, that's nostalgia.

Which is simpler, dealing with rush-hour traffic or a dire wolf trying to eat
your child?

And human interactions are not observably simpler nor more complex than any
time since the evolution of /homo sapiens/.
 
B

BruceS

Nick Keighley wrote:


Well 4gb answer should be enough, I don;t know exact figure either ;)

Maybe I'm just being overly pedantic, but that seems like a bad
answer. I don't fault IT people for not knowing the powers of 2,
though the approximation of 2^10n to 10^3n makes it easy. I do fault
people who seem overly critical for not being precise.

Now *that* I can agree with, aside from taking issue with the "these
days" part. It seems to me that for most activities, the majority of
participants are not very competent, and this certainly includes
software development.
What is the point?
Average Joe makes memory leaks in Java no problem...
these days...
Software gets more bloated, more and more bugs, ...

Ditto. As a member of a very small niche, it's nice to set terms (to
an extent). I get all sorts of shiny trinkets to prove my value and
further inflate my already healthy ego.
Well, actually if you spend enough time lurking at usenet, you can
learn enough ;)

Just be sure to learn from the right folks, or you may well learn
wrong.
I don;t have objective picture since my perspective is
from this country where sw industry is practically non existent (btw).

Greets

My perspective is from a country where the sw industry is pretty
large, but there's still plenty wrong with it.
 
J

James Kanze

I guess it depends on which unixes, and which Linux. When I
went from SVR4 Unix to NetBSD, though, I had a LOT less
downtime.

I've never used NetBSD, but from what I understand, it does
seem like it would have been a lot better than Linux.

Note that the problem is more one of being new. And not having a
decent development process, but that problem was shared by many
commercial OS's as well. Up until the late 1990's, I used Sun
OS 4 professionally. Early Solaris wasn't that great, either.
The version I used (nvi) was nearly-rock-solid. Which is to
say, I found and reported a bug and it was fixed within a day.
And I've been using the same version of nvi that I was using
in 1994 ever since, and I have not encountered a single bug in
15 years.

The two aspects are probably connected. Stable software doesn't
change versions that offen.
I said gcc, not g++. And while, certainly, it has bugs, so
has every other compiler I've used. I had less trouble with
gcc than with sun cc. I used a commercial SVR4 which switched
to gcc because it was noticably more reliable than the SVR4
cc.

I believe that gcc was pretty stable by then. But by the early
1990's, we'd moved on the C++. I did one of the compiler
evaluations back then, and I can assure you that g++ was a real
joke.
I do not think it is likely that implying that anyone who
disagrees with you is being dishonest will lead to productive
discussion. My experiences with free software were apparently
different from yours -- or perhaps my experiences with
commercial software were different.

My experiences with commercial software are not universally
positive. But realistically, anytime before the mid-1990's,
most of the free software was simply not acceptable. It didn't
have a good enough process to ensure stability, and was too new
for most of the bugs to have been worked out.
Whatever the cause, the net result is that by the mid-90s, I
had a strong preference for free tools and operating systems,
because they had consistently been more reliable for me.

The turning point was some time in the mid-1990's. When
depending on what you were doing.
 
J

James Kanze

[...]
I have to agree with you here. My earliest use of Linux was
1993, side by side with IRIX and SunOS. I don't remember
frequent crashing of Linux but there was no question but that
the UNIX systems were more stable, more polished and had more
capability. Granted, everyone back then was throwing Linux on
old PCs, which probably didn't help, but still...

Today, the problem is that everyone is throwing it on new
PC's:). Before the drivers for the latest cards are fully
stable. (Other than that, there still seem to be some problems
in XFree, and I've generally had to more or less hack some of
the boot scripts to get them to work.)

With the exception of the problems in XFree, however, I don't
think you can compare them with the commercial offerings.
Solaris always installed like a charm for me, but that was on a
Sun Sparc---the two were literally made for each other, and Sun
made sure that any new Sun hardware would work with Solaris.
Trying to cover generic hardware, including chips that haven't
been invented yet, is a lot more difficult.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,763
Messages
2,569,563
Members
45,039
Latest member
CasimiraVa

Latest Threads

Top