Thread synchronization

B

Babu Kalakrishnan

Thomas said:
YES. But I stress the /synchronized(this)/ idiom because interesting things
happen when the /synchronized method()/ idiom is used:

1. The developer is far less likely to shift over to
using a different lock should the need arise. He
feels "locked in" :) to using the current object
instance.

Also, I've seen a lot of developers (esp. fresh out of school) who don't even
know about synchronized blocks (as opposed to synchronized methods) because
(almost) all code they have seen is where methods are declared as synchronized.
So in my opinion, schools ought to first teach synchronization following the
synchronized(object) {... } paradigm, and then go on to explain the "shortcut"
form available in the special situation where the monitor object happens to be
"this" and the critical section of the code happens to be an entire method body.

BK
 
T

Thomas G. Marshall

Babu Kalakrishnan coughed up:
Also, I've seen a lot of developers (esp. fresh out of school) who
don't even know about synchronized blocks (as opposed to synchronized
methods) because (almost) all code they have seen is where methods
are declared as synchronized. So in my opinion, schools ought to
first teach synchronization following the synchronized(object) {... }
paradigm, and then go on to explain the "shortcut" form available in
the special situation where the monitor object happens to be "this"
and the critical section of the code happens to be an entire method
body.


YES.
 
L

Lee Fesperman

Thomas said:
Lee Fesperman coughed up:

Ok. And if you are not protecting the program at large, then
synchronization is meaningless. There is no useful case where this is not
true. Keep wandering up, like I pointed out elsethread.

Asinine. You are confusing the technique with its purpose. Synchronization is a
technique used for the purpose of restricting access to shared state.
Larger contexts always exist. But there are /lines of code/ in
synchronization blocks and methods. Saying that you are protecting an
object is of no good in teaching someone how the particular /lines of code/
might need be atomic to an algorithm.

You are misleading your students if you refuse to tell them that synchronization is used
to restrict access to shared state (including the state of external resources.)

In fact, you haven't given a useful example where it is not true. Your one vague example
of an algorithm that can't be interrupted yet doesn't have shared state doesn't hold
water. Provide more details to support your example.
And it is those lines of code /within/ such synchronization structures that
are protected. Lines exterior to those simply are not.

What is a synchronization structure? Your statement is vague and sounds wrong.
 
L

Lee Fesperman

Chris said:
Well, the data to be protected is either a subset of the state of one object,
the entire state of one object, or spread across more than one object -- it's
not immediately obvious what your fourth possibility is ?

Your paragraph was somewhat vague, but here goes...

You list 'some' cases (2 actually):

1) spread across more than one object,
2) subset of state of one object.

You then state that *only* in those cases is a lock object appropriate.

You also list the 'normal' case:

3) entire state of one object.

You imply that this case would use the object as the monitor.

This is a fourth case:

4) entire state of one object but using a separate monitor to protect its state.

You left this one out of your *only* assertion.
 
T

Thomas G. Marshall

Lee Fesperman coughed up:

Careful. You want people to ignore you? Then go and throw unwarranted
insults like that around some more. I don't remember saying that your
notions were asinine.

You are confusing the technique with its purpose.

Actually, I was pointing out that /you/ were.

Synchronization is a technique used for the purpose of restricting
access to shared state.

Nail down any purpose you like. It still comes down to making sure that
lines of code are protected from concurrent execution.

You are misleading your students if you refuse to tell them that
synchronization is used to restrict access to shared state (including
the state of external resources.)

You are misleading everyone you talk to if you try to convince them that the
lines within the synchronization blocks /aren't/ what's important. Your
explanations simply do /not/ help a student to understand where and how to
apply synchronization, and why entire methods do not need to be
synchronized, for example.
In fact, you haven't given a useful example where it is not true. Your
one vague example of an algorithm that can't be interrupted yet
doesn't have shared state doesn't hold water. Provide more details
to support your example.

Provide /one/ detail as to why it doesn't hold water.

What is a synchronization structure? Your statement is vague and
sounds wrong.

You never heard of a synchronization block? That's not a synchronization
structure?
 
L

Lee Fesperman

Thomas said:
Lee Fesperman coughed up:

Careful. You want people to ignore you? Then go and throw unwarranted
insults like that around some more. I don't remember saying that your
notions were asinine.

You twisted my words to mock them as foolish. You didn't add anything substantive to the
discussion.
Actually, I was pointing out that /you/ were.

More word play? You're saying that protecting state is a technique used for the purpose
of synchronizing threads?
Nail down any purpose you like. It still comes down to making sure that
lines of code are protected from concurrent execution.

Yep, I realize that nothing anyone can say will change your mind. OTOH, all it would
take to convince me is a useful case where synchronization is needed except for
protecting external state ... you may need to provide details, though.
You are misleading everyone you talk to if you try to convince them that the
lines within the synchronization blocks /aren't/ what's important.

Again, you are twisting my words for no reasonable purpose. I haven't said that
synchronization is unimportant in protecting state ... it is a *very* important
technique. You're not just saying that mentioning state is unimportant in discussing
synchronization but that it would do no good.
Your explanations simply do /not/ help a student to understand where and how to
apply synchronization, and why entire methods do not need to be
synchronized, for example.

I'm explaining why you use synchronization which naturally leads to where and how to
apply it. If you don't know the purpose of synchronizing, you won't know where to apply
it. Avoiding concurrent execution is too vague a goal. Why do you need to avoid
concurrent execution?
Provide /one/ detail as to why it doesn't hold water.

You've refused to provide details. That makes it rather difficult to pick one.
Nevertheless, I will respond ---

If you can't interrupt (delay) this thread and the thread doesn't affect external state,
who would ever know if you did interrupt it? If it has no effects, interrupting it would
have no effect.
You never heard of a synchronization block? That's not a synchronization
structure?

Ok. You meant synchronization block. Perhaps you could have said 'synchronization
control structure'. Further imprecision in your statements made it necessary to clarify.

To continue: By /within/, do you mean --- lines of code textually inside the braces of a
synchronization block (or synchronized method)?
 
C

Chris Uppal

Lee said:
You list 'some' cases (2 actually):

1) spread across more than one object,
2) subset of state of one object.

You then state that *only* in those cases is a lock object appropriate.

You also list the 'normal' case:

3) entire state of one object.

You imply that this case would use the object as the monitor.

This is a fourth case:

4) entire state of one object but using a separate monitor to protect its
state.

You left this one out of your *only* assertion.

No, because I consider it misguided. That was (part of) the point I was trying
to make in my first post. (More on this, btw, in my other post in this thread
today).

-- chris
 
C

Chris Uppal

Thomas said:
I read through this post, and I'm not sure that you do. Maybe "mostly".

Nope, I'm afraid it is "completely" ;-)

Pondering this, I'm trying to work out where our disagreement lies.

But before getting into it, I want to mention that I don't think your example
of "protecting" access to an external resource, like writing unscrambled lines
to System.out, is relevant here. I'll come back to why later, but I think
it'll be easier to understand my position if you know in advance that I don't
consider such cases to be of any great importance for the point in question.

In another post you mentioned a hierarchy which I'll paraphrase (and extend) as
something like:

I am protecting these lines of code,
in order to protect this aspect of shared state,
in order that my program will work correctly,
.. skipping a level or two here...
in order that I may enjoy enhanced wealth and status...

where, at each level, the goal can be seen as a tactic that is intended to
contribute to achieving the goal of the next level.

Looked at in this way, one could think of the disagreement between us as merely
a matter of which of the first two levels we see as critical in day-to-day
programming. A question of which way of thinking about it worked best for the
individual.

I take a very object-centric view of OO programming, which can be summed up in
a slogan: "the objects matter, the code doesn't". (A POV, by the way, that has
been greatly strengthened by my more recent experience with Smalltalk and
Smalltalk IDEs.) From that you'd expect that I would prefer to put the
emphasis on level two in the above list (the "state" here being understood as
the objects). Whereas it seems that you prefer to take the first level as the
"real" one.

I do think that some part of our disagreement may come from that sort of
difference in attitude, but I want to go further and say that the first level
is actively misleading. The rest of this post will be an unstructured ramble
through my reasons.

(This is probably as good a point as any to mention that if I were interviewing
prospective junior programmers, and they said that synchronisation was to
protect passages of code, then I'd very likely fail them on that basis alone --
at the very least it would ring loud alarm bells and I would put even more
effort into trying to discover if they really understood what objects were.
It's up to you, of course, to decide how much weight to give to my opinion, but
such facts /in general/ should surely inform how you train your students. (I
am right in thinking you teach ?))

The first point is that (as you know, of course) synchronised blocks (and
methods) don't /really/ protect passages of code -- not unless you
conceptualise the code as being duplicated "in" each object. I think that if
you put the emphasis on the code in this way, then you are actively inviting
the mistake of forgetting that the synchronisation is bound to the object. As
a result you need a counter-balancing way of forcing a sense of /individual/
objects back into the picture; which you try to do by encouraging explicit
synchronisation and using separate lock/monitor objects. That seems to me to
be wrong -- if you hadn't lost sight of the objects in the first place, then
the issue wouldn't arise. You are, in a sense, treating the symptoms, not the
disease.

BTW, if I were teaching concurrent Java programming -- which, thank God, I'm
not -- I'd be tempted to try the experiment of ensuring that /every single/
example the students ever saw (or created) had at least two instances of the
class(es) in question. I don't know if you've ever tried that, maybe you do it
as a matter of course, but it'd be interesting to know if it helped. It might
even be worth trying in teaching ordinary, non-threaded, programming... (I'd
also postpone teaching "static" for as long as I possible could -- but that's
another story...)

A further digression: the thing I dislike most about Java as a language is that
it does little to encourage the object-centric viewpoint, and does little to
challenge -- indeed makes it hard to escape -- the code-centric viewpoint.
Perhaps we could see "synchronized" as another opportunity to teach /why/ it's
important to think about the objects. "Objects matter, code doesn't"...


Anyway, returning to the question in hand; my second point is about
responsibility in OO. Another slogan:

IT IS AN OBJECT'S RESPONSIBILITY TO MANAGE ITS OWN STATE.

(I'm sorry about the shouting, and I hope I didn't hurt anyone's ears, but I do
think it is important enough to warrant a little noise.) In a concurrent
program that may include maintaining the integrity of the state in the face of
concurrent access (some objects assume that responsibility, some -- e.g. Java
Collections -- don't). Nearly all uses of Java's concurrency features are
directly linked to allowing an object to satisfy the responsibilities it has
assumed for permitting concurrent use. I think it's important to describe and
think about synchronisation in a way that puts emphasis what it's (normally)
being used for. That's to say that I think the "synchronisation protects
objects" viewpoint is, or should be, the most natural and fruitful for the OO
programmer.

In fact I'm strongly tempted to go further and say that if that /isn't/ what
it's being used for then there's something wrong with the design of the
program. I'm not certain that there aren't count-examples, though.

I similarly think it's important to program in a way that reflects what
synchronisation is for. I have no particular preference between synchronised
methods and synchronized (on 'this', whether implicitly or explicitly) blocks,
but using a separate (and un-necessary) lock object is just obfuscating the
design.

(But I have to be honest and admit that in Smalltalk -- where objects don't all
have the Gosling-given ability to act as monitors -- it doesn't seem to be
significantly more difficult to read or write threadsafe code, even though you
/have/ to use separate lock objects. It's partly a matter of language-specific
idiom, and following familiar patterns where possible.)


Imagine a variant of Java where synchronisation was only allowed on 'this' (so
it only had synchronised methods, and unqualified "synchronized { ... }"
blocks). I think that nearly all concurrent code would be unaffected by the
change, and that the "feel" of concurrent programming in that language would be
just the same as in Java itself. Of course some expressions and designs
wouldn't carry over unchanged, but most of them would. Yet in that language it
would be inescapably true that synchronisation protects the objects -- that's
all it /could/ do. Since that language would be essentially the same as Java,
I think that supports the idea that "synchronisation protects the objects" in
Java too.

Incidentally, the above is why I don't think that your examples of cases where
'synchronized" /isn't/ protecting any obvious object, or anything that would
naturally be called "state", are very relevant (although perfectly valid). If
such cases are rare (as they are) and if they can be handled gracefully in the
modified Java (as they could be) then (IMO) it's clear that they are not
central for understanding either what Java synchronisation /is/ or what it's
/for/. In short, they are a distraction (and a source of confusion for
students).

In the Java variant, your example of writing ungarbled lines to System.out
could not be solved directly. Two possible approaches would be to implement a
Semaphore class (with the classic P and V operations), or to create a new
object that "knew" its job was to accept data piecemeal and write it out in
coherent lines. In either case, what you have done is create a new object that
explicitly accepts responsibility for "managing" access to the resource. To me
that seems like a better design than unstructured "ah-hoc" synchronisation.

Returning to the case where several objects need to coordinate so as to
maintain the coherency of their shared state. This is, of course, something
that can be handled by using shared lock (or "monitor") objects. But I wonder
how many such cases would be better handled by finding higher level ways of
expressing the coordination. Using higher-level abstractions seems better to
me than "synchronization spaghetti".

To finish with yet another digression: this is why I wouldn't be particularly
bothered if a junior programmer, who did understand objects and "synchronized"
correctly, did not know about "synchronized(someOtherObject) {". There are
simply too few legitimate places to use it, and is so easy to explain if the
basics are solid, that I can't see that it matters much. (Mind you, I would
worry about what /other/ gaps there were in that programmer's training...)

-- chris
 
U

Usenet

I similarly think it's important to program in a way that reflects what
synchronisation is for. I have no particular preference between
synchronised
methods and synchronized (on 'this', whether implicitly or explicitly)
blocks,
but using a separate (and un-necessary) lock object is just obfuscating
the
design.

Then you haven't thought this through. Suppose that objects didn't all come
equipped with monitors, and you had to declare a Monitor-valued field for
each object you wanted to protect. Would you declare this field public or
private? Silly question, isn't it? Synchronized methods or
synchronized(this) blocks are the moral equivalent of a public monitor field
that any code which can access your object can use to destroy your carefully
constructed synchronization logic. (It's true that in your hypothetical
world where synchronization is limited to "this", no such problem exists,
but in the real world it does.)

Getting synchronization right in a complex system, avoiding both races and
deadlocks, is hard work. It isn'as simple as having each class protect its
state. It requires a high-level design which is *implemented* at the class
level. One of the things that bothers me about Java is how easy it is to
throw in a few "synchronized"s and think you're done. I've observed this
happen many times, occasionally even to people who should know better.
 
C

Chris Uppal

Usenet wrote:

[replying to passages out of order]
Getting synchronization right in a complex system, avoiding both races and
deadlocks, is hard work. It isn'as simple as having each class protect
its state. It requires a high-level design which is *implemented* at the
class level. One of the things that bothers me about Java is how easy it
is to throw in a few "synchronized"s and think you're done. I've
observed this happen many times, occasionally even to people who should
know better.

Yes, I agree entirely with this, but...

Then you haven't thought this through. Suppose that objects didn't all
come equipped with monitors, and you had to declare a Monitor-valued
field for each object you wanted to protect. Would you declare this
field public or private? Silly question, isn't it? Synchronized methods
or synchronized(this) blocks are the moral equivalent of a public monitor
field that any code which can access your object can use to destroy your
carefully constructed synchronization logic. (It's true that in your
hypothetical world where synchronization is limited to "this", no such
problem exists, but in the real world it does.)

I have no idea what point you are trying to make here. After three readings, I
still cannot understand what you are trying to say (except for the first
sentence -- I understand that all right, but am less than charmed by it).

-- chris
 
M

Mike Schilling

I have no idea what point you are trying to make here. After three
readings, I
still cannot understand what you are trying to say (except for the first
sentence -- I understand that all right, but am less than charmed by it).

You've snipped the context. I was replying to your point that you prefer

synchronized(this) {}

to

private Object lock = new Object();
synchronized(lock) {}

and explaining the dangers of the former.

This idea isn't peculiar to me, by the way. Bloch makes the same point in
_Effective Java_.
 
T

Thomas G. Marshall

This post is huge. I'm going to try to respond to as few of the salient
points as I can that I believe we are at odds with. If I snipped something
too important to miss, then I apologize, and please let me know.


Chris Uppal coughed up:
Nope, I'm afraid it is "completely" ;-)

Pondering this, I'm trying to work out where our disagreement lies.

But before getting into it, I want to mention that I don't think your
example of "protecting" access to an external resource, like writing
unscrambled lines to System.out, is relevant here. I'll come back to
why later, but I think it'll be easier to understand my position if
you know in advance that I don't consider such cases to be of any
great importance for the point in question.

In another post you mentioned a hierarchy which I'll paraphrase (and
extend) as something like:

I am protecting these lines of code,
in order to protect this aspect of shared state,
in order that my program will work correctly,
.. skipping a level or two here...
in order that I may enjoy enhanced wealth and status...

where, at each level, the goal can be seen as a tactic that is
intended to contribute to achieving the goal of the next level.

Looked at in this way, one could think of the disagreement between us
as merely a matter of which of the first two levels we see as
critical in day-to-day programming. A question of which way of
thinking about it worked best for the individual.

I take a very object-centric view of OO programming, which can be
summed up in a slogan: "the objects matter, the code doesn't". (A
POV, by the way, that has been greatly strengthened by my more recent
experience with Smalltalk and Smalltalk IDEs.) From that you'd
expect that I would prefer to put the emphasis on level two in the
above list (the "state" here being understood as the objects).
Whereas it seems that you prefer to take the first level as the
"real" one.

Mutexes, semaphores, and all notions of synchronization and concurrency
control that I can think of exist in computer science with or without object
orientation.

Do you agree?

I do think that some part of our disagreement may come from that sort
of difference in attitude, but I want to go further and say that the
first level is actively misleading. The rest of this post will be an
unstructured ramble through my reasons.

(This is probably as good a point as any to mention that if I were
interviewing prospective junior programmers, and they said that
synchronisation was to protect passages of code, then I'd very likely
fail them on that basis alone

That would be a mistake.

-- at the very least it would ring loud
alarm bells and I would put even more effort into trying to discover
if they really understood what objects were. It's up to you, of
course, to decide how much weight to give to my opinion, but such
facts /in general/ should surely inform how you train your students.
(I am right in thinking you teach ?))

The first point is that (as you know, of course) synchronised blocks
(and methods) don't /really/ protect passages of code -- not unless
you conceptualise the code as being duplicated "in" each object.

Not exactly. The code /execution/ is duplicated "in" each thread. When I
say that I am protecting lines of code from simultaneous execution, it means
just that: The threads that want to simultaneously execute the lines of code
are forced to do it "one at a time", assuming of course that the
synchronization is done correctly.

I think that if you put the emphasis on the code in this way, then you
are actively inviting the mistake of forgetting that the
synchronisation is bound to the object.

And what of non-object oriented languages?

As a result you need a
counter-balancing way of forcing a sense of /individual/ objects back
into the picture; which you try to do by encouraging explicit
synchronisation and using separate lock/monitor objects. That seems
to me to be wrong -- if you hadn't lost sight of the objects in the
first place, then the issue wouldn't arise. You are, in a sense,
treating the symptoms, not the disease.

BTW, if I were teaching concurrent Java programming -- which, thank
God, I'm not -- I'd be tempted to try the experiment of ensuring that
/every single/ example the students ever saw (or created) had at
least two instances of the class(es) in question. I don't know if
you've ever tried that, maybe you do it as a matter of course, but
it'd be interesting to know if it helped. It might even be worth
trying in teaching ordinary, non-threaded, programming... (I'd also
postpone teaching "static" for as long as I possible could -- but
that's another story...)

A further digression: the thing I dislike most about Java as a
language is that it does little to encourage the object-centric
viewpoint, and does little to challenge -- indeed makes it hard to
escape -- the code-centric viewpoint. Perhaps we could see
"synchronized" as another opportunity to teach /why/ it's important
to think about the objects. "Objects matter, code doesn't"...


Anyway, returning to the question in hand; my second point is about
responsibility in OO. Another slogan:

IT IS AN OBJECT'S RESPONSIBILITY TO MANAGE ITS OWN STATE.

This is precisely how students get confused. Take the following object:

public class Thing
{
public void one() {.....}
public void two() {...........}
public synchronized void three() {.......}
public void four()
{
....
synchronized(this) {.....}
....
}
}

Once a thread has wandered into three(), what is the state of the object?
Is it protected from concurrency? Nope. one() and two() have no such
protection whatsoever. There is no identifiable state for this object.
What /has/ been protected are the lines within three(), and the sync block
in four(), assuming that the threads are all accessing the same instance.

Ah ha, you might say. An /instance/ with a state. Not useful: it is an
instance in which the lock is "kept". Which is why I strongly suggest that
newbies make the lock instance a separate object entirely, devoted to
nothing but holding the lock, for all the reasons I've pointed out several
times in this thread alone.

The lock is what matters. The lines of code protected by the lock is what
matters. Extrapolate upward and refer to that as object state, application
functionality, etc., all you like.

Your explanation of object state, while an enormously important notion in
OOP in general, is simply not going to stop the kind of confusion we see in
the OP. Once I show my simple class examples (I haven't posted them yet) to
folks with similar confusion, they then understand the basics.

(I'm sorry about the shouting, and I hope I didn't hurt anyone's
ears, but I do think it is important enough to warrant a little
noise.) In a concurrent program that may include maintaining the
integrity of the state in the face of concurrent access (some objects
assume that responsibility, some -- e.g. Java Collections -- don't).
Nearly all uses of Java's concurrency features are directly linked to
allowing an object to satisfy the responsibilities it has assumed for
permitting concurrent use. I think it's important to describe and
think about synchronisation in a way that puts emphasis what it's
(normally) being used for. That's to say that I think the
"synchronisation protects objects" viewpoint is, or should be, the
most natural and fruitful for the OO programmer.

In fact I'm strongly tempted to go further and say that if that
/isn't/ what it's being used for then there's something wrong with
the design of the program. I'm not certain that there aren't
count-examples, though.

I similarly think it's important to program in a way that reflects
what synchronisation is for. I have no particular preference
between synchronised methods and synchronized (on 'this', whether
implicitly or explicitly) blocks, but using a separate (and
un-necessary) lock object is just obfuscating the design
(But I have to be honest and admit that in Smalltalk -- where objects
don't all have the Gosling-given ability to act as monitors -- it
doesn't seem to be significantly more difficult to read or write
threadsafe code, even though you /have/ to use separate lock objects.
It's partly a matter of language-specific idiom, and following
familiar patterns where possible.)


Imagine a variant of Java where synchronisation was only allowed on
'this' (so it only had synchronised methods, and unqualified
"synchronized { ... }" blocks). I think that nearly all concurrent
code would be unaffected by the change, and that the "feel" of
concurrent programming in that language would be just the same as in
Java itself. Of course some expressions and designs wouldn't carry
over unchanged, but most of them would. Yet in that language it
would be inescapably true that synchronisation protects the objects
-- that's all it /could/ do. Since that language would be
essentially the same as Java, I think that supports the idea that
"synchronisation protects the objects" in Java too.

Incidentally, the above is why I don't think that your examples of
cases where 'synchronized" /isn't/ protecting any obvious object, or
anything that would naturally be called "state", are very relevant
(although perfectly valid). If such cases are rare (as they are) and
if they can be handled gracefully in the modified Java (as they could
be) then (IMO) it's clear that they are not central for understanding
either what Java synchronisation /is/ or what it's /for/. In short,
they are a distraction (and a source of confusion for students).

In the Java variant, your example of writing ungarbled lines to
System.out could not be solved directly. Two possible approaches
would be to implement a Semaphore class (with the classic P and V
operations), or to create a new object that "knew" its job was to
accept data piecemeal and write it out in coherent lines. In either
case, what you have done is create a new object that explicitly
accepts responsibility for "managing" access to the resource. To me
that seems like a better design than unstructured "ah-hoc"
synchronisation.

Returning to the case where several objects need to coordinate so as
to maintain the coherency of their shared state. This is, of course,
something that can be handled by using shared lock (or "monitor")
objects. But I wonder how many such cases would be better handled by
finding higher level ways of expressing the coordination. Using
higher-level abstractions seems better to me than "synchronization
spaghetti".

To finish with yet another digression: this is why I wouldn't be
particularly bothered if a junior programmer, who did understand
objects and "synchronized" correctly, did not know about
"synchronized(someOtherObject) {". There are simply too few
legitimate places to use it,

This is fundamental to the understanding of where the locks are kept. Once
the student fully understands that the lock is actually "kept" somewhere,
either in an instance or in a class for statics, then the rest is far less
likely to confuse him.
 
X

xarax

Thomas G. Marshall said:
This post is huge.

/snip huge post/

mutual exclusion provides an atomic view of
the state of a set of data. mutual exclusion
is used when there is the possibility of
multiple distinct units of work viewing or
altering discrete pieces of data such that
a consistent state is presented to the clients
of that data. each "client" is typically a
thread, which represents a "context", but
can also be owned by a distinct process.

protecting lines of code is the wrong way
to think about concurrent programming and
mutual exclusion. protecting the consistency
of a data state is the whole point.

in the old days, mutual exclusion was called
"serialization", because concurrent work units
were "serialized" one at a time through a
piece of program logic that required a consistent
view of the data being operated upon. now it's
called "synchronization"; "serializing" now
means something entirely different (what I
would call "data persistence").

a programmer can apply "synchronized" blocks
to any piece of code. whether such synchronization
is effective depends on the design of the
application.

public synchronized int getFubar()
{
return 12;
}

the above instance method is uselessly synchronized.
the code is protected against multiple threads
executing concurrently, but there is no point to
that. there is no data state to protect. synchronization
in the absence of protecting data state consistency
is pointless. therefore, it is pointless to focus
on the effect of synchronization upon pieces of code.
it's all about the consistency of the data state.


--
----------------------------
Jeffrey D. Smith
Farsight Systems Corporation
24 BURLINGTON DRIVE
LONGMONT, CO 80501-6906
http://www.farsight-systems.com
z/Debug debugs your Systems/C programs running on IBM z/OS for FREE!
 
T

Thomas G. Marshall

Lee Fesperman coughed up:
You twisted my words to mock them as foolish.

It was a direct quote, without snipping. If your unnecessary usage of the
word /asinine/ makes you appear foolish, then it is your own fault.

You didn't add anything
substantive to the discussion.

Oh, and using the word "asinine" did?

More word play? You're saying that protecting state is a technique
used for the purpose of synchronizing threads?


Yep, I realize that nothing anyone can say will change your mind.
OTOH, all it would take to convince me is a useful case where
synchronization is needed except for protecting external state ...
you may need to provide details, though.


Again, you are twisting my words for no reasonable purpose. I haven't
said that synchronization is unimportant in protecting state ... it
is a *very* important technique. You're not just saying that
mentioning state is unimportant in discussing synchronization but
that it would do no good.

What comes first in understanding the "state" to which you refer is how to
preserve it. To do that properly you need to understand precisely what
lines of code can damage it.

I'm explaining why you use synchronization which naturally leads to
where and how to apply it. If you don't know the purpose of
synchronizing, you won't know where to apply it. Avoiding concurrent
execution is too vague a goal. Why do you need to avoid concurrent
execution?

A "goal"?

Controlling concurrent execution is /precisely/ the technique. "Preserving
external state" is FAR too vague a goal. How do you preserve such state
(with regards to multi-threading)? By carefully analyzing and controlling
the concurrent execution.

You've refused to provide details. That makes it rather difficult to
pick one. Nevertheless, I will respond ---

If you can't interrupt (delay) this thread and the thread doesn't
affect external state, who would ever know if you did interrupt it?
If it has no effects, interrupting it would have no effect.


Ok. You meant synchronization block.

No, I said:

ME:
And it is those lines of code /within/ such synchronization
structures that are protected.

The synchronization structures would include the explicit synchronization
blocks and synchronization methods. The structure of a program is a fairly
common CS term.

Perhaps you could have said
'synchronization control structure'.

A synchronization control stucture /is/ a synchronization structure.

Further imprecision in your
statements made it necessary to clarify.

You sure you want to say this? Or am I twisting your words again?

To continue: By /within/, do you mean --- lines of code textually
inside the braces of a synchronization block (or synchronized method)?

The execution of such lines of code. Yep.
 
T

Thomas G. Marshall

xarax coughed up:
"Thomas G. Marshall"


/snip huge post/

mutual exclusion provides an atomic view of
the state of a set of data. mutual exclusion
is used when there is the possibility of
multiple distinct units of work viewing or
altering discrete pieces of data such that
a consistent state is presented to the clients
of that data. each "client" is typically a
thread, which represents a "context", but
can also be owned by a distinct process.

protecting lines of code is the wrong way
to think about concurrent programming and
mutual exclusion. protecting the consistency
of a data state is the whole point.

You cannot protect any such state without an understanding of the lines of
code that can damage it.

That's like saying that in this case:

public class Thing
{
public void method()
{
...broken algorithm...
}
}

that protecting the object state is the whole point. No. Supplying lines
of code within method() are the point. You cannot have a properly behaving
object without it.

Again, refer to the abstraction ladder I posted elsethread.

in the old days, mutual exclusion was called
"serialization", because concurrent work units
were "serialized" one at a time through a
piece of program logic that required a consistent
view of the data being operated upon. now it's
called "synchronization"; "serializing" now
means something entirely different (what I
would call "data persistence").

a programmer can apply "synchronized" blocks
to any piece of code. whether such synchronization
is effective depends on the design of the
application.

public synchronized int getFubar()
{
return 12;
}

This is an example of misusing the synchronization keyword. So what?

the above instance method is uselessly synchronized.
the code is protected against multiple threads
executing concurrently, but there is no point to
that. there is no data state to protect. synchronization
in the absence of protecting data state consistency
is pointless. therefore, it is pointless to focus
on the effect of synchronization upon pieces of code.
it's all about the consistency of the data state.

The consistency of the data state /cannot/ be maintained in a multiple
threading situation unless you analyze what part of the objects executable
code can dammage it.
 
T

Thomas G. Marshall

Thomas G. Marshall coughed up:
xarax coughed up:

You cannot protect any such state without an understanding of the
lines of code that can damage it.

That's like saying that in this case:

public class Thing
{
public void method()
{
...broken algorithm...
}
}

that protecting the object state is the whole point. No. Supplying
....working...


lines of code within method() are the point. You cannot have a
properly behaving object without it.

Again, refer to the abstraction ladder I posted elsethread.



This is an example of misusing the synchronization keyword. So what?



The consistency of the data state /cannot/ be maintained in a multiple
threading situation unless you analyze what part of the objects
executable code can dammage it.
 
T

Thomas G. Marshall

Chris Uppal coughed up:

....[snip]...

To finish with yet another digression: this is why I wouldn't be
particularly bothered if a junior programmer, who did understand
objects and "synchronized" correctly, did not know about
"synchronized(someOtherObject) {". There are simply too few
legitimate places to use it,

I was going to let this go, but I just cannot. This last statement of yours
is particularly troubling. It's because I've had to fix a particular
mistake time and time again, and not just from junior engineers. From
senior engineers not taught properly, or some other reason.

Given this simple simple thread-safe object (I'll use the sync method idiom
for abbreviation, instead of "synchronized(this){}" ) :

public class Thing
{
private int count = 0;
public synchronized int get() { return count; }
public synchronized void set(int val) { count = val; }
}

I've seen *over and over again* the following type of code:

//somewhere this sucker is created...
static Thing ourThing = new Thing();

// elsewhere...
// move our thing up a notch...
ourThing.set(ourThing.get() + 1);

This is a disaster waiting to happen if in a multiple threading context.
The idiom that the interviewing engineer should be very comfortable with is:

// move our thing up a notch...
synchronized(ourThing)
{
ourThing.set(ourThing.get() + 1);
}

Now again, in my particular teaching technique, the notion of synchronizing
on /this/ is taught after using objects dedicated to synchronization (and
then while stressing the block idiom, and not the method), but you need to
be aware that this kind of error I've illustrated is made /very/ often.

Particularly when the user grabs a thread safe list using
Collections.synchronizedList().

....[snip]...
 
M

Mike Schilling

Thomas G. Marshall said:
Chris Uppal coughed up:

...[snip]...

To finish with yet another digression: this is why I wouldn't be
particularly bothered if a junior programmer, who did understand
objects and "synchronized" correctly, did not know about
"synchronized(someOtherObject) {". There are simply too few
legitimate places to use it,

I was going to let this go, but I just cannot. This last statement of
yours
is particularly troubling. It's because I've had to fix a particular
mistake time and time again, and not just from junior engineers. From
senior engineers not taught properly, or some other reason.

Given this simple simple thread-safe object (I'll use the sync method
idiom
for abbreviation, instead of "synchronized(this){}" ) :

public class Thing
{
private int count = 0;
public synchronized int get() { return count; }
public synchronized void set(int val) { count = val; }
}

I've seen *over and over again* the following type of code:

//somewhere this sucker is created...
static Thing ourThing = new Thing();

// elsewhere...
// move our thing up a notch...
ourThing.set(ourThing.get() + 1);

This is a disaster waiting to happen if in a multiple threading context.
The idiom that the interviewing engineer should be very comfortable with
is:

// move our thing up a notch...
synchronized(ourThing)
{
ourThing.set(ourThing.get() + 1);
}

or
synchronized(CreatingClass.class)

which is a general way of locking the class's static state. (Though I
personally dislike this for the same reason as I do synchronized methods,
and would prefer:

private static static_lock = new Object();
...
synchronized(static_lock) {..}
 
X

xarax

"Thomas G. Marshall" <[email protected]>
wrote in message /snip/
It is incorrect to think of synchronization as
protecting "code". It protects data. If there
were no code, there would be no processing and
no need for computers. Code is a "given".

public class Fubar
{
protected int gonk;

public synchronized void snafu()
{
gonk++;
}

public synchronized void gorko()
{
gonk--;
}
}

Will the "synchronized" keyword prevent multiple
threads from concurrently executing the code? Say,
thread#1 calls snafu() and thread#2 calls gorko().

The answer is, "it depends". Maybe yes, maybe
no. Why? Because snafu() and gorko() are instance
methods, and the synchronization depends on which
instance each thread is referring to at the time
the method is called. it's all about the data, not
about the code.

Another way to describe it is synchronization
governs access to a resource. Data in memory can
be a resource. Files on a harddrive can be a
resource. There are synchronization mechanisms
to ensure the consistency of data and proper
access to resources. Any such synch mechanism
is a "protocol" that must be observed by all
threads that access the *same* resource. With
proper design and implementation, the resource
is protected from corruption.

mutex protects data, not code.
 
T

Thomas G. Marshall

Mike Schilling coughed up:
"Thomas G. Marshall"
Chris Uppal coughed up:

...[snip]...

To finish with yet another digression: this is why I wouldn't be
particularly bothered if a junior programmer, who did understand
objects and "synchronized" correctly, did not know about
"synchronized(someOtherObject) {". There are simply too few
legitimate places to use it,

I was going to let this go, but I just cannot. This last statement
of yours
is particularly troubling. It's because I've had to fix a particular
mistake time and time again, and not just from junior engineers.
From senior engineers not taught properly, or some other reason.

Given this simple simple thread-safe object (I'll use the sync method
idiom
for abbreviation, instead of "synchronized(this){}" ) :

public class Thing
{
private int count = 0;
public synchronized int get() { return count; }
public synchronized void set(int val) { count = val; }
}

I've seen *over and over again* the following type of code:

//somewhere this sucker is created...
static Thing ourThing = new Thing();

// elsewhere...
// move our thing up a notch...
ourThing.set(ourThing.get() + 1);

This is a disaster waiting to happen if in a multiple threading
context. The idiom that the interviewing engineer should be very
comfortable with is:

// move our thing up a notch...
synchronized(ourThing)
{
ourThing.set(ourThing.get() + 1);
}

or
synchronized(CreatingClass.class)

which is a general way of locking the class's static state.

Yep. As long as the internals are doing that as well, then there is no
worry /what/ is holding the lock.

I've discovered that part of what confuses students is that in order to
externally synchronize on an object, that object needs to be 1. internally
synchronizing on /this/ (or on the class, or something else /known/, etc.),
and 2. well documented that it is doing so.

Not sure that there is a way around that in any OO language...

(Though I
personally dislike this for the same reason as I do synchronized
methods, and would prefer:

private static static_lock = new Object();
...
synchronized(static_lock) {..}

Ditto (as if I had to say that :p )...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,578
Members
45,052
Latest member
LucyCarper

Latest Threads

Top