ThreadPoolExecutor with blocking execute?

C

castillo.bryan

I thought I could use a ThreadPoolExecutor for a producer/consumer
relationship. I wanted to have a fixed queue size for the pool, which
blocked on the producer side if the queue was full, until a slot in the
queue was open. I can see that a RejectedExecutionHandler is called
when the queue is full and there are some pre-existing handlers, to
drop the Runnable or to run the Runnable in the current thread, but no
support for waiting until a slot is empty. I thought that running the
Runnable in the current thread is pretty close, but if multiple slots
open up, while the current thread is busy with a Runnable, it can't
give more tasks to waiting threads.

So I wrote this simple class to block until a slot is empty. Does this
seem reasonable? Does something like this already exist in the JDK that
I missed?



import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.RejectedExecutionException;
import java.util.concurrent.RejectedExecutionHandler;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.concurrent.TimeUnit;


public class BlockingThreadPoolExecutor extends ThreadPoolExecutor {

private static class BlockingQueuePut implements
RejectedExecutionHandler {
public void rejectedExecution(Runnable r, ThreadPoolExecutor
executor) {
try {
executor.getQueue().put(r);
} catch (InterruptedException ie) {
throw new RejectedExecutionException(ie);
}
}
}

public BlockingThreadPoolExecutor(int coreThreadSize, int
maxThreadSize, int queueSize) {
super(
coreThreadSize,
maxThreadSize,
5,
TimeUnit.SECONDS,
new ArrayBlockingQueue<Runnable>(queueSize),
new BlockingQueuePut());
}

}
 
W

wesley.hall

I thought I could use a ThreadPoolExecutor for a producer/consumer
relationship. I wanted to have a fixed queue size for the pool, which
blocked on the producer side if the queue was full, until a slot in the
queue was open. I can see that a RejectedExecutionHandler is called
when the queue is full and there are some pre-existing handlers, to
drop the Runnable or to run the Runnable in the current thread, but no
support for waiting until a slot is empty. I thought that running the
Runnable in the current thread is pretty close, but if multiple slots
open up, while the current thread is busy with a Runnable, it can't
give more tasks to waiting threads.

So I wrote this simple class to block until a slot is empty. Does this
seem reasonable? Does something like this already exist in the JDK that
I missed?



import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.RejectedExecutionException;
import java.util.concurrent.RejectedExecutionHandler;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.concurrent.TimeUnit;


public class BlockingThreadPoolExecutor extends ThreadPoolExecutor {

private static class BlockingQueuePut implements
RejectedExecutionHandler {
public void rejectedExecution(Runnable r, ThreadPoolExecutor
executor) {
try {
executor.getQueue().put(r);
} catch (InterruptedException ie) {
throw new RejectedExecutionException(ie);
}
}
}

public BlockingThreadPoolExecutor(int coreThreadSize, int
maxThreadSize, int queueSize) {
super(
coreThreadSize,
maxThreadSize,
5,
TimeUnit.SECONDS,
new ArrayBlockingQueue<Runnable>(queueSize),
new BlockingQueuePut());
}

}

Whats wrong with this?:

BlockingQueue<Runnable> fixedSizeQueue = new
ArrayBlockingQueue<Runnable>(size);
Executor executor = new ThreadPoolExecutor(........., fixedSizeQueue);

Just add tasks to the fixedSizeQueue, which will block if the queue
overflows?

Seems much simpler to me. Would doesn't this solve your problem?
 
C

castillo.bryan

Whats wrong with this?:

BlockingQueue<Runnable> fixedSizeQueue = new
ArrayBlockingQueue<Runnable>(size);
Executor executor = new ThreadPoolExecutor(........., fixedSizeQueue);

Just add tasks to the fixedSizeQueue, which will block if the queue
overflows?

Seems much simpler to me. Would doesn't this solve your problem?

No, by default ThreadPoolExecutor does not block when the queue is
full. It throws a RejectedExecutionException.

If you run the code below you will see that happen.

import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.concurrent.TimeUnit;


public class TestExecutorService {

public static void runTest(ExecutorService executor, final long
sleepTime, int itemsToRun)
throws InterruptedException
{
System.err.println("Starting test.");
for (int i=0; i<itemsToRun; i++) {
final int id = i+1;
System.err.println("enqueing item " + id + ".");
executor.execute(new Runnable() {
public void run() {
System.err.println("Running " + id);
try {
Thread.sleep(sleepTime);
} catch (InterruptedException ie) {}
System.err.println("Finished " + id);
}
});
}
System.err.println("Waiting for shutdown.");
executor.shutdown();
while (!executor.awaitTermination(30, TimeUnit.SECONDS)) {
; // do nothing
}
}

public static void main(String[] args) {
try {
ExecutorService executor = new ThreadPoolExecutor(1, 10, 5,
TimeUnit.SECONDS, new ArrayBlockingQueue<Runnable>(10));
//ExecutorService executor = new BlockingThreadPoolExecutor(1, 10,
5);
runTest(executor, 1000, 50);
}
catch (Exception e) {
e.printStackTrace();
System.exit(1);
}
}

}
 
W

wesley.hall

No, by default ThreadPoolExecutor does not block when the queue is
full. It throws a RejectedExecutionException.

If you run the code below you will see that happen.

<snip code>

I see what you mean. What you need to do is use a second queue to
manage your flow control. I wrote a quick example...

import java.util.concurrent.BlockingQueue;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.Executor;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.LinkedBlockingQueue;

public class ThreadPoolExecutorBlockTest
{
public static void main(String[] args)
{
final BlockingQueue<Runnable> queue = new
ArrayBlockingQueue<Runnable>(20, true);
final Executor executor = new ThreadPoolExecutor(10, 10, 1000,
TimeUnit.SECONDS, new LinkedBlockingQueue<Runnable>());

new Thread(new Runnable()
{
public void run()
{
while(true)
{
try
{
executor.execute(queue.take());
}
catch (InterruptedException e)
{
//Ignore and repeat loop
}
}
}
}).start();

for(int i = 0; i < 30; i++)
{
try
{
queue.put(new Printer(i));
}
catch (InterruptedException e)
{
e.printStackTrace();
}
}
}

private static class Printer implements Runnable
{
private int number;

public Printer(int number)
{
this.number = number;
}

public void run()
{
System.out.println("Running task: " + number);
try
{
Thread.sleep(10000);
}
catch (InterruptedException e)
{
e.printStackTrace();
}
}
}
}
 
C

castillo.bryan

No, by default ThreadPoolExecutor does not block when the queue is
full. It throws a RejectedExecutionException.

If you run the code below you will see that happen.

<snip code>

I see what you mean. What you need to do is use a second queue to
manage your flow control. I wrote a quick example...

import java.util.concurrent.BlockingQueue;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.Executor;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.LinkedBlockingQueue;

public class ThreadPoolExecutorBlockTest
{
public static void main(String[] args)
{
final BlockingQueue<Runnable> queue = new
ArrayBlockingQueue<Runnable>(20, true);
final Executor executor = new ThreadPoolExecutor(10, 10, 1000,
TimeUnit.SECONDS, new LinkedBlockingQueue<Runnable>());

new Thread(new Runnable()
{
public void run()
{
while(true)
{
try
{
executor.execute(queue.take());
}
catch (InterruptedException e)
{
//Ignore and repeat loop
}
}
}
}).start();

for(int i = 0; i < 30; i++)
{
try
{
queue.put(new Printer(i));
}
catch (InterruptedException e)
{
e.printStackTrace();
}
}
}

private static class Printer implements Runnable
{
private int number;

public Printer(int number)
{
this.number = number;
}

public void run()
{
System.out.println("Running task: " + number);
try
{
Thread.sleep(10000);
}
catch (InterruptedException e)
{
e.printStackTrace();
}
}
}
}



Yeah, but that code will basically be in a busy wait loop. It will
constantly have an exception thrown, recaught and retried. By using a
blocking put on the queue (my first post), the thread can yield itself
until it can actually do something. I think your example would eat the
CPU and is more complex than the first example I had.
 
D

Daniel Dyer

Yeah, but that code will basically be in a busy wait loop. It will
constantly have an exception thrown, recaught and retried. By using a
blocking put on the queue (my first post), the thread can yield itself
until it can actually do something. I think your example would eat the
CPU and is more complex than the first example I had.

You probably need to write your own BlockingQueue implementation to get
the behaviour you want. SynchronousQueue would be a good place to start,
it kind of does what you want, except it returns immediately from its
offer() method instead of blocking, and thus leads to the task being
rejected. The overloaded offer method that takes a timeout is closer
still. With a long enough timeout, it would effectively block.
Unfortunately, the ThreadPoolExecutor class does not seem to make use of
this offer method. However, it would be trivial to write a wrapper for
SynchronousQueue that implements offer(E) by delegating to the other offer
method with a suitably long timeout.

Dan.
 
C

castillo.bryan

Daniel said:
You probably need to write your own BlockingQueue implementation to get
the behaviour you want. SynchronousQueue would be a good place to start,
it kind of does what you want, except it returns immediately from its
offer() method instead of blocking, and thus leads to the task being
rejected. The overloaded offer method that takes a timeout is closer
still. With a long enough timeout, it would effectively block.
Unfortunately, the ThreadPoolExecutor class does not seem to make use of
this offer method. However, it would be trivial to write a wrapper for
SynchronousQueue that implements offer(E) by delegating to the other offer
method with a suitably long timeout.

The first example I had, in the first post has something that works.
But it does this by accesing the queue directly when an item is
rejected. So, I have something that works. What I'm wondering is, if
there is anything wrong with the way I'm doing it.

I thought about trying it the way you are talking about and overriding
ArrayBlockingQueue's offer method to actually call put. For some
reason that gave me shivers...... I know I could set it up in a way
that no other class could use that subclass though.
 
D

Daniel Dyer

The first example I had, in the first post has something that works.
But it does this by accesing the queue directly when an item is
rejected. So, I have something that works. What I'm wondering is, if
there is anything wrong with the way I'm doing it.
I thought about trying it the way you are talking about and overriding
ArrayBlockingQueue's offer method to actually call put. For some
reason that gave me shivers...... I know I could set it up in a way
that no other class could use that subclass though.

I'll admit that my suggestion is not particularly elegant in terms of
implementation. However, I think there is something to be said for
encapsulating this behaviour within the queue implementation, particularly
if you want to re-use it, since you won't have to worry about setting a
RejectedExecutionHandler on every thread pool you create. So I'm
convinced that the custom queue implementation is the right approach, but
the question remains as to what form should that implementation take,
whether you should write something from scratch or adapt one of the
existing implementations (presumably SynchronousQueue or
ArrayBlockingQueue).

I'd overlooked the put method but, now that you mention it, I think this
has to be better than my suggestion of fudging the issue by calling offer
with a long timeout (unless, of course, you want a timeout). There is
nothing in the API documentation that suggests that a blocking
implementation of offer would be a bad thing.

Dan.
 
W

wesley.hall

Yeah, but that code will basically be in a busy wait loop. It will
constantly have an exception thrown, recaught and retried. By using a
blocking put on the queue (my first post), the thread can yield itself
until it can actually do something. I think your example would eat the
CPU and is more complex than the first example I had.

Huh?

There is no exception thrown, recaught and retried. Not sure where you
got this idea from. My code uses the blocking methods of the
BlockingQueue (which the executor does not). The CPU will not run hot
and an exception is not thrown unless the thread is interupted (which
it is not under normal operation).

Personally, I don't the idea of managing rejected execution
retroactively (as per your example). If you are happy with this and
prefer this approach then great. It is fairly subjective after all.

Dan's solution is nice. I was under the (incorrect) impression that the
offer method was documented as failing when queue is full, but it seems
it isn't.
 
C

castillo.bryan

Huh?

There is no exception thrown, recaught and retried. Not sure where you
got this idea from. My code uses the blocking methods of the
BlockingQueue (which the executor does not). The CPU will not run hot
and an exception is not thrown unless the thread is interupted (which
it is not under normal operation).


Sorry, I misread the code, I thought one of the comments indicated that
"more code should be written..." (And I didn't mean to sound like a
jerk - sorry)

So I ran your code, but it still doesn't actually limit the main
producing thread. The queue you used for the ThreadPool (not the other
one) is unbounded, so that's why you didn't get any exceptions. So the
problem I had with an unbounded queue, is that I could fill up memory.
If you put a print statement right after your for loop which puts items
on, you will see it gets through the loop very fast. The extra thread
and ArrayBlockingQueue don't help in limiting throughput.

Personally, I don't the idea of managing rejected execution
retroactively (as per your example). If you are happy with this and
prefer this approach then great. It is fairly subjective after all.

I don't know that I like it either, but it seems like that was the
model intended by the API.
Dan's solution is nice. I was under the (incorrect) impression that the
offer method was documented as failing when queue is full, but it seems
it isn't.

It is documented:
http://java.sun.com/j2se/1.5.0/docs/api/java/util/Queue.html#offer(E)
 
W

wesley.hall

Sorry, I misread the code, I thought one of the comments indicated that
"more code should be written..." (And I didn't mean to sound like a
jerk - sorry)

So I ran your code, but it still doesn't actually limit the main
producing thread. The queue you used for the ThreadPool (not the other
one) is unbounded, so that's why you didn't get any exceptions. So the
problem I had with an unbounded queue, is that I could fill up memory.
If you put a print statement right after your for loop which puts items
on, you will see it gets through the loop very fast. The extra thread
and ArrayBlockingQueue don't help in limiting throughput.


You are absolutely right, sorry, must have been experiencing a mental
haywire when I wrote that code yesterday. It is probably best ignored.
I don't know that I like it either, but it seems like that was the
model intended by the API.

I dont think it was the model intended for implementing blocking
executors. It probably is the model intended to inform a process that a
task it provided cannot be executed due to heavy load (all threads
busy, and pool at maximum).

Sure, but there is nothing there that says explicitally that 'offer'
cannot block, Dan mentioned this before. The ThreadPoolExecutor uses
queue.offer rather than queue.put when execute is called, so you need
offer to block. You could write an ArrayBlockingQueue subclass to
redirect all calls to 'offer' to 'put' which would have the desired
effect. I think I would prefer something like this....

import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.TimeUnit;
import java.util.Collection;

public class DefaultedOfferTimeoutBlockingQueue<E> extends
ArrayBlockingQueue<E>
{
private long defaultOfferTimeoutDuration;
private TimeUnit defaultOfferTimeUnit;

public DefaultedOfferTimeoutBlockingQueue(int capacity, long
defaultOfferTimeoutDuration, TimeUnit defaultOfferTimeUnit)
{
super(capacity);
this.defaultOfferTimeoutDuration = defaultOfferTimeoutDuration;
this.defaultOfferTimeUnit = defaultOfferTimeUnit;
}

public DefaultedOfferTimeoutBlockingQueue(int capacity, boolean
fair, long defaultOfferTimeoutDuration, TimeUnit defaultOfferTimeUnit)
{
super(capacity, fair);
this.defaultOfferTimeoutDuration = defaultOfferTimeoutDuration;
this.defaultOfferTimeUnit = defaultOfferTimeUnit;
}

public DefaultedOfferTimeoutBlockingQueue(int capacity, boolean
fair, Collection<? extends E> initialElements, long
defaultOfferTimeoutDuration, TimeUnit defaultOfferTimeUnit)
{
super(capacity, fair, initialElements);
this.defaultOfferTimeoutDuration = defaultOfferTimeoutDuration;
this.defaultOfferTimeUnit = defaultOfferTimeUnit;
}


public boolean offer(E element)
{
try
{
return offer(element, defaultOfferTimeoutDuration,
defaultOfferTimeUnit);
}
catch (InterruptedException e)
{
//todo: probably should log something here
return false;
}
}
}

Which is a simple blocking queue that lets you create a default timeout
for the basic 'offer' method (rather than it failing instantly).

With this class in place you can simply do this...

import java.util.concurrent.Executor;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.concurrent.TimeUnit;

public class ThreadPoolExecutorBlockTest
{
public static void main(String[] args)
{
Executor executor = new ThreadPoolExecutor(5, 10, 10,
TimeUnit.SECONDS, new DefaultedOfferTimeoutBlockingQueue<Runnable>(5,
86400, TimeUnit.SECONDS));

for(int i = 0; i < 50; i++)
{
executor.execute(new Printer(i));
System.out.println("Task " + i + " added");
}
}

private static class Printer implements Runnable
{
private int number;

public Printer(int number)
{
this.number = number;
}

public void run()
{
System.out.println("Running task: " + number);
try
{
Thread.sleep(10000);
}
catch (InterruptedException e)
{
e.printStackTrace();
}
}
}
}

This seems to work very well for me.

Note though, items are removed from the queue (freeing up space) before
the task is run and not after, so there will always be (queueSize +
threadPoolSize) tasks that are sitting in the executor.
 
C

castillo.bryan

I dont think it was the model intended for implementing blocking
executors. It probably is the model intended to inform a process that a
task it provided cannot be executed due to heavy load (all threads
busy, and pool at maximum).

But there is a class CallerRunsPolicy (it implements
RejectedExecutionHandler), which handles the runnable by executing it
in the current thread. Using this class, you pretty much get a
blocking executor. So I think the RejectedExecutionHandler was meant
to be used for a variety of things, including processing the Runnable.


http://java.sun.com/j2se/1.5.0/docs...rent/ThreadPoolExecutor.CallerRunsPolicy.html

Anyway, there are 2 solutions here. I'm just left wondering why there
isn't direct support for a blocking ThreadPoolExecutor in the API.
(The documentation for BlockingQueue has a single producer, multiple
consumers example.) I think the ThreadPoolExecutor would be the
perfect place to have that code already in place. Its always such a
pain writing your own, with proper shutdown etc....



<snip>
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,743
Messages
2,569,478
Members
44,898
Latest member
BlairH7607

Latest Threads

Top