The Active Object Design Pattern is a concurrency pattern that decouples method invocation from method execution, allowing tasks to run asynchronously without blocking the caller.
At its core, an Active Object introduces a proxy that clients interact with. Instead of executing methods directly, the proxy places requests into a queue. A separate worker thread (or pool) processes these requests in the background. This creates a clean separation between what needs to be done and when/how it gets executed.
A typical Active Object system has four key components:
Proxy – exposes the interface to the client
Method Request – encapsulates a function call as an object or callable
Activation Queue – holds pending requests
Scheduler/Worker – executes requests asynchronously
This pattern is especially useful when:
You want to avoid blocking the main thread
You need controlled concurrency (e.g., limited worker threads)
You want to serialize access to shared resources safely
Sequence Diagram:
Mapping The Code to Active Object Components
Let’s reinterpret the code piece by piece.
Proxy (Client Interface)
put!(ao, () -> heavy_compute(10^7 + i))
This is the proxy layer.
Why?
- The caller is not executing the method directly
-
Instead, it:
- wraps the request as a function (closure)
- submits it to a queue
In classic Active Object:
proxy.method_call() → enqueue request
In my code:
put!(ao, job_function)
So:
The Channel (
ao) acts as the proxy interface
Activation Queue
ch = Channel{Function}(32)
This is the Activation Queue.
- Holds pending method requests
- Thread-safe
- Decouples producer and consumer
Classic role:
Queue<Request>
My version:
Channel{Function}
Each Function = a method request object
Method Request
() -> heavy_compute(10^7 + i)
This is a Method Request object, just expressed as a closure.
In traditional OO:
class PrintTask : MethodRequest {
void execute() { ... }
}
In Julia:
() -> heavy_compute(10^7 + i)
Key idea:
-
Encapsulates:
- what to do
- data (i)
- logic
Scheduler + Servant (Worker Threads)
Threads.@spawn begin
for job in ch
Base.invokelatest(job)
end
end
This block plays two roles:
Scheduler
for job in ch
- Pulls requests from the queue
- Decides execution order (FIFO here)
This is the scheduler
Servant
Base.invokelatest(job)
- Actually executes the request
This is the servant
So each worker thread is:
[ Scheduler + Servant ]
Thread Pool (Multiple Active Objects Workers)
for _ in 1:nworkers
Threads.@spawn ...
end
- Creates multiple workers
- All consume from the same queue
This is a multi-threaded Active Object
Classic pattern often has:
- 1 thread → 1 active object
My version:
N threads → shared activation queue
This is more like:
- Active Object + Thread Pool hybrid
Lifecycle Control
Closing the queue
close(ao)
- Signals: no more requests
- Workers stop after finishing remaining jobs
Waiting for completion
foreach(wait, tasks)
- Ensures all scheduled work completes


