Concurrency, Threads, Synchronization, and Deadlocks
Class: CSCE-314
Notes:
As our programs grow more complex, we often need them to do multiple things at once: handle multiple users, respond to input while computing, or download data while processing results. Concurrency allows us to divide work across multiple threads of execution, improving responsiveness and performance—but also introducing entirely new categories of bugs.
1. Why Concurrency Exists
Think about how people multitask: while cooking, you might boil water, chop vegetables, and preheat the oven. Each task proceeds semi-independently, but they share resources—your attention, the kitchen, and the stove. In computing, concurrency is the same idea: multiple tasks share the CPU and memory.
Historically, computers ran one program at a time, but modern systems include multi-core processors, multiple users, and background services. Operating systems use concurrency to give the illusion that many programs run simultaneously. In Java, concurrency means multiple threads executing within the same process.
2. Threads and Processes
A process is an independent program with its own memory space. Threads are smaller units of execution that live inside a process. They share memory and system resources, making communication between them easy—but also dangerous if not managed carefully.
Each Java program starts with one thread—the main thread—but you can create others using the Thread class or Runnable interface.
Thread t = new Thread(() -> System.out.println("Hello from another
thread!"));
t.start(); // begins executing concurrently with the main thread
When a thread’s start() method is called, the JVM schedules it to run. The thread’s run() method defines what it does. Java’s thread scheduler interleaves thread execution, so operations may not occur in the order you expect. This leads us directly to one of the biggest issues in concurrency: race conditions.
Threads share memory, so communication is fast - but unsafe without synchronization.
3. Race Conditions and Synchronization
Race conditions
A race condition occurs when two or more threads access shared data at the same time, and at least one of them modifies it. Because thread scheduling is unpredictable, the outcome can vary each time you run the program.
int count = 0;
Thread t1 = new Thread(() -> count++);
Thread t2 = new Thread(() -> count++);
t1.start(); t2.start();
- What will the outcome be?
- Could be 1 or could be 2 but you can't predict which is going to be.
Depending on timing, both threads might read the same initial value of count, increment it, and write back the same result—losing one update. This happens because ++ is not atomic: it involves reading, adding, and writing back.
Synchronization
To prevent this, we synchronize access to shared data. Java provides the synchronized keyword to ensure only one thread can execute a block of code at a time:
synchronized(this) { count++; }
Synchronization uses locks associated with objects. While one thread holds the lock, others attempting to enter a synchronized block on the same object must wait.
- Note that if you synchronize everything you are back to sequential (linear) programming (not parallel with threads)
- Keep syncrhonization short
4. Deadlocks
When two or more threads wait for each other’s locks, none can proceed. This is called a deadlock. Imagine two friends who each have one half of a shared resource, and both refuse to release their half until they get the other’s.
Thread 1: lock(A); lock(B); // waits for B
Thread 2: lock(B); lock(A); // waits for A
Neither thread can continue because each is waiting for the other. Deadlocks occur when four conditions hold: mutual exclusion, hold and wait, no preemption, and circular wait.
Avoiding deadlocks
To avoid deadlocks, always acquire locks in a consistent order, minimize the use of synchronized regions, or use higher-level concurrency constructs like semaphores, ReentrantLocks with tryLock(timeouts), or concurrent data structures from java.util.concurrent.
5. Practical Applications
Concurrency is everywhere. Servers use threads to handle multiple client requests simultaneously. User interfaces rely on concurrency to stay responsive while performing background work. Scientific and data-intensive programs use concurrency to distribute computation across processor cores.
However, not every task benefits from threads. For example, too many threads can cause context-switching overhead and slow the system. Good design requires balancing concurrency for throughput and responsiveness.
Examples:
- Web Servers: handle multiple clients simultaneously
- GUIs: keep interface responsive
- Data processing: parallel computations across cores
6. Command-Line Arguments and Threads
Sometimes a concurrent program needs input files or parameters. Java provides these as
command-line arguments to the main method:
public static void main(String[] args) {
for (String file : args) {
System.out.println("Processing file: " + file);
}
}
Each thread can be passed part of this work, for example one file per thread. This is an easy way to distribute tasks concurrently.
7. Summary
- Concurrency introduces both opportunity and complexity.
- Threads let us overlap work and improve performance, but shared data access requires discipline.
- Synchronization prevents race conditions but can introduce waiting or even deadlocks.
- Effective use of concurrency requires understanding how threads share memory, when to lock, and how to design systems that avoid circular dependencies.
By the end of this week, you should understand the theory behind concurrency and be ready to implement multi-threaded Java programs safely and effectively.
Further Reading – Java Java Java: Object-Oriented Problem Solving
- Chapter 14 – Multithreading and Concurrency
- Chapter 15 – Synchronization and Deadlocks
- Chapter 16 – Concurrent Utilities in Java (not really)
These chapters provide detailed examples and diagrams explaining how Java manages multiple threads and prevents unsafe interactions.