TL;DR:
- Stop using
Hashtable.Synchronized()
—it creates a single global lock that blocks all threads, even for reads. - Use
ConcurrentDictionary<TKey, TValue>
for concurrent collections in .NET. It provides lock-free reads, fine-grained locking for writes, and atomic updates. - Performance: As a lock-free dictionary solution,
ConcurrentDictionary
significantly reduces contention and scales better in high-concurrency scenarios. - Real-world impact: Cleaner APIs, fewer race conditions, and better performance for caching, background jobs, and web applications.
Concurrent collections are critical in modern multi-threaded apps like ASP.NET Core or background workers. But not all thread safety is equal.
If you’re still using Hashtable.Synchronized()
for thread-safe dictionaries in .NET, you’re missing out on significant performance gains and better concurrency patterns. ConcurrentDictionary<TKey, TValue>
offers superior lock-free dictionary operations, better performance under load, and cleaner APIs that align with modern .NET development practices.
The Problem with Hashtable.Synchronized()
Here’s what many developers don’t realize: Hashtable.Synchronized()
creates a thread-safe wrapper, but it’s essentially a single lock around the entire collection. Every operation—read or write—blocks all other threads.
// Old approach - problematic in high-concurrency scenarios
var syncHashtable = Hashtable.Synchronized(new Hashtable());
// Every operation locks the entire collection
syncHashtable["user:123"] = userData; // Thread 1 blocks everyone
var user = syncHashtable["user:456"]; // Thread 2 waits unnecessarily
This becomes a bottleneck when you have multiple threads frequently reading from the same collection—a common pattern in web applications handling concurrent requests.
ConcurrentDictionary: The Modern Thread-Safe Dictionary
ConcurrentDictionary<TKey, TValue>
uses fine-grained locking and lock-free algorithms for reads, allowing multiple threads to access different parts of the concurrent collection simultaneously.
// Modern approach - optimized for concurrency
var concurrentCache = new ConcurrentDictionary<string, UserData>();
// Multiple threads can read simultaneously without blocking
var user1 = concurrentCache.TryGetValue("user:123", out var userData1);
var user2 = concurrentCache.TryGetValue("user:456", out var userData2);
// Atomic operations for safe updates
concurrentCache.AddOrUpdate("user:123",
newUserData, // Add if missing
(key, existingValue) => UpdateUserData(existingValue)); // Update if exists
Concurrency Visualization
Hashtable.Synchronized() ConcurrentDictionary
flowchart TD
subgraph "Hashtable.Synchronized()"
A1[Thread 1 - Read] --> L1[Single Lock]
A2[Thread 2 - Write] --> L1
A3[Thread 3 - Read] --> L1
L1 --> H1[Hashtable Operations]
H1 --> R1[All threads wait in queue]
end
subgraph "ConcurrentDictionary"
B1[Thread 1 - Read] --> S1[Segment 1]
B2[Thread 2 - Write] --> S2[Segment 2]
B3[Thread 3 - Read] --> S1
B4[Thread 4 - Write] --> S3[Segment 3]
S1 --> R2[Lock-free reads]
S2 --> R3[Fine-grained write locks]
S3 --> R3
end
Think of ConcurrentDictionary
as multiple checkout counters in a supermarket—customers can be served simultaneously.
Hashtable.Synchronized()
is like having one locked door that everyone must wait to pass through, regardless of what they’re doing inside.
Performance Comparison
Feature | Hashtable.Synchronized | ConcurrentDictionary |
---|---|---|
Read Performance | Blocks all threads | lock-free concurrent collection, highly concurrent |
Write Performance | Single bottleneck | Fine-grained locking |
Memory Overhead | Wrapper + original collection | Optimized internal structure |
API Safety | Manual synchronization needed | Built-in atomic operations |
Enumeration | Requires external locking | Snapshot-based, safe |
Thread Safety | Global locking mechanism | True thread-safe dictionary design |
Real-World Caching with Concurrent Collections
Here’s a production-ready user cache implementation that demonstrates the practical benefits of using a thread-safe dictionary:
public class UserCacheService
{
private readonly ConcurrentDictionary<string, UserData> _cache = new();
private readonly IUserRepository _repository;
public UserCacheService(IUserRepository repository)
{
_repository = repository;
}
public async Task<UserData> GetUserAsync(string userId)
{
// Lock-free read attempt
if (_cache.TryGetValue(userId, out var cachedUser))
{
return cachedUser;
}
// Atomic add-or-get to prevent duplicate database calls
var userData = await _repository.GetUserAsync(userId);
return _cache.GetOrAdd(userId, userData);
}
public void InvalidateUser(string userId)
{
// Thread-safe removal
_cache.TryRemove(userId, out _);
}
}
Common Pitfalls to Avoid
Enumeration Race Conditions
// Dangerous with Hashtable.Synchronized - can throw exceptions
var syncHashtable = Hashtable.Synchronized(new Hashtable());
foreach (DictionaryEntry entry in syncHashtable) // Can fail if modified
{
// Process entry
}
// Safe with ConcurrentDictionary - uses snapshots
var concurrentDict = new ConcurrentDictionary<string, object>();
foreach (var kvp in concurrentDict) // Always safe
{
// Process kvp
}
Assuming Complete Thread Safety
Even with Hashtable.Synchronized()
, compound operations aren’t atomic:
// This is NOT thread-safe, even with Synchronized()
if (syncHashtable.Contains(key))
{
syncHashtable[key] = newValue; // Another thread might remove the key here
}
// ConcurrentDictionary provides atomic operations
concurrentDict.AddOrUpdate(key, newValue, (k, v) => newValue);
Key Takeaways
When you need a thread-safe dictionary in modern .NET applications, ConcurrentDictionary<TKey, TValue>
is the right choice.
It eliminates the single-lock bottleneck of Hashtable.Synchronized()
, offering:
- Lock-free reads
- Fine-grained writes
- Atomic operations like
AddOrUpdate
andGetOrAdd
- Safe enumeration without extra locking
In short, ConcurrentDictionary
is faster, safer, and simpler to use.
If you’re still wrapping Hashtable
in Synchronized()
, now’s the time to modernize.