Coroutines and the Event Loop
Why asyncio Matters
The Problem: Threads cost memory (one MB per stack), and the GIL makes them poor for many small tasks.
The Solution: asyncio runs thousands of coroutines on one thread, switching cooperatively at every await — perfect for I/O-bound work like web servers, scrapers, and proxies.
Real Impact: Modern Python web frameworks (FastAPI, Starlette, aiohttp) are async-native — knowing asyncio is the difference between handling 100 RPS and 100,000 RPS on the same hardware.
Real-World Analogy
Think of asyncio as a single waiter juggling many tables:
- Coroutine = a table's order — paused while the kitchen cooks
- Event loop = the waiter who circles, checking which tables are ready
- await = leaving a table and serving someone else while their food cooks
- Task = an order placed but not yet awaited — work-in-flight
- gather / TaskGroup = submitting several orders at once and waiting for them all
asyncio runs many concurrent coroutines on a single thread. async def defines a coroutine; await yields control while waiting on I/O so other coroutines can run.
import asyncio
async def greet(name: str, delay: float):
await asyncio.sleep(delay)
print(f"hi {name}")
async def main():
await greet("Alice", 1)
await greet("Bob", 1)
asyncio.run(main())
# Takes ~2 seconds — sequential
To run them concurrently, wrap each in a Task.
Tasks and gather
async def main():
# gather runs them concurrently and waits for all
await asyncio.gather(
greet("Alice", 1),
greet("Bob", 1),
greet("Carol", 1),
)
asyncio.run(main())
# Takes ~1 second — all three sleep concurrently
create_task for fire-and-track
async def main():
task = asyncio.create_task(fetch(url)) # starts immediately
await do_other_stuff()
result = await task # now wait
asyncio.TaskGroup (3.11+) — preferred
async def main():
async with asyncio.TaskGroup() as tg:
tg.create_task(fetch("url1"))
tg.create_task(fetch("url2"))
# All tasks awaited; one failure cancels siblings and raises ExceptionGroup
Real Example: Concurrent HTTP Fetches
import asyncio, aiohttp
async def fetch(session, url: str) -> int:
async with session.get(url) as resp:
return resp.status
async def main(urls):
async with aiohttp.ClientSession() as session:
results = await asyncio.gather(*[fetch(session, u) for u in urls])
return results
urls = [f"https://httpbin.org/delay/1?n={i}" for i in range(20)]
print(asyncio.run(main(urls)))
# 20 requests each take 1s. Total: ~1s, not 20s.
⚠️ async libraries only
Inside async code you must use async-aware libraries: aiohttp not requests, asyncpg not psycopg, aiofiles not open. Calling sync I/O in async code blocks the event loop and kills concurrency.
Timeouts and Cancellation
async def main():
try:
result = await asyncio.wait_for(slow_op(), timeout=5.0)
except asyncio.TimeoutError:
log("timed out")
# 3.11+ alternative: async with asyncio.timeout(5.0):
async with asyncio.timeout(5.0):
result = await slow_op()
Cancellation
task = asyncio.create_task(long_op())
await asyncio.sleep(2)
task.cancel() # requests cancellation
try:
await task
except asyncio.CancelledError:
log("cancelled")
Async Iteration and Context Managers
# Async iteration with `async for`
async def stream_lines(reader):
async for line in reader:
process(line)
# Async generators with yield
async def tail(path):
while True:
line = await read_one(path)
if line: yield line
await asyncio.sleep(0.1)
async def main():
async for line in tail("app.log"):
print(line)
# Async context managers — async with
class Session:
async def __aenter__(self):
await self.connect()
return self
async def __aexit__(self, exc_type, exc, tb):
await self.close()
🎯 Practice Exercises
Exercise 1: Concurrent fetch
Fetch 100 URLs concurrently with aiohttp and asyncio.gather. Measure total time and compare to sequential.
Exercise 2: Timeout wrapper
Wrap a slow coroutine with asyncio.timeout(2.0). Verify it raises TimeoutError.
Exercise 3: Worker pool with Semaphore
Limit concurrent requests to 10 using asyncio.Semaphore.
Exercise 4: TaskGroup
Build a fetch pipeline using async with TaskGroup(). Make one task fail and observe sibling cancellation.