Lesson 21 • Advanced
AsyncIO: Event Loop, Tasks & Futures
AsyncIO is the backbone of asynchronous programming in Python. To build high-performance systems — APIs, websocket servers, scrapers, automation pipelines, or distributed workers — you must fully understand how the event loop works, how Tasks provide concurrency, and how Futures act as low-level building blocks.
What You'll Learn in This Lesson
- • How the asyncio event loop schedules and switches between coroutines
- • The difference between Tasks and Futures — and when to use each
- • How to run multiple async operations concurrently with
asyncio.gather - • How to create and cancel Tasks, and handle timeouts safely
- • How to use
asyncio.Queuefor producer-consumer pipelines - • Common async patterns used in real APIs, scrapers, and websocket servers
🔥 1. What Exactly Is the Event Loop?
The event loop is a scheduler that repeatedly:
- Picks an awaitable that is ready to run
- Executes a small portion of it
- Pauses it when it awaits I/O
- Switches to the next ready task
- Handles callbacks, timers, and I/O events
It's the "orchestra conductor" of asynchronous execution.
Running the Event Loop
Basic asyncio event loop example
import asyncio
async def main():
print("Event loop running!")
asyncio.run(main())
# asyncio.run() does:
# 1. Create event loop
# 2. Run coroutine
# 3. Clean up loop
# 4. Closeasyncio.run() does:
- Create event loop
- Run coroutine
- Clean up loop
- Close
⚙️ 2. Creating Coroutines (The Basics)
A coroutine is a function that can be paused:
Creating Coroutines
Basic coroutine definition
async def fetch_user():
await asyncio.sleep(1)
return {"name": "Alice"}Coroutines don't run until awaited or turned into a Task.
🧠 3. Tasks — The Core of Concurrency
A Task wraps a coroutine and schedules it on the event loop so it runs concurrently.
Creating Tasks
Running tasks concurrently
async def work():
await asyncio.sleep(1)
return "done"
async def main():
task = asyncio.create_task(work())
print("Task started...")
result = await task
print(result)
asyncio.run(main())Key behavior:
- create_task() schedules coroutine immediately
- Execution overlaps with the rest of the program
- Awaiting task retrieves result
This is how we achieve concurrency in a single thread.
⚡ 4. Running Multiple Tasks Concurrently
asyncio.gather() runs many tasks at once:
asyncio.gather
Running multiple tasks concurrently
async def a(): await asyncio.sleep(1); return "A"
async def b(): await asyncio.sleep(1); return "B"
results = await asyncio.gather(a(), b())
print(results)Total runtime: 1 second, not 2.
Use it when:
- Fetching from many APIs
- Processing many files
- Running many workers
- Web scraping
- Database batch loading
🌀 5. Futures — Low-Level Awaitables
A Future represents a placeholder for a value that isn't available yet.
You rarely create Futures manually, but Tasks and event-loop internals rely on them.
Create a Future:
Working with Futures
Low-level awaitable objects
import asyncio
async def main():
loop = asyncio.get_running_loop()
future = loop.create_future()
loop.call_later(1, future.set_result, "Future complete")
print(await future)
asyncio.run(main())This teaches two critical things:
- Futures hold results that arrive later
- Callbacks can resolve Futures
Tasks are built on Futures — every Task is a subclass of Future.
⏳ 6. Understanding How Tasks Progress
A task runs until it hits an await that yields control:
Task Progression
How tasks interleave execution
async def step1():
print("Step 1")
await asyncio.sleep(1)
print("Step 1 done")
async def step2():
print("Step 2")
await asyncio.sleep(1)
print("Step 2 done")
async def main():
await asyncio.gather(step1(), step2())Execution flow:
- Step 1 runs → hits sleep → yields
- Step 2 runs → hits sleep → yields
- Event loop resumes Step 1 and Step 2
- Both finish
This overlapping execution is concurrency.
🧩 7. Task Cancellation
Every real system must handle cancellations:
Task Cancellation
Gracefully cancelling tasks
async def worker():
try:
while True:
await asyncio.sleep(1)
print("Working...")
except asyncio.CancelledError:
print("Task cancelled!")
async def main():
task = asyncio.create_task(worker())
await asyncio.sleep(3)
task.cancel()
await task
asyncio.run(main())Used in:
- Timeouts
- Shutting down servers
- Stopping background loops gracefully
🧱 8. Task Groups (Python 3.11+)
One of the newest and cleanest APIs:
Task Groups
Structured concurrency in Python 3.11+
async def main():
async with asyncio.TaskGroup() as tg:
tg.create_task(fetch_data())
tg.create_task(fetch_user())Benefits:
- Automatic error propagation
- Structured concurrency
- Cleaner code than gather()
⚡ 9. Wait vs Gather — When To Use Which?
asyncio.gather
- Returns results
- Cancels all tasks if one fails
- Best for symmetric jobs
asyncio.wait
- More control
- Choose FIRST_COMPLETED, FIRST_EXCEPTION
- Best for:
- Race conditions
- Redundant API fetches
- Timeout logic
Example:
asyncio.wait
Advanced task control
done, pending = await asyncio.wait(tasks, return_when=asyncio.FIRST_COMPLETED)⏱️ 10. Using Timeouts Correctly
Async Timeouts
Handling timeouts properly
try:
await asyncio.wait_for(task, timeout=3)
except asyncio.TimeoutError:
print("Timed out!")Timeouts are essential for robust production systems.
🔄 11. Callbacks & Event Loop Scheduling
You can schedule code without async:
Event Loop Callbacks
Scheduling with callbacks
loop.call_later(2, lambda: print("Hello 2s later"))
loop.call_soon(lambda: print("Hello ASAP"))This gives event-loop-level control that frameworks use internally.
🛰️ 12. Real-World Example — Concurrent API Fetching
Concurrent API Fetching
Fetching multiple APIs at once
import aiohttp
import asyncio
async def fetch(url):
async with aiohttp.ClientSession() as s:
async with s.get(url) as r:
return await r.json()
async def main():
urls = [
"https://api1.com",
"https://api2.com",
"https://api3.com",
]
results = await asyncio.gather(*(fetch(u) for u in urls))
print(results)
asyncio.run(main())This is how modern backend services fetch data from multiple microservices at once.
📡 13. Real-World Example — WebScraping With Concurrency
Async Web Scraping
Scraping multiple pages concurrently
import asyncio
import aiohttp
async def fetch(session, url):
async with session.get(url) as r:
return await r.text()
async def scrape_all(urls):
async with aiohttp.ClientSession() as session:
tasks = [asyncio.create_task(fetch(session, u)) for u in urls]
return await asyncio.gather(*tasks)This pattern lets you scrape hundreds of pages per second.
🔥 14. Production Architecture Using Tasks
A real backend service might have:
Background tasks
- Session cleanup
- Cache warmers
- Message queue consumers
Foreground request tasks
- API request-response cycles
Signal handlers
- Graceful shutdown
- Task cancellation
Queues
- Producer/consumer pipelines
All run on the same event loop.
🎉 Conclusion
You've mastered three critical components of AsyncIO:
✔ Event Loop
How async tasks are scheduled and run
✔ Tasks
Concurrent execution wrappers built on Futures
✔ Futures
Low-level placeholders controlling async flow
Together, these form the foundation of every major async Python framework (FastAPI, Starlette, aiohttp).
📋 Quick Reference — AsyncIO
| Syntax | What it does |
|---|---|
| asyncio.get_event_loop() | Get the current event loop |
| asyncio.create_task(coro) | Schedule coroutine as background task |
| asyncio.wait_for(coro, timeout) | Add timeout to a coroutine |
| asyncio.Queue() | Thread-safe async queue |
| async for / async with | Async iteration and context managers |
🎉 Great work! You've completed this lesson.
You now know how the asyncio event loop works internally, how to manage Tasks, and how to build async pipelines.
Up next: Concurrency — compare threads vs processes and choose the right model.
Sign up for free to track which lessons you've completed and get learning reminders.