import asyncio
# Define an async function (coroutine)
async def fetch_data(name: str, delay: float) -> str:
print(f"Starting {name}...")
await asyncio.sleep(delay) # Simulates I/O operation
print(f"Finished {name}")
return f"Result from {name}"
# Run a single coroutine
result = await fetch_data("task1", 1)
print(result)Async Python Basics
—title: “Async Python Basics”date: 2026-01-05categories: [Python, Engineering]description: “Quick guide to async/await in Python - coroutines, tasks, and common patterns”jupyter: python3—
Why Async?
Async programming allows your code to handle I/O-bound operations (network requests, file operations, database queries) without blocking. Instead of waiting, Python can do other work.
Use async when: - Making multiple API calls - Handling many concurrent connections - Working with I/O-bound tasks
Don’t use async for: - CPU-bound tasks (use multiprocessing instead) - Simple scripts with sequential logic
Basic Syntax
Running Multiple Tasks Concurrently
The real power of async is running multiple operations at the same time.
import time
async def main():
start = time.time()
# Run tasks concurrently with gather
results = await asyncio.gather(
fetch_data("API call 1", 2),
fetch_data("API call 2", 2),
fetch_data("API call 3", 2),
)
elapsed = time.time() - start
print(f"\nAll done in {elapsed:.2f}s (not 6s!)")
print(f"Results: {results}")
await main()Creating Tasks for More Control
Use asyncio.create_task() when you need to start a coroutine and continue doing other work.
async def main_with_tasks():
# Create tasks - they start running immediately
task1 = asyncio.create_task(fetch_data("background job", 3))
task2 = asyncio.create_task(fetch_data("another job", 2))
# Do other work while tasks run
print("Doing other work...")
await asyncio.sleep(0.5)
print("Still working...")
# Wait for tasks when you need results
result1 = await task1
result2 = await task2
print(f"Got: {result1}, {result2}")
await main_with_tasks()Real-World Example: Async HTTP Requests
Using aiohttp for concurrent API calls (much faster than sequential requests).
# pip install aiohttp
import aiohttp
async def fetch_url(session: aiohttp.ClientSession, url: str) -> dict:
async with session.get(url) as response:
return await response.json()
async def fetch_multiple_urls(urls: list[str]) -> list[dict]:
async with aiohttp.ClientSession() as session:
tasks = [fetch_url(session, url) for url in urls]
return await asyncio.gather(*tasks)
# Example usage
urls = [
"https://httpbin.org/json",
"https://httpbin.org/uuid",
"https://httpbin.org/headers",
]
results = await fetch_multiple_urls(urls)
for i, result in enumerate(results):
print(f"URL {i+1}: {list(result.keys())}")Handling Errors in Async Code
async def might_fail(name: str, should_fail: bool):
await asyncio.sleep(0.5)
if should_fail:
raise ValueError(f"{name} failed!")
return f"{name} succeeded"
async def handle_errors():
# gather with return_exceptions=True catches errors
results = await asyncio.gather(
might_fail("task1", False),
might_fail("task2", True),
might_fail("task3", False),
return_exceptions=True
)
for result in results:
if isinstance(result, Exception):
print(f"Error: {result}")
else:
print(f"Success: {result}")
await handle_errors()Running Async from Sync Code
When you’re not in a Jupyter notebook or async context:
# In a regular Python script:
# asyncio.run(main()) # Python 3.7+
# Or for more control:
# loop = asyncio.get_event_loop()
# loop.run_until_complete(main())Quick Reference
| Pattern | Use Case |
|---|---|
await coroutine() |
Wait for single async operation |
asyncio.gather(*coros) |
Run multiple coroutines concurrently |
asyncio.create_task(coro) |
Start coroutine without waiting |
asyncio.sleep(n) |
Non-blocking sleep |
async with |
Async context manager |
async for |
Async iteration |