position = [0, 0]
def up():
global position
x, y = position
y += 1
position = x, y
print(f"You are at coordinates {position}")
def right():
global position
x, y = position
x += 1
position = x, y
print(f"You are at coordinates {position}")Iterators, Generators, and Decorators
All about ators
Introduction
Python has three powerful features that all end in “-ator”: decorators, iterators, and generators. These aren’t just linguistic cousins—they’re fundamental patterns that make Python code more elegant and efficient.
Decorators let you wrap functions to add or modify behavior without changing their code. Iterators are objects you can loop over, following a specific protocol. Generators are a simpler way to create iterators using functions that can pause and resume.
We’ll start with decorators (the easiest to grasp), then move to iterators (the protocol), and finally generators (the elegant solution).
Decorators
The problem: code duplication
Imagine you’re building a simple game where you move around a grid. Here’s a first attempt:
These functions are nearly identical! The only difference is how they calculate the new position.
We can extract the calculation into simple helper functions:
def simple_up(x, y):
return x, y + 1
def simple_right(x, y):
return x + 1, yNow we can rewrite our movement functions using these helpers:
def up():
global position
x, y = position
position = simple_up(x, y)
print(f"You are at coordinates {position}")
def right():
global position
x, y = position
position = simple_right(x, y)
print(f"You are at coordinates {position}")
position = [0, 0]
up()
up()
right()
up()You are at coordinates (0, 1)
You are at coordinates (0, 2)
You are at coordinates (1, 2)
You are at coordinates (1, 3)
Better, but we still have duplication. What if we think of up() as a transformation of simple_up()?
transformation :: simple_up |----> up
The same transformation could turn simple_right() into right(), and so on.
Functions that transform functions
In Python, a transformation is just a function. But this is a special kind of function: one that takes a function as input and returns a different function as output.
def transformator(func):
def other_func():
global position
x, y = position
position = func(x, y)
print(f"You are at coordinates {position}")
return other_funcNow we can create our movement functions by transforming the simple ones:
def simple_up(x, y):
return x, y + 1
up = transformator(simple_up)
def simple_right(x, y):
return x + 1, y
right = transformator(simple_right)
position = [0, 0]
up()
up()
right()
up()You are at coordinates (0, 1)
You are at coordinates (0, 2)
You are at coordinates (1, 2)
You are at coordinates (1, 3)
This works! But Python has special syntax to make this even cleaner.
The @ syntax
The decorator syntax uses @ to apply a transformation directly when defining a function:
@transformator
def func(...):
...This is 100% equivalent to:
def func(...):
...
func = transformator(func)The decorator approach just saves us from repeating the function name. Let’s apply it to our example:
def define_move(func):
def other_func():
global position
x, y = position
position = func(x, y)
print(f"You are at coordinates {position}")
return other_func
@define_move
def up(x, y):
return x, y + 1
@define_move
def right(x, y):
return x + 1, y
position = [0, 0]
up()
up()
right()
up()You are at coordinates (0, 1)
You are at coordinates (0, 2)
You are at coordinates (1, 2)
You are at coordinates (1, 3)
Much cleaner! The @define_move decorator handles all the boilerplate, and we only write the unique calculation logic.
Handling different signatures
Our decorator only works for functions with two parameters. What if we want to decorate functions with different signatures?
The solution is *args and **kwargs:
def print_result(func):
def wrapper(*args, **kwargs):
result = func(*args, **kwargs)
print(func.__name__, result)
return result
return wrapperNow it works with any function:
@print_result
def plus(x, y):
return x + y
@print_result
def minus(x, y):
return x - y
@print_result
def plus3(x, y, z):
return x + y + z
plus3(1, 2, 3)
plus(plus(3, minus(4, 1)), 10)plus3 6
minus 3
plus 6
plus 16
16
Preserving function metadata
There’s a subtle problem with decorators. They can lose information about the original function:
def deco(func):
def wrapper(*args, **kwargs):
print(f'Calling {func.__name__}!')
return func(*args, **kwargs)
return wrapper
@deco
def test(x):
'''This function will take a number and double it'''
return 2 * x
test(2)Calling test!
4
The function works, but check its metadata:
test.__doc__test.__name__'wrapper'
The docstring and name now refer to wrapper, not test! This breaks documentation tools and can confuse other developers.
The fix is functools.wraps:
import functools
def deco(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
print(f'Calling {func.__name__}!')
return func(*args, **kwargs)
return wrapper
@deco
def test(x):
'''This function will take a number and double it'''
return 2 * x
test(2)Calling test!
4
Now the metadata is preserved:
test.__doc__'This function will take a number and double it'
test.__name__'test'
Always use @functools.wraps(func) when writing decorators. It preserves the original function’s name, docstring, and other metadata.
Beyond the basics
Decorators can do much more:
- Class decorators: Decorators aren’t just for functions. You can decorate classes too, like
@dataclasses.dataclass. - Decorators with arguments: You can create decorators that take parameters. This adds another layer of functions (you create a function that creates decorators).
- Stacking decorators: You can apply multiple decorators to the same function.
- Method decorators: Decorators work on class methods too. For example,
@propertyturns a method into a property.
Here’s a quick example with @property:
class Temperature:
def __init__(self, kelvin):
self.kelvin = kelvin
@property
def celsius(self):
return self.kelvin - 273.15
t0 = Temperature(290)
t0.celsius # Note: no parentheses needed16.850000000000023
The @property decorator transforms the celsius method so it can be accessed like an attribute.
Iterators
What’s an iterator?
You’ve seen that range(10) doesn’t create a list—it creates a special range object:
range(10)range(0, 10)
Similarly, a generator expression (using parentheses instead of brackets) creates a generator object:
(x for x in range(10) if x % 2 == 0)<generator object <genexpr> at 0x731db42e65a0>
You can work with these objects by calling next() repeatedly:
genexpr = (x for x in range(10) if x % 2 == 0)
type(genexpr)generator
next(genexpr), next(genexpr), next(genexpr)(0, 2, 4)
next(genexpr), next(genexpr)(6, 8)
When there are no more items, you get a StopIteration exception:
next(genexpr)--------------------------------------------------------------------------- StopIteration Traceback (most recent call last) Cell In[21], line 1 ----> 1 next(genexpr) StopIteration:
Once exhausted, the generator is done—you need to create a new one:
genexpr = (x for x in range(10) if x % 2 == 0)Python’s for loop knows how to work with these objects. It calls next() automatically until it sees StopIteration:
for x in genexpr:
print(x)0
2
4
6
8
You can also mix manual next() calls with for loops:
genexpr = (x for x in range(10) if x % 2 == 0)
next(genexpr) # Skip first item
next(genexpr) # Skip second item
for x in genexpr:
print(x)4
6
8
Be careful mixing next() inside a for loop—you can easily skip items or create infinite loops:
genexpr = (x for x in range(10) if x % 2 == 0)
for x in genexpr:
print(x)
next(genexpr) # Skips the next item each iteration!0
4
8
--------------------------------------------------------------------------- StopIteration Traceback (most recent call last) Cell In[25], line 5 3 for x in genexpr: 4 print(x) ----> 5 next(genexpr) # Skips the next item each iteration! StopIteration:
Infinite iterators
Some iterators never run out. For example, itertools.cycle loops through a sequence forever:
import itertools
c = itertools.cycle([1, 2, 3])
for _ in range(5):
print(next(c))1
2
3
1
2
Never use a for loop on an infinite iterator without a break statement—it will run forever!
The iterator protocol
You can create your own iterators by implementing two special methods:
__next__(): Returns the next item or raisesStopIterationwhen done__iter__(): Returns the iterator object itself (usuallyself)
Here’s a custom iterator that yields even numbers:
class LoopOverEven:
def __init__(self, n):
self.n = n
self.current = 0
def __next__(self):
if self.current >= self.n:
raise StopIteration
while True:
self.current += 1
if self.current % 2 == 0:
return self.current
gen = LoopOverEven(10)
next(gen), next(gen), next(gen)(2, 4, 6)
This works with next(), but not yet with for loops:
for item in LoopOverEven(10):
print(item)--------------------------------------------------------------------------- TypeError Traceback (most recent call last) Cell In[28], line 1 ----> 1 for item in LoopOverEven(10): 2 print(item) TypeError: 'LoopOverEven' object is not iterable
Iterable vs iterator
The for loop looks for an __iter__ method, which LoopOverEven doesn’t have. We can fix this two ways.
Option 1: Create a separate container class:
class LoopContainer:
def __init__(self, n):
self.n = n
def __iter__(self):
return LoopOverEven(self.n)
for x in LoopContainer(10):
print(x)2
4
6
8
10
Now the terminology becomes clear:
LoopContaineris an iterable (has__iter__)LoopOverEvenis an iterator (has__next__)
Option 2: Make the iterator return itself:
class LoopOverEven:
def __init__(self, n):
self.n = n
self.current = 0
def __next__(self):
if self.current >= self.n:
raise StopIteration
while True:
self.current += 1
if self.current % 2 == 0:
return self.current
def __iter__(self):
return self
for x in LoopOverEven(10):
print(x)2
4
6
8
10
This works because when a for loop calls __iter__(), it gets back an iterator (which is self), and then it can call __next__() on it.
- Iterable: An object with an
__iter__method that returns an iterator - Iterator: An object with a
__next__method (and usually__iter__that returnsself) - Every iterator is iterable (if it has
__iter__) - Not every iterable is an iterator (e.g., lists are iterable but not iterators)
Creating iterators with classes can be tedious. Let’s see a better way…
Generators
Generator functions
Instead of writing a whole class, you can use the yield keyword in a function:
def maker(n):
for x in range(n):
if x % 2 == 0:
yield x
m10 = maker(10)Calling maker(10) doesn’t run the function—it creates a generator-iterator:
m10.__iter__, m10.__next__(<method-wrapper '__iter__' of generator object at 0x731d884b29b0>,
<method-wrapper '__next__' of generator object at 0x731d884b29b0>)
This generator-iterator follows the iterator protocol automatically:
m5 = maker(5)
next(m5), next(m5), next(m5)(0, 2, 4)
And it works perfectly with for loops:
for u in maker(10):
print(u)0
2
4
6
8
Much simpler than writing a class! The yield keyword handles all the plumbing.
How yield works
When you call a generator function, Python creates a generator object. Each time you call next():
- The function runs until it hits
yield - It returns the yielded value
- It pauses, remembering where it was
- On the next
next()call, it resumes from where it paused
This “pausable function” behavior is what makes generators so powerful.
Generator expressions
We’ve already seen generator expressions—they’re just a compact syntax for simple generators:
# Generator expression
gen1 = (x * 2 for x in range(5))
# Equivalent generator function
def gen2_func():
for x in range(5):
yield x * 2
gen2 = gen2_func()
list(gen1), list(gen2)([0, 2, 4, 6, 8], [0, 2, 4, 6, 8])
Generator expressions are great for simple cases. For complex logic, use generator functions.
Terminology clarification
The terminology around generators can be confusing:
- Generator function: A function that contains
yield - Generator-iterator: The object created by calling a generator function
- Generator expression: The
(x for x in ...)syntax - Iterator: Any object with
__next__and__iter__
All generator-iterators are iterators, but not all iterators are generator-iterators. The word “generator” usually refers to the function, but sometimes to the iterator it creates.
Think of a generator function as a recipe for creating iterators. Each time you call it, you get a fresh iterator that can be consumed independently.
def countdown(n):
while n > 0:
yield n
n -= 1
# Two independent countdowns
c1 = countdown(3)
c2 = countdown(5)
print(next(c1), next(c2)) # 3, 5
print(next(c1), next(c2)) # 2, 4
print(next(c1), next(c2)) # 1, 33 5
2 4
1 3
yield from
The yield from syntax is shorthand for yielding all values from another iterable:
import string
def abc(word):
"""Generate letters from a word in alphabetical order"""
for x in string.ascii_lowercase:
if x in word:
yield x
for x in abc("banana"):
print(x)a
b
n
You can delegate to other generators with yield from:
def multiabc(sentence):
for word in sentence.split():
yield from abc(word)
yield '----'
for x in multiabc("banana racecar"):
print(x)a
b
n
----
a
c
e
r
----
The line yield from abc(word) is equivalent to:
for x in abc(word):
yield xBut yield from is more efficient and handles some edge cases better.
Generator coroutines
Generators can do more than just yield values—they can also receive values. This creates a simple form of coroutine:
def keep_adding():
sum = 0
while sum < 1000:
x = yield sum
sum += x
k = keep_adding()
next(k) # Prime the generator0
After priming with next(), you can send values in:
k.send(10), k.send(1), k.send(100)(10, 11, 111)
This is an older pattern for coroutines. Modern Python (3.5+) uses async/await syntax for native coroutines, which are more powerful and cleaner for concurrent programming. We’ll cover that in a future lecture on async programming. Generator coroutines still work but are rarely used in new code.