Iterators, Generators, and Decorators

All about ators

Author

Karsten Naert

Published

November 15, 2025

Introduction

Python has three powerful features that all end in “-ator”: decorators, iterators, and generators. These aren’t just linguistic cousins—they’re fundamental patterns that make Python code more elegant and efficient.

Decorators let you wrap functions to add or modify behavior without changing their code. Iterators are objects you can loop over, following a specific protocol. Generators are a simpler way to create iterators using functions that can pause and resume.

We’ll start with decorators (the easiest to grasp), then move to iterators (the protocol), and finally generators (the elegant solution).

Decorators

The problem: code duplication

Imagine you’re building a simple game where you move around a grid. Here’s a first attempt:

position = [0, 0]

def up():
    global position
    x, y = position
    y += 1
    position = x, y
    print(f"You are at coordinates {position}")

def right():
    global position
    x, y = position
    x += 1
    position = x, y
    print(f"You are at coordinates {position}")

These functions are nearly identical! The only difference is how they calculate the new position.

We can extract the calculation into simple helper functions:

def simple_up(x, y):
    return x, y + 1

def simple_right(x, y):
    return x + 1, y

Now we can rewrite our movement functions using these helpers:

def up():
    global position
    x, y = position
    position = simple_up(x, y)
    print(f"You are at coordinates {position}")

def right():
    global position
    x, y = position
    position = simple_right(x, y)
    print(f"You are at coordinates {position}")

position = [0, 0]
up()
up()
right()
up()
You are at coordinates (0, 1)
You are at coordinates (0, 2)
You are at coordinates (1, 2)
You are at coordinates (1, 3)

Better, but we still have duplication. What if we think of up() as a transformation of simple_up()?

transformation :: simple_up |----> up

The same transformation could turn simple_right() into right(), and so on.

Functions that transform functions

In Python, a transformation is just a function. But this is a special kind of function: one that takes a function as input and returns a different function as output.

def transformator(func):
    def other_func():
        global position
        x, y = position
        position = func(x, y)
        print(f"You are at coordinates {position}")
    return other_func

Now we can create our movement functions by transforming the simple ones:

def simple_up(x, y):
    return x, y + 1

up = transformator(simple_up)

def simple_right(x, y):
    return x + 1, y

right = transformator(simple_right)

position = [0, 0]
up()
up()
right()
up()
You are at coordinates (0, 1)
You are at coordinates (0, 2)
You are at coordinates (1, 2)
You are at coordinates (1, 3)

This works! But Python has special syntax to make this even cleaner.

The @ syntax

The decorator syntax uses @ to apply a transformation directly when defining a function:

@transformator
def func(...):
    ...

This is 100% equivalent to:

def func(...):
    ...
func = transformator(func)

The decorator approach just saves us from repeating the function name. Let’s apply it to our example:

def define_move(func):
    def other_func():
        global position
        x, y = position
        position = func(x, y)
        print(f"You are at coordinates {position}")
    return other_func

@define_move
def up(x, y):
    return x, y + 1

@define_move
def right(x, y):
    return x + 1, y

position = [0, 0]
up()
up()
right()
up()
You are at coordinates (0, 1)
You are at coordinates (0, 2)
You are at coordinates (1, 2)
You are at coordinates (1, 3)

Much cleaner! The @define_move decorator handles all the boilerplate, and we only write the unique calculation logic.

Handling different signatures

Our decorator only works for functions with two parameters. What if we want to decorate functions with different signatures?

The solution is *args and **kwargs:

def print_result(func):
    def wrapper(*args, **kwargs):
        result = func(*args, **kwargs)
        print(func.__name__, result)
        return result
    return wrapper

Now it works with any function:

@print_result
def plus(x, y):
    return x + y

@print_result
def minus(x, y):
    return x - y

@print_result
def plus3(x, y, z):
    return x + y + z

plus3(1, 2, 3)
plus(plus(3, minus(4, 1)), 10)
plus3 6
minus 3
plus 6
plus 16
16

Preserving function metadata

There’s a subtle problem with decorators. They can lose information about the original function:

def deco(func):
    def wrapper(*args, **kwargs):
        print(f'Calling {func.__name__}!')
        return func(*args, **kwargs)
    return wrapper

@deco
def test(x):
    '''This function will take a number and double it'''
    return 2 * x

test(2)
Calling test!
4

The function works, but check its metadata:

test.__doc__
test.__name__
'wrapper'

The docstring and name now refer to wrapper, not test! This breaks documentation tools and can confuse other developers.

The fix is functools.wraps:

import functools

def deco(func):
    @functools.wraps(func)
    def wrapper(*args, **kwargs):
        print(f'Calling {func.__name__}!')
        return func(*args, **kwargs)
    return wrapper

@deco
def test(x):
    '''This function will take a number and double it'''
    return 2 * x

test(2)
Calling test!
4

Now the metadata is preserved:

test.__doc__
'This function will take a number and double it'
test.__name__
'test'
Best Practice

Always use @functools.wraps(func) when writing decorators. It preserves the original function’s name, docstring, and other metadata.

Beyond the basics

Decorators can do much more:

  • Class decorators: Decorators aren’t just for functions. You can decorate classes too, like @dataclasses.dataclass.
  • Decorators with arguments: You can create decorators that take parameters. This adds another layer of functions (you create a function that creates decorators).
  • Stacking decorators: You can apply multiple decorators to the same function.
  • Method decorators: Decorators work on class methods too. For example, @property turns a method into a property.

Here’s a quick example with @property:

class Temperature:
    def __init__(self, kelvin):
        self.kelvin = kelvin
    
    @property
    def celsius(self):
        return self.kelvin - 273.15

t0 = Temperature(290)
t0.celsius  # Note: no parentheses needed
16.850000000000023

The @property decorator transforms the celsius method so it can be accessed like an attribute.

Exercise

Practice creating decorators:

  1. Extend the movement system: Add @define_move decorated functions for down and left movements.

  2. Debug decorator: Create a @debugme decorator that prints both the arguments passed to a function and its return value. Test it on a few different functions.

  3. Call counter: Create a @counter decorator that tracks how many times a function has been called. Each time the function is called, it should print something like “Function ‘add’ has been called 5 times.”

  4. Apply twice: Create a decorator @apply_twice that makes a function apply itself twice. For example:

@apply_twice
def add_five(x):
    return x + 5

add_five(3)  # Should return 13 (3 + 5 + 5)
  1. Timing decorator: Create a @timer decorator that measures and prints how long a function takes to execute. Use the time module.

Hint: For the counter decorator, you’ll need to store state. You can use a mutable default argument (carefully!) or attach an attribute to the wrapper function.

Iterators

What’s an iterator?

You’ve seen that range(10) doesn’t create a list—it creates a special range object:

range(10)
range(0, 10)

Similarly, a generator expression (using parentheses instead of brackets) creates a generator object:

(x for x in range(10) if x % 2 == 0)
<generator object <genexpr> at 0x731db42e65a0>

You can work with these objects by calling next() repeatedly:

genexpr = (x for x in range(10) if x % 2 == 0)
type(genexpr)
generator
next(genexpr), next(genexpr), next(genexpr)
(0, 2, 4)
next(genexpr), next(genexpr)
(6, 8)

When there are no more items, you get a StopIteration exception:

next(genexpr)
---------------------------------------------------------------------------
StopIteration                             Traceback (most recent call last)
Cell In[21], line 1
----> 1 next(genexpr)

StopIteration: 

Once exhausted, the generator is done—you need to create a new one:

genexpr = (x for x in range(10) if x % 2 == 0)

Python’s for loop knows how to work with these objects. It calls next() automatically until it sees StopIteration:

for x in genexpr:
    print(x)
0
2
4
6
8

You can also mix manual next() calls with for loops:

genexpr = (x for x in range(10) if x % 2 == 0)

next(genexpr)  # Skip first item
next(genexpr)  # Skip second item

for x in genexpr:
    print(x)
4
6
8
Warning

Be careful mixing next() inside a for loop—you can easily skip items or create infinite loops:

genexpr = (x for x in range(10) if x % 2 == 0)

for x in genexpr:
    print(x)
    next(genexpr)  # Skips the next item each iteration!
0
4
8
---------------------------------------------------------------------------
StopIteration                             Traceback (most recent call last)
Cell In[25], line 5
      3 for x in genexpr:
      4     print(x)
----> 5     next(genexpr)  # Skips the next item each iteration!

StopIteration: 

Infinite iterators

Some iterators never run out. For example, itertools.cycle loops through a sequence forever:

import itertools

c = itertools.cycle([1, 2, 3])

for _ in range(5):
    print(next(c))
1
2
3
1
2
Warning

Never use a for loop on an infinite iterator without a break statement—it will run forever!

The iterator protocol

You can create your own iterators by implementing two special methods:

  • __next__(): Returns the next item or raises StopIteration when done
  • __iter__(): Returns the iterator object itself (usually self)

Here’s a custom iterator that yields even numbers:

class LoopOverEven:
    def __init__(self, n):
        self.n = n
        self.current = 0
    
    def __next__(self):
        if self.current >= self.n:
            raise StopIteration
        while True:
            self.current += 1
            if self.current % 2 == 0:
                return self.current

gen = LoopOverEven(10)
next(gen), next(gen), next(gen)
(2, 4, 6)

This works with next(), but not yet with for loops:

for item in LoopOverEven(10):
    print(item)
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[28], line 1
----> 1 for item in LoopOverEven(10):
      2     print(item)

TypeError: 'LoopOverEven' object is not iterable

Iterable vs iterator

The for loop looks for an __iter__ method, which LoopOverEven doesn’t have. We can fix this two ways.

Option 1: Create a separate container class:

class LoopContainer:
    def __init__(self, n):
        self.n = n
    
    def __iter__(self):
        return LoopOverEven(self.n)

for x in LoopContainer(10):
    print(x)
2
4
6
8
10

Now the terminology becomes clear:

  • LoopContainer is an iterable (has __iter__)
  • LoopOverEven is an iterator (has __next__)

Option 2: Make the iterator return itself:

class LoopOverEven:
    def __init__(self, n):
        self.n = n
        self.current = 0
    
    def __next__(self):
        if self.current >= self.n:
            raise StopIteration
        while True:
            self.current += 1
            if self.current % 2 == 0:
                return self.current
    
    def __iter__(self):
        return self

for x in LoopOverEven(10):
    print(x)
2
4
6
8
10

This works because when a for loop calls __iter__(), it gets back an iterator (which is self), and then it can call __next__() on it.

Terminology Summary
  • Iterable: An object with an __iter__ method that returns an iterator
  • Iterator: An object with a __next__ method (and usually __iter__ that returns self)
  • Every iterator is iterable (if it has __iter__)
  • Not every iterable is an iterator (e.g., lists are iterable but not iterators)

Creating iterators with classes can be tedious. Let’s see a better way…

Generators

Generator functions

Instead of writing a whole class, you can use the yield keyword in a function:

def maker(n):
    for x in range(n):
        if x % 2 == 0:
            yield x

m10 = maker(10)

Calling maker(10) doesn’t run the function—it creates a generator-iterator:

m10.__iter__, m10.__next__
(<method-wrapper '__iter__' of generator object at 0x731d884b29b0>,
 <method-wrapper '__next__' of generator object at 0x731d884b29b0>)

This generator-iterator follows the iterator protocol automatically:

m5 = maker(5)
next(m5), next(m5), next(m5)
(0, 2, 4)

And it works perfectly with for loops:

for u in maker(10):
    print(u)
0
2
4
6
8

Much simpler than writing a class! The yield keyword handles all the plumbing.

How yield works

When you call a generator function, Python creates a generator object. Each time you call next():

  1. The function runs until it hits yield
  2. It returns the yielded value
  3. It pauses, remembering where it was
  4. On the next next() call, it resumes from where it paused

This “pausable function” behavior is what makes generators so powerful.

Generator expressions

We’ve already seen generator expressions—they’re just a compact syntax for simple generators:

# Generator expression
gen1 = (x * 2 for x in range(5))

# Equivalent generator function
def gen2_func():
    for x in range(5):
        yield x * 2
gen2 = gen2_func()

list(gen1), list(gen2)
([0, 2, 4, 6, 8], [0, 2, 4, 6, 8])

Generator expressions are great for simple cases. For complex logic, use generator functions.

Terminology clarification

The terminology around generators can be confusing:

  • Generator function: A function that contains yield
  • Generator-iterator: The object created by calling a generator function
  • Generator expression: The (x for x in ...) syntax
  • Iterator: Any object with __next__ and __iter__

All generator-iterators are iterators, but not all iterators are generator-iterators. The word “generator” usually refers to the function, but sometimes to the iterator it creates.

Mental Model

Think of a generator function as a recipe for creating iterators. Each time you call it, you get a fresh iterator that can be consumed independently.

def countdown(n):
    while n > 0:
        yield n
        n -= 1

# Two independent countdowns
c1 = countdown(3)
c2 = countdown(5)

print(next(c1), next(c2))  # 3, 5
print(next(c1), next(c2))  # 2, 4
print(next(c1), next(c2))  # 1, 3
3 5
2 4
1 3

yield from

The yield from syntax is shorthand for yielding all values from another iterable:

import string

def abc(word):
    """Generate letters from a word in alphabetical order"""
    for x in string.ascii_lowercase:
        if x in word:
            yield x

for x in abc("banana"):
    print(x)
a
b
n

You can delegate to other generators with yield from:

def multiabc(sentence):
    for word in sentence.split():
        yield from abc(word)
        yield '----'

for x in multiabc("banana racecar"):
    print(x)
a
b
n
----
a
c
e
r
----

The line yield from abc(word) is equivalent to:

for x in abc(word):
    yield x

But yield from is more efficient and handles some edge cases better.

Generator coroutines

Generators can do more than just yield values—they can also receive values. This creates a simple form of coroutine:

def keep_adding():
    sum = 0
    while sum < 1000:
        x = yield sum
        sum += x

k = keep_adding()
next(k)  # Prime the generator
0

After priming with next(), you can send values in:

k.send(10), k.send(1), k.send(100)
(10, 11, 111)
Modern Coroutines

This is an older pattern for coroutines. Modern Python (3.5+) uses async/await syntax for native coroutines, which are more powerful and cleaner for concurrent programming. We’ll cover that in a future lecture on async programming. Generator coroutines still work but are rarely used in new code.

Practice with iterators and generators:

  1. Countdown iterator: Create a generator function countdown(n) that yields numbers from n down to 1, then prints “Blastoff!”.

  2. Flatten nested lists: Write a generator function flatten(nested_list) that flattens arbitrarily nested lists:

mylist = [[1, 2], [3], [4, [5, [6]]]]
list(flatten(mylist))  # Should return [1, 2, 3, 4, 5, 6]

Hint: Use isinstance(item, list) to check if something is a list, and use yield from for recursion.

  1. Fibonacci generator: Create an infinite generator fibonacci() that yields Fibonacci numbers. Test it by printing the first 10 numbers.

  2. Range with step: Create a generator my_range(start, stop, step=1) that mimics Python’s built-in range() but works with floats too:

list(my_range(0, 1, 0.1))  # Should work with floats
  1. Sliding window: Create a generator sliding_window(iterable, n) that yields tuples of n consecutive elements:
list(sliding_window([1, 2, 3, 4, 5], 3))
# Should return [(1, 2, 3), (2, 3, 4), (3, 4, 5)]
  1. Compare approaches: Take your LoopOverEven class from earlier and rewrite it as a generator function. Which is simpler?

Additional resources