Day 26 · ~18m

Python Decorator Patterns: Build retry, timer, and cache Decorators

Decorator factories that take arguments. Build @retry(max_attempts=3), @timer, and @cache from scratch. See the three-level nesting in action.

student (curious)

I was reading the payment processor code this morning. There's this decorator: @retry(max_attempts=3). Wait — the decorator takes an argument? How does that work?

teacher (neutral)

Because the decorator isn't a function — it's a function that returns a function that returns a function. Three levels of nesting.

student (confused)

That's... a lot of nesting. Walk me through it.

teacher (focused)

Okay. Yesterday you learned that a decorator is a function that takes a function, wraps it, and returns the wrapper. That's level one: decorator(function) → wrapper.

Now if you want the decorator itself to take arguments, you need a layer above the decorator. A function that returns a decorator.

student (thinking)

So @retry(max_attempts=3) calls retry(max_attempts=3) first, and that returns the decorator?

teacher (encouraging)

Exactly. It's confusing because the syntax hides the nesting. Let me show you step by step.

# Without arguments, a decorator is simple:
def timer(func):
    def wrapper(*args, **kwargs):
        print(f"Calling {func.__name__}")
        result = func(*args, **kwargs)
        print(f"Done")
        return result
    return wrapper

@timer
def fetch_user(user_id):
    return f"User {user_id}"

# When you use @timer, Python does this:
# fetch_user = timer(fetch_user)

That's level one.

student (focused)

Okay so the decorator is applied at definition time. The function gets passed to the decorator, the decorator returns a wrapper, and that wrapper is what fetch_user becomes.

teacher (serious)

Right. Now what if you want to customize the decorator? What if you want some calls to timer to log, others to time, others to both?

student (thinking)

You'd add a parameter to the decorator?

teacher (focused)

You'd add a parameter to a function that creates the decorator. Like this:

def timer(label=None):
    def decorator(func):
        def wrapper(*args, **kwargs):
            if label:
                print(f"[{label}] Calling {func.__name__}")
            result = func(*args, **kwargs)
            if label:
                print(f"[{label}] Done")
            return result
        return wrapper
    return decorator

@timer(label="api_call")
def fetch_user(user_id):
    return f"User {user_id}"

Now when Python sees @timer(label="api_call"), it does three things:

  1. Call timer(label="api_call") → returns decorator function
  2. Call decorator(fetch_user) → returns wrapper function
  3. Set fetch_user = wrapper
student (surprised)

Three levels. The outer function takes arguments. It returns the decorator. The decorator takes the function. The decorator returns the wrapper.

teacher (proud)

That's the pattern. The outer function is a decorator factory. It creates customized decorators.

student (thinking)

But why not just pass all the arguments to the decorator directly? Like def timer(func, label=None):?

teacher (serious)

Because Python's @ syntax assumes the decorator is a callable that takes one argument — the function. If you wrote @timer(fetch_user, label="api_call") that would be weird. The three-level pattern lets you write @timer(label="api_call") and it feels natural.

student (focused)

Okay so the outer function (timer) is a factory. It takes the configuration. It returns a decorator. The decorator takes the function and wraps it.

teacher (neutral)

That's it.

student (curious)

Show me retry.

teacher (focused)

Retry is a decorator that calls the function up to N times. If it fails, retry. If it succeeds, return the result. If all retries fail, raise the last exception.

def retry(max_attempts=3):
    def decorator(func):
        def wrapper(*args, **kwargs):
            last_exception = None
            for attempt in range(max_attempts):
                try:
                    result = func(*args, **kwargs)
                    print(f"Success on attempt {attempt + 1}")
                    return result
                except Exception as e:
                    last_exception = e
                    print(f"Attempt {attempt + 1} failed: {e}")
            raise last_exception
        return wrapper
    return decorator

@retry(max_attempts=3)
def flaky_api_call():
    import random
    if random.random() < 0.7:
        raise ConnectionError("Network error")
    return "success"

Now flaky_api_call is wrapped. When you call it, the wrapper tries up to 3 times. If all fail, the last exception gets raised.

student (excited)

So the wrapper is a state machine. Try once. If it fails, try again. If it fails again, try again. If all three fail, give up.

teacher (encouraging)

Yes. And the original function is never called directly — it's always called from inside the wrapper.

student (thinking)

And the last_exception variable in the outer scope... the wrapper uses it?

teacher (focused)

Yes. That's a closure. The wrapper function has access to last_exception because it was defined in the retry function's scope. When you raise last_exception at the end, you're raising the exception from the last failed attempt.

student (curious)

What if all attempts succeed on the first try? Does it just return?

teacher (neutral)

Yes. If the first call succeeds, it prints "Success on attempt 1" and returns immediately. The loop breaks, the function exits.

student (focused)

Okay so I understand retry now. What's timer?

teacher (focused)

Timer measures how long a function takes:

import time

def timer(label=None):
    def decorator(func):
        def wrapper(*args, **kwargs):
            start = time.time()
            result = func(*args, **kwargs)
            elapsed = time.time() - start
            if label:
                print(f"[{label}] {elapsed:.2f}s")
            else:
                print(f"{func.__name__}: {elapsed:.2f}s")
            return result
        return wrapper
    return decorator

@timer(label="database_query")
def fetch_orders():
    time.sleep(1)  # Simulate slow query
    return []

When you call fetch_orders(), the wrapper times the execution and prints how long it took.

student (amused)

So it's like a stopwatch decorator.

teacher (amused)

Exactly. You're measuring function performance without changing the function's code.

student (thinking)

And the label is optional. If you don't provide one, it uses the function name.

teacher (encouraging)

Right. That's why the outer function takes label=None. If you don't care about a label, you can write @timer() and it works.

student (focused)

What about cache?

teacher (focused)

Cache stores the result of a function so you don't have to call it again with the same arguments.

def cache(func):
    store = {}  # Dictionary to hold cached results
    
    def wrapper(*args, **kwargs):
        # Create a key from the arguments
        key = (args, tuple(sorted(kwargs.items())))
        
        if key in store:
            print(f"Cache hit: {func.__name__}")
            return store[key]
        
        print(f"Cache miss: computing {func.__name__}")
        result = func(*args, **kwargs)
        store[key] = result
        return result
    
    return wrapper

@cache
def expensive_calculation(n):
    print(f"Really computing for {n}")
    return n * n

print(expensive_calculation(5))  # Cache miss, computes
print(expensive_calculation(5))  # Cache hit, returns stored result
print(expensive_calculation(6))  # Cache miss, new argument
student (thinking)

So the wrapper checks if the arguments have been seen before. If yes, return the stored result. If no, call the function and store the result.

teacher (neutral)

Exactly. The store dict persists across calls because it's in the wrapper's closure. Every time wrapper is called, it checks the same store.

student (confused)

Wait, cache doesn't take arguments like retry and timer do. Why?

teacher (focused)

Because cache doesn't need customization. It's simple: store results, check before computing. Retry needs max_attempts. Timer needs an optional label. Cache doesn't need any configuration.

student (thinking)

But I could write a cache factory if I wanted to, like, clear the cache after a certain time?

teacher (excited)

Yes! That's a good idea. You could write:

def cache(ttl=None):
    store = {}
    timestamps = {}
    
    def decorator(func):
        def wrapper(*args, **kwargs):
            key = (args, tuple(sorted(kwargs.items())))
            import time
            now = time.time()
            
            if key in store and (ttl is None or now - timestamps[key] < ttl):
                return store[key]
            
            result = func(*args, **kwargs)
            store[key] = result
            timestamps[key] = now
            return result
        return wrapper
    return decorator

@cache(ttl=60)  # Cache for 60 seconds
def expensive_calculation(n):
    return n * n

Now cache is a factory too. It takes ttl (time to live) and returns a decorator.

student (focused)

So the pattern is: if the decorator needs configuration, make it a factory. If it doesn't, just make it a function that takes the function.

teacher (encouraging)

That's the rule.

student (curious)

What's the simplest example of this? Like, the bare minimum three-level thing?

teacher (neutral)

Here:

def decorator_factory(config):
    def decorator(func):
        def wrapper(*args, **kwargs):
            print(f"Config: {config}")
            return func(*args, **kwargs)
        return wrapper
    return decorator

@decorator_factory(config="hello")
def greet(name):
    return f"Hello {name}"

Three levels: factory takes config, returns decorator. Decorator takes func, returns wrapper. Wrapper calls func.

student (thinking)

And the syntax @decorator_factory(config="hello") calls the factory first, gets the decorator, applies the decorator to greet.

teacher (focused)

Right. By the time Python assigns greet, it's already been through both the factory and the decorator.

student (excited)

So when I see @retry(max_attempts=3) in the codebase, I know: retry is a factory. It takes max_attempts and returns a decorator. The decorator takes a function and returns a wrapper that retries up to max_attempts times.

teacher (proud)

And the wrapper is called every time you invoke the decorated function.

student (focused)

What happens if I use retry without arguments? Like @retry?

teacher (serious)

It breaks. Because @retry would pass the function directly to retry(), and retry() expects max_attempts, not a function. You have to write @retry(max_attempts=3) or the factory won't work.

student (thinking)

Unless you make max_attempts optional and provide a default?

teacher (excited)

Yes! That's why I showed def retry(max_attempts=3). Now @retry() works with the default, and @retry(max_attempts=5) customizes it.

student (focused)

Okay I think I get it. The three levels are confusing at first but once you see the pattern — factory, decorator, wrapper — it clicks.

teacher (encouraging)

And tomorrow you'll see functools.wraps and functools.lru_cache, which are built-in versions of these patterns. You're building them by hand today so you understand how they work.

student (excited)

So functools.lru_cache is basically the cache decorator I just learned?

teacher (serious)

Functionally, yes. But it's more robust — it handles corner cases, it's optimized, it's production-ready. Today you learned the core idea. Tomorrow we see how the standard library does it.

student (proud)

I'm going to build the retry decorator for our order processor. We have flaky API calls and we're catching them in a try-except every time. A @retry decorator would be cleaner.

teacher (amused)

Amir's going to ask you to explain it. This time you'll actually be able to.