

Python Programming
This article is an illustration of techniques in advanced Python coding and an exploration of decorators. A decorator in Python is like a wrapper for a function or method that adds extra functionality without modifying the function or method's original code. Decorators can be plain, chained, or paramaterised to make a decorator factory (a generator of decorators). They are designated with an "@name" where "name" is the name of the decorator. Think of it like this:
You have a plain coffee (a function), and you want to add whipped cream or caramel drizzle without changing the core drink. You wrap the coffee with an object that contains the addons. That’s a decorator.
Here's a basic example. Here's a simple decorator that logs whenever a function is called:
def logger(func):
def wrapper(*args, **kwargs):
print(f"Calling {func.__name__}")
return func(*args, **kwargs)
return wrapper
@logger
def greet(name):
print(f"Hello, {name}!")
greet("Fred")
Running this program without the "@logger" line will generate the following output:
Hello Fred!
Running the program with the "@logger" line included will generate the following output:
Calling greet
Hello, Fred!
The "@logger" expression is a decorator invocation. It is syntactic sugar for:
logger(greet)("Fred!")
The function "logger()" take the original function "greet" and returns a new function "wrapper" that adds behaviour before and after the original call.
Now, in this case we have simply put the output to the console, so it is a bit silly, but if we had instead written it to a file or the stderror it becomes clear that this is a potentially useful trick in debugging:
import sys
def logger(func):
def wrapper(*args, **kwargs):
print(f"Calling {func.__name__}", file=sys.stderr)
return func(*args, **kwargs)
return wrapper
@logger
def greet(name):
print(f"Hello, {name}!")
greet("Fred")
Now the greeting goes to the console and the log notifications go to stderr.
As a slight aside if you were actually doing this for real, you might want to define a reusable eprint() function like this (so you don't have to keep setting the file argument):
import sys
def eprint(*args, **kwargs):
print(*args, file=sys.stderr, **kwargs)
def logger(func):
def wrapper(*args, **kwargs):
eprint(f"Calling {func.__name__}")
return func(*args, **kwargs)
return wrapper
Although we call the wrapper function "wrapper" in these examples, this is just a "convention", not a keyword. The wrapper function can be called anything you like
def shout_factory(times): # This is your decorator factory
def decorate_me(func): # This is the actual decorator
def boom(*args, **kwargs): # This is the wrapper function
for _ in range(times):
print("HEH!", end=" ")
return func(*args, **kwargs)
return boom
return decorate_me
@shout_factory(3)
def greet(name):
print(f"Hello, {name}!")
greet("Jonathan") #Outputs: HEH! HEH! HEH! Hello Jonathan!
We will go into decorator factories later, but for now I just want you to note that the decorator factory, decorator and wrapper can have any name you like. That being said, using clear names like decorator and wrapper can make your code easier to read and understand, especially if you revisit it later or share it with someone else.
The standard use-cases for decoration are:
Python comes with a number of built-in decorators which you may have used:
We will consider each of these in turn:
Defines a method that doesn't take self or cls as the first argument. It is typically used when the method’s logic is related to the class but doesn't need instance or class access. I.e it is statically available regardless of the presence of a class instance.
For example:
class Math:
@staticmethod
def add(x, y):
return x + y
Math.add(3, 4) # ➜ 7
Note that in this case it looks like a module access. We don't actually need to instantiate an instance of the Math class to be able to use "add". It essentially creates a utility function held inside a class definition for organisation purposes.
Defines a method that takes "cls" (but not "self") as its first argument, giving access to the class itself. It is used for class factory methods or anything that needs to modify the shared class level state.
For example:
class Book:
default_title = "Untitled"
def __init__(self, title):
self.title = title
@classmethod
def create_default(cls):
return cls(cls.default_title)
myBook = Book.create_default()
Here we have defined a classmethod that is an alternative constructor, so we definately want it to be available before any instance of the class is instantiated. The standard constructor (defined as "__init_()" takes a title as an argument, while this method invokes the standard constructer using the default_title as the argument and so defaults the title to the class default. Obviousl;y, there are also other ways to achieve this exact behaviour.
Defines getter behavior so an instance method can be read - accessed like an attribute. Essentially cosmetic, it provides a cleaner interface for read only behaviours.
class Circle:
def __init__(self, radius):
self.radius = radius
@property
def area(self):
from math import pi
return pi * self.radius ** 2
c = Circle(3)
print(c.area) # ➜ 28.27..., accessed like an attribute!
It makes a method "feel" like a read only attribute while retaining the benefits of the functional model.
Part of the functools library, the lru_cache decorator adds memoization to a function. We will look at memoization and lru_cache in a little more detail later in this article, but the idea behind memoization is to capture or cache the results of executing a function so that if it is called again the cached value can be returned instead of recalculating the value - thus improving execution speeds.
Decorators can be stacked to compose behaviors, just like function chaining or pipe() style transformations in Pandas. Stacked decorators are when you apply multiple decorators to a single function, and they’re applied from the bottom up, but executed top-down—like layers of an onion.
Here’s a simple example with two decorators: one that logs the function call, and another that times how long it takes.
import time
import functools
def logger(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
print(f"[LOG] Calling {func.__name__}()")
return func(*args, **kwargs)
return wrapper
def timer(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
start = time.time()
result = func(*args, **kwargs)
end = time.time()
print(f"[TIMER] {func.__name__} took {end - start:.4f}s")
return result
return wrapper
# Stacking: applied bottom-up, run top-down
@logger
@timer
def compute():
time.sleep(1)
print("Finished computation")
compute()
When run this produces an output similar to:
LOG] Calling compute()
Finished computation
[TIMER] compute took 1.0002s
Notice:
If you reverse the order of the decorators, the output changes too.
This example shows a few more hidden "gotcha's" with decorators that we should explore:
In the example we have included another decorator "@functools.wraps()" within the decorator definition for both logger and timer. We will go into what this decorator actually does in a minute but first let's note that including a decorator inside a decorator effectively nests the decorator.
Stacking decorators is functionally equivalent to nesting them in code. The behaviour of a decorator used in the declaration of another decorator is the same as when it is used outside of a decorator. There is nothing really special about a decorator definition as it is just another function defnition that is using closures to hold state. Just like a block of code, there is no theoretical limit to the number of nests (although, of course there is always a practical limit) - just like with recursion: there is no theoretical limit to the number of recursions, but there is a practical limit because eventually you run out of stack space.
This topic is kind-of REALLY important and ideally would have been the first topic in the article, but had we done that there wouldn't be the context to understand it.
When you create a decorator that wraps another function, you're actually replacing that original function with your own wrapper. The problem is, without @functools.wraps, the wrapper function steals the identity of the original:
If all you are doing is running the code, you won't notice the difference, but in the editor or debugger you very well might.
The @functools.wraps decorator copies the metadata (like __name__, __doc__, and __module__) from the original func to your wrapper function. This is the pattern:
import functools
def my_decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
print("Wrapping!")
return func(*args, **kwargs)
return wrapper
That one line makes your decorators much friendlier to debugging, documentation, and composability.
Next we will compare what happens with and without the wraps decorator:
Decorator With @wraps:
import functools
def log_decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
"""Wrapper function that logs calls"""
print(f"Calling {func.__name__}")
return func(*args, **kwargs)
return wrapper
@log_decorator
def say_hello():
"""Greet the user"""
print("Hello!")
print(say_hello.__name__) # 'say_hello'
print(say_hello.__doc__) # 'Greet the user'
With @wraps, all the metadata from say_hello is preserved, even though it's wrapped.
Decorator without @wraps:
def log_decorator(func):
def wrapper(*args, **kwargs):
"""Wrapper function that logs calls"""
print(f"Calling {func.__name__}")
return func(*args, **kwargs)
return wrapper
@log_decorator
def say_hello():
"""Greet the user"""
print("Hello!")
print(say_hello.__name__) # 'wrapper'
print(say_hello.__doc__) # 'Wrapper function that logs calls'
Without @wraps, Python thinks your original say_hello() is now called wrapper, and its docstring is lost. This messes with:
If you're wrapping functions, always use @functools.wraps(func). It's one line of protection that preserves your original function’s identity.
Every decorator has an implied argument: the function name it is wrapping. When we talk about decorators with or without arguments we are referring to the presence of arguments other than the "invisible" function name.
When a decorator has arguments—like @lru_cache(maxsize=128)—you’re actually dealing with a decorator factory. It’s a function that returns a decorator, which in turn wraps your function. For example:
Here, my_decorator is a function that takes func and returns a modified version of it
@my_decorator
def func():
...
Now my_decorator(arg1, arg2) runs first, returning a new decorator customized with arg1 and arg2. That returned decorator is then used to wrap func.
@my_decorator(arg1, arg2)
def func():
...
For example, this is a decorator that takes one argument and repeats the function call n times when called once:
def repeat(n_times):
def decorator(func):
def wrapper(*args, **kwargs):
for _ in range(n_times):
func(*args, **kwargs)
return wrapper
return decorator
@repeat(3)
def say_hello():
print("Hello!")
say_Hello()
The for-loop inside the wrapper causes the wrapped function to be called n times. Output:
Hello!
Hello!
Hello!
We have already touched on lru_cache decorator earlier. This decorator is one of the more useful ones as it memoises the function, caching the output and checking if the input/output is in the cache before calling the function again. If the output is in the cache it reuses that previous output instead. The argument sets the size of the cache
from functools import lru_cache
@lru_cache(maxsize=128) # <-- this returns a caching decorator
def my_fib(n):
if n <= 1:
return n
return my_fib(n - 1) + my_fib(n - 2)
print(f"{my_fib(6)}")
print(f"{my_fib(4)}")
In the example, the first call will cause every instance of fib<=6 to be called and stored in the cache, so the second call just reuses the cached result.
A couple of notes about lru_cache:
It's the "()" that indicates the decorator factory rather than the arguments themselves. @mydecorator() is a decorator factory, even if you’re not passing any arguments. It’s the difference between using a decorator and generating one dynamically. A factory returns a decorator that is then used to wrap a function, while a decorator just wraps the function.
For example - a simple decorator:
def mydecorator(func):
def wrapper(*args, **kwargs):
print("Running!")
return func(*args, **kwargs)
return wrapper
@mydecorator
def greet():
print("Hello")
Here, mydecorator directly takes the function and returns a modified version. No parentheses needed because you're not calling it—you're passing the function into it.
Contrast with:
def mydecorator():
def actual_decorator(func):
def wrapper(*args, **kwargs):
print("Running from factory!")
return func(*args, **kwargs)
return wrapper
return actual_decorator
@mydecorator() # <-- called here, returns the actual decorator
def greet():
print("Hello")
Here we are calling mydecorator first, expecting it to return an actual decorator which is then used to wrap the function. Even if mydecorator() doesn’t take arguments, it still returns a decorator when called so it’s still a factory, just one without configurable behavior (yet).
We might use a decorator factory without arguments because we:
A decorator factory can be designed to be called with or without the (), but the behaviour is different in each case and a little more setup is required in the code. These are hybrid decorators. They function both as decorators and decorator factories.
For example:
def log_call(_func=None, *, prefix=">>"):
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
print(f"{prefix} Calling {func.__name__}")
return func(*args, **kwargs)
return wrapper
# If used without parentheses: @log_call
if callable(_func):
return decorator(_func)
# If used with parentheses: @log_call(...)
return decorator
@log_call
def greet():
print("Hello!")
@log_call(prefix=">>>")
def farewell():
print("Goodbye!")
Note that the first invocation has no () and is applying the decorator immediately, while the second invocation is calling the factory. Earlier I noted that, all decorator invocations actually have arguments (even if they look like they don't) because the first argument is the function itself, if it has no other arguments and no explicit "()". So the key to distinguishing between a decorator invocation and a decorator factory invocation is checking whether the first argument (_func) is callable:
This pattern is used in real-world libraries like pytest.fixture and click.command.
In another article on our website (www,bishopphillips.com) I go through closures. A decorator is essentially a kind of closure (a function that holds state between function calls). This characteristic can be utilised to create some useful behaviours.
Here is a decorator that tracks how many times a function has been called:
import functools
def count_calls(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
wrapper.call_count += 1
print(f"[Call #{wrapper.call_count}] {func.__name__} was called")
return func(*args, **kwargs)
wrapper.call_count = 0
return wrapper
@count_calls
def greet(name):
print(f"Hello, {name}!")
#Usage:
greet("Jonathan")
greet("Ada")
greet("Grace")
print(f"Greet call count: {greet.call_count}"))
Which creates the output:
[Call #1] greet was called
Hello, Jonathan!
[Call #2] greet was called
Hello, Ada!
[Call #3] greet was called
Hello, Grace!
Greet call count: 3
The wrapper function creates a local variable or attribute inside the function definition "wrapper.call_count" which attaches a counter directly to the wrapper function object. With each call that counter is incremented, and it can be inspected by accessing the function object without invocation, which we do in the final print statement.
We can create a decorator to hide the original functions identity. For example:
Which creates the output:
def anonymize(func):
def wrapper(*args, **kwargs):
print("A mysterious function is running...")
return func(*args, **kwargs)
# Mangle the metadata
wrapper.__name__ = "anonymous_function"
wrapper.__doc__ = "This function prefers to remain nameless."
wrapper.__module__ = "unknown.origin"
return wrapper
@anonymize
def launch_protocol():
"""Launches the orbital transport sequence."""
print("Liftoff initiated.")
#Usage
launch_protocol()
print(launch_protocol.__name__) # anonymous_function
print(launch_protocol.__doc__) # This function prefers to remain nameless.
print(launch_protocol.__module__) # unknown.origin
Just like modules and packages in python, functions have invisible attributes like "__name__" and "__doc__". A decorator can be designed to deliberately modify these as we have done above. This kind of manipulation might be desirable under the following scenarios:
So far, we have demonstrated decorators for:
As the final part of this article I want to float some addional use cases for applying decorators.
Assuming we design our code to rigid patterns that include certain standard keywork parameters like "debug", that can be turned on or off to create additional behaviours, we can use decorators to set this behaviour project wide. You would probably want to create a package or module with such decorators and import them into your python module. An argument modifier would:
And would look something like this:
def force_debug(func):
def wrapper(*args, **kwargs):
kwargs["debug"] = True
return func(*args, **kwargs)
return wrapper
@retry(retries=3, delay=2)
def upload_file(...):
...
def require_admin(func):
def wrapper(*args, **kwargs):
if not current_user.is_admin:
raise PermissionError("Not authorized")
return func(*args, **kwargs)
return wrapper
@time_cache(ttl=60) # custom TTL-based caching
def get_config_from_disk():
@trace_request("MCP_SUBSYSTEM")
def handle_message(msg): ...
@rate_limit(calls=10, per=60)
def check_status(): ...