Python Decoration Is Very Useful, But When To Use Them?

Author:Murphy  |  View: 26069  |  Time: 2025-03-22 21:43:10

As one of the most unique features in Python, decorator might not be quite easy to understand. However, it will provide super conveniences to our development no matter if you are a Data Scientist, Data Engineer or Web Developer.

A few years ago, I wrote a tutorial article about Python decorator. That is highly recommended if you are after some basics about the Python decorator.

The Simplest Tutorial for Python Decorator

In this article, I will summarise five different scenarios of Python decorator usage. So, hopefully, it will give you ideas about when and why we should use decorators.

1. Quick Start – Function Decorator

Let's start with the basic usage pattern of Python decorator, which is the function decorator.

Assuming that we are developing an application for data analytics. We want to know how much time it takes for certain data processing functions to run. Please see the following simulated function for data processing.

def process_data(data):
    time.sleep(2) # Simulate the elapse time for data processing
    print(f"Done, data is processed.")

data = [1,2,3] # Simple dummy data

For demonstration purposes, we can create a simple list as the dummy data. Also, we can skip the real "data process" implementation and use the sleeping function to simulate something happening in the function. This also simulates the "elapsed time" that we want to measure.

Now, let's not worry about the decorator yet. We can simply write the code as follows to measure the performance of the function process_data().

import time

start_time = time.time()
process_data(data)
end_time = time.time()
print(f"The function executed in {end_time - start_time:.3f} seconds")

We first need to import the time module. Then, we can get the starting timestamp followed by running the function. After the function is executed, we can get the ending timestamp. The difference between these two timestamps will be the elapsed time. I also rounded it up to 3 decimals for readability.

Why decorator is necessary here?

The above example works great. However, what if we have multiple functions like the process_data() to be measured multiple times? Shall we write the start time and end time everywhere in our application?

With the Python decorator, we don't have to repeat ourselves. To write a decorator for achieving exactly the same results as above, we simply need to put the code into a nested function as follows.

def performance_metrics(func):

    def wrapper(*args):
        start_time = time.time()
        func(*args)
        end_time = time.time()
        print(f"{func.__name__} executed in {end_time - start_time:.3f} seconds")

    return wrapper

In this nested function, the parameter func refers to the function that the decorator will decorate. In our case, that means the process_data() function.

Then, in the wrapper() function, we define what the decorator is supposed to do. Basically, the code is exactly the same as the scripts we have done in the previous example, except that we have to use the func to call the original function.

In the wrapper(*args) function, we just get all the variables from the process_data(data) function and pass it to the func to make sure we execute it with its original arguments. If you don't understand the *args magic, please check out the article below.

*Do You Really Know args In Python?**

Therefore, we get the decorator called performance_metrics ready to use. The below code shows how to use it.

@performance_metrics
def process_data(data):
    time.sleep(2)
    print(f"Done, data is processed.")

process_data(data)

Great! So, no matter it is the process_data() function or anything else, we can put this decorator on, and it will help us to measure the elapsed time of the execution and then output.

2. Methods decorator

Now, let's have a look at the 2nd usage type of Python decorator. Nothing magic, after we showed that we can use a decorator on a Python function, it is natural to ask if we can use it on a class method. Obviously, the answer is yes.

Suppose we have to write a method inside a class to do something, but we still want to measure its performance. To achieve that, we can directly add the decorator to the class method. Yes, don't worry about the namespace, that is the correct way of doing this.

class DataProcessor:
    @performance_metrics
    def process_data(self, data):
        time.sleep(2)
        print(f"Done, data is processed.")

# Test code
processor = DataProcessor()
print(processor.process_data(data))

3. Class Decorator

Now, what if we want to use the performance_metrics decorator for all of the methods in a class? Yes, that is surely doable. Of course, I'm not going to tell you to add the decorator to the method one by one. That is one option with flexibility, but what I want to show you is the class decorator.

We can define a class-level decorator as follows.

def performance_for_all_methods(cls):
    for name, method in cls.__dict__.items():
        if callable(method):
            setattr(cls, name, performance_metrics(method))
    return cls

Please note that we are using the performance_metrics function decorator inside this class decorator called performance_for_all_methods. The parameter cls stands for the class it decorates on.

In Python, we can get all the definitions such as the attributes and the methods from the class. This concept is similar to the "reflection" in other Programming languages like Java. From the dictionary, let's get the name of the artifact and the artifacts themselves. Then, we need to check if the artifact is a method since it could be an attribute or other things. All we need to do is to check if it is callable by running callable(method).

Then, we can use the setattr() function to "programmatically" add the performance_metrics decorator to every method in the class.

Now, let's define a sample class with dummy methods with the decorator added to it.

def performance_for_all_methods(cls):
    for name, method in cls.__dict__.items():
        if callable(method):
            setattr(cls, name, performance_metrics(method))
    return cls

@performance_for_all_methods
class DataAnalysis:
    def clean_data(self, data):
        time.sleep(1)
        print(f"Done, data is cleaned.")

    def analyze_data(self, data):
        time.sleep(3)
        print(f"Done, data is analyzed.")

# Testing the decorated class
analysis = DataAnalysis()
analysis.clean_data(data)
analysis.analyze_data(data)

Works great!

Please note that this demo means to show you that we can do lots of customisable things in the class-level decorators, so don't be limited to the use case in the example. Try to explore more use cases by yourselves!

4. Stacked Decorators

Do you know that we can add multiple decorators to a function? I want to call this "stacked decorators" and hope that makes sense.

Suppose that we also want to add a decorator to make sure it logs errors if happen. For demonstration purposes, I'll use the stdout logging only rather than writing to a file.

import logging

# Setup basic configuration for logging
logging.basicConfig(level=logging.DEBUG, format='%(asctime)s - %(levelname)s - %(message)s')

# The new decorator for logging
def log_exceptions(func):
    def wrapper(*args):
        try:
            func(*args)
        except Exception as e:
            logging.exception(f"An error occurred in {func.__name__}: {str(e)}")
            raise  # Re-raise the exception after logging
    return wrapper

In the above code, we defined a new decorator called log_exceptions. In the wrapper() function, we put the execution of the function in a try-except block to catch any possible errors. If there is an error, use the logging module to log the exception with a certain format. In practice, we should log the error to a file, but let's make it easy for this demo.

An important highlight is that we will need to add the raise statement to re-raise the error. Otherwise, the error will be fully handled and the whole process will be considered successful.

Now, let's add one more simulation in the process_data() function to simulate the scenario that a customised error will be raised if the data is empty.

@performance_metrics
@log_exceptions
def process_data(data):
    # Simulate a condition that could raise an error
    if not data:
        raise ValueError("No data provided")
    # Simulate data processing delay
    time.sleep(2)
    print(f"Data processed with {len(data)} records")

Look! We've added both the performance_metrics decorator and the log_exceptions decorator. Inside the function, we added the additional simulation for the no-data scenario.

Let's first test it with the valid data.

It works! The results remain unchanged from the previous example. However, if we intentionally pass an empty data to the function, the 2nd log_exceptions will be show its impact.

process_data([])

Please note that the content in the green box highlighted in the screenshot is the logging output exactly.

5. Parameterised Decorator

Now, suppose we do not want to throw an error when the data is empty, but just give a warning. However, we want to control the warning behaviour using a parameter in the decorator. That is, if we enable this feature, the decorator will detect if the data is empty and give a warning. Or, if this is disabled, do not show the warning.

See how we can define a decorator to do so.

import time
import logging

logging.basicConfig(level=logging.DEBUG, format='%(asctime)s - %(levelname)s - %(message)s')

def performance_metrics(warn_on_empty=True):
    def decorator(func):
        def wrapper(*args):
            if warn_on_empty and not args[0]:
                logging.warning(f"No data provided to {func.__name__}. Execution halted.")
                print("Execution halted due to no data.")
            start_time = time.time()
            result = func(*args)
            end_time = time.time()
            elapsed_time = end_time - start_time
            print(f"{func.__name__} executed in {elapsed_time:.3f} seconds")
            return result
        return wrapper
    return decorator

Please don't be scared by the 3-level nested function. If you look at the two inner functions decorator(func) and wrapper(*args), they are exactly the original performance_metric decorator that we defined above. Of course, to achieve the new requirements, we have to add the following feature in the wrapper(*args) function.

That is, adding the if condition to check if the boolean variable warn_on_empty is true (meaning that the warning feature is enabled), and the first argument (in our case this will be the data variable) is empty.

We can go back to using our original process_data(data) function. Also, don't forget to add our new decorator with the parameter, and enable it.

@performance_metrics(warn_on_empty=True)
def process_data(data):
    time.sleep(2)
    print(f"Done, data is processed.")

Let's run the function with valid data first.

Nothing different from the previous behaviour. The decorator will only output the performance metrics.

Now, let's try to pass an empty list to the function.

The warning message is displayed because the data is empty. That's exactly what we want.

If you want, we can also test disabling the feature by setting the warn_on_empty to false.

@performance_metrics(warn_on_empty=False)
def process_data(data):
    time.sleep(2)
    print(f"Done, data is processed.")

# Test code
process_data([])

Great, the parameter works as expected. If you understand this now, feel free to do your homework to try the parameterised decorator for other purposes. For example, add a timeout parameter to kill the thread if the data processing time runs over certain seconds.

Summary

In this article, I have introduced five common and typical usage patterns of the Python decorator. From the basic function decorator to the stacked and parameterised decorator. I hope this can provide ideas about when we can or should use decorators.

Of course, I'm not persuading you to use decorators everywhere. It will be your own judgement when we should use it to reduce the complexity and make our code clean and tidy, and when it may generate more overhead or reduce the readability. Happy coding!

Tags: Artificial Intelligence Data Science Hands On Tutorials Programming Technology

Comment