Using functools for Function Caching

Function caching is a technique that stores the results of expensive function calls and reuses the cached result when the same inputs occur again. This can significantly improve performance, especially for functions with costly computations or repeated calls.

Python’s functools module provides the lru_cache decorator, which enables Least Recently Used (LRU) caching for function calls. By using this decorator, the result of a function is cached based on its arguments, and if the same arguments are encountered again, the cached result is returned instead of recalculating it.

Code Example

import functools

@functools.lru_cache(maxsize=4)
def fibonacci(n: int) -> int:
    if n < 2:
        return n
    return fibonacci(n-1) + fibonacci(n-2)

# Example usage:
print(fibonacci(10))  # Outputs: 55
print(fibonacci.cache_info())  # Displays cache statistics

Code Explanation

In this example, the fibonacci function calculates the Fibonacci number at position n. The @functools.lru_cache(maxsize=4) decorator is applied to this function, which caches the results of the function for up to four distinct function arguments. The cache size is controlled by the maxsize parameter. When the cache is full, the least recently used items are discarded to make room for new ones.

The Fibonacci sequence is a recursive function that requires multiple calls to the same sub-problems. For example, calculating fibonacci(10) involves repeated calls to fibonacci(9), fibonacci(8), and so on. Without caching, these repeated calls lead to redundant computations, significantly slowing down performance.

With the LRU cache enabled, if the same Fibonacci number is requested multiple times, the cached result is returned rather than recalculating it. For instance, if fibonacci(10) is calculated once, any further calls to fibonacci(10) will use the cached result instead of recalculating the value.

Finally, fibonacci.cache_info() can be called to display cache statistics, such as how many times the cache was hit, missed, and the current cache size. This helps in understanding how often cached results are being reused, providing insight into the performance benefits of caching.