DEV Community

Cover image for Cache in Python !
Atul Kushwaha
Atul Kushwaha

Posted on

1

Cache in Python !

Exploring Python's functools: Cache, Cached Property, and LRU Cache

Python's functools module is a treasure trove for functional programming enthusiasts, offering tools to make your code more efficient and elegant. In this post, we'll dive into three powerful functions—cache, cached_property, and lru_cache—that help optimize performance by storing results of expensive computations. Whether you're speeding up recursive algorithms or simplifying class-based calculations, these tools have you covered. Let’s explore each one with clear explanations and practical examples.

1. cache: Simple, Unbounded Memoization

The cache decorator is a lightweight way to memoize function results, storing them for reuse when the same inputs occur again. It’s like a sticky note for your function’s outputs—no recomputation needed!

How It Works

  • What it does: Stores function results in an unbounded dictionary, using arguments as keys.
  • When to use it: Ideal for pure functions (same input, same output) that are computationally expensive.
  • Key feature: It’s equivalent to lru_cache(maxsize=None) but faster due to its simplicity.

Example

from functools import cache

@cache
def factorial(n):
    return n * factorial(n-1) if n else 1

print(factorial(10))  # Computes: 3628800
print(factorial(10))  # Returns cached result, no recomputation
Enter fullscreen mode Exit fullscreen mode

Why It’s Awesome

  • Speed: Avoids redundant calculations, making recursive functions like factorial lightning-fast.
  • Simplicity: No configuration needed—just slap on the decorator.
  • Caveat: The cache grows indefinitely, so monitor memory usage for functions with many unique inputs.

2. cached_property: One-Time Property Computation

The cached_property decorator transforms a class method into a property that computes its value once and caches it for the instance. Think of it as a lazy-loaded attribute that sticks around.

How It Works

  • What it does: Runs the method the first time it’s accessed, caches the result as an instance attribute, and returns the cached value on subsequent calls.
  • When to use it: Perfect for expensive calculations in classes that don’t change after the first computation. Key feature: Works only with instance methods (requires self).

Example

from functools import cached_property

class Circle:
    def __init__(self, radius):
        self.radius = radius

    @cached_property
    def area(self):
        print("Computing area")
        return 3.14159 * self.radius ** 2

c = Circle(5)
print(c.area)  # Prints: Computing area, then 78.53975
print(c.area)  # Prints: 78.53975 (cached, no recomputation)
Enter fullscreen mode Exit fullscreen mode

Why It’s Awesome

  • Efficiency: Computes only once per instance, saving CPU cycles.
  • Clean code: Reads like a property (c.area vs. c.area()), blending seamlessly into class design.
  • Caveat: The cached value can be overridden (e.g., c.area = 0), so use it for immutable data.

3. lru_cache: Flexible, Bounded Memoization

The lru_cache decorator is the heavy hitter of memoization, offering a Least Recently Used (LRU) cache with configurable size. It’s thread-safe and packed with introspection features, making it a go-to for optimizing complex functions.

How It Works

  • What it does: Caches up to maxsize results, evicting the least recently used entry when full. Supports a typed option to treat different types (e.g., 3 vs. 3.0) as distinct keys.
  • When to use it: Great for recursive algorithms, dynamic programming, or any function with repetitive calls.
  • Key feature: Provides cache_info() to inspect hits, misses, and cache size.

Example

from functools import lru_cache

@lru_cache(maxsize=32)
def fib(n):
    if n < 2:
        return n
    return fib(n-1) + fib(n-2)

print(fib(10))  # Computes: 55
print(fib.cache_info())  # Shows: CacheInfo(hits=8, misses=11, maxsize=32, currsize=11)
Enter fullscreen mode Exit fullscreen mode

Why It’s Awesome

  • Control: Set maxsize to balance memory and performance (use None for unbounded, like cache).
  • Thread-safe: Safe for multithreaded applications, ensuring the cache stays consistent.
  • Debugging: cache_info() helps you tune performance by revealing cache effectiveness.
  • Caveat: Avoid using with functions that have side effects, as the cache assumes deterministic outputs.

Quadratic AI

Quadratic AI – The Spreadsheet with AI, Code, and Connections

  • AI-Powered Insights: Ask questions in plain English and get instant visualizations
  • Multi-Language Support: Seamlessly switch between Python, SQL, and JavaScript in one workspace
  • Zero Setup Required: Connect to databases or drag-and-drop files straight from your browser
  • Live Collaboration: Work together in real-time, no matter where your team is located
  • Beyond Formulas: Tackle complex analysis that traditional spreadsheets can't handle

Get started for free.

Watch The Demo 📊✨

Top comments (0)

Jetbrains Survey

Take part in the Developer Ecosystem Survey!

Share your thoughts and get the chance to win a MacBook Pro, an iPhone 16 Pro, or other exciting prizes. Help shape the coding landscape and earn rewards for your contributions!

Take the survey

👋 Kindness is contagious

Engage with a wealth of insights in this thoughtful article, cherished by the supportive DEV Community. Coders of every background are encouraged to bring their perspectives and bolster our collective wisdom.

A sincere “thank you” often brightens someone’s day—share yours in the comments below!

On DEV, the act of sharing knowledge eases our journey and forges stronger community ties. Found value in this? A quick thank-you to the author can make a world of difference.

Okay