Python Profile Memory Usage [In-Depth Guide]

Profiling memory usage in Python is important for identifying memory leaks and optimizing your code’s memory efficiency.

Profiling memory usage in Python can be done using various tools and techniques. Here are some common methods to profile memory usage in Python:

1. Memory Profilers:

To profile memory usage in a Python program, you can use the memory_profiler library. This library allows you to monitor memory usage line-by-line in your code. Here’s how you can use it:

  1. Install memory_profiler:

You can install memory_profiler using pip:

pip install memory-profilerCode language: Python (python)
  1. Create a Python script you want to profile, e.g., my_script.py.
  2. Add memory profiling to your script:
# my_script.py

from memory_profiler import profile

@profile
def my_function():
    a = [1] * (10 ** 6)  # Create a large list
    b = [2] * (2 * 10 ** 7)  # Create an even larger list
    del b  # Delete the large list to free memory
    return a

if __name__ == "__main__":
    my_function()Code language: Python (python)

In the code above, we’ve imported the profile decorator from memory_profiler and applied it to the my_function function. This decorator will profile the memory usage of the function when it’s called.

  1. Run the memory profiler:

Open a terminal and run the script using the mprof command-line tool that comes with memory_profiler. You can use the mprof tool to start, stop, and view memory usage statistics.

To start profiling, run:

mprof run my_script.pyCode language: Python (python)


To profile memory usage in a Python program, you can use the memory_profiler library. This library allows

  1. Analyze the results:

After running the script, you can analyze the memory usage profile by running:

mprof plotCode language: Python (python)

This command will generate a plot showing memory usage over time. You can also use other mprof commands to inspect the memory usage data.

Keep in mind that memory_profiler adds some overhead to your code, so it’s best used for profiling memory-intensive parts of your application rather than for overall performance monitoring.

Remember to remove the @profile decorator and any profiling code before deploying your application to production because it’s meant for debugging and development purposes only.

Example of what the output might look like:

The output of the memory_profiler tool will show you a line-by-line breakdown of memory usage as your Python script runs. Here’s an example of what the output might look like when you run a script with memory profiling:

Line #    Mem usage    Increment   Line Contents
================================================
     4  32.547 MiB   0.000 MiB   @profile
     5                             def my_function():
     6  32.547 MiB   0.000 MiB       a = [1] * (10 ** 6)  # Create a large list
     7  162.051 MiB 129.504 MiB       b = [2] * (2 * 10 ** 7)  # Create an even larger list
     8  32.547 MiB -129.504 MiB       del b  # Delete the large list to free memory
     9  32.547 MiB   0.000 MiB       return aCode language: Python (python)

In this example, the output provides information for each line of code in the my_function function:

  • Line #: The line number in the script where the memory usage is being measured.
  • Mem usage: The total memory usage at that line.
  • Increment: The change in memory usage compared to the previous line.
  • Line Contents: The actual code at that line.

Here’s an explanation of the output:

  • Line 6: Memory usage increases as the a list is created.
  • Line 7: Memory usage increases significantly as the b list is created.
  • Line 8: Memory usage decreases as the b list is deleted.
  • Line 9: Memory usage remains the same as it returns the a list.

The Increment column shows how much memory is allocated or deallocated at each line. In this example, you can see that the largest memory increase happens on Line 7 when the large b list is created.

This information helps you identify which parts of your code are responsible for increased memory usage, making it easier to optimize memory-intensive sections of your Python script.

2. Using Resource Module:

You can use Python’s built-in resource module to profile memory usage. It’s not as detailed as some third-party tools, but it can provide basic information. Here’s an example of how to use it:

The resource module in Python can be used to profile memory usage, but it’s relatively basic compared to more specialized memory profiling tools. It provides information about resource usage by the current process, including memory usage. Here’s a detailed example of how to use the resource module to profile memory usage:

import resource
import time

def profile_memory_usage():
    # Start tracking resource usage
    resource_usage_start = resource.getrusage(resource.RUSAGE_SELF)

    # Code you want to profile goes here
    numbers = []
    for i in range(1, 10001):
        numbers.append(i)

    # Stop tracking resource usage
    resource_usage_end = resource.getrusage(resource.RUSAGE_SELF)

    # Calculate memory usage
    memory_usage = resource_usage_end.ru_maxrss - resource_usage_start.ru_maxrss
    # Memory usage is in kilobytes, so convert it to megabytes
    memory_usage_mb = memory_usage / 1024

    print(f"Memory Usage: {memory_usage_mb:.2f} MB")

if __name__ == "__main__":
    profile_memory_usage()Code language: Python (python)

In this example:

  1. We import the resource module, which allows us to track resource usage.
  2. The profile_memory_usage function is defined.
  3. We start tracking resource usage using resource.getrusage(resource.RUSAGE_SELF) to get resource usage statistics before running the code you want to profile.
  4. In the code section you want to profile (in this case, creating a list of numbers), you should replace this with the part of your application you want to profile.
  5. After executing the code you want to profile, we stop tracking resource usage again using resource.getrusage(resource.RUSAGE_SELF) to get resource usage statistics after running the code.
  6. We calculate memory usage by subtracting the maximum resident set size (RSS) before and after running the code.
  7. We convert the memory usage from kilobytes to megabytes for a more readable output.
  8. Finally, we print the memory usage in megabytes.

When you run this script, it will output the memory usage of the code section you profiled. Keep in mind that the resource module provides basic information about resource usage, and more specialized memory profiling tools may offer more detailed insights into your program’s memory usage.

3. Using Tracemalloc

tracemalloc is a Python module that allows you to trace memory allocations in your code. It’s available in Python 3.4 and later versions. Here’s a detailed example of how to use tracemalloc to profile memory usage:

import tracemalloc

# Function to profile memory usage
def profile_memory_usage():
    tracemalloc.start()  # Start tracing memory allocations

    # Code you want to profile goes here
    numbers = []
    for i in range(1, 10001):
        numbers.append(i)

    # Take a snapshot of memory usage
    snapshot = tracemalloc.take_snapshot()

    print("Top 10 Memory Consuming Lines:")
    top_stats = snapshot.statistics('lineno')

    for stat in top_stats[:10]:
        print(stat)

    # Stop tracing memory allocations
    tracemalloc.stop()

if __name__ == "__main__":
    profile_memory_usage()Code language: Python (python)

In this example:

  1. We import the tracemalloc module.
  2. The profile_memory_usage function is defined, and we start tracing memory allocations using tracemalloc.start().
  3. In the code section you want to profile, we create a list of numbers. This is just an example; you should replace this code with the part of your application that you want to profile.
  4. After executing the code you want to profile, we take a snapshot of memory usage using tracemalloc.take_snapshot().
  5. We then print the top 10 memory-consuming lines using snapshot.statistics('lineno'). This provides information about which lines in your code are consuming the most memory.
  6. Finally, we stop tracing memory allocations using tracemalloc.stop().

When you run this script, it will output information about the top memory-consuming lines in the code section you profiled. This can help you identify areas in your code that may need optimization or where memory is being used inefficiently.

4. Using Psutil

ou can also use the psutil library to profile memory usage in Python. psutil is a cross-platform library for accessing system details and managing processes. Here’s how you can use psutil to profile memory usage:

  1. First, make sure you have psutil installed. You can install it using pip:
pip install psutilCode language: Python (python)
  1. Next, you can use psutil to profile memory usage in your Python script. Here’s an example:
import psutil

def profile_memory_usage():
    process = psutil.Process()
    memory_info = process.memory_info()

    print(f"Memory Usage (RSS): {memory_info.rss / (1024 * 1024)} MB")
    print(f"Memory Usage (Virtual): {memory_info.vms / (1024 * 1024)} MB")

if __name__ == "__main__":
    profile_memory_usage()Code language: Python (python)
  • psutil.Process() is used to get information about the current process.
  • memory_info.rss provides the resident set size (RSS) memory usage in bytes.
  • memory_info.vms provides the virtual memory size (VMS) memory usage in bytes.

This example prints the memory usage in megabytes (MB). You can adjust the output format and customize it according to your needs.

Remember that psutil provides a wide range of system-related information and can be a powerful tool for profiling and monitoring your Python applications.

5. Using Guppy

You can use the guppy library (specifically, guppy3) to profile memory usage in Python. guppy is a third-party library that provides memory profiling capabilities. It’s particularly useful for analyzing memory usage and finding memory leaks in your Python code.

Here’s an example of how to use guppy3 to profile memory usage:

  1. First, you need to install the guppy3 library using pip:
pip install guppy3Code language: Python (python)
  1. Next, you can use guppy3 in your Python script. Here’s an example:
from guppy import hpy

def profile_memory_usage():
    hp = hpy()
    h = hp.heap()
    
    # Code you want to profile goes here
    numbers = []
    for i in range(1, 10001):
        numbers.append(i)

    # Analyze memory usage
    memory_info = h.byrcs
    for rc in memory_info:
        print(f"Type: {rc}")
        print(f"Count: {memory_info[rc].count}")
        print(f"Size (bytes): {memory_info[rc].size}\n")

if __name__ == "__main__":
    profile_memory_usage()Code language: Python (python)

In this example:

  • We import hpy from the guppy module to create a heap profiler.
  • Inside the profile_memory_usage function, we create an instance of the heap profiler using hpy(), and then we obtain a heap snapshot using hp.heap().
  • The code section you want to profile is represented by the creation of a list of numbers. You should replace this code with the part of your application that you want to profile.
  • After executing the code, we use h.byrcs to analyze memory usage by type. This provides information about the types of objects in memory, the count of instances, and their sizes.
  • Finally, we print the memory usage information.

guppy3 can provide detailed insights into your program’s memory usage, making it useful for debugging memory-related issues and optimizing memory usage in your Python code.

6. Using objgraph Module

objgraph module in Python to visualize and analyze the object reference graph, which can be helpful for identifying memory usage and potential memory leaks in your Python code. Here’s how you can use the objgraph module:

  1. Install objgraph using pip:
pip install objgraphCode language: Python (python)
  1. Import the objgraph module in your Python script:
import objgraphCode language: Python (python)
  1. Create a function or a section of your code that you want to profile for memory usage:
def profile_memory_usage():
    # Code you want to profile goes here
    numbers = []
    for i in range(1, 10001):
        numbers.append(i)Code language: Python (python)
  1. Within your profiling function, you can use objgraph to visualize object references and identify memory usage:
def profile_memory_usage():
    # Code you want to profile goes here
    numbers = []
    for i in range(1, 10001):
        numbers.append(i)
        
    # Visualize object references
    objgraph.show_most_common_types(limit=10)Code language: Python (python)

The show_most_common_types function will print the most common types of objects in memory, which can help you identify potential memory usage issues.

  1. Call your profiling function to analyze memory usage:
if __name__ == "__main__":
    profile_memory_usage()Code language: Python (python)

When you run this script, it will display information about the most common types of objects in memory. This information can be valuable for identifying objects that are consuming a significant amount of memory and for debugging memory-related issues.

Keep in mind that objgraph is primarily used for visualization and analysis of object references, so it complements other memory profiling tools like tracemalloc or guppy that provide more detailed memory usage statistics.

7. Using sys.getsizeof():

This built-in Python function allows you to measure the size of individual objects in memory. It’s useful for checking the memory consumption of specific variables or objects.

import sys

my_variable = [1, 2, 3, 4, 5]
print(sys.getsizeof(my_variable))Code language: Python (python)

8. Using Heapy

This is a Python library that provides a way to profile and inspect memory usage. It’s particularly helpful for diagnosing memory leaks.

Install Heapy:

pip install guppy3Code language: Python (python)

Example usage:

from guppy import hpy

hp = hpy()
heap = hp.heap()
print(heap)Code language: Python (python)

9. External Tools:

You can also use external memory profiling tools like Valgrind’s Massif or Pyflame for more in-depth analysis, but these tools are not Python-specific.

Remember that profiling memory usage can have an impact on the performance of your code, so use these tools judiciously, especially in production environments. Additionally, interpreting memory profiling results requires a good understanding of Python’s memory management and the specifics of your code.

Read More;

  • Abdullah Walied Allama is a driven programmer who earned his Bachelor's degree in Computer Science from Alexandria University's Faculty of Computer and Data Science. He is passionate about constructing problem-solving models and excels in various technical skills, including Python, data science, data analysis, Java, SQL, HTML, CSS, and JavaScript.

    View all posts

Leave a Comment