Captum: Model Interpretability for PyTorch

Captum

3.5 | 386 | 0
Type:
Open Source Projects
Last Updated:
2025/08/25
Description:
Captum is an open-source model interpretability library for PyTorch. It supports various modalities, offers extensibility, and integrates seamlessly with PyTorch models.
Share:
model interpretability
attribution methods
PyTorch

Overview of Captum

Captum: Model Interpretability for PyTorch

What is Captum?

Captum is an open-source, extensible library for model interpretability research in PyTorch. It provides tools to understand and attribute the predictions of PyTorch models across various modalities like vision and text.

Key Features

  • Multi-Modal: Supports interpretability of models across modalities including vision, text, and more.
  • Built on PyTorch: Supports most types of PyTorch models and can be used with minimal modification to the original neural network.
  • Extensible: Open source, generic library for interpretability research. Easily implement and benchmark new algorithms.

How to Get Started with Captum?

  1. Install Captum:

    • Via conda (recommended):
    conda install captum -c pytorch
    
    • Via pip:
    pip install captum
    
  2. Create and Prepare a Model:

The following example demonstrates how to use Captum with a simple ToyModel:

import numpy as np
import torch
import torch.nn as nn
from captum.attr import IntegratedGradients

class ToyModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.lin1 = nn.Linear(3, 3)
        self.relu = nn.ReLU()
        self.lin2 = nn.Linear(3, 2)

        # initialize weights and biases
        self.lin1.weight = nn.Parameter(torch.arange(-4.0, 5.0).view(3, 3))
        self.lin1.bias = nn.Parameter(torch.zeros(1,3))
        self.lin2.weight = nn.Parameter(torch.arange(-3.0, 3.0).view(2, 3))
        self.lin2.bias = nn.Parameter(torch.ones(1,2))

    def forward(self, input):
        return self.lin2(self.relu(self.lin1(input)))


model = ToyModel()
model.eval()

To make computations deterministic, let's fix random seeds:

torch.manual_seed(123)
np.random.seed(123)

Define input and baseline tensors:

input = torch.rand(2, 3)
baseline = torch.zeros(2, 3)
  1. Select Algorithm to Instantiate and Apply:

This example uses Integrated Gradients:

ig = IntegratedGradients(model)
attributions, delta = ig.attribute(input, baseline, target=0, return_convergence_delta=True)
print('IG Attributions:', attributions)
print('Convergence Delta:', delta)

Output:

IG Attributions: tensor([[-0.5922, -1.5497, -1.0067],
                         [ 0.0000, -0.2219, -5.1991]])
Convergence Delta: tensor([2.3842e-07, -4.7684e-07])

Why is Captum important?

Model interpretability is crucial for understanding how AI models arrive at their decisions. Captum helps researchers and practitioners gain insights into model behavior, which is essential for debugging, improving, and building trust in AI systems.

Where can I use Captum?

Captum can be used in various applications, including:

  • Image Classification: Understand which pixels contribute most to a model's prediction.
  • Text Classification: Identify the key words or phrases driving a model's sentiment analysis.
  • Other PyTorch Models: Interpret any PyTorch model with minimal modifications.

Best Alternative Tools to "Captum"

CPUmade
No Image Available
359 0

CPUmade is an AI-powered platform that lets users create custom t-shirt designs through simple text descriptions. Generate unique apparel designs, customize colors, and order directly with global shipping.

custom apparel
AI design
Claude
No Image Available
524 0

Anthropic's Claude AI is designed for reliability, interpretability, and steerability. Explore Claude Opus and Sonnet for advanced AI applications, coding, and AI agents.

AI safety
large language model
Defog.ai
No Image Available
361 0

Defog.ai provides instant data insights using a fine-tuned LLM for enterprise data analysis. Powered by SQLCoder, it offers accurate text-to-SQL capabilities and integrates with various data sources. Trusted by industry leaders.

AI data analysis
text-to-SQL
Bethge Lab
No Image Available
408 0

Bethge Lab at the University of Tübingen focuses on AI research, machine learning, and understanding brain representations using neural networks.

AI research
machine learning

Tags Related to Captum