Skip to content

Visualize features cells are responsive to via gradient ascent.

License

Notifications You must be signed in to change notification settings

ecobost/featurevis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

featurevis

Visualize features that activate neurons via gradient ascent.

Installation

After installing PyTorch, run:

pip3 install git+https://github.com/cajal/featurevis.git

Usage

feature_vis.gradient_ascent receives a function $f(x)$ to optimize, an initial estimate $x$ and some optimization parameters like step size and number of iterations.

Optionally, it can receive any of: a differentiable transform $t(x)$ to apply to $x$ at each iteration before evaluating $f$, a differentiable regularization $r(x)$ to be minimized, i.e., optimization becomes:

$$\arg\max_{x} f(t(x)) - r(t(x))\text{ ,}$$

a gradient_f function $g(x)$ to apply to the gradient before applying the update and a post_update function $p(x)$ to apply to the updated $x$ after each iteration:

$$ x_{t+1} = p\left(x_t + \alpha g\left(\frac{\delta f}{\delta x_t}\right)\right)\text{.} $$

These functions ($t$, $r$, $g$ and $p$) should cover the most common scenarios when creating feature visualizations for neural network models. We provide implementations for many of these commonly used functions in feature_vis.ops.

You can check the Examples.ipynb notebook to see how to visualize features from a VGG network or real neurons[1] under different configurations.

[1]: Models for real neurons come from a private repo but the examples should still be a useful starting point.

About

Visualize features cells are responsive to via gradient ascent.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published