Skip to content

Very accurate color filters reconstruction tools based on 3D color LUTs

License

Notifications You must be signed in to change notification settings

homm/color-filters-reconstruction

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

53 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Accurate Instagram Filters Reconstruction

There's a bunch of apps out there (e.g., Instagram) allowing you to apply color filters to images. You might be interested in cloning their behavior: reconstruct them. This repository holds tools for semi-automatically and very accurately reconstructing color filters.

The truth is folks like Instagram filters. They are trying to reproduce them again and again. And again and again. And again and again. The problem with the attempts is they mostly deal with manually correcting colors. For me, it was more interesting to find a solution using a more robust method and maths.

This looks like the only attempt to provide an accurate color filter reconstruction. For instance, one of the following images was obtained by applying the Clarendon Instagram filter to the source image, while another one was derived via an accurate reconstruction. Try guessing which one was reconstructed.

reconstruction inst

To compare, here is the result of applying the same filter from a commercial set of Instagram-like filters.

foreign

How it works

This method is based on three-dimensional lookup tables and their two-dimensional representation: hald images. The core idea is simple: a sample hald image with a uniform color distribution is processed using a target color filter with an unknown transformation algorithm. The processed hald image can then be used as a filter for a very accurate approximation of that target color transformation.

A resulting hald image could then be used in various software such as GraphicsMagick or Adobe Photoshop. You can implement those hald images in your iOS or macOS app with CocoaLUT. Also, hald images could be converted to the 3D LUT cube file format, which is common in a great number of video editing software.

Limitation

This method can capture color transformations only where no other variables are used for manipulations. For example, vignetting, scratches, gradients, and watermarks can not be captured. It also might be wrong when different filters are applied to different parts of a single image.

Requirements

To generate and convert hald images, you will need git and a pip-enabled Python interpreter.

$ git clone https://github.com/homm/color-filters-reconstruction.git
$ cd color-filters-reconstruction
$ pip install -r requirements.txt 

The resulting hald images can be applied to any visuals in your application using GraphicsMagick bindings for Python, Ruby, PHP, JavaScript™, and other programming languages or using CLI. No software from this repository is required.

Guide

  1. First, you'll need to create the identity image. Just run:

    $ ./bin/generate.py

    This will create a file named hald.5.png. The number in the filename stands for the square root of a 3D LUT size. For example, 5 means we're dealing with a 25×25×25 lookup table.

    original

    This file doesn't look like other hald images. This image is specifically designed to oppose distortions which may occur during transformations such as vignetting, scratches, gradients, and JPEG artifacts.

  2. Process the identity image with a target software you prefer. Say, if you were using Instagram, you'd have to transfer the identity image to your device and post that image with one of the filters applied. After that, you'll see filtered identity image in your camera roll. Well, just transfer it back.

    Clarendon

    Before continuing, make sure that the resolution of your filtered identity image exactly matches that of the source one.

  3. Convert the filtered to the real hald image:

    $ ./bin/convert.py raw/1.Clarendon.jpg halds/

    Where halds/ is your output directory.

    Clarendon
  4. That's it! You can now apply that resulting hald image to any input.

    $ gm convert sample.jpg -hald-clut halds/1.Clarendon.png out.jpeg

    sample Clarendon

Advanced tips

While the default parameters provide you with high-quality hald filters, there are some cases where it is not enough.

If the target filter has heavy distortions on the local level or significant gradients in the center of an image, some undesired effects may occur. The most noticeable one is color banding. This is an original image and the one processed with the Hudson-like filter, one of the most problematic in this aspect.

# Create hald image from processed by Instagram identity hald image
$ ./bin/convert.py raw/15.Hudson.jpg halds/
# Apply hald image to the sample image
$ gm convert girl.jpg -hald-clut halds/15.Hudson.png girl.15.jpg

original hudson

You can notice that in the processed image many objects look flat and posterized: face, hair, chairs in the background. While posterization is one of the common image filters, it is not a part of the Hudson filter.

If you thoroughly look at the image with a Hudson-like filter applied, you'll see that it looks noisy, and that's where the problem comes from.

hudson

Fortunately enough, you can ask convert.py to apply a gaussian blur to the three-dimensional lookup table to reduce that noise. You'll need to install SciPy to continue.

# The next line is only needed once
$ pip install scipy
$ ./bin/convert.py raw/15.Hudson.jpg halds/ --smooth 1.5
$ gm convert girl.jpg -hald-clut halds/15.Hudson.png girl.15.fixed.jpg

hudson fixed hudson

You can discover the additional convert.py options by executing ./bin/convert.py --help.

Have fun with reverse engineering!

About

Very accurate color filters reconstruction tools based on 3D color LUTs

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages