Skip to content

noahtren/Cooperative-Communication-Networks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cooperative Communication Networks

Cooperative Communication Networks

WIP code demonstrating Cooperative Communication Networks — generative systems that learn representations by transmitting and reconstructing small messages. This project attempts to encode trees and graphs as images, to explore generative notation systems.

Usage

You can run this locally by cloning the repo and installing the project as a Python package with pip install -e . in the project directory.

python ccn/main.py will start training based on the config set in config.json. By default, a snapshot of 4 random samples is saved to the gallery folder at the end of every epoch.

An example snapshot may look like this:

You can see that the top row is the generated samples, and the bottom row is the samples after being passed through a noisy channel.

Config

The configuration can be modified to run experiments. Three different experiment types can be run.

JUST_VISION trains a CNN autoencoder to produce unique symbols, with a vocab size equal to NUM_SYMBOLS.

JUST_GRAPH trains a graph autoencoder, which is run to pretrain the graph autoencoder portion of the full pipeline. It encodes arithmetic syntax trees as defined in graph_data.py

FULL trains a CNN autoencoder nested inside of a graph autoencoder, which is the main demonstration of this project. Syntax trees can be expressed as graphs and passed through the autoencoder, and the latent space is visual.

Samples

JUST_VISION experiments

FULL experiments

Contributing

If you have any questions about the code or notice any problems, please open an issue.

About This Project

Cooperative Communication Networks

Rather than learning from high-dimensional data such as images, CCNs learn to communicate messages more or less from scratch. They are guided by a set of constraints that modulate and/or hinder communication. This includes (but is not limited to) the communication medium and channel. Adversarial agents could also be part of the system. Essentially, CCNs are autoencoders with latent spaces that can be shaped according to arbitrary constraints. You can read more here.

Extensions

  • Adversarial games. One could add a "Spy" generator that attempts to learn the same representations as the original generator. It would be trained to trick the decoder. This would be similar to a GAN, but the dataset is replaced with a cooperative generator.

  • Perceptual loss. Implementing perceptual loss could encourage symbols to be perceptually different from each other via some external metric, such as from a pre-trained CNN.

Credit

Joel Simon directly inspired my interest in this with Dimensions of Dialogue. Joel also first had the idea of adversarial games. Ryan Murdock originally suggested calling these Cooperative Communication Networks, and also had the idea of perceptual loss.

The new development here is encoding structured data (trees and graphs) and transforming it into visual representations.

License

The code is MIT-licensed.

About

Cooperative Communication Networks on Graphs

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published