Skip to content

pipixiaqishi1/SAM-E

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SAM-E: Leveraging Visual Foundation Model with Sequence Imitation for Embodied Manipulation
ICML 2024

@inproceedings{2024sam,
         author    = {Junjie Zhang, Chenjia Bai, Haoran He, Zhigang Wang, Bin Zhao, Xiu Li, Xuelong Li},
         title     = {SAM-E: Leveraging Visual Foundation Model with Sequence Imitation for Embodied Manipulation},
         booktitle = {International Conference on Machine Learning}
         year      = {2024},
}

This is the offical repository for SAM-E. Currently we provide the code for training. More updates are coming soon.

Get Started

Install

  • Step 1: create a conda environment
conda create --name samE python=3.8
conda activate samE
  • Step 2: Install PyTorch compatible with your CUDA version.

  • Step 3: Install PyTorch3D.

PyTorch3D is essential for our SAM-E.

One recommended version that is compatible with the rest of the library can be installed as follows. Note that this might take some time. For more instructions visit here.

curl -LO https://github.com/NVIDIA/cub/archive/1.10.0.tar.gz
tar xzf 1.10.0.tar.gz
export CUB_HOME=$(pwd)/cub-1.10.0
pip install 'git+https://github.com/facebookresearch/pytorch3d.git@stable'

Once you have downloaded CoppeliaSim, add the following to your ~/.bashrc file. (NOTE: the 'EDIT ME' in the first line)

export COPPELIASIM_ROOT=<EDIT ME>/PATH/TO/COPPELIASIM/INSTALL/DIR
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$COPPELIASIM_ROOT
export QT_QPA_PLATFORM_PLUGIN_PATH=$COPPELIASIM_ROOT
export DISPLAY=:1.0

Remember to source your .bashrc (source ~/.bashrc) or .zshrc (source ~/.zshrc) after this.

  • Step 5: Install samE and other submodules using the following command.
cd SAM-E/
pip install -e . 
pip install -e samE/libs/PyRep 
pip install -e samE/libs/RLBench 
pip install -e samE/libs/YARR 
pip install -e samE/libs/peract_colab

Training

  • Step 1: Download dataset

    • We use the same dataset as RVT for experiments in RLBench. Download the dataset provided by PerAct to SAM-E/samE/data/. For training, you may only need to download the /train/ and take it as SAM-E/samE/data/train/.
  • Step 2: Start training

python train.py --exp_cfg_path configs/samE.yaml --mvt_cfg_path configs/mvt_samE.yaml --device 0,1,2,3

Change the device flag depending on available gpus.

Evaluation

  • We evaluate the model on RLBench. First download the eval dataset and move the /eval/ to SAM-E/samE/data/eval/.

  • After training, you will have the checkpoint and configs at SAM-E/samE/runs/. Run the evaluation by:

    python eval.py  --model-folder runs/sam_e --eval-datafolder ./data/test --tasks all --eval-episodes 25 --log-name test/final --device 0 --model-name model_14.pth
    

Checkpoint

  • We provide a checkpoint of the multi-task training which is available here.

Acknowledgement

We sincerely thank the authors of the following repositories for sharing their code.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages