Skip to content

baopj/DenseEventsGrounding

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 

Repository files navigation

Dense Events Grounding in Video (AAAI 2021 oral)

Introduction

This is a pytorch implementation of Dense Events Propagation Network (DepNet) on ActivityNet Captions for the AAAI 2021 oral paper "Dense Events Grounding in Video" .

Dataset

Please download the visual features from the official website of ActivityNet: Official C3D Feature. And you can download preprocessed annotation files here.

Prerequisites

  • python 3.5
  • pytorch 1.4.0
  • torchtext
  • easydict
  • terminaltables

Training

Use the following commands for training:

cd moment_localization && export CUDA_VISIBLE_DEVICES=0
python dense_train.py --verbose --cfg ../experiments/dense_activitynet/acnet.yaml

You may get better results than that reported in our paper thanks to the code updates.

Citation

If you use our code or models in your research, please cite with:

@inproceedings{bao2021dense,
  title     = {Dense Events Grounding in Video},
  author    = {Bao, Peijun and Zheng, Qian and Mu, Yadong},
  booktitle = {AAAI},
  year      = {2021}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages