Neural Architecture Search as Multiobjective Optimization Benchmarks: Problem Formulation and Performance Assessment arXiv
EvoXBench is a platfrom offering instant benchmarking of evolutionary multi-objective optimization (EMO) algorithms in neural architecture search (NAS), with ready to use test suites. It facilitates efficient performance assessments with NO requirement of GPUs or PyTorch/TensorFlow, enhancing accessibility for a broader range of research applications. It encompasses extensive test suites that cover a variety of datasets (CIFAR10, ImageNet, Cityscapes, etc.), search spaces (NASBench101, NASBench201, NATS, DARTS, ResNet50, Transformer, MNV3, MoSegNAS, etc.), and hardware devices (Eyeriss, GPUs, Samsung Note10, etc.). It provides a versatile interface compatible with multiple programming languages (Java, Matlab, Python, etc.).
-
EvoXBench has been updated to version 1.0.5! This latest release addresses bugs in CitySeg/MOP10 and HV calculation in the CitySeg/MOP test suite.
If you're already onboard with EvoXBench, give this command a spin:
pip install evoxbench==1.0.5
.
- Formulating NAS tasks into general multi-objective optimization problems.
- Exploring NAS's nuanced traits through the prism of evolutionary optimization.
- Presenting an end-to-end worflow for instant benchmark assessments of EMO algorithms.
- Providing instant fitness evaluations as numerical optimization.
- Encompassing a wide spectrum of datasets, search spaces, and hardware devices.
- Ready-to-use test multi-objective optimization suites with up to eight objectives.
Tap the image to embark on the introductory video voyage.
-
Download requisite files:
-
database.zip
via Google Drive or Baidu云盘(提取码:mhgs) -
data.zip
via Google Drive or Baidu云盘(提取码:9z45)
-
-
Run
pip install evoxbench
to get the benchmark. -
Configure the benchmark:
from evoxbench.database.init import config
config("Path to database", "Path to data")
# For instance:
# With this structure:
# /home/Downloads/
# └─ database/
# | | __init__.py
# | | db.sqlite3
# | | ...
# |
# └─ data/
# └─ darts/
# └─ mnv3/
# └─ ...
# Then, execute:
# config("/home/Downloads/database", "/home/Downloads/data")
Explore our comprehensive database and understand its structure and content. Check it out here.
- Use the issue tracker for bugs or questions.
- Submit your enhancements through a pull request (PR).
- We have an active QQ group (ID: 297969717).
- Official Website: https://evox.group/
- EvoX: A computing framework for distributed GPU-aceleration of evolutionary computation, supporting a wide spectrum of evolutionary algorithms and test problems. Check out here.
If you use EvoXBench in your research and want to cite it in your work, please use:
@article{EvoXBench,
title={Neural Architecture Search as Multiobjective Optimization Benchmarks: Problem Formulation and Performance Assessment},
author={Lu, Zhichao and Cheng, Ran and Jin, Yaochu and Tan, Kay Chen and Deb, Kalyanmoy},
journal={IEEE Transactions on Evolutionary Computation},
year={2023},
publisher={IEEE}
}
A big shoutout to the following projects that have made EvoXBench possible:
NAS-Bench-101, NAS-Bench-201, NAS-Bench-301, NATS-Bench, Once for All, AutoFormer, Django, pymoo.