Skip to content

0.18.4

Compare
Choose a tag to compare
@peterschmidt85 peterschmidt85 released this 27 Jun 12:14
· 381 commits to master since this release
f6395c6

Google Cloud TPU

This update introduces initial support for Google Cloud TPU.

To request a TPU, specify the TPU architecture prefixed by tpu- (in gpu under resources):

type: task

python: "3.11"

commands:
  - pip install torch~=2.3.0 torch_xla[tpu]~=2.3.0 torchvision -f https://storage.googleapis.com/libtpu-releases/index.html
  - git clone --recursive https://github.com/pytorch/xla.git
  - python3 xla/test/test_train_mp_imagenet.py --fake_data --model=resnet50 --num_epochs=1

resources:
  gpu:  tpu-v2-8

Important

Currently, you can't specify other than 8 TPU cores. This means only single TPU device workloads are supported. Support for multiple TPU devices is coming soon.

Private subnets with GCP

Additionally, the update allows configuring the gcp backend to use only private subnets. To achieve this, set public_ips to false.

projects:
  - name: main
    backends:
      - type: gcp
        creds:
          type: default

        public_ips: false

Major bug-fixes

Besides TPU, the update fixes a few important bugs.

Other

New contributors

Full changelog: 0.18.3...0.18.4