Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix inconsistent devices for tensors #417

Merged
merged 8 commits into from
Jan 23, 2021
Merged

Fix inconsistent devices for tensors #417

merged 8 commits into from
Jan 23, 2021

Conversation

hangtingchen
Copy link
Contributor

Fix inconsistent devices for tensors

@jonashaag
Copy link
Collaborator

Looks good, can you add a test?

@mpariente
Copy link
Collaborator

IMO, we can merge without a test.

@jonashaag
Copy link
Collaborator

Ah, since we only have a single PyTorch device in CI, it's kind of pointless anyways. I was wondering why this isn't tested already, but we always only ever test with the default device in the first place. At some point it would be really great to add a second the device to CI and also run all of the tests on CPU and CUDA.

@hangtingchen
Copy link
Contributor Author

The bug was found when I train the network on GPU, but cnt was placed on CPU in default. It is a good idea to add a second device to CI. Maybe this can be done in another PR.

@jonashaag
Copy link
Collaborator

It is a good idea to add a second device to CI. Maybe this can be done in another PR.

Unfortunately we don't have GPU support in CI :(

Will merge this fix, thanks for the patch.

@jonashaag jonashaag merged commit 2e1eeac into asteroid-team:master Jan 23, 2021
@mpariente
Copy link
Collaborator

Yes, no GPU on CI. Maybe we can emulate another device? It might be too much work.
I don't know whether CircleCI or drone provide free GPU CI, we might want to have a look.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants