Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump iree to 20231130.724 #212

Merged
merged 1 commit into from
Dec 6, 2023
Merged

Conversation

AmosLewis
Copy link
Contributor

To solve the issue #110

@AmosLewis AmosLewis self-assigned this Nov 30, 2023
@stellaraccident stellaraccident changed the title Bump iree Bump iree to 20231130.724 Nov 30, 2023
@stellaraccident
Copy link
Contributor

(in the future, more descriptive PR title/description please)

@stellaraccident
Copy link
Contributor

Also, looks like some test failures with the bump.

@AmosLewis
Copy link
Contributor Author

AmosLewis commented Dec 5, 2023

The bumping torch-mlir will break the llama test, where some more ops need to supported in linalg:
llama_test.mlir
llama_torch2linalg.mlir

  • torch.aten.empty_strided
  • torch.aten.mean.dim
  • torch.aten.expand
  • torch.aten._softmax
  • torch.aten.silu

@AmosLewis AmosLewis mentioned this pull request Dec 5, 2023
88 tasks
@stellaraccident
Copy link
Contributor

I don't really know where we got this particular test from and it is not exercising our production path. If we most, we can xfail it and fix forward.

@AmosLewis
Copy link
Contributor Author

I don't really know where we got this particular test from and it is not exercising our production path. If we most, we can xfail it and fix forward.

It's from here. I think we should xfail it #36

@stellaraccident
Copy link
Contributor

Can we add an issue to fix it when we do that and make sure to work it. It is important to have some generality in coverage, but this stuff is new enough I don't think we need to stop the train for it.

@AmosLewis
Copy link
Contributor Author

Can we add an issue to fix it when we do that and make sure to work it. It is important to have some generality in coverage, but this stuff is new enough I don't think we need to stop the train for it.

Here we go #221

@AmosLewis
Copy link
Contributor Author

One more test error with uninitialized=True.

def testUninitializedParameters(self):

This was added #201, that was supported in 20231121.715, but with 20231130.724, it has a regression in iree. The unintialized=True support patch might has been reverted in IREE?

iree.compiler._mlir_libs._site_initialize.<locals>.MLIRError: Unable to parse attribute:
error: "#util.uninitialized : tensor<30x20xf32>":1:7: unknown attribute `uninitialized` in dialect `util`

@stellaraccident
Copy link
Contributor

Something is wrong -- this route could not be happening with a certain in the last two weeks. We need to look at iree and see why it seems to be going back in time.

@stellaraccident
Copy link
Contributor

One more test error with uninitialized=True.

def testUninitializedParameters(self):

This was added #201, that was supported in 20231121.715, but with 20231130.724, it has a regression in iree. The unintialized=True support patch might has been reverted in IREE?

iree.compiler._mlir_libs._site_initialize.<locals>.MLIRError: Unable to parse attribute:
error: "#util.uninitialized : tensor<30x20xf32>":1:7: unknown attribute `uninitialized` in dialect `util`

Where are you seeing this? I think it must be testing an old wheel somehow.

@AmosLewis AmosLewis force-pushed the bump-iree branch 2 times, most recently from b546b4a to fb21611 Compare December 5, 2023 07:06
@AmosLewis AmosLewis marked this pull request as ready for review December 5, 2023 07:19
To solve the batchnorm2d issue nod-ai#110
Xfail llama_test becasue missing ops from torch to linalg
@AmosLewis
Copy link
Contributor Author

Where are you seeing this? I think it must be testing an old wheel somehow.

It's on my local machine. You are right. I just double-checked it and it tried to use my old iree_build.

@stellaraccident stellaraccident merged commit 35e2238 into nod-ai:main Dec 6, 2023
3 checks passed
@AmosLewis AmosLewis deleted the bump-iree branch December 6, 2023 17:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants