Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move op methods into dedicated objects and export them to torch namespace #29

Closed
sbrunk opened this issue Jun 17, 2023 · 5 comments · Fixed by #30
Closed

Move op methods into dedicated objects and export them to torch namespace #29

sbrunk opened this issue Jun 17, 2023 · 5 comments · Fixed by #30

Comments

@sbrunk
Copy link
Owner

sbrunk commented Jun 17, 2023

torch.scala is getting a bit large, and we still need to implement a bunch of ops. So I think it makes sense to move the op methods into smaller units and then define aliases via export clauses in the torch namespace/package like we already do with torch.special.

The difference being that we should do it also for methods that are not aliased explicitly in PyTorch (methods that are defined in the torch namespace there). A natural way to split them, is to use the ops structure in the PyTorch docs. We already group into these op groups in torch.scala already so it shouldn't be hard to move things out.

The advantage should be more clarity, maintainability and smaller compilation units.

Here's an idea for a structure:

torch/ops/PointwiseOps.scala

package torch.ops
object PointwiseOps:
  def abs[D <: NumericNN](input: Tensor[D]): Tensor[D] = ...

torch/ops/ReductionOps.scala

package torch.ops
object ReductionOps:
  def argmax[D <: DType]( ...

torch/torch.scala

export ops.PointwiseOps.*
export ops.ReductionOps.*

I just did a quick feasibility test and there seems an issue with default methods with default parameters which are lost when an export is imported in a different compilation unit. See scala/scala3#17930. I could not find a way around it. Since we're we're using default parameters extensively, this is a blocker and I guess we'll have to wait until it is fixed.

@davoclavo
Copy link
Contributor

Totally agree! that file is indeed getting pretty large.

While that bug is fixed I am thinking we could do try these workarounds (unsure if there are any negative implications of either)

  1. Split them in files but still define them as part of package torch - so no export is needed

  2. Implementing traits and having a package object that extends all those traits (I've never done this myself but have seen something like this in other projects before)

Yesterday I spent a bit of time splitting them in files using option 1 - as well as the test suite. Everything seems to be working fine. I can upload a draft of my changes later today when I get home.

Btw, thanks a lot for openly sharing your thoughts and progress with the Reduction Ops in order to avoid any duplicate work. Will try to do the same.

@sbrunk
Copy link
Owner Author

sbrunk commented Jun 17, 2023

Ah great I didn't think of 1. as a workaround. If you've already tried that successfully, let's go for it!

I tried 2. but ran into some weird circular issues when compiling the package object because the traits were also somewhere under torch. Also package objects, especially with inheritance, are (kind of) deprecated in Scala 3 as far as I know.

Btw, thanks a lot for openly sharing your thoughts and progress with the Reduction Ops in order to avoid any duplicate work. Will try to do the same.

Yeah I'm still trying to get into the habit of writing these things down more often. I think it's good exercise for me 😄

davoclavo added a commit to davoclavo/storch that referenced this issue Jun 18, 2023
Fixes sbrunk#29

Bonus additions

- Add torch.multinomial
- Add torch.oneHot
- Add torch.gradient
davoclavo added a commit to davoclavo/storch that referenced this issue Jun 18, 2023
Fixes sbrunk#29

Bonus additions:

- Add `torch.multinomial`
- Add `torch.oneHot`
- Add `torch.gradient`
davoclavo added a commit to davoclavo/storch that referenced this issue Jun 18, 2023
Fixes sbrunk#29

Bonus additions:

- Add `torch.multinomial` [RandomSamplingOps]
- Add `torch.randint` [RandomSamplingOps]
- Add `torch.gradient` [PointwiseOps]
- Add `torch.nn.functional.oneHot`
@sbrunk
Copy link
Owner Author

sbrunk commented Jun 18, 2023

Let's keep this open even though we've split ops things into different files while defining them in the torch package in #30.

The export variant has a few advantages like being able to put ops into actual objects which in turn enables Scaladoc macros to work so I'd still like to try this option when it becomes possible.

@sbrunk sbrunk reopened this Jun 18, 2023
@davoclavo
Copy link
Contributor

davoclavo commented Jun 27, 2023

If I am not mistaken, it seems like this issue was resolved by #32 🥳

@sbrunk
Copy link
Owner Author

sbrunk commented Jun 28, 2023

You're right, for some reason 2. (traits and package objects inheriting from them) did work after all. Not sure what caused the issues last time I tried.

@sbrunk sbrunk closed this as completed Jun 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants