Skip to content

Commit

Permalink
Merge branch 'main' into issue-ivy-llc#26476
Browse files Browse the repository at this point in the history
  • Loading branch information
Mac16661 authored Oct 3, 2023
2 parents c6860d9 + 89cf14e commit 83f4ec1
Show file tree
Hide file tree
Showing 95 changed files with 1,301 additions and 472 deletions.
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
> 🚀 We are granting pilot access to **Ivy\'s Compiler and Transpiler**
> 🚀 We are granting pilot access to **Ivy\'s Tracer and Transpiler**
> to some users, [join the waitlist](https://console.unify.ai/) if you
> want to test them out!
Expand Down Expand Up @@ -131,8 +131,8 @@ deploy systems. Feel free to head over to the docs for the full API
reference, but the functions you\'d most likely want to use are:

``` python
# Compiles a function into an efficient fully-functional graph, removing all wrapping and redundant code
ivy.compile()
# Traces an efficient fully-functional graph from a function, removing all wrapping and redundant code
ivy.trace_graph()

# Converts framework-specific code to a different framework
ivy.transpile()
Expand All @@ -142,8 +142,8 @@ ivy.unify()
```

These functions can be used eagerly or lazily. If you pass the necessary
arguments for function tracing, the compilation/transpilation step will
happen instantly (eagerly). Otherwise, the compilation/transpilation
arguments for function tracing, the tracing/transpilation step will
happen instantly (eagerly). Otherwise, the tracing/transpilation
will happen only when the returned function is first invoked.

``` python
Expand Down
2 changes: 1 addition & 1 deletion docs/demos
Submodule demos updated from 73a66d to d0c9f0
6 changes: 3 additions & 3 deletions docs/overview/contributing/error_handling.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ This section, "Error Handling" aims to assist you in navigating through some com
E with_out=False,
E instance_method=False,
E test_gradients=False,
E test_compile=None,
E test_trace=None,
E as_variable=[False],
E native_arrays=[False],
E container=[False],
Expand Down Expand Up @@ -65,7 +65,7 @@ This section, "Error Handling" aims to assist you in navigating through some com
E with_out=False,
E instance_method=False,
E test_gradients=True,
E test_compile=None,
E test_trace=None,
E as_variable=[False],
E native_arrays=[False],
E container=[False],
Expand Down Expand Up @@ -129,7 +129,7 @@ This section, "Error Handling" aims to assist you in navigating through some com
E with_out=False,
E instance_method=False,
E test_gradients=False,
E test_compile=None,
E test_trace=None,
E as_variable=[False],
E native_arrays=[False],
E container=[False],
Expand Down
4 changes: 2 additions & 2 deletions docs/overview/deep_dive/containers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -252,8 +252,8 @@ There may be some compositional functions which are not implicitly nestable for
One such example is the :func:`ivy.linear` function which is not implicitly nestable despite being compositional. This is because of the use of special functions like :func:`__len__` and :func:`__list__` which, among other functions, are not nestable and can't be made nestable.
But we should try to avoid this, in order to make the flow of computation as intuitive to the user as possible.

When compiling the code, the computation graph is **identical** in either case, and there will be no implications on performance whatsoever.
The implicit nestable solution may be slightly less efficient in eager mode, as the leaves of the container are traversed multiple times rather than once, but if performance is of concern then the code should always be compiled in any case.
When tracing the code, the computation graph is **identical** in either case, and there will be no implications on performance whatsoever.
The implicit nestable solution may be slightly less efficient in eager mode, as the leaves of the container are traversed multiple times rather than once, but if performance is of concern then the code should always be traced in any case.
The distinction is only really relevant when stepping through and debugging with eager mode execution, and for the reasons outlined above, the preference is to keep compositional functions implicitly nestable where possible.

**Shared Nested Structure**
Expand Down
6 changes: 3 additions & 3 deletions docs/overview/deep_dive/ivy_frontends.rst
Original file line number Diff line number Diff line change
Expand Up @@ -92,12 +92,12 @@ The former set of functions map very closely to the API for the Accelerated Line
The latter set of functions map very closely to NumPy's well known API.
In general, all functions in the :mod:`jax.numpy` namespace are themselves implemented as a composition of the lower-level functions in the :mod:`jax.lax` namespace.

When transpiling between frameworks, the first step is to compile the computation graph into low level python functions for the source framework using Ivy's graph compiler, before then replacing these nodes with the associated functions in Ivy's frontend API.
When transpiling between frameworks, the first step is to trace a computation graph of low level python functions for the source framework using Ivy's tracer, before then replacing these nodes with the associated functions in Ivy's frontend API.
Given that all jax code can be decomposed into :mod:`jax.lax` function calls, when transpiling JAX code it should always be possible to express the computation graph as a composition of only :mod:`jax.lax` functions.
Therefore, arguably these are the *only* functions we should need to implement in the JAX frontend.
However, in general we wish to be able to compile a graph in the backend framework with varying levels of dynamicism.
However, in general we wish to be able to trace a graph in the backend framework with varying levels of dynamicism.
A graph of only :mod:`jax.lax` functions chained together in general is more *static* and less *dynamic* than a graph which chains :mod:`jax.numpy` functions together.
We wish to enable varying extents of dynamicism when compiling a graph with our graph compiler, and therefore we also implement the functions in the :mod:`jax.numpy` namespace in our frontend API for JAX.
We wish to enable varying extents of dynamicism when creating a graph with our tracer, and therefore we also implement the functions in the :mod:`jax.numpy` namespace in our frontend API for JAX.

Thus, both :mod:`lax` and :mod:`numpy` modules are created in the JAX frontend API.
We start with the function :func:`lax.add` as an example.
Expand Down
4 changes: 2 additions & 2 deletions docs/overview/deep_dive/ivy_frontends_tests.rst
Original file line number Diff line number Diff line change
Expand Up @@ -630,8 +630,8 @@ for example, :code:`ndarray.__add__` would expect an array as input, despite the
:func:`helpers.test_frontend_method` is used to test frontend instance methods. It is used in the same way as :func:`helpers.test_frontend_function`. A few important arguments for this function are following:
- :code:`init_input_dtypes` Input dtypes of the arguments on which we are initializing the array on.
- :code:`init_all_as_kwargs_np` The data to be passed when intializing, this will be a dictionary in which the numpy array which will contain the data will be passed in the :code:`data` key.
- :code:`method_input_dtypes` The input dtypes of the argument which are to be passed to the instance method after the intialization of the array.
- :code:`init_all_as_kwargs_np` The data to be passed when initializing, this will be a dictionary in which the numpy array which will contain the data will be passed in the :code:`data` key.
- :code:`method_input_dtypes` The input dtypes of the argument which are to be passed to the instance method after the initialization of the array.
- :code:`method_all_as_kwargs_np` All the arguments which are to be passed to the instance method.
Expand Down
8 changes: 4 additions & 4 deletions docs/overview/deep_dive/superset_behaviour.rst
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ We've already explained that we should not duplicate arguments in the Ivy functi
Does this mean, provided that the proposed argument is not a duplicate, that we should always add this backend-specific argument to the Ivy function?
The answer is **no**.
When determining the superset, we are only concerned with the pure **mathematics** of the function, and nothing else.
For example, the :code:`name` argument is common to many TensorFlow functions, such as `tf.concat <https://www.tensorflow.org/api_docs/python/tf/concat>`_, and is used for uniquely identifying parts of the compiled computation graph during logging and debugging.
For example, the :code:`name` argument is common to many TensorFlow functions, such as `tf.concat <https://www.tensorflow.org/api_docs/python/tf/concat>`_, and is used for uniquely identifying parts of the traced computation graph during logging and debugging.
This has nothing to do with the mathematics of the function, and so is *not* included in the superset considerations when implementing Ivy functions.
Similarly, in NumPy the argument :code:`subok` controls whether subclasses of the :class:`numpy.ndarray` class should be permitted, which is included in many functions, such as `numpy.ndarray.astype <https://numpy.org/doc/stable/reference/generated/numpy.ndarray.astype.html>`_.
Finally, in JAX the argument :code:`precision` is quite common, which controls the precision of the return values, as used in `jax.lax.conv <https://jax.readthedocs.io/en/latest/_autosummary/jax.lax.conv.html>`_ for example.
Expand Down Expand Up @@ -129,8 +129,8 @@ The following would be a much better solution:
return res
You will notice that this implementation involves more lines of code, but this should not be confused with added complexity.
All Ivy code should be graph compiled for efficiency, and in this case all the :code:`if` and :code:`else` statements are removed, and all that remains is the backend functions which were executed.
This new implementation will be compiled to a graph of either one, three, four, or six functions depending on the values of :code:`beta` and :code:`threshold`, while the previous implementation would *always* compile to six functions.
All Ivy code should be traced for efficiency, and in this case all the :code:`if` and :code:`else` statements are removed, and all that remains is the backend functions which were executed.
This new implementation will be traced to a graph of either one, three, four, or six functions depending on the values of :code:`beta` and :code:`threshold`, while the previous implementation would *always* traces to six functions.

This does mean we do not adopt the default values used by PyTorch, but that's okay.
Implementing the superset does not mean adopting the same default values for arguments, it simply means equipping the Ivy function with the capabilities to execute the superset of behaviours.
Expand Down Expand Up @@ -167,7 +167,7 @@ With regards to both of these points, Ivy provides the generalized superset impl

However, as discussed above, :func:`np.logical_and` also supports the :code:`where` argument, which we opt to **not** support in Ivy.
This is because the behaviour can easily be created as a composition like so :code:`ivy.where(mask, ivy.logical_and(x, y), ivy.zeros_like(mask))`, and we prioritize the simplicity, clarity, and function uniqueness in Ivy's API in this case, which comes at the cost of reduced runtime efficiency for some functions when using a NumPy backend.
However, in future releases our automatic graph compilation and graph simplification processes will alleviate these minor inefficiencies entirely from the final computation graph, by fusing multiple operations into one at the API level where possible.
However, in future releases our automatic graph tracing and graph simplification processes will alleviate these minor inefficiencies entirely from the final computation graph, by fusing multiple operations into one at the API level where possible.

Maximizing Usage of Native Functionality
----------------------------------------
Expand Down
2 changes: 1 addition & 1 deletion docs/overview/design.rst
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ If that sounds like you, feel free to check out the `Deep Dive`_ section after y
| back-end functional APIs ✅
| Ivy functional API ✅
| Framework Handler ✅
| Ivy Compiler 🚧
| Ivy Tracer 🚧
|
| (b) `Ivy as a Transpiler <design/ivy_as_a_transpiler.rst>`_
| front-end functional APIs 🚧
Expand Down
Loading

0 comments on commit 83f4ec1

Please sign in to comment.