Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added paddle.tensor.math.exp #15465

Closed
wants to merge 59 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
59 commits
Select commit Hold shift + click to select a range
1b27230
close #15458
baberabb May 15, 2023
3c06651
close #15458
baberabb May 17, 2023
9c01939
Merge branch 'master' into exp
baberabb May 17, 2023
e3d4cda
Update test_set.py
Fayad-Alman May 17, 2023
e37545e
Update test_set.py
Fayad-Alman May 17, 2023
9ef31a8
updated optional requirements for m1 (#15521)
fnhirwa May 17, 2023
fd859aa
fixes to dtype wrappers (#15525)
RickSanchezStoic May 17, 2023
107df7a
🤖 Lint code
ivy-branch May 17, 2023
7820742
Reformat intelligent-tests.yml [skip ci]
RashulChutani May 17, 2023
de10a9f
Add Discrete Fourier Transform functions to Numpy Frontend #1532 (#14…
rubada May 17, 2023
2dab3b0
Added diag to the Tensorflow frontend(#14057)
mobley-trent May 17, 2023
44549a9
Adding test files for tf1 (#15491)
abdulasiraj May 17, 2023
41368c2
Partially fixed diagflat in numpy frontend Paddle backend fixed (#15531)
fnhirwa May 17, 2023
2aad1e5
Replaced `func` by `fn` in `func_wrapper.py` for consistency
vedpatwardhan May 17, 2023
fee1b47
Fix lint (#15535)
KareemMAX May 17, 2023
fbb20ba
moved ivy.ndindex and ivy.ndenumerate
AnnaTz May 17, 2023
993f30c
implemented ivy.indices and updated numpy_frontend.indices
AnnaTz May 17, 2023
ace9108
add vmap to normal jax frontend namespace
mattbarrett98 May 17, 2023
b9135ec
small fix to hann window of tensorflow backend
Daniel4078 May 17, 2023
a2bae53
Update test_jax_src_tree_util.py (#15541)
Fayad-Alman May 17, 2023
825a043
remove unnecessary dtype wrappers in paddle backend
MahmoudAshraf97 May 17, 2023
3b0fe21
pass unsupported dtype to execute_with_gradient function correctly as…
Ishticode May 17, 2023
1296553
fix test_jax_numpy_eye with paddle backend which did not handle k dia…
Ishticode May 17, 2023
b82b6eb
Added data_format parameter to batch_norm and updated the tests
hello-fri-end May 18, 2023
65231fe
Added data_format parameter to instance_norm, updated the doctsring, …
hello-fri-end May 18, 2023
a178442
ivy.bacth_norm:use ivy.prod(x.shape) for calculating the number of el…
hello-fri-end May 18, 2023
74d7059
Change base image from debian to ubuntu and upgrade python version (#…
bipinKrishnan May 18, 2023
c05e098
Dtype casting mode (#15637)
RickSanchezStoic May 18, 2023
509bc97
🤖 Lint code
ivy-branch May 18, 2023
6594b24
`jnp.compress` compositional func in the frontend (#15390)
mobley-trent May 18, 2023
3ee5541
add safety factor to test_jax_lax_mul to avoid overflows which lead t…
Ishticode May 18, 2023
b33b587
fix test_jax_numpy_array for paddle backend by updating in expand_dim…
Ishticode May 18, 2023
c17ae07
added torch.Tensor.norm
AnnaTz May 18, 2023
d15291b
removed unused import
AnnaTz May 18, 2023
c06ae50
fix for tf.asarray when values are dtype min and max
MahmoudAshraf97 May 18, 2023
cc1fe2e
updated torch norm function tests to cover all supported cases
AnnaTz May 18, 2023
5e32396
removed print statement
AnnaTz May 18, 2023
a60718a
Unsupported dtype decorator (#15483)
RickSanchezStoic May 18, 2023
393a59a
fixed int does not have ndim attribute
AnnaTz May 18, 2023
9bf91db
fixed ResourceVariable does not have ndim attribute
AnnaTz May 18, 2023
04a9f61
fixed dtype mismatch of pad return value
AnnaTz May 18, 2023
15c254c
cast in promote_types_of_inputs instead of creating a new array
MahmoudAshraf97 May 18, 2023
4d99733
fix test_jax_numpy_vander for all backends with issues relating to sh…
Ishticode May 18, 2023
6b3c58d
Merge pull request #15635 from hello-fri-end/dataFormatNorms
hello-fri-end May 18, 2023
526141d
Fixed solve, pinv and lstsq in the numpy frontend (#15649)
fnhirwa May 18, 2023
3b9b2b7
Revert "Change base image from debian to ubuntu and upgrade python ve…
bipinKrishnan May 18, 2023
1b9bffe
update jax backend dtype version specification as no additional suppo…
Ishticode May 18, 2023
1e554d3
update dtype version specifications for numpy backend as no dtype is …
Ishticode May 18, 2023
231219b
update the dtype support version specification to the latest stable v…
Ishticode May 18, 2023
e41c16d
update version dtype spec in torch backend to the latest stable versi…
Ishticode May 18, 2023
ce5b4a5
specify the latest dtype versions as no update has been made to vande…
Ishticode May 18, 2023
2e91737
Fixing dtype promotions for tf's scatter_nd
saeedashrraf May 18, 2023
d98bb90
Fixing ivy.matrix_norm (#15379)
AnnaTz May 18, 2023
4db0e65
remove incorrectly added where param to test_abs
Ishticode May 18, 2023
c842524
added numpy.random.negative_binomial to frontend (#15519)
baberabb May 18, 2023
a639ea0
Added Ravel instance method to Jax Frontend (#15388)
RakshitKumar04 May 18, 2023
e9021d4
rebase
baberabb May 15, 2023
5fb094f
rebase
baberabb May 17, 2023
bc5364c
Merge branch 'exp' of https://github.com/baberabb/ivy into exp
baberabb May 20, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2,252 changes: 19 additions & 2,233 deletions .github/workflows/intelligent-tests.yml

Large diffs are not rendered by default.

66 changes: 30 additions & 36 deletions ivy/data_classes/array/experimental/norms.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@ def batch_norm(
training: bool = False,
eps: float = 1e-5,
momentum: float = 1e-1,
data_format: str = "NSC",
out: Optional[Tuple[ivy.Array, ivy.Array, ivy.Array]] = None,
) -> Tuple[ivy.Array, ivy.Array, ivy.Array]:
"""
Expand All @@ -61,39 +62,35 @@ def batch_norm(
Parameters
----------
self
Input array of shape (N, *S, C), where N is the batch dimension,
Input array of default shape (N, *S, C), where N is the batch dimension,
*S corresponds to any number of spatial dimensions and
C corresponds to the channel dimension.
training
If true, calculate and use the mean and variance of `x`. Otherwise, use the
provided `mean` and `variance`.
mean
Array used for input's normalization. If ``training=True``
then it must be one dimensional with size equal to the size of
channel dimension C. If ``training=False`` then it can be of any
shape broadcastble to the input shape.
Mean array used for input's normalization. It can be of any shape
braodcastable to (N,*S,C).
variance
Array for the input's normalization. If ``training=True``
then it must be one dimensional with size equal to the size of
channel dimension C. If ``training=False`` then it can be of any shape
broadcastble to the input shape.
Variance array used for input's normalization. It can be of any shape
braodcastable to (N,*S,C).
offset
An offset array. If present, will be added to the normalized input.
If ``training=True`` then it must be one dimensional with size equal
to the size of channel dimension C. If ``training=False`` then it can
be of any shape broadcastble to the input shape.
It can be of any shape broadcastable to (N,*S,C).
scale
A scale array. If present, the scale is applied to the normalized input.
If ``training=True`` then it must be one dimensional with size equal to
the size of channel dimension C. If ``training=False`` then it can be of
any shape broadcastble to the input shape.
training
If true, calculate and use the mean and variance of `x`. Otherwise, use the
provided `mean` and `variance`.
It can be of any shape broadcastable to (N,*S,C).
eps
A small float number to avoid dividing by 0.
momentum
the value used for the running_mean and running_var computation.
Default value is 0.1.
data_format
The ordering of the dimensions in the input, one of "NSC" or "NCS",
where N is the batch dimension, S represents any number of spatial
dimensions and C is the channel dimension. Default is "NSC".
out
optional output array, for writing the result to.
optional output arrays, for writing the result to.

Returns
-------
Expand All @@ -110,6 +107,7 @@ def batch_norm(
training=training,
eps=eps,
momentum=momentum,
data_format=data_format,
out=out,
)

Expand All @@ -124,6 +122,7 @@ def instance_norm(
training: bool = False,
eps: float = 1e-5,
momentum: float = 1e-1,
data_format: str = "NSC",
out: Optional[Tuple[ivy.Array, ivy.Array, ivy.Array]] = None,
) -> Tuple[ivy.Array, ivy.Array, ivy.Array]:
"""
Expand All @@ -134,29 +133,19 @@ def instance_norm(
Parameters
----------
self
Input array of shape (N, *S, C), where N is the batch dimension,
Input array of shape default (N, *S, C), where N is the batch dimension,
*S corresponds to any number of spatial dimensions and
C corresponds to the channel dimension.
mean
Mean array used for input's normalization. If ``training=True``
then it must be one dimensional with size equal to the size of
channel dimension C. If ``training=False`` then it can be of any
shape broadcastble to the input shape.
Mean array of size C used for input's normalization.
variance
Variance array for the input's normalization. If ``training=True``
then it must be one dimensional with size equal to the size of
channel dimension C. If ``training=False`` then it can be of any shape
broadcastble to the input shape.
Variance array of size C used for input's normalization.
offset
An offset array. If present, will be added to the normalized input.
If ``training=True`` then it must be one dimensional with size equal
to the size of channel dimension C. If ``training=False`` then it can
be of any shape broadcastble to the input shape.
An offset array of size C. If present, will be added
to the normalized input.
scale
A scale array. If present, the scale is applied to the normalized input.
If ``training=True`` then it must be one dimensional with size equal to
the size of channel dimension C. If ``training=False`` then it can be of
any shape broadcastble to the input shape.
A scale array of size C. If present, the scale is
applied to the normalized input.
training
If true, calculate and use the mean and variance of `x`. Otherwise, use the
provided `mean` and `variance`.
Expand All @@ -165,6 +154,10 @@ def instance_norm(
momentum
the value used for the running_mean and running_var computation.
Default value is 0.1.
data_format
The ordering of the dimensions in the input, one of "NSC" or "NCS",
where N is the batch dimension, S represents any number of spatial
dimensions and C is the channel dimension. Default is "NSC".
out
optional output array, for writing the result to.

Expand All @@ -184,6 +177,7 @@ def instance_norm(
eps=eps,
momentum=momentum,
out=out,
data_format=data_format,
)

def lp_normalize(
Expand Down
1 change: 0 additions & 1 deletion ivy/data_classes/array/experimental/statistical.py
Original file line number Diff line number Diff line change
Expand Up @@ -357,7 +357,6 @@ def nanmedian(

Examples
--------

With :class:`ivy.array` input and default backend set as `numpy`:

>>> a = ivy.array([[10.0, ivy.nan, 4], [3, 2, 1]])
Expand Down
Loading