Skip to content

Commit

Permalink
Merge genaidev Into Dev (Project-MONAI#7886)
Browse files Browse the repository at this point in the history
Fixes Project-MONAI#6676 .

### Description

This merges the Generative Models code into dev. Everything has been
checked by the generative team, tests all pass, and the changes that
have been done recently are integrated. This is ready to merge.

### Types of changes
<!--- Put an `x` in all the boxes that apply, and remove the not
applicable items -->
- [x] Non-breaking change (fix or new feature that would not break
existing functionality).
- [ ] Breaking change (fix or new feature that would cause existing
functionality to change).
- [x] New tests added to cover the changes.
- [x] Integration tests passed locally by running `./runtests.sh -f -u
--net --coverage`.
- [x] Quick tests passed locally by running `./runtests.sh --quick
--unittests --disttests`.
- [x] In-line docstrings updated.
- [x] Documentation updated, tested `make html` command in the `docs/`
folder.

---------

Signed-off-by: Mark Graham <[email protected]>
Signed-off-by: KumoLiu <[email protected]>
Signed-off-by: YunLiu <[email protected]>
Signed-off-by: Mark Graham <[email protected]>
Signed-off-by: vgrau98 <[email protected]>
Signed-off-by: vgrau98 <[email protected]>
Signed-off-by: Wenqi Li <[email protected]>
Signed-off-by: dongy <[email protected]>
Signed-off-by: myron <[email protected]>
Signed-off-by: kaibo <[email protected]>
Signed-off-by: monai-bot <[email protected]>
Signed-off-by: elitap <[email protected]>
Signed-off-by: Felix Schnabel <[email protected]>
Signed-off-by: YanxuanLiu <[email protected]>
Signed-off-by: ytl0623 <[email protected]>
Signed-off-by: Dženan Zukić <[email protected]>
Signed-off-by: Ishan Dutta <[email protected]>
Signed-off-by: dependabot[bot] <[email protected]>
Signed-off-by: heyufan1995 <[email protected]>
Signed-off-by: binliu <[email protected]>
Signed-off-by: axel.vlaminck <[email protected]>
Signed-off-by: Ibrahim Hadzic <[email protected]>
Signed-off-by: Behrooz <[email protected]>
Signed-off-by: Timothy Baker <[email protected]>
Signed-off-by: Mathijs de Boer <[email protected]>
Signed-off-by: Fabian Klopfer <[email protected]>
Signed-off-by: Lucas Robinet <[email protected]>
Signed-off-by: Lucas Robinet <[email protected]>
Signed-off-by: chaoliu <[email protected]>
Signed-off-by: cxlcl <[email protected]>
Signed-off-by: chaoliu <[email protected]>
Signed-off-by: Suraj Pai <[email protected]>
Signed-off-by: Juan Pablo de la Cruz Gutiérrez <[email protected]>
Signed-off-by: John Zielke <[email protected]>
Signed-off-by: Mingxin Zheng <[email protected]>
Signed-off-by: Vladimir Chernyi <[email protected]>
Signed-off-by: Yiheng Wang <[email protected]>
Signed-off-by: Szabolcs Botond Lorincz Molnar <[email protected]>
Signed-off-by: Lucas Robinet <[email protected]>
Signed-off-by: Mingxin <[email protected]>
Signed-off-by: Han Wang <[email protected]>
Signed-off-by: Konstantin Sukharev <[email protected]>
Signed-off-by: Ben Murray <[email protected]>
Signed-off-by: Matthew Vine <[email protected]>
Signed-off-by: Peter Kaplinsky <[email protected]>
Signed-off-by: Simon Jensen <[email protected]>
Signed-off-by: NabJa <[email protected]>
Signed-off-by: virginiafdez <[email protected]>
Signed-off-by: Eric Kerfoot <[email protected]>
Signed-off-by: Eric Kerfoot <[email protected]>
Co-authored-by: Mark Graham <[email protected]>
Co-authored-by: YunLiu <[email protected]>
Co-authored-by: KumoLiu <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: vgrau98 <[email protected]>
Co-authored-by: Wenqi Li <[email protected]>
Co-authored-by: Dong Yang <[email protected]>
Co-authored-by: myron <[email protected]>
Co-authored-by: Kaibo Tang <[email protected]>
Co-authored-by: monai-bot <[email protected]>
Co-authored-by: elitap <[email protected]>
Co-authored-by: Felix Schnabel <[email protected]>
Co-authored-by: YanxuanLiu <[email protected]>
Co-authored-by: ytl0623 <[email protected]>
Co-authored-by: Dženan Zukić <[email protected]>
Co-authored-by: Ishan Dutta <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Kaibo Tang <[email protected]>
Co-authored-by: Yufan He <[email protected]>
Co-authored-by: binliunls <[email protected]>
Co-authored-by: Ben Murray <[email protected]>
Co-authored-by: axel.vlaminck <[email protected]>
Co-authored-by: Mingxin Zheng <[email protected]>
Co-authored-by: Ibrahim Hadzic <[email protected]>
Co-authored-by: Dr. Behrooz Hashemian <[email protected]>
Co-authored-by: Timothy J. Baker <[email protected]>
Co-authored-by: Mathijs de Boer <[email protected]>
Co-authored-by: Mathijs de Boer <[email protected]>
Co-authored-by: Fabian Klopfer <[email protected]>
Co-authored-by: Yiheng Wang <[email protected]>
Co-authored-by: Lucas Robinet <[email protected]>
Co-authored-by: Lucas Robinet <[email protected]>
Co-authored-by: cxlcl <[email protected]>
Co-authored-by: Suraj Pai <[email protected]>
Co-authored-by: Juampa <[email protected]>
Co-authored-by: johnzielke <[email protected]>
Co-authored-by: Vladimir Chernyi <[email protected]>
Co-authored-by: Lőrincz-Molnár Szabolcs-Botond <[email protected]>
Co-authored-by: Nic Ma <[email protected]>
Co-authored-by: Lucas Robinet <[email protected]>
Co-authored-by: Han Wang <[email protected]>
Co-authored-by: Konstantin Sukharev <[email protected]>
Co-authored-by: Matthew Vine <[email protected]>
Co-authored-by: Pkaps25 <[email protected]>
Co-authored-by: Peter Kaplinsky <[email protected]>
Co-authored-by: Simon Jensen <[email protected]>
Co-authored-by: NabJa <[email protected]>
Co-authored-by: Virginia Fernandez <[email protected]>
Co-authored-by: virginiafdez <[email protected]>
Co-authored-by: Yu <[email protected]>
  • Loading branch information
Show file tree
Hide file tree
Showing 86 changed files with 16,359 additions and 178 deletions.
5 changes: 5 additions & 0 deletions docs/source/engines.rst
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,11 @@ Workflows
.. autoclass:: GanTrainer
:members:

`AdversarialTrainer`
~~~~~~~~~~~~~~~~~~~~
.. autoclass:: AdversarialTrainer
:members:

`Evaluator`
~~~~~~~~~~~
.. autoclass:: Evaluator
Expand Down
23 changes: 23 additions & 0 deletions docs/source/inferers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,29 @@ Inferers
:members:
:special-members: __call__

`DiffusionInferer`
~~~~~~~~~~~~~~~~~~
.. autoclass:: DiffusionInferer
:members:
:special-members: __call__

`LatentDiffusionInferer`
~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: LatentDiffusionInferer
:members:
:special-members: __call__

`ControlNetDiffusionInferer`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: ControlNetDiffusionInferer
:members:
:special-members: __call__

`ControlNetLatentDiffusionInferer`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: ControlNetLatentDiffusionInferer
:members:
:special-members: __call__

Splitters
---------
Expand Down
5 changes: 5 additions & 0 deletions docs/source/utils.rst
Original file line number Diff line number Diff line change
Expand Up @@ -81,3 +81,8 @@ Component store
---------------
.. autoclass:: monai.utils.component_store.ComponentStore
:members:

Ordering
--------
.. automodule:: monai.utils.ordering
:members:
4 changes: 2 additions & 2 deletions monai/apps/detection/utils/anchor_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ def generate_anchors(
w_ratios = 1 / area_scale
h_ratios = area_scale
# if 3d, w:h:d = 1:aspect_ratios[:,0]:aspect_ratios[:,1]
elif self.spatial_dims == 3:
else:
area_scale = torch.pow(aspect_ratios_t[:, 0] * aspect_ratios_t[:, 1], 1 / 3.0)
w_ratios = 1 / area_scale
h_ratios = aspect_ratios_t[:, 0] / area_scale
Expand All @@ -199,7 +199,7 @@ def generate_anchors(
hs = (h_ratios[:, None] * scales_t[None, :]).view(-1)
if self.spatial_dims == 2:
base_anchors = torch.stack([-ws, -hs, ws, hs], dim=1) / 2.0
elif self.spatial_dims == 3:
else: # elif self.spatial_dims == 3:
ds = (d_ratios[:, None] * scales_t[None, :]).view(-1)
base_anchors = torch.stack([-ws, -hs, -ds, ws, hs, ds], dim=1) / 2.0

Expand Down
1 change: 1 addition & 0 deletions monai/apps/pathology/transforms/post/array.py
Original file line number Diff line number Diff line change
Expand Up @@ -379,6 +379,7 @@ def _generate_contour_coord(self, current: np.ndarray, previous: np.ndarray) ->
"""

p_delta = (current[0] - previous[0], current[1] - previous[1])
row, col = -1, -1

if p_delta in ((0.0, 1.0), (0.5, 0.5), (1.0, 0.0)):
row = int(current[0] + 0.5)
Expand Down
1 change: 1 addition & 0 deletions monai/bundle/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -221,6 +221,7 @@ def load_bundle_config(bundle_path: str, *config_names: str, **load_kw_args: Any
raise ValueError(f"Cannot find config file '{full_cname}'")

ardata = archive.read(full_cname)
cdata = {}

if full_cname.lower().endswith("json"):
cdata = json.loads(ardata, **load_kw_args)
Expand Down
1 change: 1 addition & 0 deletions monai/data/dataset_summary.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,7 @@ def collect_meta_data(self):
"""

for data in self.data_loader:
meta_dict = {}
if isinstance(data[self.image_key], MetaTensor):
meta_dict = data[self.image_key].meta
elif self.meta_key in data:
Expand Down
15 changes: 9 additions & 6 deletions monai/data/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,10 +53,6 @@
pytorch_after,
)

if pytorch_after(1, 13):
# import private code for reuse purposes, comment in case things break in the future
from torch.utils.data._utils.collate import collate_tensor_fn, default_collate_fn_map

pd, _ = optional_import("pandas")
DataFrame, _ = optional_import("pandas", name="DataFrame")
nib, _ = optional_import("nibabel")
Expand Down Expand Up @@ -454,8 +450,13 @@ def collate_meta_tensor_fn(batch, *, collate_fn_map=None):
Collate a sequence of meta tensor into a single batched metatensor. This is called by `collage_meta_tensor`
and so should not be used as a collate function directly in dataloaders.
"""
collate_fn = collate_tensor_fn if pytorch_after(1, 13) else default_collate
collated = collate_fn(batch) # type: ignore
if pytorch_after(1, 13):
from torch.utils.data._utils.collate import collate_tensor_fn # imported here for pylint/mypy issues

collated = collate_tensor_fn(batch)
else:
collated = default_collate(batch)

meta_dicts = [i.meta or TraceKeys.NONE for i in batch]
common_ = set.intersection(*[set(d.keys()) for d in meta_dicts if isinstance(d, dict)])
if common_:
Expand Down Expand Up @@ -496,6 +497,8 @@ def list_data_collate(batch: Sequence):

if pytorch_after(1, 13):
# needs to go here to avoid circular import
from torch.utils.data._utils.collate import default_collate_fn_map

from monai.data.meta_tensor import MetaTensor

default_collate_fn_map.update({MetaTensor: collate_meta_tensor_fn})
Expand Down
4 changes: 3 additions & 1 deletion monai/engines/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,14 @@
from __future__ import annotations

from .evaluator import EnsembleEvaluator, Evaluator, SupervisedEvaluator
from .trainer import GanTrainer, SupervisedTrainer, Trainer
from .trainer import AdversarialTrainer, GanTrainer, SupervisedTrainer, Trainer
from .utils import (
DiffusionPrepareBatch,
IterationEvents,
PrepareBatch,
PrepareBatchDefault,
PrepareBatchExtraInput,
VPredictionPrepareBatch,
default_make_latent,
default_metric_cmp_fn,
default_prepare_batch,
Expand Down
Loading

0 comments on commit d020fac

Please sign in to comment.