Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[14.0] [ADD] queue_job_batch_size #566

Open
wants to merge 1 commit into
base: 14.0
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
98 changes: 98 additions & 0 deletions queue_job_batch_size/README.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
====================
Queue Job Batch Size
====================

..
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! This file is generated by oca-gen-addon-readme !!
!! changes will be overwritten. !!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! source digest: sha256:3b2b67b2f9e534bfe1b21f54456b2c5f02088b0dbf652e26b57d9704678fef2c
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
.. |badge1| image:: https://img.shields.io/badge/maturity-Beta-yellow.png
:target: https://odoo-community.org/page/development-status
:alt: Beta
.. |badge2| image:: https://img.shields.io/badge/licence-AGPL--3-blue.png
:target: http://www.gnu.org/licenses/agpl-3.0-standalone.html
:alt: License: AGPL-3
.. |badge3| image:: https://img.shields.io/badge/github-OCA%2Fqueue-lightgray.png?logo=github
:target: https://github.com/OCA/queue/tree/14.0/queue_job_batch_size
:alt: OCA/queue
.. |badge4| image:: https://img.shields.io/badge/weblate-Translate%20me-F47D42.png
:target: https://translation.odoo-community.org/projects/queue-14-0/queue-14-0-queue_job_batch_size
:alt: Translate me on Weblate
.. |badge5| image:: https://img.shields.io/badge/runboat-Try%20me-875A7B.png
:target: https://runboat.odoo-community.org/builds?repo=OCA/queue&target_branch=14.0
:alt: Try me on Runboat

|badge1| |badge2| |badge3| |badge4| |badge5|

This module allows to seemlessly split a big job into smaller jobs.

It uses ``queue_job_batch`` to group the created jobs into a batch.

Example:

.. code-block:: python
class ResPartner(models.Model):
# ...
def copy_all_partners(self):
# Duplicate all partners in batches of 30:
self.with_delay(batch_size=30).copy()
# ...
self.env['res.partner'].search([], limit=1000).copy_all_partners()
This will create 34 jobs, each one copying 30 partners (except the last one which will copy 10) and will group them into a batch.

Instead of ``batch_size``, one can also use ``batch_count`` to specify the number of batches to create instead.



**Table of contents**

.. contents::
:local:

Bug Tracker
===========

Bugs are tracked on `GitHub Issues <https://github.com/OCA/queue/issues>`_.
In case of trouble, please check there if your issue has already been reported.
If you spotted it first, help us to smash it by providing a detailed and welcomed
`feedback <https://github.com/OCA/queue/issues/new?body=module:%20queue_job_batch_size%0Aversion:%2014.0%0A%0A**Steps%20to%20reproduce**%0A-%20...%0A%0A**Current%20behavior**%0A%0A**Expected%20behavior**>`_.

Do not contact contributors directly about support or help with technical issues.

Credits
=======

Authors
~~~~~~~

* Akretion

Contributors
~~~~~~~~~~~~

* Florian Mounier <[email protected]>

Maintainers
~~~~~~~~~~~

This module is maintained by the OCA.

.. image:: https://odoo-community.org/logo.png
:alt: Odoo Community Association
:target: https://odoo-community.org

OCA, or the Odoo Community Association, is a nonprofit organization whose
mission is to support the collaborative development of Odoo features and
promote its widespread use.

This module is part of the `OCA/queue <https://github.com/OCA/queue/tree/14.0/queue_job_batch_size>`_ project on GitHub.

You are welcome to contribute. To learn how please visit https://odoo-community.org/page/Contribute.
1 change: 1 addition & 0 deletions queue_job_batch_size/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from . import models
19 changes: 19 additions & 0 deletions queue_job_batch_size/__manifest__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Copyright 2023 Akretion (http://www.akretion.com).
# @author Florian Mounier <[email protected]>
# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl).

{
"name": "Queue Job Batch Size",
"summary": "Add batch size / steps property to queue jobs to"
" automatically split them",
"version": "14.0.1.0.0",
"author": "Akretion,Odoo Community Association (OCA)",
"website": "https://github.com/OCA/queue",
"category": "Generic Modules",
"license": "AGPL-3",
"application": False,
"installable": True,
"depends": [
"queue_job_batch",
],
}
1 change: 1 addition & 0 deletions queue_job_batch_size/models/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from . import base
132 changes: 132 additions & 0 deletions queue_job_batch_size/models/base.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,132 @@
# Copyright 2023 Akretion (http://www.akretion.com).
# @author Florian Mounier <[email protected]>
# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl).

import operator
from functools import reduce

from odoo import models

from odoo.addons.queue_job.delay import Delayable


class DelayableBatchRecordset(object):
__slots__ = ("delayables", "batch")

def __init__(
self,
recordset,
priority=None,
eta=None,
max_retries=None,
description=None,
channel=None,
identity_key=None,
batch_size=None,
batch_count=None,
):
total_records = len(recordset)

Check warning on line 28 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L28

Added line #L28 was not covered by tests
if batch_size:
batch_count = 1 + total_records // batch_size

Check warning on line 30 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L30

Added line #L30 was not covered by tests
else:
batch_size = total_records // batch_count

Check warning on line 32 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L32

Added line #L32 was not covered by tests
if total_records % batch_count:
batch_size += 1

Check warning on line 34 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L34

Added line #L34 was not covered by tests

description = description or "__EMPTY__"
self.batch = recordset.env["queue.job.batch"].get_new_batch(

Check warning on line 37 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L36-L37

Added lines #L36 - L37 were not covered by tests
"Batch of %s" % description
)
self.delayables = []

Check warning on line 40 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L40

Added line #L40 was not covered by tests
for batch in range(batch_count):
start = batch * batch_size
end = min((batch + 1) * batch_size, total_records)

Check warning on line 43 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L42-L43

Added lines #L42 - L43 were not covered by tests
if end > start:
self.delayables.append(

Check warning on line 45 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L45

Added line #L45 was not covered by tests
Delayable(
recordset[start:end].with_context(job_batch=self.batch),
priority=priority or 12, # Lower priority than default
# to let queue_job_batch check the state
eta=eta,
max_retries=max_retries,
description="%s (batch %d/%d)"
% (description, batch + 1, batch_count),
channel=channel,
identity_key=identity_key,
)
)

@property
def recordset(self):
return reduce(operator.or_, self.delayables, set()).recordset

Check warning on line 61 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L61

Added line #L61 was not covered by tests

def __getattr__(self, name):
def _delay_delayable(*args, **kwargs):

Check warning on line 64 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L64

Added line #L64 was not covered by tests
for delayable in self.delayables:
func = getattr(delayable, name)

Check warning on line 66 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L66

Added line #L66 was not covered by tests

# FIXME: Find a better way to set default description
if "__EMPTY__" in delayable.description:
description = (

Check warning on line 70 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L70

Added line #L70 was not covered by tests
func.__doc__.splitlines()[0].strip()
if func.__doc__
else "{}.{}".format(delayable.recordset._name, name)
)
delayable.description = delayable.description.replace(

Check warning on line 75 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L75

Added line #L75 was not covered by tests
"__EMPTY__", description
)
if "__EMPTY__" in self.batch.name:
self.batch.name = self.batch.name.replace(

Check warning on line 79 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L79

Added line #L79 was not covered by tests
"__EMPTY__", description
)
func(*args, **kwargs).delay()
self.batch.enqueue()

Check warning on line 83 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L82-L83

Added lines #L82 - L83 were not covered by tests
return [delayable._generated_job for delayable in self.delayables]

return _delay_delayable

Check warning on line 86 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L86

Added line #L86 was not covered by tests

def __str__(self):
recordset = self.delayables[0].recordset
return "DelayableBatchRecordset(%s%s)" % (

Check warning on line 90 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L89-L90

Added lines #L89 - L90 were not covered by tests
recordset._name,
getattr(recordset, "_ids", ""),
)

__repr__ = __str__


class Base(models.AbstractModel):
_inherit = "base"

def with_delay(
self,
priority=None,
eta=None,
max_retries=None,
description=None,
channel=None,
identity_key=None,
batch_size=None,
batch_count=None,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems straightforward enough, but the corresponding change to def delayable is missing.

):
if batch_size or batch_count:
return DelayableBatchRecordset(

Check warning on line 113 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L113

Added line #L113 was not covered by tests
self,
priority=priority,
eta=eta,
max_retries=max_retries,
description=description,
channel=channel,
identity_key=identity_key,
batch_size=batch_size,
batch_count=batch_count,
)

return super().with_delay(
priority=priority,
eta=eta,
max_retries=max_retries,
description=description,
channel=channel,
identity_key=identity_key,
)
1 change: 1 addition & 0 deletions queue_job_batch_size/readme/CONTRIBUTORS.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
* Florian Mounier <[email protected]>
21 changes: 21 additions & 0 deletions queue_job_batch_size/readme/DESCRIPTION.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
This module allows to seemlessly split a big job into smaller jobs.

It uses ``queue_job_batch`` to group the created jobs into a batch.

Example:

.. code-block:: python
class ResPartner(models.Model):
# ...
def copy_all_partners(self):
# Duplicate all partners in batches of 30:
self.with_delay(batch_size=30).copy()
# ...
self.env['res.partner'].search([], limit=1000).copy_all_partners()
This will create 34 jobs, each one copying 30 partners (except the last one which will copy 10) and will group them into a batch.

Instead of ``batch_size``, one can also use ``batch_count`` to specify the number of batches to create instead.
Loading
Loading