Skip to content

Commit

Permalink
[ADD] queue_job_batch_size
Browse files Browse the repository at this point in the history
  • Loading branch information
paradoxxxzero committed Oct 20, 2023
1 parent 306093d commit b46ee67
Show file tree
Hide file tree
Showing 12 changed files with 914 additions and 0 deletions.
98 changes: 98 additions & 0 deletions queue_job_batch_size/README.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
====================
Queue Job Batch Size
====================

..
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! This file is generated by oca-gen-addon-readme !!
!! changes will be overwritten. !!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! source digest: sha256:3b2b67b2f9e534bfe1b21f54456b2c5f02088b0dbf652e26b57d9704678fef2c
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
.. |badge1| image:: https://img.shields.io/badge/maturity-Beta-yellow.png
:target: https://odoo-community.org/page/development-status
:alt: Beta
.. |badge2| image:: https://img.shields.io/badge/licence-AGPL--3-blue.png
:target: http://www.gnu.org/licenses/agpl-3.0-standalone.html
:alt: License: AGPL-3
.. |badge3| image:: https://img.shields.io/badge/github-OCA%2Fqueue-lightgray.png?logo=github
:target: https://github.com/OCA/queue/tree/14.0/queue_job_batch_size
:alt: OCA/queue
.. |badge4| image:: https://img.shields.io/badge/weblate-Translate%20me-F47D42.png
:target: https://translation.odoo-community.org/projects/queue-14-0/queue-14-0-queue_job_batch_size
:alt: Translate me on Weblate
.. |badge5| image:: https://img.shields.io/badge/runboat-Try%20me-875A7B.png
:target: https://runboat.odoo-community.org/builds?repo=OCA/queue&target_branch=14.0
:alt: Try me on Runboat

|badge1| |badge2| |badge3| |badge4| |badge5|

This module allows to seemlessly split a big job into smaller jobs.

It uses ``queue_job_batch`` to group the created jobs into a batch.

Example:

.. code-block:: python
class ResPartner(models.Model):
# ...
def copy_all_partners(self):
# Duplicate all partners in batches of 30:
self.with_delay(batch_size=30).copy()
# ...
self.env['res.partner'].search([], limit=1000).copy_all_partners()
This will create 34 jobs, each one copying 30 partners (except the last one which will copy 10) and will group them into a batch.

Instead of ``batch_size``, one can also use ``batch_count`` to specify the number of batches to create instead.



**Table of contents**

.. contents::
:local:

Bug Tracker
===========

Bugs are tracked on `GitHub Issues <https://github.com/OCA/queue/issues>`_.
In case of trouble, please check there if your issue has already been reported.
If you spotted it first, help us to smash it by providing a detailed and welcomed
`feedback <https://github.com/OCA/queue/issues/new?body=module:%20queue_job_batch_size%0Aversion:%2014.0%0A%0A**Steps%20to%20reproduce**%0A-%20...%0A%0A**Current%20behavior**%0A%0A**Expected%20behavior**>`_.

Do not contact contributors directly about support or help with technical issues.

Credits
=======

Authors
~~~~~~~

* Akretion

Contributors
~~~~~~~~~~~~

* Florian Mounier <[email protected]>

Maintainers
~~~~~~~~~~~

This module is maintained by the OCA.

.. image:: https://odoo-community.org/logo.png
:alt: Odoo Community Association
:target: https://odoo-community.org

OCA, or the Odoo Community Association, is a nonprofit organization whose
mission is to support the collaborative development of Odoo features and
promote its widespread use.

This module is part of the `OCA/queue <https://github.com/OCA/queue/tree/14.0/queue_job_batch_size>`_ project on GitHub.

You are welcome to contribute. To learn how please visit https://odoo-community.org/page/Contribute.
1 change: 1 addition & 0 deletions queue_job_batch_size/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from . import models
19 changes: 19 additions & 0 deletions queue_job_batch_size/__manifest__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Copyright 2023 Akretion (http://www.akretion.com).
# @author Florian Mounier <[email protected]>
# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl).

{
"name": "Queue Job Batch Size",
"summary": "Add batch size / steps property to queue jobs to"
" automatically split them",
"version": "14.0.1.0.0",
"author": "Akretion,Odoo Community Association (OCA)",
"website": "https://github.com/OCA/queue",
"category": "Generic Modules",
"license": "AGPL-3",
"application": False,
"installable": True,
"depends": [
"queue_job_batch",
],
}
1 change: 1 addition & 0 deletions queue_job_batch_size/models/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from . import base
130 changes: 130 additions & 0 deletions queue_job_batch_size/models/base.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,130 @@
# Copyright 2023 Akretion (http://www.akretion.com).
# @author Florian Mounier <[email protected]>
# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl).

import operator
from functools import reduce

from odoo import models
from odoo.addons.queue_job.delay import Delayable


class DelayableBatchRecordset(object):
__slots__ = ("delayables", "batch")

def __init__(
self,
recordset,
priority=None,
eta=None,
max_retries=None,
description=None,
channel=None,
identity_key=None,
batch_size=None,
batch_count=None,
):
total_records = len(recordset)

Check warning on line 27 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L27

Added line #L27 was not covered by tests
if batch_size:
batch_count = 1 + total_records // batch_size

Check warning on line 29 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L29

Added line #L29 was not covered by tests
else:
batch_size = total_records // batch_count

Check warning on line 31 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L31

Added line #L31 was not covered by tests
if total_records % batch_count:
batch_size += 1

Check warning on line 33 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L33

Added line #L33 was not covered by tests

description = description or "__EMPTY__"
self.batch = recordset.env["queue.job.batch"].get_new_batch(

Check warning on line 36 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L35-L36

Added lines #L35 - L36 were not covered by tests
"Batch of %s" % description
)
self.delayables = []

Check warning on line 39 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L39

Added line #L39 was not covered by tests
for batch in range(batch_count):
start = batch * batch_size
end = min((batch + 1) * batch_size, total_records)

Check warning on line 42 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L41-L42

Added lines #L41 - L42 were not covered by tests
if end > start:
self.delayables.append(

Check warning on line 44 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L44

Added line #L44 was not covered by tests
Delayable(
recordset[start:end].with_context(job_batch=self.batch),
priority=priority,
eta=eta,
max_retries=max_retries,
description="%s (batch %d/%d)"
% (description, batch + 1, batch_count),
channel=channel,
identity_key=identity_key,
)
)

@property
def recordset(self):
return reduce(operator.or_, self.delayables, set()).recordset

Check warning on line 59 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L59

Added line #L59 was not covered by tests

def __getattr__(self, name):
def _delay_delayable(*args, **kwargs):

Check warning on line 62 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L62

Added line #L62 was not covered by tests
for delayable in self.delayables:
func = getattr(delayable, name)

Check warning on line 64 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L64

Added line #L64 was not covered by tests

# FIXME: Find a better way to set default description
if "__EMPTY__" in delayable.description:
description = (

Check warning on line 68 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L68

Added line #L68 was not covered by tests
func.__doc__.splitlines()[0].strip()
if func.__doc__
else "{}.{}".format(delayable.recordset._name, name)
)
delayable.description = delayable.description.replace(

Check warning on line 73 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L73

Added line #L73 was not covered by tests
"__EMPTY__", description
)
if "__EMPTY__" in self.batch.name:
self.batch.name = self.batch.name.replace(

Check warning on line 77 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L77

Added line #L77 was not covered by tests
"__EMPTY__", description
)
func(*args, **kwargs).delay()
self.batch.enqueue()

Check warning on line 81 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L80-L81

Added lines #L80 - L81 were not covered by tests
return [delayable._generated_job for delayable in self.delayables]

return _delay_delayable

Check warning on line 84 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L84

Added line #L84 was not covered by tests

def __str__(self):
recordset = self.delayables[0].recordset
return "DelayableBatchRecordset(%s%s)" % (

Check warning on line 88 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L87-L88

Added lines #L87 - L88 were not covered by tests
recordset._name,
getattr(recordset, "_ids", ""),
)

__repr__ = __str__


class Base(models.AbstractModel):
_inherit = "base"

def with_delay(
self,
priority=None,
eta=None,
max_retries=None,
description=None,
channel=None,
identity_key=None,
batch_size=None,
batch_count=None,
):
if batch_size or batch_count:
return DelayableBatchRecordset(

Check warning on line 111 in queue_job_batch_size/models/base.py

View check run for this annotation

Codecov / codecov/patch

queue_job_batch_size/models/base.py#L111

Added line #L111 was not covered by tests
self,
priority=priority,
eta=eta,
max_retries=max_retries,
description=description,
channel=channel,
identity_key=identity_key,
batch_size=batch_size,
batch_count=batch_count,
)

return super().with_delay(
priority=priority,
eta=eta,
max_retries=max_retries,
description=description,
channel=channel,
identity_key=identity_key,
)
1 change: 1 addition & 0 deletions queue_job_batch_size/readme/CONTRIBUTORS.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
* Florian Mounier <[email protected]>
23 changes: 23 additions & 0 deletions queue_job_batch_size/readme/DESCRIPTION.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
This module allows to seemlessly split a big job into smaller jobs.

It uses ``queue_job_batch`` to group the created jobs into a batch.

Example:

.. code-block:: python
class ResPartner(models.Model):
# ...
def copy_all_partners(self):
# Duplicate all partners in batches of 30:
self.with_delay(batch_size=30).copy()
# ...
self.env['res.partner'].search([], limit=1000).copy_all_partners()
This will create 34 jobs, each one copying 30 partners (except the last one which will copy 10) and will group them into a batch.

Instead of ``batch_size``, one can also use ``batch_count`` to specify the number of batches to create instead.


Loading

0 comments on commit b46ee67

Please sign in to comment.