Skip to content

Commit

Permalink
Merge pull request pulp#3731 from bmbouter/small-http-downloader-fixes
Browse files Browse the repository at this point in the history
Close HttpDownloader connections properly
  • Loading branch information
bmbouter authored Oct 29, 2018
2 parents 648859f + a324766 commit 21a1d7d
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 8 deletions.
8 changes: 4 additions & 4 deletions docs/plugins/plugin-api/download.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Basic Downloading

The most basic downloading from a url can be done like this:

>>> downloader = HttpDownload('http://example.com/')
>>> downloader = HttpDownloader('http://example.com/')
>>> result = downloader.fetch()

The example above downloads the data synchronously. The
Expand All @@ -37,8 +37,8 @@ Any downloader in the ``pulpcore.plugin.download`` package can be run in paralle
that ``asyncio`` can schedule in parallel. Consider this example:

>>> download_coroutines = [
>>> HttpDownload('http://example.com/').run(),
>>> HttpDownload('http://pulpproject.org/').run(),
>>> HttpDownloader('http://example.com/').run(),
>>> HttpDownloader('http://pulpproject.org/').run(),
>>> ]
>>>
>>> loop = asyncio.get_event_loop()
Expand Down Expand Up @@ -86,7 +86,7 @@ supported urls.
information.

.. note::
All :class:`~pulpcore.plugin.download.HttpDownload` downloaders produced by the same
All :class:`~pulpcore.plugin.download.HttpDownloader` downloaders produced by the same
remote instance share an `aiohttp` session, which provides a connection pool, connection
reusage and keep-alives shared across all downloaders produced by a single remote.

Expand Down
8 changes: 4 additions & 4 deletions plugin/pulpcore/plugin/download/http.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,12 +73,12 @@ class HttpDownloader(BaseDownloader):
Parallel Download:
>>> download_coroutines = [
>>> HttpDownload('http://example.com/').run(),
>>> HttpDownload('http://pulpproject.org/').run(),
>>> HttpDownloader('http://example.com/').run(),
>>> HttpDownloader('http://pulpproject.org/').run(),
>>> ]
>>>
>>> loop = asyncio.get_event_loop()
>>> done, not_done = loop.run_until_complete(asyncio.wait([download_coroutines]))
>>> done, not_done = loop.run_until_complete(asyncio.wait(download_coroutines))
>>>
>>> for task in done:
>>> try:
Expand Down Expand Up @@ -178,5 +178,5 @@ async def run(self, extra_data=None):
to_return = await self._handle_response(response)
await response.release()
if self._close_session_on_finalize:
self.session.close()
await self.session.close()
return to_return

0 comments on commit 21a1d7d

Please sign in to comment.