Skip to content
This repository has been archived by the owner on Apr 11, 2022. It is now read-only.

Move node watcher task to cloud function #324

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

niklabh
Copy link
Contributor

@niklabh niklabh commented Apr 27, 2020

Possible solution for memory leak: #267

We can move running task read write part to cloud function (https://cloud.google.com/functions). Cloud function are serverless compute instances which starts and tears down for each invokation. So memory leak issue will not arise.

Long running script will just invoke cloud function with blockIndex.

To test on local:

Start task runner/cloud function:

ARCHIVE_NODE_ENDPOINT=ws://127.0.0.1:9944 PRISMA_ENDPOINT=http://0.0.0.0:4466 PORT=4489 yarn start:runner

Start long running loop:

MAX_LAG=5 BLOCK_IDENTIFIER=aasdsdasdadssa ARCHIVE_NODE_ENDPOINT=ws://127.0.0.1:9944 TASK_RUNNER_SERVER=http://localhost:4489 PRISMA_ENDPOINT=http://0.0.0.0:4466 yarn start

long running loop will invoke cloud function with incrementing blockIndex. Cloud function will execute and tear down releasing memory. Invokation rate will be at the speed of block production.

@niklabh
Copy link
Contributor Author

niklabh commented Apr 27, 2020

Need help with setting up cloud functions. I need permissions

Copy link
Contributor

@pmespresso pmespresso left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah cool! thank you!

my only question is whether this would slow the task runner down significantly or not since the cloud function needs to start up (and open then close a new Websocket connection, as well as instantiate and decorate PolkadotJS-api) on every block index.

It's worth a go, anyway.

@niklabh
Copy link
Contributor Author

niklabh commented Apr 28, 2020

It will not slow down task runner. Cloud function are desinged in such a way that they take time only for some initial invokations after that cloud functions are considered warmed up and next invokations takes only some milliseconds.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants