Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feat]: add Logger API #284

Merged
merged 17 commits into from
Sep 23, 2024
Merged

[Feat]: add Logger API #284

merged 17 commits into from
Sep 23, 2024

Conversation

aniketmaurya
Copy link
Collaborator

@aniketmaurya aniketmaurya commented Sep 20, 2024

Before submitting
  • Was this discussed/agreed via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

⚠️ How does this PR impact the user? ⚠️

Provides a clean and easy to use interface for logging metrics. Such as connect with Promethus.


What does this PR do?

Provide a Logger API and LitAPI.log to record key-value paired metrics to be processed by the Loggers.

  1. Enable Logger API which implements process method to handle a key-value pair metric.
  2. LitAPI.log: Passes key-value pair to the loggers

image

The following example shows the usage of Logger API along with LitAPI.log from a callback.

import time
import litserve as ls

class FileLogger(ls.Logger):
    def process(self, key, value):
        with open("test_logger_temp.txt", "a+") as f:
            f.write(f"{key}: {value:.1f}\n")

class PredictionTimeLogger(ls.Callback):
    def on_before_predict(self, lit_api):
        t0 = time.perf_counter()
        self._start_time = t0

    def on_after_predict(self, lit_api):
        t1 = time.perf_counter()
        elapsed = t1 - self._start_time
        print(f"Prediction took {elapsed:.2f} seconds", flush=True)
        lit_api.log("prediction_time", elapsed)

if __name__ == '__main__':
    lit_api = ls.test_examples.SimpleLitAPI()
    server = ls.LitServer(lit_api, callbacks=[PredictionTimeLogger()], loggers=FileLogger())
    server.run()

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in GitHub issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@aniketmaurya aniketmaurya changed the title add logger [Feat]: add Logger API Sep 20, 2024
@aniketmaurya aniketmaurya added the enhancement New feature or request label Sep 20, 2024
@aniketmaurya aniketmaurya self-assigned this Sep 20, 2024
Copy link

codecov bot commented Sep 20, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 95%. Comparing base (44e0fe9) to head (652a690).
Report is 1 commits behind head on main.

Additional details and impacted files
@@         Coverage Diff         @@
##           main   #284   +/-   ##
===================================
  Coverage    95%    95%           
===================================
  Files        18     19    +1     
  Lines      1173   1241   +68     
===================================
+ Hits       1112   1181   +69     
+ Misses       61     60    -1     

@williamFalcon
Copy link
Contributor

williamFalcon commented Sep 21, 2024

first of all, this looks great overall. nice job. a few nits:

can a user call self.log inside the litapi like you can in PTL without any callbacks for example?

also, cases to test:

  • self.log in litapi without any callbacks or loggers.
  • self.log in litapi with a logger
  • no self.log, with a logger
  • no self.log, with a logger and a callback

also, what is the speed impact of adding a logger? did we do it in such a way where we won’t slow down the server processing?

src/litserve/api.py Outdated Show resolved Hide resolved
@aniketmaurya
Copy link
Collaborator Author

aniketmaurya commented Sep 21, 2024

Thank you @williamFalcon 🙏

can a user call self.log inside the litapi like you can in PTL without any callbacks for example?

Yes, it's possible to call self.log inside LitAPI without any callback. We assert this in tests.

also, cases to test:

  • self.log in litapi without any callbacks or loggers.
  • Included in tests
  • self.log in litapi with a logger
  • Included in tests
  • no self.log, with a logger
  • Included in tests
  • no self.log, with a logger and a callback
  • Will add this in test

also, what is the speed impact of adding a logger? did we do it in such a way where we won’t slow down the server processing?

We put the logs in a queue, the queue is then processed in a separate process without blocking the inference worker. We landed the streaming test too before this PR to make sure we don't regress.

@williamFalcon
Copy link
Contributor

nice. i’m good with it. just need @lantiga or @tchaton to sanity check

Copy link
Collaborator

@lantiga lantiga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me.

Just calling out a possible failure mode we'll need to take into account (either here or in a future PR).

src/litserve/loggers.py Show resolved Hide resolved
@aniketmaurya aniketmaurya merged commit 92b0dd5 into main Sep 23, 2024
19 of 20 checks passed
@aniketmaurya aniketmaurya deleted the aniket/feat/logger branch September 23, 2024 03:42
@vrdn-23
Copy link
Contributor

vrdn-23 commented Oct 2, 2024

@aniketmaurya
Does the example provided for PrometheusLogger still work with the merged code? I've been trying to set it up with the code from the main branch but I'm getting this error during start-up.

This is the example that I'm trying to get working

class PrometheusLogger(ls.Logger):
    def __init__(self):
        super().__init__()
        self._metric_counter = Counter('log_entries', 'Count of log entries')

    def process(self, key, value):
        # Increment the Prometheus counter for each log entry
        self._metric_counter.inc()
        print(f"Logged {key}: {value}")

This is the error that gets thrown when I try to start the server

INFO:     Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/Users/vidamoda/dev/serving/small-model-custom-server/small_model_custom_server/run_custom_server.py", line 44, in <module>
    server.run(port=8000, generate_client_file=False, log_level=settings.log_level.lower())
  File "/Users/vidamoda/dev/LitServe/src/litserve/server.py", line 445, in run
    manager, litserve_workers = self.launch_inference_worker(num_api_servers)
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/vidamoda/dev/LitServe/src/litserve/server.py", line 216, in launch_inference_worker
    self._logger_connector.run(self)
  File "/Users/vidamoda/dev/LitServe/src/litserve/loggers.py", line 143, in run
    process.start()
  File "/Users/vidamoda/.pyenv/versions/3.12.1/lib/python3.12/multiprocessing/process.py", line 121, in start
    self._popen = self._Popen(self)
                  ^^^^^^^^^^^^^^^^^
  File "/Users/vidamoda/.pyenv/versions/3.12.1/lib/python3.12/multiprocessing/context.py", line 289, in _Popen
    return Popen(process_obj)
           ^^^^^^^^^^^^^^^^^^
  File "/Users/vidamoda/.pyenv/versions/3.12.1/lib/python3.12/multiprocessing/popen_spawn_posix.py", line 32, in __init__
    super().__init__(process_obj)
  File "/Users/vidamoda/.pyenv/versions/3.12.1/lib/python3.12/multiprocessing/popen_fork.py", line 19, in __init__
    self._launch(process_obj)
  File "/Users/vidamoda/.pyenv/versions/3.12.1/lib/python3.12/multiprocessing/popen_spawn_posix.py", line 47, in _launch
    reduction.dump(process_obj, fp)
  File "/Users/vidamoda/.pyenv/versions/3.12.1/lib/python3.12/multiprocessing/reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)
AttributeError: Can't pickle local object 'MultiProcessValue.<locals>.MmapedValue'

Issue seems to be related to pickling the Prometheus metric but I wanted to know if you had a minimum working code example that could help me get started, just so that I can rule out any concerns regarding user error.
Thanks again for all the great work regarding the project!

@aniketmaurya
Copy link
Collaborator Author

hey @vrdn-23, created a fix for this here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants