Skip to content

Commit

Permalink
keep ruff happy
Browse files Browse the repository at this point in the history
  • Loading branch information
ajtejankar committed Oct 28, 2024
1 parent 5dfb5e5 commit bca4daf
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion server/lorax_server/models/flash_causal_lm.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@
warmup_mode,
)
from lorax_server.utils.tokenizer import TokenizerManager
from lorax_server.utils.torch_utils import is_fp8_kv, is_fp8_supported, is_fp8
from lorax_server.utils.torch_utils import is_fp8, is_fp8_kv, is_fp8_supported
from lorax_server.utils.weights import Weights

ADAPTER_MEMORY_FRACTION = float(os.getenv("ADAPTER_MEMORY_FRACTION", "0.1"))
Expand Down

0 comments on commit bca4daf

Please sign in to comment.