Skip to content

Commit

Permalink
fix uncompressed path
Browse files Browse the repository at this point in the history
  • Loading branch information
Sara Adkins committed Jun 19, 2024
1 parent 9a28cd7 commit 6ec50d5
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion examples/llama7b_sparse_quantized/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,10 +73,12 @@ run the following in the same Python instance as the previous steps.

```python
import torch
import os
from sparseml.transformers import SparseAutoModelForCausalLM

compressed_output_dir = "output_llama7b_2:4_w4a16_channel_compressed"
model = SparseAutoModelForCausalLM.from_pretrained(output_dir, torch_dtype=torch.bfloat16)
uncompressed_path = os.path.join(output_dir, "stage_quantization")
model = SparseAutoModelForCausalLM.from_pretrained(uncompressed_path, torch_dtype=torch.bfloat16)
model.save_pretrained(compressed_output_dir, save_compressed=True)
```

Expand Down

0 comments on commit 6ec50d5

Please sign in to comment.