Skip to content

Commit 86da823

Browse files
Fix default component and add missing table divider (#10)
* Fix default component name for Transformers * Run `uv version 0.2.2` * Fix `print_report` divider on multiple dtypes * Empty commit to add co-author Contributed with #9, which is a similar fix for the `__metadata__` validation for Transformers models as the one in this PR, hence adding them as co-author for credits, thanks! Co-Authored-By: Bae-ChangHyun <[email protected]> * Run `pre-commit run --all-files` --------- Co-authored-by: Bae-ChangHyun <[email protected]>
1 parent 6cc92e9 commit 86da823

File tree

4 files changed

+12
-11
lines changed

4 files changed

+12
-11
lines changed

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "hf-mem"
3-
version = "0.2.1"
3+
version = "0.2.2"
44
description = "A CLI to estimate inference memory requirements for Hugging Face models, written in Python"
55
readme = "README.md"
66
authors = [

src/hf_mem/cli.py

Lines changed: 6 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -99,10 +99,9 @@ async def run(
9999
# set the default component name to `0_Transformer` as defined in modules.json
100100
if "config_sentence_transformers.json" in file_paths:
101101
raw_metadata = {"0_Transformer": raw_metadata}
102-
103-
# NOTE: If the model is a transformers model, then we simply set the component name to `Transformer`, to
104-
# make sure that we provide the expected input to the `parse_safetensors_metadata`
105-
if "__metadata__" in raw_metadata:
102+
else:
103+
# NOTE: If the model is a transformers model, then we simply set the component name to `Transformer`, to
104+
# make sure that we provide the expected input to the `parse_safetensors_metadata`
106105
raw_metadata = {"Transformer": raw_metadata}
107106

108107
metadata = parse_safetensors_metadata(raw_metadata=raw_metadata)
@@ -132,10 +131,9 @@ async def fetch_with_semaphore(url: str) -> Dict[str, Any]:
132131
# set the default component name to `0_Transformer` as defined in modules.json
133132
if "config_sentence_transformers.json" in file_paths:
134133
raw_metadata = {"0_Transformer": raw_metadata}
135-
136-
# NOTE: If the model is a transformers model, then we simply set the component name to `Transformer`, to
137-
# make sure that we provide the expected input to the `parse_safetensors_metadata`
138-
if "__metadata__" in raw_metadata:
134+
else:
135+
# NOTE: If the model is a transformers model, then we simply set the component name to `Transformer`, to
136+
# make sure that we provide the expected input to the `parse_safetensors_metadata`
139137
raw_metadata = {"Transformer": raw_metadata}
140138

141139
metadata = parse_safetensors_metadata(raw_metadata=raw_metadata)

src/hf_mem/print.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -169,7 +169,7 @@ def print_report(
169169
for _, dtype_metadata in value.dtypes.items()
170170
]
171171
)
172-
for dtype, dtype_metadata in value.dtypes.items():
172+
for idx, (dtype, dtype_metadata) in enumerate(value.dtypes.items()):
173173
gb_text = (
174174
f"{_bytes_to_gb(dtype_metadata.bytes_count):.2f} / {_bytes_to_gb(metadata.bytes_count):.2f} GB"
175175
)
@@ -190,4 +190,7 @@ def print_report(
190190
current_len,
191191
)
192192

193+
if idx < len(value.dtypes) - 1:
194+
_print_divider(current_len + 1)
195+
193196
_print_divider(current_len + 1, "bottom")

uv.lock

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)