Skip to content

Commit 6fb4cf0

Browse files
committed
Update README
1 parent a4fdbf8 commit 6fb4cf0

File tree

4 files changed

+17
-4
lines changed

4 files changed

+17
-4
lines changed

Fine_tuning/PipeStore/main.py

-1
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,6 @@
5454
===================================================
5555
ASPLOS 2024
5656
---------------------------------------------------'''
57-
print(start_message)
5857

5958
client = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
6059

Fine_tuning/Tuner/server.py

+1-2
Original file line numberDiff line numberDiff line change
@@ -28,12 +28,11 @@
2828
===================================================
2929
ASPLOS 2024
3030
---------------------------------------------------'''
31-
print(start_message)
3231

3332
overall_start = time.perf_counter()
3433

3534
### Set communication to clients(SOFA SSDs)
36-
comm = CommUnit(args.split_number, args.num_of_client, args.port, client=CLIENT)
35+
comm = CommUnit(args.num_of_run, args.num_of_client, args.port, client=CLIENT)
3736
comm.get_SSD_path()
3837
comm.send_message(f'dir:{os.path.abspath(os.path.join(os.path.realpath(__file__), os.pardir))}', True)
3938
comm.get_message("Feature Extraction Started")

Offline_inference/resnet50.onnx

97.4 MB
Binary file not shown.

README.md

+16-1
Original file line numberDiff line numberDiff line change
@@ -153,6 +153,15 @@ trtexec --onnx=resnet50.onnx --workspace=8192 --saveEngine=resnet50.engine --bui
153153
```
154154
- The script uses a command-line argument for the port if provided; otherwise, it defaults to 25258.
155155

156+
3. Review the inference results. In this test, we provide two pieces of information: the elapsed time for the feature extraction time/throughput (executed in PipeStore) and the overall fine-tuning time.
157+
158+
```
159+
# In AWS setting
160+
Feature extraction time (sec): 31.360074814000654
161+
Feature extraction throughput (image/sec): 1913.2607417509444
162+
Overall fine-tuning time (sec): 75.19124622900017
163+
```
164+
156165
## Installation & Execution (Offline Inference)
157166

158167
For offline inference evaluation, we provide a simple test code. This code can be executed on the PipeStore side.
@@ -161,7 +170,7 @@ For offline inference evaluation, we provide a simple test code. This code can b
161170

162171
The prerequisites for offline inference are almost identical to those required for fine-tuning.
163172

164-
1. Follow steps 1-7 of the fine-tuning guide.
173+
1. Follow steps 1-7 of the fine-tuning guide (Only PipeStore setup is needed).
165174

166175
2. Install the deflate module by following the instructions below:
167176
```
@@ -177,6 +186,12 @@ The prerequisites for offline inference are almost identical to those required f
177186
../Offline_inference# unzip inference_dataset.zip
178187
```
179188

189+
4. Compile the model specifically for your GPU
190+
```
191+
trtexec --onnx=resnet50.onnx --workspace=8192 --saveEngine=resnet50.engine --buildOnly --inputIOFormats=fp16:chw --outputIOFormats=fp16:chw --fp16
192+
```
193+
194+
180195
### Running the System (Takes < 1 min to run)
181196

182197
1. Execute the following command: the first argument specifies the path to the model's engine, and the second argument specifies the dataset path.

0 commit comments

Comments
 (0)