Skip to content

Commit 648adde

Browse files
committed
remove redundant files
1 parent 6efa70c commit 648adde

File tree

5 files changed

+6
-52
lines changed

5 files changed

+6
-52
lines changed

.gitignore

+3-1
Original file line numberDiff line numberDiff line change
@@ -213,4 +213,6 @@ log_0824
213213
log_0828
214214
deprecated
215215
data_test
216-
run.sh
216+
run.sh
217+
run_c.sh
218+
run_parsimon_eval.sh

README.md

-1
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,6 @@ python gen_ckpt.py --dir_output=/data1/lichenni/m3/ckpts
8080
Note the checkpoints will be saved in the `ckpts` directory, one is for the Llama-2 model and the other is for the 2-layer MLP model.
8181

8282
7. To replicate paper results in Section 5.2, run the following in the `parsimon-eval/expts/fig_8` directory:
83-
Note all commands can be found in `parsimon-eval/run.sh`
8483

8584
```bash
8685
cargo run --release -- --root=./data --mixes spec/all_dctcp.mix.json ns3-config

clibs/run.sh

-45
This file was deleted.

gen_ckpt.py

+2-4
Original file line numberDiff line numberDiff line change
@@ -34,9 +34,7 @@
3434
# set parameters
3535
model_trained_dir=f"{args.dir_output}/m3_shard2000_nflows1_nhosts3_nsamples20_lr10Gbps/version_0"
3636
output_dir=f"./ckpts"
37-
model_id="_config_e421"
38-
# model_id="_e466"
39-
# model_id="_hpcc_e447"
37+
model_id=""
4038
class m3_inference:
4139
def __init__(self):
4240
self.bucket_thold = 1
@@ -53,7 +51,7 @@ def __init__(self):
5351
training_config = config["training"]
5452
n_params=dataset_config["n_params"]
5553
model = FlowSimTransformer_Path.load_from_checkpoint(
56-
f"{self.dir_train}/checkpoints/best{model_id}.ckpt",
54+
f"{self.dir_train}/checkpoints/last{model_id}.ckpt",
5755
map_location=DEVICE,
5856
n_layer=model_config["n_layer"],
5957
n_head=model_config["n_head"],

parsimon-eval

Submodule parsimon-eval updated 1 file

0 commit comments

Comments
 (0)