19 lines
693 B
Markdown
19 lines
693 B
Markdown
# Meta information about training
|
|
|
|
The trained models that are saved here follow the following naming convention:
|
|
|
|
* Optuna intermediate model: `[model]-[dataset]-[context].pt`
|
|
* Fully trained model: `[model]-[dataset]-full-[context].pt`
|
|
|
|
The following parameters were used:
|
|
|
|
* training size: 2048 for optuna, 209715 for full training
|
|
* context sizes: {128, 256}
|
|
|
|
The models were trained with the following command:
|
|
|
|
```bash
|
|
uv run python ./results/[cnn,autoencoder] train --method [full,optuna] \
|
|
--data-root ./data --dataset [genome,enwik9] --context [128,256] --size [2048,209715] \
|
|
--model-save-path ./models/<name> --model-load-path <path if full trainig, output from optuna>
|
|
```
|