# Meta information about training The trained models that are saved here follow the following naming convention: * Optuna intermediate model: `[model]-[dataset]-[context].pt` * Fully trained model: `[model]-[dataset]-full-[context].pt` The following parameters were used: * training size: 2048 for optuna, 209715 for full training * context sizes: {128, 256} The models were trained with the following command: ```bash uv run python ./results/[cnn,autoencoder] train --method [full,optuna] \ --data-root ./data --dataset [genome,enwik9] --context [128,256] --size [2048,209715] \ --model-save-path ./models/ --model-load-path ```