This repository has been archived on 2025-12-23. You can view files and clone it, but you cannot make any changes to it's state, such as pushing and creating new issues, pull requests or comments.
2025ML-project-neural_compr.../models/README.md

693 B

Meta information about training

The trained models that are saved here follow the following naming convention:

  • Optuna intermediate model: [model]-[dataset]-[context].pt
  • Fully trained model: [model]-[dataset]-full-[context].pt

The following parameters were used:

  • training size: 2048 for optuna, 209715 for full training
  • context sizes: {128, 256}

The models were trained with the following command:

uv run python ./results/[cnn,autoencoder] train --method [full,optuna] \
    --data-root ./data --dataset [genome,enwik9] --context [128,256] --size [2048,209715] \
    --model-save-path ./models/<name> --model-load-path <path if full trainig, output from optuna>