This repository has been archived on 2025-12-23. You can view files and clone it, but you cannot make any changes to it's state, such as pushing and creating new issues, pull requests or comments.
2025ML-project-neural_compr.../models
Tibo De Peuter 2f869a8a7a
chore: Restructure
# Conflicts:
#	config/measure.py
#	results/compression_results_auto_small.csv
2025-12-19 17:58:43 +01:00
..
autoencoder chore: Restructure 2025-12-19 17:58:43 +01:00
cnn chore: Restructure 2025-12-19 17:58:43 +01:00
README.md Update and rename training.md to models/README.md 2025-12-17 20:55:27 +01:00

Meta information about training

The trained models that are saved here follow the following naming convention:

  • Optuna intermediate model: [model]-[dataset]-[context].pt
  • Fully trained model: [model]-[dataset]-full-[context].pt

The following parameters were used:

  • training size: 2048 for optuna, 209715 for full training
  • context sizes: {128, 256}

The models were trained with the following command:

uv run python ./results/[cnn,autoencoder] train --method [full,optuna] \
    --data-root ./data --dataset [genome,enwik9] --context [128,256] --size [2048,209715] \
    --model-save-path ./models/<name> --model-load-path <path if full trainig, output from optuna>