About ASTROMER

ASTROMER Architecture

ASTROMER's architecture is based on the Transformer architecture proposed in Vaswani (2017), with an encoder-decoder Architecture.



Pre-trained Weights

You can easily download the pre-trained weights of ASTROMER's model:

Version tag Pretraining Data Description RMSE/R-Square Link
v0 MACHO Paper's model 0.147/0.80 Download
v1* MACHO Mask token and residual connections added 0.113/0.72 Download

(*) Best performance up-to-date


Pre-trained Weights

You can easily download the pre-trained weights of ASTROMER's model:

Description: Paper's model
Test RMSE: 0.147
Test R-Square: 0.80
Link: Download

Description: Mask token and residual connections added
Test RMSE: 0.113
Test R-Square: 0.72
Link: Download

(*) Best performance up-to-date

Contributing to ASTROMER

If you train your model from scratch, you can share your pre-trained weights by submitting a pull request in the weights repository.

More information

You can learn more about the project in the following links: