ASTROMER's architecture is based on the Transformer architecture proposed in Vaswani (2017), with an encoder-decoder Architecture.
You can easily download the pre-trained weights of ASTROMER's model:
If you train your model from scratch, you can share your pre-trained weights by submitting a pull request in the weights repository.
You can learn more about the project in the following links:
Transformer-Based Astronomical Time Series Model with Uncertainty Estimation for Detecting Misclassified Instances (2024)
. Martina Cádiz-Leyton, Guillermo Cabrera-Vives, Pavlos Protopapas, Daniel Moreno-Cartagena, Cristobal Donoso-Oliva.Positional Encodings for Light Curve Transformers: Playing with Positions and Attention (2023). Daniel Moreno-Cartagena, Guillermo Cabrera-Vives, Pavlos Protopapas, Cristobal Donoso-Oliva, Manuel Pérez-Carrasco, Martina Cádiz-Leyton.
ASTROMER: A transformer-based embedding for the representation of light curves (2022). C. Donoso-Oliva, I. Becker, P. Protopapas, G. Cabrera-Vives, Vishnu M., Harsh Vardhan.