Hi I'm trying to try out RetroMAE pretraining of your model on my domain data. Do you make available the encoder MLM head and decoder you used during pretraining stage to perform continued pretraining?
· Sign up or log in to comment