You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
it's me again^^
I just saw that there is a Pytorch model for distilbert-base-german-cased in huggingface's repo. After my last test with the bigger model, we, the IKON team at the FU Berlin's HCC lab, would be super excited to use these models in our application. Did you also run this distillation experiment by any chance and have the TF checkpoints laying around?
The text was updated successfully, but these errors were encountered:
thanks for your interest and great to hear that our model is useful for you :)
we used the Hugging Face implementation for that, that is using the "original" PyTorch model and it will also output the distilled model as a PyTorch model. So there's no TensorFlow checkpoint that I could provide 😅 But: I think it is possible to convert the model into a TensorFlow compatible one (e.g. the distilbert-base-multilingual-casedshows a TensorFlow version), so I will try to convert it + report back here!
Any updates on the german distilbert TF checkpoint?
We are running a few experiments on bert and would also like to play around with distilbert as well. We are using Hugging face and unfortunately they don't seem to provide a TF checkpoint for distilbert.
Hi dbmdz team,
it's me again^^
I just saw that there is a Pytorch model for distilbert-base-german-cased in huggingface's repo. After my last test with the bigger model, we, the IKON team at the FU Berlin's HCC lab, would be super excited to use these models in our application. Did you also run this distillation experiment by any chance and have the TF checkpoints laying around?
The text was updated successfully, but these errors were encountered: