HILBERT
HILBERT is the BERT-Large model for Hungarian Trained on the 4 BN NYTI-BERT corpus. One of the pioneers of the revolutionary transformer models, BERT has caused a sweeping success in the field of neural NLP.
HIL-ELECTRA
ELECTRA is based on the GAN (Generative adversarial network) method. Two ELECTRA models were trained:
- ELECTRA wiki: Trained on Hungarian Wikipedia. Training time: ~5 days.
- ELECTRA NYTI-BERT: Trained on NYTI-BERT v1 corpus (contains the Hungarian Wikipedia). Training time: ~7 days.
For further details see this page »
To DOWNLOAD the models, please fill out the registration form: » REGISTRATION FORM «