Visit complete Math for Machine Learning roadmap

← Back to Topics List

ELECTRA (Efficiently Learning an Encoder that Classifies Token Replacements Accurately) resource

  • ELECTRA (Efficiently Learning an Encoder that Classifies Token Replacements Accurately) is a new pre-training approach that attempts to match or exceed downstream performance of a pretrained MLM while using significantly less compute resources at the pretraining stage. The ELECTRA framework consists of a generator and a discriminator similar to a generative adversarial network (GAN) and works on the pretraining task known as “replaced token detection”. The generator is a small masked language model such as BERT that tries to predict the true identity of randomly masked input tokens. The output of the generator is then fed into the discriminator which predicts whether each input token was original or replaced by the generator.

For more details visit:

ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators (paper explained)

ELECTRO

Openreview

Resources Community KGx AICbe YouTube

by Devansh Shukla

"AI Tamil Nadu formely known as AI Coimbatore is a close-Knit community initiative by Navaneeth with a goal to offer world-class AI education to anyone in Tamilnadu for free."