mirror of
https://github.com/k2-fsa/icefall.git
synced 2025-08-08 09:32:20 +00:00
- Introduce unified AMP helpers (create_grad_scaler, torch_autocast) to handle deprecations in PyTorch ≥2.3.0 - Replace direct uses of torch.cuda.amp.GradScaler and torch.cuda.amp.autocast with the new utilities across all training and inference scripts - Update all torch.load calls to include weights_only=False for compatibility with newer PyTorch versions
Introduction
This recipe includes some different ASR models trained with Alimeeting (far).
./RESULTS.md contains the latest results.
Transducers
There are various folders containing the name transducer
in this folder.
The following table lists the differences among them.
Encoder | Decoder | Comment | |
---|---|---|---|
pruned_transducer_stateless2 |
Conformer(modified) | Embedding + Conv1d | Using k2 pruned RNN-T loss |
The decoder in transducer_stateless
is modified from the paper
Rnn-Transducer with Stateless Prediction Network.
We place an additional Conv1d layer right after the input embedding layer.