mirror of
https://github.com/k2-fsa/icefall.git
synced 2025-09-07 08:04:18 +00:00
Minor fixes.
This commit is contained in:
parent
a896d982ec
commit
478bc42910
@ -20,7 +20,7 @@ The following table lists the differences among them.
|
||||
| `pruned_transducer_stateless2` | Conformer(modified) | Embedding + Conv1d | Using k2 pruned RNN-T loss |
|
||||
| `pruned_transducer_stateless3` | Conformer(modified) | Embedding + Conv1d | Using k2 pruned RNN-T loss + using GigaSpeech as extra training data |
|
||||
| `pruned_transducer_stateless4` | Conformer(modified) | Embedding + Conv1d | same as pruned_transducer_stateless2 + save averaged models periodically during training |
|
||||
| `pruned_transducer_stateless5` | Conformer(modified) | Embedding + Conv1d | Using k2 pruned RNN-T loss + more layers + Random combiner|
|
||||
| `pruned_transducer_stateless5` | Conformer(modified) | Embedding + Conv1d | same as pruned_transducer_stateless4 + more layers + random combiner|
|
||||
|
||||
|
||||
The decoder in `transducer_stateless` is modified from the paper
|
||||
|
@ -135,7 +135,7 @@ results at:
|
||||
|
||||
#### Baseline-2
|
||||
|
||||
It has 88.98 M parameters. Compared to the model in pruned_transducer_stateless2, its more
|
||||
It has 88.98 M parameters. Compared to the model in pruned_transducer_stateless2, its has more
|
||||
layers (24 v.s 12) but a narrower model (1536 feedforward dim and 384 encoder dim vs 2048 feed forward dim and 512 encoder dim).
|
||||
|
||||
| | test-clean | test-other | comment |
|
||||
|
Loading…
x
Reference in New Issue
Block a user