mirror of
https://github.com/k2-fsa/icefall.git
synced 2025-09-08 16:44:20 +00:00
Update results.
This commit is contained in:
parent
a227bd76b4
commit
8d2797d7cd
17
README.md
17
README.md
@ -35,6 +35,9 @@ We do provide a Colab notebook for this recipe.
|
|||||||
|
|
||||||
### LibriSpeech
|
### LibriSpeech
|
||||||
|
|
||||||
|
Please see <https://github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/README.md>
|
||||||
|
for the **latest** results.
|
||||||
|
|
||||||
We provide 4 models for this recipe:
|
We provide 4 models for this recipe:
|
||||||
|
|
||||||
- [conformer CTC model][LibriSpeech_conformer_ctc]
|
- [conformer CTC model][LibriSpeech_conformer_ctc]
|
||||||
@ -92,6 +95,20 @@ in the decoding.
|
|||||||
|
|
||||||
We provide a Colab notebook to run a pre-trained transducer conformer + stateless decoder model: [](https://colab.research.google.com/drive/1CO1bXJ-2khDckZIW8zjOPHGSKLHpTDlp?usp=sharing)
|
We provide a Colab notebook to run a pre-trained transducer conformer + stateless decoder model: [](https://colab.research.google.com/drive/1CO1bXJ-2khDckZIW8zjOPHGSKLHpTDlp?usp=sharing)
|
||||||
|
|
||||||
|
|
||||||
|
#### k2 pruned RNN-T
|
||||||
|
|
||||||
|
| | test-clean | test-other |
|
||||||
|
|-----|------------|------------|
|
||||||
|
| WER | 2.57 | 5.95 |
|
||||||
|
|
||||||
|
#### k2 pruned RNN-T + GigaSpeech
|
||||||
|
|
||||||
|
| | test-clean | test-other |
|
||||||
|
|-----|------------|------------|
|
||||||
|
| WER | 2.19 | 4.97 |
|
||||||
|
|
||||||
|
|
||||||
### Aishell
|
### Aishell
|
||||||
|
|
||||||
We provide two models for this recipe: [conformer CTC model][Aishell_conformer_ctc]
|
We provide two models for this recipe: [conformer CTC model][Aishell_conformer_ctc]
|
||||||
|
@ -127,6 +127,29 @@ The Nbest oracle WER is computed using the following steps:
|
|||||||
- 5. The path with the lowest edit distance is the final output and is used to
|
- 5. The path with the lowest edit distance is the final output and is used to
|
||||||
compute the WER
|
compute the WER
|
||||||
|
|
||||||
|
The command to compute the Nbest oracle WER is:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
for epoch in 27; do
|
||||||
|
for avg in 10 ; do
|
||||||
|
for num_paths in 50 100 200 400; do
|
||||||
|
for nbest_scale in 0.5 0.8 1.0; do
|
||||||
|
./pruned_transducer_stateless3/decode.py \
|
||||||
|
--epoch $epoch \
|
||||||
|
--avg $avg \
|
||||||
|
--exp-dir ./pruned_transducer_stateless3/exp \
|
||||||
|
--max-duration 600 \
|
||||||
|
--decoding-method fast_beam_search_nbest_oracle \
|
||||||
|
--num-paths $num_paths \
|
||||||
|
--max-states 32 \
|
||||||
|
--beam 8 \
|
||||||
|
--nbest-scale $nbest_scale
|
||||||
|
done
|
||||||
|
done
|
||||||
|
done
|
||||||
|
done
|
||||||
|
```
|
||||||
|
|
||||||
### LibriSpeech BPE training results (Pruned Transducer 2)
|
### LibriSpeech BPE training results (Pruned Transducer 2)
|
||||||
|
|
||||||
[pruned_transducer_stateless2](./pruned_transducer_stateless2)
|
[pruned_transducer_stateless2](./pruned_transducer_stateless2)
|
||||||
|
Loading…
x
Reference in New Issue
Block a user