mirror of
https://github.com/k2-fsa/icefall.git
synced 2025-08-27 02:34:21 +00:00
Update README.
This commit is contained in:
parent
1e4920410f
commit
86604b197d
@ -40,8 +40,9 @@ and [TDNN LSTM CTC model][LibriSpeech_tdnn_lstm_ctc].
|
||||
The best WER we currently have is:
|
||||
|
||||
| | test-clean | test-other |
|
||||
|--|--|--|
|
||||
|WER| 2.57% | 5.94% |
|
||||
|-----|------------|------------|
|
||||
| WER | 2.42 | 5.73 |
|
||||
|
||||
|
||||
We provide a Colab notebook to run a pre-trained conformer CTC model: [](https://colab.research.google.com/drive/1huyupXAcHsUrKaWfI83iMEJ6J0Nh0213?usp=sharing)
|
||||
|
||||
@ -50,8 +51,8 @@ We provide a Colab notebook to run a pre-trained conformer CTC model: [](https://colab.research.google.com/drive/1kNmDXNMwREi0rZGAOIAOJo93REBuOTcd?usp=sharing)
|
||||
|
||||
|
@ -33,7 +33,8 @@ export CUDA_VISIBLE_DEVICES="0,1,2,3"
|
||||
--world-size 4 \
|
||||
--bucketing-sampler 1 \
|
||||
--start-epoch 0 \
|
||||
--num-epochs 80
|
||||
--num-epochs 90
|
||||
# Note: It trains for 90 epochs, but the best WER is at epoch-77.pt
|
||||
```
|
||||
|
||||
and the following command for decoding
|
||||
@ -55,6 +56,9 @@ and the following command for decoding
|
||||
You can find the pre-trained model by visiting
|
||||
<https://huggingface.co/csukuangfj/icefall-asr-librispeech-conformer-ctc-jit-bpe-500-2021-11-09>
|
||||
|
||||
The tensorboard log for training is available at
|
||||
<https://tensorboard.dev/experiment/hZDWrZfaSqOMqtW0NEfXKg/#scalars>
|
||||
|
||||
|
||||
#### 2021-08-19
|
||||
(Wei Kang): Result of https://github.com/k2-fsa/icefall/pull/13
|
||||
|
Loading…
x
Reference in New Issue
Block a user