From 9639f6dc0ab12b2fbcec848a22cd3b32b60989f8 Mon Sep 17 00:00:00 2001 From: Fangjun Kuang Date: Fri, 17 Dec 2021 20:19:36 +0800 Subject: [PATCH] Minor fixes. --- README.md | 2 +- egs/librispeech/ASR/RESULTS.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 217c8e744..23389d483 100644 --- a/README.md +++ b/README.md @@ -66,7 +66,7 @@ We provide a Colab notebook to run a pre-trained TDNN LSTM CTC model: [![Open I Using Conformer as encoder. -The best WER we currently have is: +The best WER with greedy search is: | | test-clean | test-other | |-----|------------|------------| diff --git a/egs/librispeech/ASR/RESULTS.md b/egs/librispeech/ASR/RESULTS.md index 4e62af2c3..1f5edb571 100644 --- a/egs/librispeech/ASR/RESULTS.md +++ b/egs/librispeech/ASR/RESULTS.md @@ -12,7 +12,7 @@ The best WER is |-----|------------|------------| | WER | 3.16 | 7.71 | -using `--epoch 26 --avg 12` during decoding. +using `--epoch 26 --avg 12` during decoding with greedy search. The training command to reproduce the above WER is: