From 79d7c943971064cb1a31d47681d05510a59893c6 Mon Sep 17 00:00:00 2001 From: Quandwang Date: Fri, 22 Jul 2022 14:52:12 +0800 Subject: [PATCH] upload the decoding results of rnn-lm to the RESULTS --- egs/librispeech/ASR/RESULTS.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/egs/librispeech/ASR/RESULTS.md b/egs/librispeech/ASR/RESULTS.md index 416d3d28e..be5f7cd8c 100644 --- a/egs/librispeech/ASR/RESULTS.md +++ b/egs/librispeech/ASR/RESULTS.md @@ -2037,6 +2037,7 @@ The decoding configuration for the reworked model is --epoch 30, --avg 8, --use- | whole-lattice-rescoring| 2.66% | 5.76%| 4.21%| 2.56%| 6.04%| 4.30%| | attention-decoder | 2.59% | 5.54%| 4.07%| 2.41%| 5.77%| 4.09%| | nbest-oracle | 1.53% | 3.47%| 2.50%| 1.69%| 4.02%| 2.86%| +|rnn-lm | 2.37% | 4.98%| 3.68%| 2.31%| 5.35%| 3.83%| @@ -2045,9 +2046,9 @@ For other decoding method, the average WER of the two test sets with the two mod Except for the 1best and nbest method, the overall performance of reworked model is better than the baseline model. -To reproduce the above result, use the following commands: +To reproduce the above result, use the following commands. -The training commands are +The training commands are: ```bash WORLD_SIZE=8