mirror of
https://github.com/k2-fsa/icefall.git
synced 2025-09-19 05:54:20 +00:00
Update docs/source/decoding-with-langugage-models/LODR.rst
Co-authored-by: Fangjun Kuang <csukuangfj@gmail.com>
This commit is contained in:
parent
e429a152e9
commit
2f1af8f303
@ -148,7 +148,7 @@ Then, we perform LODR decoding by setting ``--decoding-method`` to ``modified_be
|
|||||||
--tokens-ngram 2 \
|
--tokens-ngram 2 \
|
||||||
--ngram-lm-scale $LODR_scale
|
--ngram-lm-scale $LODR_scale
|
||||||
|
|
||||||
There are two extra arguments need to be given when doing LODR. ``--tokens-ngram`` specifies the order of n-gram. As we
|
There are two extra arguments that need to be given when doing LODR. ``--tokens-ngram`` specifies the order of n-gram. As we
|
||||||
are using a bi-gram, we set it to 2. ``--ngram-lm-scale`` is the scale of the bi-gram, it should be a negative number
|
are using a bi-gram, we set it to 2. ``--ngram-lm-scale`` is the scale of the bi-gram, it should be a negative number
|
||||||
as we are subtracting the bi-gram's score during decoding.
|
as we are subtracting the bi-gram's score during decoding.
|
||||||
|
|
||||||
|
Loading…
x
Reference in New Issue
Block a user