mirror of
https://github.com/k2-fsa/icefall.git
synced 2025-09-19 05:54:20 +00:00
minor fixes
This commit is contained in:
parent
2f1af8f303
commit
a6462490fb
@ -70,9 +70,9 @@ As the initial step, let's download the pre-trained model.
|
|||||||
|
|
||||||
.. code-block:: bash
|
.. code-block:: bash
|
||||||
|
|
||||||
$ git lfs install
|
$ GIT_LFS_SKIP_SMUDGE=1 git clone https://huggingface.co/Zengwei/icefall-asr-librispeech-pruned-transducer-stateless7-streaming-2022-12-29
|
||||||
$ git clone https://huggingface.co/Zengwei/icefall-asr-librispeech-pruned-transducer-stateless7-streaming-2022-12-29
|
|
||||||
$ pushd icefall-asr-librispeech-pruned-transducer-stateless7-streaming-2022-12-29/exp
|
$ pushd icefall-asr-librispeech-pruned-transducer-stateless7-streaming-2022-12-29/exp
|
||||||
|
$ git lfs pull --include "pretrained.pt"
|
||||||
$ ln -s pretrained.pt epoch-99.pt # create a symbolic link so that the checkpoint can be loaded
|
$ ln -s pretrained.pt epoch-99.pt # create a symbolic link so that the checkpoint can be loaded
|
||||||
|
|
||||||
To test the model, let's have a look at the decoding results **without** using LM. This can be done via the following command:
|
To test the model, let's have a look at the decoding results **without** using LM. This can be done via the following command:
|
||||||
@ -81,8 +81,9 @@ To test the model, let's have a look at the decoding results **without** using L
|
|||||||
|
|
||||||
$ exp_dir=./icefall-asr-librispeech-pruned-transducer-stateless7-streaming-2022-12-29/exp/
|
$ exp_dir=./icefall-asr-librispeech-pruned-transducer-stateless7-streaming-2022-12-29/exp/
|
||||||
$ ./pruned_transducer_stateless7_streaming/decode.py \
|
$ ./pruned_transducer_stateless7_streaming/decode.py \
|
||||||
--epoch 30 \
|
--epoch 99 \
|
||||||
--avg 9 \
|
--avg 1 \
|
||||||
|
--use-averaged-model False \
|
||||||
--exp-dir $exp_dir \
|
--exp-dir $exp_dir \
|
||||||
--max-duration 600 \
|
--max-duration 600 \
|
||||||
--decode-chunk-len 32 \
|
--decode-chunk-len 32 \
|
||||||
@ -102,11 +103,11 @@ Note that the bi-gram is estimated on the LibriSpeech 960 hours' text.
|
|||||||
|
|
||||||
.. code-block:: bash
|
.. code-block:: bash
|
||||||
|
|
||||||
$ git lfs install
|
|
||||||
$ # download the external LM
|
$ # download the external LM
|
||||||
$ git clone https://huggingface.co/ezerhouni/icefall-librispeech-rnn-lm
|
$ GIT_LFS_SKIP_SMUDGE=1 git clone https://huggingface.co/ezerhouni/icefall-librispeech-rnn-lm
|
||||||
$ # create a symbolic link so that the checkpoint can be loaded
|
$ # create a symbolic link so that the checkpoint can be loaded
|
||||||
$ pushd icefall-librispeech-rnn-lm/exp
|
$ pushd icefall-librispeech-rnn-lm/exp
|
||||||
|
$ git lfs pull --include "pretrained.pt"
|
||||||
$ ln -s pretrained.pt epoch-99.pt
|
$ ln -s pretrained.pt epoch-99.pt
|
||||||
$ popd
|
$ popd
|
||||||
$
|
$
|
||||||
|
@ -31,9 +31,9 @@ As the initial step, let's download the pre-trained model.
|
|||||||
|
|
||||||
.. code-block:: bash
|
.. code-block:: bash
|
||||||
|
|
||||||
$ git lfs install
|
$ GIT_LFS_SKIP_SMUDGE=1 git clone https://huggingface.co/Zengwei/icefall-asr-librispeech-pruned-transducer-stateless7-streaming-2022-12-29
|
||||||
$ git clone https://huggingface.co/Zengwei/icefall-asr-librispeech-pruned-transducer-stateless7-streaming-2022-12-29
|
|
||||||
$ pushd icefall-asr-librispeech-pruned-transducer-stateless7-streaming-2022-12-29/exp
|
$ pushd icefall-asr-librispeech-pruned-transducer-stateless7-streaming-2022-12-29/exp
|
||||||
|
$ git lfs pull --include "pretrained.pt"
|
||||||
$ ln -s pretrained.pt epoch-99.pt # create a symbolic link so that the checkpoint can be loaded
|
$ ln -s pretrained.pt epoch-99.pt # create a symbolic link so that the checkpoint can be loaded
|
||||||
|
|
||||||
To test the model, let's have a look at the decoding results without using LM. This can be done via the following command:
|
To test the model, let's have a look at the decoding results without using LM. This can be done via the following command:
|
||||||
@ -42,8 +42,9 @@ To test the model, let's have a look at the decoding results without using LM. T
|
|||||||
|
|
||||||
$ exp_dir=./icefall-asr-librispeech-pruned-transducer-stateless7-streaming-2022-12-29/exp/
|
$ exp_dir=./icefall-asr-librispeech-pruned-transducer-stateless7-streaming-2022-12-29/exp/
|
||||||
$ ./pruned_transducer_stateless7_streaming/decode.py \
|
$ ./pruned_transducer_stateless7_streaming/decode.py \
|
||||||
--epoch 30 \
|
--epoch 99 \
|
||||||
--avg 9 \
|
--avg 1 \
|
||||||
|
--use-averaged-model False \
|
||||||
--exp-dir $exp_dir \
|
--exp-dir $exp_dir \
|
||||||
--max-duration 600 \
|
--max-duration 600 \
|
||||||
--decode-chunk-len 32 \
|
--decode-chunk-len 32 \
|
||||||
@ -63,10 +64,12 @@ Training a language model usually takes a long time, we can download a pre-train
|
|||||||
|
|
||||||
.. code-block:: bash
|
.. code-block:: bash
|
||||||
|
|
||||||
$ git lfs install
|
$ # download the external LM
|
||||||
$ git clone https://huggingface.co/ezerhouni/icefall-librispeech-rnn-lm
|
$ GIT_LFS_SKIP_SMUDGE=1 git clone https://huggingface.co/ezerhouni/icefall-librispeech-rnn-lm
|
||||||
|
$ # create a symbolic link so that the checkpoint can be loaded
|
||||||
$ pushd icefall-librispeech-rnn-lm/exp
|
$ pushd icefall-librispeech-rnn-lm/exp
|
||||||
$ ln -s pretrained.pt epoch-99.pt # create a symbolic link so that the checkpoint can be loaded
|
$ git lfs pull --include "pretrained.pt"
|
||||||
|
$ ln -s pretrained.pt epoch-99.pt
|
||||||
$ popd
|
$ popd
|
||||||
|
|
||||||
.. note::
|
.. note::
|
||||||
|
Loading…
x
Reference in New Issue
Block a user