7 Commits

Author SHA1 Message Date
Fangjun Kuang
34e40a86b3
Fix exporting decoder model to onnx (#1264)
* Use torch.jit.script() to export the decoder model

See also https://github.com/k2-fsa/sherpa-onnx/issues/327
2023-09-22 09:57:15 +08:00
zr_jin
a81396b482
Use tokens.txt to replace bpe.model (#1162) 2023-08-12 16:53:59 +08:00
Wei Kang
db71b03026
Support int8 quantization in decoder (#1152) 2023-06-29 16:48:59 +08:00
Fangjun Kuang
968ebd236b
Fix ONNX export of the latest streaming zipformer model. (#1148) 2023-06-27 14:35:59 +08:00
Wei Kang
219bba1310
zipformer wenetspeech (#1130)
* copy files

* update train.py

* small fixes

* Add decode.py

* Fix dataloader in decode.py

* add blank penalty

* Add blank-penalty to other decoding method

* Minor fixes

* add zipformer2 recipe

* Minor fixes

* Remove pruned7

* export and test models

* Replace bpe with tokens in export.py and pretrain.py

* Minor fixes

* Minor fixes

* Minor fixes

* Fix export

* Update results

* Fix zipformer-ctc

* Fix ci

* Fix ci

* Fix CI

* Fix CI

---------

Co-authored-by: Fangjun Kuang <csukuangfj@gmail.com>
2023-06-26 09:33:18 +08:00
Zengwei Yao
0ad037d076
Add CTC loss option in zipformer recipe (#1111)
* add CTC loss option in zipformer recipe

* add ctc_decode.py

* support CTC model export, add jit_pretrained_ctc.py, pretrained_ctc.py

* update README.md and RESULTS.md

* add CI test
2023-06-14 14:27:29 +08:00
danfu
0cb71ad3bc
add updated zipformer onnx export (#1108)
Co-authored-by: Fangjun Kuang <csukuangfj@gmail.com>
2023-06-12 14:02:23 +08:00