6 Commits

Author SHA1 Message Date
Fangjun Kuang
34e40a86b3
Fix exporting decoder model to onnx (#1264)
* Use torch.jit.script() to export the decoder model

See also https://github.com/k2-fsa/sherpa-onnx/issues/327
2023-09-22 09:57:15 +08:00
zr_jin
a81396b482
Use tokens.txt to replace bpe.model (#1162) 2023-08-12 16:53:59 +08:00
Yifan Yang
ca60ced213
Fix typo (#1114)
* Fix typo for zipformer

* Fix typo for pruned_transducer_stateless7

* Fix typo for pruned_transducer_stateless7_ctc

* Fix typo for pruned_transducer_stateless7_ctc_bs

* Fix typo for pruned_transducer_stateless7_streaming

* Fix typo for pruned_transducer_stateless7_streaming_multi

* Fix file permissions for pruned_transducer_stateless7_streaming_multi

* Fix typo for pruned_transducer_stateless8

* Fix typo for pruned_transducer_stateless6

* Fix typo for pruned_transducer_stateless5

* Fix typo for pruned_transducer_stateless4

* Fix typo for pruned_transducer_stateless3
2023-06-02 14:12:42 +08:00
Fangjun Kuang
a632b24c35
Export int8 quantized models for non-streaming Zipformer. (#977)
* Export int8 quantized models for non-streaming Zipformer.

* Delete export-onnx.py

* Export int8 models for other folders
2023-03-31 22:46:19 +08:00
Fangjun Kuang
8c3ea93fc8
Save meta data to exported ONNX models (#968) 2023-03-27 11:39:29 +08:00
Fangjun Kuang
2b995639b7
Add ONNX support for Zipformer and ConvEmformer (#884) 2023-02-09 00:02:38 +08:00