4 Commits

Author SHA1 Message Date
Fangjun Kuang
fba5e67d5e
Fix CI tests. (#1974)
- Introduce unified AMP helpers (create_grad_scaler, torch_autocast) to handle 
  deprecations in PyTorch ≥2.3.0

- Replace direct uses of torch.cuda.amp.GradScaler and torch.cuda.amp.autocast 
  with the new utilities across all training and inference scripts

- Update all torch.load calls to include weights_only=False for compatibility with 
  newer PyTorch versions
2025-07-01 13:47:55 +08:00
Fangjun Kuang
8136ad775b
Use high_freq -400 in computing fbank features. (#1447)
See also https://github.com/k2-fsa/sherpa-onnx/issues/514
2024-01-04 13:59:32 +08:00
zr_jin
a81396b482
Use tokens.txt to replace bpe.model (#1162) 2023-08-12 16:53:59 +08:00
Zengwei Yao
d167aad4ab
Add streaming zipformer (#787)
* add streaming zipformer codes

* add test_model.py

* add export.py, pretrained.py, jit_pretrained.py

* add cached_len for pooling module

* add jit_trace_export.py and jit_trace_pretrained.py

* fix bug in jit.trace

* update RESULTS.md

* add CI test

* minor fix in pruned_transducer_stateless7/zipformer.py

* update README.md
2022-12-30 10:52:18 +08:00