k2-fsa
|
a1277c9ae9
|
fix grad scaler
|
2025-07-01 10:39:05 +08:00 |
|
k2-fsa
|
f186e1d427
|
Fix weights_only=False
|
2025-06-30 23:30:11 +08:00 |
|
k2-fsa
|
a53c323750
|
Fix CI warnings
|
2025-06-30 21:46:18 +08:00 |
|
Fangjun Kuang
|
d4d4f281ec
|
Revert "Replace deprecated pytorch methods (#1814)" (#1841)
This reverts commit 3e4da5f78160d3dba3bdf97968bd7ceb8c11631f.
|
2024-12-18 16:49:57 +08:00 |
|
Li Peng
|
3e4da5f781
|
Replace deprecated pytorch methods (#1814)
* Replace deprecated pytorch methods
- torch.cuda.amp.GradScaler(...) => torch.amp.GradScaler("cuda", ...)
- torch.cuda.amp.autocast(...) => torch.amp.autocast("cuda", ...)
* Replace `with autocast(...)` with `with autocast("cuda", ...)`
Co-authored-by: Li Peng <lipeng@unisound.ai>
|
2024-12-16 10:24:16 +08:00 |
|
Zengwei Yao
|
334beed2af
|
fix usages of returned losses after adding attention-decoder in zipformer (#1689)
|
2024-07-12 16:50:58 +08:00 |
|
Teo Wen Shen
|
19048e155b
|
Cast grad_scale in whiten to float (#1663)
* cast grad_scale in whiten to float
* fix cast in zipformer_lora
|
2024-07-11 15:12:30 +08:00 |
|
Xiaoyu Yang
|
2dfd5dbf8b
|
Add LoRA for Zipformer (#1540)
|
2024-03-15 17:19:23 +08:00 |
|