k2-fsa
|
a1277c9ae9
|
fix grad scaler
|
2025-07-01 10:39:05 +08:00 |
|
k2-fsa
|
f186e1d427
|
Fix weights_only=False
|
2025-06-30 23:30:11 +08:00 |
|
k2-fsa
|
a53c323750
|
Fix CI warnings
|
2025-06-30 21:46:18 +08:00 |
|
Fangjun Kuang
|
d4d4f281ec
|
Revert "Replace deprecated pytorch methods (#1814)" (#1841)
This reverts commit 3e4da5f78160d3dba3bdf97968bd7ceb8c11631f.
|
2024-12-18 16:49:57 +08:00 |
|
Li Peng
|
3e4da5f781
|
Replace deprecated pytorch methods (#1814)
* Replace deprecated pytorch methods
- torch.cuda.amp.GradScaler(...) => torch.amp.GradScaler("cuda", ...)
- torch.cuda.amp.autocast(...) => torch.amp.autocast("cuda", ...)
* Replace `with autocast(...)` with `with autocast("cuda", ...)`
Co-authored-by: Li Peng <lipeng@unisound.ai>
|
2024-12-16 10:24:16 +08:00 |
|
Zengwei Yao
|
334beed2af
|
fix usages of returned losses after adding attention-decoder in zipformer (#1689)
|
2024-07-12 16:50:58 +08:00 |
|
zr_jin
|
eb132da00d
|
additional instruction for the grad_scale is too small error (#1550)
|
2024-03-14 11:33:49 +08:00 |
|
zr_jin
|
242002e0bd
|
Strengthened style constraints (#1527)
|
2024-03-04 23:28:04 +08:00 |
|
Xiaoyu Yang
|
7e2b561bbf
|
Add recipe for fine-tuning Zipformer with adapter (#1512)
|
2024-02-29 10:57:38 +08:00 |
|