20 Commits

Author SHA1 Message Date
Daniel Povey
34500afc43 Various bug fixes 2022-04-02 20:06:43 +08:00
Daniel Povey
8be10d3d6c First draft of model rework 2022-04-02 20:03:21 +08:00
Daniel Povey
eec597fdd5 Merge changes from master 2022-04-02 18:45:20 +08:00
Daniel Povey
709c387ce6 Initial refactoring to remove unnecessary vocab_size 2022-03-30 21:40:22 +08:00
Daniel Povey
4e453a4bf9 Rework conformer, remove some code. 2022-03-29 23:41:13 +08:00
Daniel Povey
11124b03ea Refactoring and simplifying conformer and frontend 2022-03-29 20:32:14 +08:00
Daniel Povey
262388134d Increase model_warm_step to 4k 2022-03-27 11:18:16 +08:00
Daniel Povey
d2ed3dfc90 Fix bug 2022-03-25 20:35:11 +08:00
Daniel Povey
4b650e9f01 Make warmup work by scaling layer contributions; leave residual layer-drop 2022-03-25 20:34:33 +08:00
Daniel Povey
aab72bc2a5 Add changes from master to decode.py, train.py 2022-03-24 13:10:54 +08:00
Daniel Povey
9a8aa1f54a Change how warmup works. 2022-03-22 15:36:20 +08:00
Daniel Povey
4004ca81d8 Increase warm_step (and valid_interval) 2022-03-22 13:32:24 +08:00
Daniel Povey
b82a505dfc Reduce initial pruned_loss scale from 0.01 to 0.0 2022-03-22 12:30:48 +08:00
Daniel Povey
ccbf8ba086 Incorporate changes from master into pruned_transducer_stateless2. 2022-03-21 21:12:43 +08:00
Daniel Povey
0ee2404ff0 Remove logging code that broke with newer Lhotse; fix bug with pruned_loss 2022-03-19 14:01:45 +08:00
Daniel Povey
2dfcd8f117 Double warm_step 2022-03-18 16:38:36 +08:00
Daniel Povey
cbe6b175d1 Reduce warmup scale on pruned loss form 0.1 to 0.01. 2022-03-17 16:46:59 +08:00
Daniel Povey
acc0eda5b0 Scale down pruned loss in warmup mode 2022-03-17 16:09:35 +08:00
Daniel Povey
13db33ffa2 Fix diagnostics-getting code 2022-03-17 15:53:53 +08:00
Daniel Povey
11bea4513e Add remaining files in pruned_transducer_stateless2 2022-03-17 11:17:52 +08:00