1830 Commits

Author SHA1 Message Date
Daniel Povey
0d7161ebec Use get_parameter_groups_with_lr in train.py; bug fixes 2023-01-05 14:11:33 +08:00
Daniel Povey
1db509ea31 Attempt to implement slower learning for downsampled modules 2023-01-05 13:39:22 +08:00
Daniel Povey
b7be18c2f8 Keep only needed changes from Liyong's branch 2023-01-05 12:23:32 +08:00
Daniel Povey
096ebeaf23 take a couple files from liyong's branch 2023-01-05 12:01:42 +08:00
Daniel Povey
22b4a417dd Implement extra_layerdrop 2023-01-04 20:59:58 +08:00
Daniel Povey
b973929d7c Bug fixes to ScheduledFloat 2023-01-04 20:54:05 +08:00
Daniel Povey
ae73469b7e Refactor ScheduledFloat to include PiecewiseLinear 2023-01-04 20:46:42 +08:00
Daniel Povey
f688066517 Merge branch 'scaled_adam_exp823' into scaled_adam_exp843 2023-01-04 17:02:37 +08:00
Daniel Povey
f7d67f5456 Higher dropout schedule for SmallConvolutionModule 2023-01-02 14:58:23 +08:00
Daniel Povey
5223286424 Add SmallConvolutionModule 2023-01-02 14:47:28 +08:00
Daniel Povey
a61bd01e5b convnext1 kernel size 5, 5 to 5, 7 2023-01-02 14:17:51 +08:00
Daniel Povey
e4d0ac0946 Let the feedforward dims be respectively 3*feedforward_dim // 4 and 5*feedforward_dim // 4. 2023-01-02 00:24:12 +08:00
Daniel Povey
3a5b3f640d Remove eps from BasicNorm and reintroduce bias 2023-01-02 00:02:31 +08:00
Daniel Povey
a2227a07fc Revert some changes to Balancer. 2023-01-01 23:02:52 +08:00
Daniel Povey
e52bfb7219 Revert final conv_skip_rate from 0.01 to 0.0 2023-01-01 22:13:13 +08:00
Daniel Povey
460fb945ec Merge branch 'scaled_adam_exp813' into scaled_adam_exp820 2023-01-01 22:12:10 +08:00
Daniel Povey
977d412690 Cosmetic fix 2023-01-01 21:43:14 +08:00
Daniel Povey
dadeb3feec Fixes for jit scripting and osmetic improvements 2023-01-01 14:35:51 +08:00
Daniel Povey
60d491eee6 Bug fix 2023-01-01 14:31:28 +08:00
Daniel Povey
287bd120be Reduce min_abs of zipformer balancer1; constraints on eps of Conv2dSubsampling.out_norm 2023-01-01 14:28:18 +08:00
Daniel Povey
1797d0ec6d Fix bugs in how max_rms/min_rms constraint were applied, which had the effect of making min_rms dominate over mean. 2023-01-01 13:05:41 +08:00
Daniel Povey
8db0636f1d Fix to Balancer to treat max-rms and min-rms losses separately, only max-rms loss scaled up 2023-01-01 00:38:07 +08:00
Daniel Povey
907d28ca2a Make RMS loss dominate mean loss in Balancer if both are active; remove the 4x scale introduced in 814. 2023-01-01 00:09:14 +08:00
Daniel Povey
a2815ea0df Increase max_abs of ZipformerEncoderLayer.balancer2 from 1.0 to 4.0. 2023-01-01 00:00:26 +08:00
Daniel Povey
63472a19b1 Whitespace fix 2022-12-31 23:50:09 +08:00
Daniel Povey
008dbaf745 Use 4 times the normal grad_scale for BasicNorm if max_rms violated. 2022-12-31 23:42:38 +08:00
Daniel Povey
4a4d12c994 Revert kernel size of convnext2 from 5x5 to 7x7 2022-12-31 21:52:11 +08:00
Daniel Povey
d0ae60400e Decrease convnext1 kernel size from 7x7 to 5x5 2022-12-31 17:19:02 +08:00
Daniel Povey
d48b2ccb45 Reduce kernel size of convnext2 from 7 to 5. 2022-12-31 17:10:31 +08:00
Daniel Povey
c533c30442 Increase final conv_skip_rate from 0.0 to 0.01 2022-12-31 15:10:52 +08:00
Daniel Povey
577c3ad390 Adjust balancers of modules; most significant change is to make min_abs of ff2 balancer from 0.5 to 0.1 2022-12-31 14:38:00 +08:00
Daniel Povey
a0c35adca0 Merge branch 'scaled_adam_exp800' into scaled_adam_exp807 2022-12-31 01:23:57 +08:00
Daniel Povey
c15578d0bb Add balancer_ff2 to avoid too small ff2 module 2022-12-31 01:09:17 +08:00
Daniel Povey
9ee4472f36 Decrease min_abs at end of feedforward modules from 0.5 to 0.1. 2022-12-30 23:29:03 +08:00
Daniel Povey
8952b69d42 Reduce BasicNorm.eps default from 0.25 to 0.1 2022-12-30 16:28:44 +08:00
Daniel Povey
d604284f16 Change initial log_scale back to 1.0 and initial eps to 0.1 2022-12-30 15:48:12 +08:00
Daniel Povey
c4101c7873 Change initial log_scale from 2 to 0. (was 1.0 in previous expt 2022-12-30 15:31:38 +08:00
Daniel Povey
851912c581 Remove bias from BasicNorm, add an eps instead. 2022-12-30 15:13:00 +08:00
Daniel Povey
da0623aa7f Add another balancer to ZipformerEncoderLayer, prior to output. 2022-12-30 14:35:49 +08:00
Daniel Povey
0c3530a6fd Merge branch 'scaled_adam_exp795' into scaled_adam_exp798 2022-12-30 14:29:48 +08:00
Daniel Povey
8056e0f9af Make sure param_rms limit is effectively applied; fix tests in optim.py 2022-12-29 23:55:16 +08:00
Daniel Povey
e164393e91 Increase default grad_scale of Balancer from 0.02 to 0.04. 2022-12-29 21:38:19 +08:00
Daniel Povey
59be36181c Replace ActivationBalancer with Balancer 2022-12-29 20:34:46 +08:00
Daniel Povey
c6bad1ee4f Start ff modules with larger initial_scale 2022-12-29 18:50:12 +08:00
Daniel Povey
fbdb12cf77 Remove ZipformerEncoder.norm 2022-12-29 16:00:34 +08:00
Daniel Povey
0de1184c6d Fix min_abs for AttentionSqueeze 2022-12-29 15:24:13 +08:00
Daniel Povey
a8282bb6d7 Adjust joiner and simple_lm/simple_am projections to account for larger activation dims 2022-12-29 12:52:11 +08:00
Daniel Povey
03e1f7dc01 Multiply min_abs values in line of encoder residuals by 4. 2022-12-29 12:49:04 +08:00
Daniel Povey
71d7843654 Re-introduce bias into BasicNorm and replace eps with log_scale. 2022-12-26 21:22:00 +08:00
Daniel Povey
920ed685ac Change how bypass_scale works, src = src * bypass_scale + src_orig * (1.0 - bypass_scale) 2022-12-26 14:27:16 +08:00