228 Commits

Author SHA1 Message Date
Daniel Povey
30c6e5b929 Make attention_squeeze use full dim. 2022-12-10 00:08:38 +08:00
Daniel Povey
0fc646f281 Merge branch 'scaled_adam_exp663' into scaled_adam_exp665 2022-12-10 00:07:37 +08:00
Daniel Povey
d35eb7a3a6 Add cosmetic/diagnostics changes from scaled_adam_exp656. 2022-12-09 22:02:42 +08:00
Daniel Povey
958d9b929d Double limit of penalize_abs_values_gt in AttentionDownsample from 10 to 20. 2022-12-09 21:00:24 +08:00
Daniel Povey
75a1e05e49 Introduce nonlin_skip_rate 2022-12-08 20:35:38 +08:00
Daniel Povey
1718b2de44 Merge branch 'scaled_adam_exp647' into scaled_adam_exp652
# Conflicts:
#	egs/librispeech/ASR/pruned_transducer_stateless7/zipformer.py
2022-12-08 20:35:02 +08:00
Daniel Povey
dea177fa63 Adjust min_positive and max_positive of NonlinAttentionModule 2022-12-08 20:31:37 +08:00
Daniel Povey
e77a12602c Merge branch 'scaled_adam_exp639' into scaled_adam_exp652 2022-12-08 20:11:28 +08:00
Daniel Povey
d29e3d89e5 Use half the dimension in AttentionSqueeze. 2022-12-08 20:07:35 +08:00
Daniel Povey
f4f3d057e7 Cosmetic improvements to convolution module; enable more stats. 2022-12-08 18:27:01 +08:00
Daniel Povey
3f82ee0783 Merge dropout schedule, 0.3 ... 0.1 over 20k batches 2022-12-08 18:18:46 +08:00
Daniel Povey
5f5d02ed0c Add another whitening module, move balancer to output. 2022-12-07 18:07:56 +08:00
Daniel Povey
8859177bfa Move balancer of NonlinAttentionModule to output; add an extra whitener at output. 2022-12-07 17:49:53 +08:00
Daniel Povey
51d8e64d66 Reduce min_abs on NonlinAttentionModule balancer2 to default from 0.5. 2022-12-06 15:32:56 +08:00
Daniel Povey
8f841e5b2b Add another balancer for NonlinAttentionModule. 2022-12-06 11:11:28 +08:00
Daniel Povey
22617da725 Make dropout a schedule starting at 0.3. 2022-12-05 23:39:24 +08:00
Daniel Povey
178eca1c0e Revert scaling, scale only grad. 2022-12-05 17:53:23 +08:00
Daniel Povey
b93cf0676a Initialize Conv2dSubsampling with scale. 2022-12-05 17:31:56 +08:00
Daniel Povey
7999dd0dbe Introduce scalar multiplication and change rules for updating gradient scale. 2022-12-05 16:15:20 +08:00
Daniel Povey
7b1f093077 Use Swoosh-R in the Conv and Swoosh-L in the feedforward. 2022-12-04 19:18:16 +08:00
Daniel Povey
d214e1c352 Change min_abs,max_abs from 2,10 to 1,5 for FF module 2022-12-04 16:15:38 +08:00
Daniel Povey
67812276ed Change Swoosh formula so left crossing is near zero; change min_positive, max_positive of ActivationBalancer. 2022-12-03 15:10:03 +08:00
Daniel Povey
d5bfca4f49 Change activation of ConvolutionModule from Swoosh to Tanh, and min_abs from 1.0 to 0.75. 2022-12-03 14:50:09 +08:00
Daniel Povey
074f94f256 Increase ConvolutionModule min_abs from 0.4 to 1.0 2022-12-03 14:47:36 +08:00
Daniel Povey
306fd85bab Reduce initial whitening-schedule limit from 7.5 to 5.0 for NonlinAttentionModule. 2022-12-03 14:40:39 +08:00
Daniel Povey
3faa4ffa3f Revert "Change whitening limit of NonlinAttentionModule from _whitening_schedule(7.5), to 5.0"
This reverts commit dcf6fced40a831892e7bb4e52eddf26e6e070eef.
2022-12-03 00:19:48 +08:00
Daniel Povey
bd1b1dd7e3 Simplify formula for Swoosh and make it pass through 0; make max_abs of ConvolutionModule a constant. 2022-12-03 00:13:09 +08:00
Daniel Povey
862e5828c5 Set min_abs in balancer2 of ConvolutionModule to 0.4, and max_abs to 10.0 (this is a dontcare, really). 2022-12-02 21:02:06 +08:00
Daniel Povey
509467988f Reduce min_abs in balancer2 of conv_module from 1.0 to 0.2. 2022-12-02 20:29:23 +08:00
Daniel Povey
14267a5194 Use Swoosh not DoubleSwish in zipformer; fix constants in Swoosh 2022-12-02 16:58:31 +08:00
Daniel Povey
85bd9859e9 Use different heads for nonlin/squeeze on alternate layers 2022-12-02 13:17:45 +08:00
Daniel Povey
d8185201e9 Merge branch 'scaled_adam_exp569' into scaled_adam_exp585 2022-12-01 19:14:26 +08:00
Daniel Povey
4f9bb332fb Use fewer hidden channels in NonlinAttentionModule 2022-12-01 19:14:12 +08:00
Daniel Povey
d294449221 Fix typo 0.21->0.2 2022-12-01 15:29:46 +08:00
Daniel Povey
f0c46ce564 Double nonlin_skip_rate and have it last twice longer. 2022-12-01 15:28:44 +08:00
Daniel Povey
cac1a8b860 Merge branch 'scaled_adam_exp569' into scaled_adam_exp576 2022-12-01 15:20:20 +08:00
Daniel Povey
4621e924ba Introduce dropout schedule for NonlinAttentionModule 2022-12-01 15:19:51 +08:00
Daniel Povey
dcf6fced40 Change whitening limit of NonlinAttentionModule from _whitening_schedule(7.5), to 5.0 2022-12-01 14:28:56 +08:00
Daniel Povey
025bcc155d Change scale_min of Conv2dSubsampling from .01 to .1; some cosmetic changes/unimportant bugfixes. 2022-12-01 14:20:15 +08:00
Daniel Povey
ba31272c92 Change sigmoid to tanh in NonlinAttentionModule, and adjust abs limits of balancer to compensate. 2022-11-30 21:44:45 +08:00
Daniel Povey
c75c2dc91d reduce min_abs of zipformer-encoder-layer balancer from 0.4 to 0.25. 2022-11-30 13:40:53 +08:00
Daniel Povey
d100aed58b Revert "Reduce min of scale in Conv2dSubsampling from 0.01 to 0.2"
This reverts commit 7589e3768975df10c3d022beb4c88f14c2f25d3d.
2022-11-30 13:17:20 +08:00
Daniel Povey
640d48262f Double scale on aux_loss 2022-11-29 16:20:21 +08:00
Daniel Povey
7589e37689 Reduce min of scale in Conv2dSubsampling from 0.01 to 0.2 2022-11-29 16:18:41 +08:00
Daniel Povey
441fcf063d Reduce final value of bypass_min from 0.25 to 0.2 2022-11-29 16:15:34 +08:00
Daniel Povey
73e420865c Revert min_abs in NonlinAttentionModule from 2.0 to 1.5 2022-11-29 15:53:29 +08:00
Daniel Povey
f48786534a Merge branch 'scaled_adam_exp535' into scaled_adam_exp548 2022-11-29 15:43:44 +08:00
Daniel Povey
28c5923986 Remove max_factor=0.02 option in bottleneck balancer of class AttentionSqueeze, change its min,max positive to 0.2,0.8 2022-11-29 15:43:25 +08:00
Daniel Povey
5632782ee1 Merge branch 'scaled_adam_exp539' into scaled_adam_exp548 2022-11-29 15:40:23 +08:00
Daniel Povey
b90d8aabde Revert the alternate-layers-only thing for nonlin_attention and attention_squeeze 2022-11-29 15:38:55 +08:00