Daniel Povey
|
678be7a2eb
|
Revert ConvNorm1d to BasicNorm in Conv2dSubsampling and ZipformerLayer to BasicNorm
|
2022-12-21 23:53:13 +08:00 |
|
Daniel Povey
|
0995970f29
|
Decrease hidden_ratio of ConvNeXt from 4 to 3.
|
2022-12-21 18:43:11 +08:00 |
|
Daniel Povey
|
39e7c613c7
|
Add balancer to ConvNeXt
|
2022-12-21 18:41:05 +08:00 |
|
Daniel Povey
|
4d61d39d36
|
Merge branch 'scaled_adam_exp747' into scaled_adam_exp748
|
2022-12-20 23:23:49 +08:00 |
|
Daniel Povey
|
3ef2a1d81e
|
Make some of the layer-skipping logic be done per sequence.
|
2022-12-20 22:26:30 +08:00 |
|
Daniel Povey
|
244633660d
|
Implement ConvNorm2d and use it in frontend after convnext
|
2022-12-20 20:28:03 +08:00 |
|
Daniel Povey
|
71880409cc
|
Bug fix; also make the final norm of Conv2dSubsampling a ConvNorm1d
|
2022-12-20 19:44:04 +08:00 |
|
Daniel Povey
|
494139d27a
|
Replace BasicNorm of encoder layers with ConvNorm1d
|
2022-12-20 19:15:14 +08:00 |
|
Daniel Povey
|
f59697555f
|
Add BasicNorm on output of Conv2dSubsampling module
|
2022-12-20 15:00:01 +08:00 |
|
Daniel Povey
|
5fa8de5c05
|
Implement layerdrop per-sequence for convnext; lower, slower-decreasing layerdrop rate.
|
2022-12-20 13:51:08 +08:00 |
|
Daniel Povey
|
5c11e92d4a
|
Adjust warmup duration of layerdrop_prob
|
2022-12-20 00:12:10 +08:00 |
|
Daniel Povey
|
473bb338d6
|
Merge branch 'scaled_adam_exp734' into scaled_adam_exp738
|
2022-12-20 00:10:19 +08:00 |
|
Daniel Povey
|
2cc5bc18be
|
Merge branch 'scaled_adam_exp731' into scaled_adam_exp737
# Conflicts:
# egs/librispeech/ASR/pruned_transducer_stateless7/zipformer.py
|
2022-12-20 00:04:49 +08:00 |
|
Daniel Povey
|
6277a5ab4b
|
Merge branch 'scaled_adam_exp725' into scaled_adam_exp736
# Conflicts:
# egs/librispeech/ASR/pruned_transducer_stateless7/scaling.py
# egs/librispeech/ASR/pruned_transducer_stateless7/zipformer.py
|
2022-12-20 00:01:38 +08:00 |
|
Daniel Povey
|
c7d15dacc6
|
Revert "Increase layerdrop_prob of ConvNeXt, and make it warm up faster."
This reverts commit 111a0aa3c73618eee7986291780383f166aac85d.
|
2022-12-18 14:31:59 +08:00 |
|
Daniel Povey
|
5e1bf8b8ec
|
Add BasicNorm to ConvNeXt; increase prob given to CutoffEstimator; adjust default probs of ActivationBalancer.
|
2022-12-18 14:14:15 +08:00 |
|
Daniel Povey
|
0341ff1ec5
|
One more convnext layer, two fewer conformer layers.
|
2022-12-17 22:00:58 +08:00 |
|
Daniel Povey
|
a424a73881
|
Increase ratio of convnext from 3 to 4.
|
2022-12-17 21:58:59 +08:00 |
|
Daniel Povey
|
9a72567b7f
|
Restore two nonlinearities.
|
2022-12-17 21:57:21 +08:00 |
|
Daniel Povey
|
598f52cbac
|
Remove 2 ConvNeXt layers.
|
2022-12-17 21:53:46 +08:00 |
|
Daniel Povey
|
111a0aa3c7
|
Increase layerdrop_prob of ConvNeXt, and make it warm up faster.
|
2022-12-17 16:44:57 +08:00 |
|
Daniel Povey
|
96daf7a00f
|
Bug fix; remove BasicNorm; add one more ConvNeXt layer.
|
2022-12-17 16:11:54 +08:00 |
|
Daniel Povey
|
744dca1c9b
|
Merge branch 'scaled_adam_exp724' into scaled_adam_exp726
|
2022-12-17 15:46:57 +08:00 |
|
Daniel Povey
|
86bb0623e9
|
Remove query from AttentionDownsample, rename to SimpleDownsample
|
2022-12-17 13:45:30 +08:00 |
|
Daniel Povey
|
ed7e01448c
|
Remove query in AttentionDownsample, rename to SimpleDownsample.
|
2022-12-17 13:44:08 +08:00 |
|
Daniel Povey
|
35b63c1387
|
Revert "Reduce const_attention_rate"
This reverts commit bc002a9eda8ac912cff235460dcdef2fd51b2f19.
|
2022-12-17 13:27:17 +08:00 |
|
Daniel Povey
|
4eb3e97848
|
Remove bias from SimpleUpsample, add one to AttentionDownsample
|
2022-12-16 17:59:15 +08:00 |
|
Daniel Povey
|
bc002a9eda
|
Reduce const_attention_rate
|
2022-12-16 16:30:49 +08:00 |
|
Daniel Povey
|
66465c8be4
|
Give attention_skip_rate a longer tail
|
2022-12-16 15:12:04 +08:00 |
|
Daniel Povey
|
56ac7354df
|
Remove LinearWithAuxLoss; simplify schedule of prob in ActivationBalancer.
|
2022-12-16 15:07:42 +08:00 |
|
Daniel Povey
|
3213c18a22
|
Changes to schedules: _whitening_schedule longer, min_abs schedule on attention_squeeze+nonlin_attention shorter; dip in conv_skip_rate.
|
2022-12-16 14:58:15 +08:00 |
|
Daniel Povey
|
e84f525840
|
Fix test condition
|
2022-12-16 12:24:54 +08:00 |
|
Daniel Povey
|
53ab18a862
|
Ditch caching_eval; reduce params more.
|
2022-12-16 00:22:44 +08:00 |
|
Daniel Povey
|
083e5474c4
|
Reduce ConvNeXt parameters.
|
2022-12-16 00:21:04 +08:00 |
|
Daniel Povey
|
d26ee2bf81
|
Try to implement caching evaluation for memory efficient training
|
2022-12-15 23:06:40 +08:00 |
|
Daniel Povey
|
076b18db60
|
Implement Nextformer-style frontend
|
2022-12-15 21:48:32 +08:00 |
|
Daniel Povey
|
864ff96322
|
Remove nonlin_skip_rate, introduce conv_skip_rate.
|
2022-12-15 19:27:29 +08:00 |
|
Daniel Povey
|
1506b83c7b
|
Change nonlin_skip_rate to be conv_skip_rate.
|
2022-12-15 19:25:21 +08:00 |
|
Daniel Povey
|
37a8c30136
|
Merge branch 'scaled_adam_exp699' into scaled_adam_exp711
|
2022-12-15 00:24:56 +08:00 |
|
Daniel Povey
|
25834453db
|
Merge branch 'scaled_adam_exp698' into scaled_adam_exp710
# Conflicts:
# egs/librispeech/ASR/pruned_transducer_stateless7/zipformer.py
|
2022-12-15 00:21:31 +08:00 |
|
Daniel Povey
|
9e79b296f2
|
Merge branch 'scaled_adam_exp708' into scaled_adam_exp709
|
2022-12-14 22:56:09 +08:00 |
|
Daniel Povey
|
aac9bebc62
|
Bug fix
|
2022-12-14 22:54:59 +08:00 |
|
Daniel Povey
|
9bc326a9b6
|
Merge branch 'scaled_adam_exp705' into scaled_adam_exp709
|
2022-12-14 21:41:50 +08:00 |
|
Daniel Povey
|
159f37ddeb
|
Merge branch 'scaled_adam_exp700' into scaled_adam_exp709
|
2022-12-14 21:41:43 +08:00 |
|
Daniel Povey
|
cec2162a17
|
Merge branch 'scaled_adam_exp703' into scaled_adam_exp709
|
2022-12-14 21:41:32 +08:00 |
|
Daniel Povey
|
87df9f3215
|
Simplify schedules of output balancers for nonlin_attention_module and attention_squeeze.
|
2022-12-14 21:37:32 +08:00 |
|
Daniel Povey
|
930f1b8948
|
Reduce conv_module balancer2 min_abs from 0.75 to 0.5.
|
2022-12-13 23:01:49 +08:00 |
|
Daniel Povey
|
48445f22e4
|
Increase ratio from 2.0 to 3.0 on 2 whitening schedules
|
2022-12-13 22:50:21 +08:00 |
|
Daniel Povey
|
157f4074a2
|
Halve min_positive schedule of ConvolutionModule.
|
2022-12-13 21:41:15 +08:00 |
|
Daniel Povey
|
57040e382a
|
Set all aux-loss probs to zero.
|
2022-12-13 19:25:08 +08:00 |
|