Daniel Povey
|
20e6d2a157
|
Rework zipformer code for clarity and extensibility
|
2022-11-09 12:56:07 +08:00 |
|
Daniel Povey
|
797a0e6ce7
|
Change order of convolution and nonlin-attention modules
|
2022-11-08 20:00:25 +08:00 |
|
Daniel Povey
|
36bff9b369
|
Fix to comment
|
2022-11-07 12:33:12 +08:00 |
|
Daniel Povey
|
47f42ef5db
|
Replace the 1st of the ConvolutionModules with NonlinAttentionModule
|
2022-11-05 14:19:43 +08:00 |
|
Daniel Povey
|
eb6e2b5a1d
|
Have 2 squeeze-excite modules per layer, using different attention heads.
|
2022-11-04 17:40:51 +08:00 |
|
Daniel Povey
|
efbe20694f
|
Use the attention weights as input for the ModifiedSEModule
|
2022-11-04 16:01:07 +08:00 |
|
Daniel Povey
|
0d94783e76
|
Instead of a pooling operation, use the first bottleneck_dim dimensions of the preceding self_attn.forward2 as the input to the squeeze-excite module.
|
2022-11-04 15:16:59 +08:00 |
|
Daniel Povey
|
c27ee8cfcf
|
Merge branch 'scaled_adam_exp277' into scaled_adam_exp281
|
2022-11-04 15:06:23 +08:00 |
|
Daniel Povey
|
67d470766f
|
Revert bottleneck_dim from 8 to 16
|
2022-11-04 15:02:56 +08:00 |
|
Daniel Povey
|
cefcd061bd
|
Merge branch 'scaled_adam_exp271' into scaled_adam_exp274
|
2022-11-04 14:50:00 +08:00 |
|
Daniel Povey
|
31d9bbfb3c
|
Merge branch 'scaled_adam_exp268b' into scaled_adam_exp279
|
2022-11-04 14:42:00 +08:00 |
|
Daniel Povey
|
70300e34d3
|
Merge branch 'scaled_adam_exp273' into scaled_adam_exp277
# Conflicts:
# egs/librispeech/ASR/pruned_transducer_stateless7/zipformer.py
|
2022-11-04 12:41:02 +08:00 |
|
Daniel Povey
|
f625810de1
|
Use the balancer; remove the unused sigmoid module.
|
2022-11-03 19:21:37 +08:00 |
|
Daniel Povey
|
a9c384e69e
|
Add Whiten module after squeeze_proj.
|
2022-11-03 19:04:34 +08:00 |
|
Daniel Povey
|
11cb30bf49
|
Reduce bottleneck dim fo SEModule from 16 to 8.
|
2022-11-03 17:30:36 +08:00 |
|
Daniel Povey
|
824c3afd7d
|
Merge branch 'scaled_adam_exp265b' into scaled_adam_exp268b
|
2022-11-03 14:35:00 +08:00 |
|
Daniel Povey
|
eb915f170c
|
Merge branch 'scaled_adam_exp265b' into scaled_adam_exp266b
|
2022-11-03 14:32:40 +08:00 |
|
Daniel Povey
|
97a1dd40cf
|
Change initialization value of weight in SimpleCombine from 0.0 to 0.1; ignore infinities in MetricsTracker
.
|
2022-11-03 13:46:14 +08:00 |
|
Daniel Povey
|
44bdda1218
|
Move ModifiedSEModule to end of ZipformerEncoderLayer
|
2022-11-03 13:13:41 +08:00 |
|
Daniel Povey
|
a2dbce2a9a
|
Add Whiten module, with whitening_limit=10.0, at output of ModifiedSEModule
|
2022-11-03 13:02:54 +08:00 |
|
Daniel Povey
|
a27670d097
|
Restore feedforward3 module
|
2022-11-03 12:41:19 +08:00 |
|
Daniel Povey
|
0379ab57a2
|
Make weight in SimpleCombine a vector
|
2022-11-03 12:20:29 +08:00 |
|
Daniel Povey
|
e08f5c1bce
|
Replace Pooling module with ModifiedSEModule
|
2022-11-01 14:38:06 +08:00 |
|
Daniel Povey
|
4da4a3a5df
|
Merge branch 'scaled_adam_exp236' into scaled_adam_exp242
|
2022-10-31 19:37:53 +08:00 |
|
Daniel Povey
|
b7876baed6
|
Remove dynamic weights in SimpleCombine
|
2022-10-31 19:22:01 +08:00 |
|
Daniel Povey
|
b091ae5482
|
Add bias in weight module
|
2022-10-31 17:11:21 +08:00 |
|
Daniel Povey
|
5e51534fbc
|
Introduce minimum probs in the SimpleCombiner
|
2022-10-31 17:02:21 +08:00 |
|
Daniel Povey
|
12f17f550e
|
Introduce dropout rate to dynamic submodules of conformer.
|
2022-10-31 16:18:52 +08:00 |
|
Daniel Povey
|
3de8a5aef2
|
Bug fix
|
2022-10-31 15:50:46 +08:00 |
|
Daniel Povey
|
5fda800b6d
|
Implement pooling module, add it after initial feedforward.
|
2022-10-31 15:49:18 +08:00 |
|
Daniel Povey
|
730e6c8914
|
Change schedule after initial loss not promising
|
2022-10-31 13:47:26 +08:00 |
|
Daniel Povey
|
b8db0f53f1
|
Change to schedule of bypass_scale min: make it larger, decrease slower.
|
2022-10-31 13:11:59 +08:00 |
|
Daniel Povey
|
efbb1d25c7
|
Restore the changes from scaled_adam_219 and scaled_adam_exp220, accidentally lost, re layer skipping
|
2022-10-30 14:59:49 +08:00 |
|
Daniel Povey
|
e4a22bbe96
|
Reduce initial clamp_min for bypass_scale from 1.0 to 0.5.
|
2022-10-30 14:43:02 +08:00 |
|
Daniel Povey
|
e9c69d8477
|
Add warmup schedule for zipformer encoder layer, from 1.0 -> 0.2.
|
2022-10-30 14:41:18 +08:00 |
|
Daniel Povey
|
8b0722e626
|
Rework how warmup count is produced; should not affect results.
|
2022-10-30 14:17:41 +08:00 |
|
Daniel Povey
|
6b6143f28c
|
Merge branch 'scaled_adam_exp218' into scaled_adam_exp221
|
2022-10-30 13:17:29 +08:00 |
|
Daniel Povey
|
a3561c8dcd
|
Have warmup schedule for layer-skipping
|
2022-10-29 21:01:00 +08:00 |
|
Daniel Povey
|
072776b2a1
|
Apply layer-skip dropout prob
|
2022-10-29 20:11:39 +08:00 |
|
Daniel Povey
|
9a7979d7b8
|
Avoid falling off the loop for weird inputs
|
2022-10-29 20:03:41 +08:00 |
|
Daniel Povey
|
05689f6354
|
Add skip connections as in normal U-net
|
2022-10-29 19:57:12 +08:00 |
|
Daniel Povey
|
bba454a349
|
Make decoder group size equal to 4.
|
2022-10-29 17:08:19 +08:00 |
|
Daniel Povey
|
ff03ec88a5
|
Tuning change to num encoder layers, inspired by relative param importance.
|
2022-10-29 15:56:02 +08:00 |
|
Daniel Povey
|
f995426324
|
Reduce min of bypass_scale from 0.5 to 0.3, and make it not applied in test mode.
|
2022-10-29 15:40:47 +08:00 |
|
Daniel Povey
|
435d0dec71
|
Reduce dim of linear positional encoding in attention layers.
|
2022-10-29 15:31:34 +08:00 |
|
Daniel Povey
|
96ea4cf1be
|
Have 6 different encoder stacks, U-shaped network.
|
2022-10-28 20:36:45 +08:00 |
|
Daniel Povey
|
7b57a34227
|
Have 4 encoder stacks
|
2022-10-28 19:32:14 +08:00 |
|
Daniel Povey
|
de9a6ebd6c
|
Fix bug RE seq lengths
|
2022-10-28 19:26:06 +08:00 |
|
Daniel Povey
|
d7d5188bd9
|
Refactor how the downsampling is done so that it happens later, but the 1st encoder stack still operates after a subsampling of 2.
|
2022-10-28 19:20:21 +08:00 |
|
Daniel Povey
|
0a89f51dc9
|
Have a 3rd encoder, at downsampling factor of 8.
|
2022-10-28 17:42:00 +08:00 |
|