From 006d793fce7958e6bcb7d30d8c3a8c56079fc754 Mon Sep 17 00:00:00 2001 From: dohe0342 Date: Fri, 23 Dec 2022 14:03:42 +0900 Subject: [PATCH] from local --- .../ASR/.distillation_with_hubert.sh.swp | Bin 24576 -> 24576 bytes .../ASR/distillation_with_hubert.sh | 2 +- 2 files changed, 1 insertion(+), 1 deletion(-) diff --git a/egs/librispeech/ASR/.distillation_with_hubert.sh.swp b/egs/librispeech/ASR/.distillation_with_hubert.sh.swp index f18e25dd2d339533715b01dcd55ee2e0adb2cf75..71895ea438d39dc23cd5d4bd0441e3b0ca991a0e 100644 GIT binary patch delta 260 zcmWN_F-t;W0EOZ6f%@syqQy~AK`jVz3KSu5bBF}NK@>%YrhY(!cK?6`U81q6DNy9D z4bHkmAP7PJ1C>Ch2Hyh@9A0;3+?k2@3XSSbvY0Q3ok_1(@BQX9zP%0Yf)wjHwaR1q zR7yd3JCWMRB7+3}W6}yPh8Q4$LsVMh7oT{+1Ik!B(ir#Xp^X+w*hi!d7MMqLtvV-% tpRn|W1?Kp`6Z*J-gNc=A*N?miNrOBo$^ stage 0 and stage 1 would be skipped, # and directly download the extracted codebook indexes for distillation # "False" -> start from scratch -use_extracted_codebook=False +use_extracted_codebook=True # teacher_model_id can be one of # "hubert_xtralarge_ll60k_finetune_ls960" -> fine-tuned model, it is the one we currently use.