mirror of
https://github.com/k2-fsa/icefall.git
synced 2025-09-08 08:34:19 +00:00
a short intro to distillation framework
This commit is contained in:
parent
f921a7be0e
commit
80dfdd1cfa
@ -1,3 +1,20 @@
|
||||
# A short introduction about distillation framework.
|
||||
#
|
||||
# A typical traditional distillation method is
|
||||
# Loss(teacher embedding, student embedding).
|
||||
#
|
||||
# Comparing to these, the proposed distillation framework contains two mainly steps:
|
||||
# codebook indexes = quantizer.encode(teacher embedding)
|
||||
# Loss(codebook indexes, student embedding)
|
||||
#
|
||||
# Things worth to meantion:
|
||||
# 1. The float type teacher embedding is quantized into a sequence of
|
||||
# 8-bit integer codebook indexes.
|
||||
# 2. a middle layer 36(1-based) out of total 48 layers is used to extract
|
||||
# teacher embeddings.
|
||||
# 3. a middle layer 6(1-based) out of total 6 layers is used to extract
|
||||
# student embeddings.
|
||||
|
||||
# This is an example to do distillation with librispeech clean-100 subset.
|
||||
# run with command:
|
||||
# bash distillation_with_hubert.sh [0|1|2|3|4]
|
||||
|
Loading…
x
Reference in New Issue
Block a user