Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Iam #2658

Merged
merged 38 commits into from
Sep 12, 2018
Merged

Iam #2658

Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
a3a18e2
adding changes for language modelling
aarora8 Aug 30, 2018
91508b5
adding modifications for augmentation, topology, shearing, run.sh
aarora8 Aug 31, 2018
5f273d6
fixing bugs
aarora8 Aug 31, 2018
2645f14
fixing bug
aarora8 Aug 31, 2018
6ebfdb2
adding parameter tuning
aarora8 Sep 1, 2018
b532978
cosmetic fixes and updating results
aarora8 Sep 1, 2018
f383334
cosmetic fixes
aarora8 Sep 1, 2018
44c9e58
adding results
aarora8 Sep 1, 2018
2d11672
removing local/prepare_lang and adding gen_topo in run.sh
aarora8 Sep 1, 2018
4fc6705
fixing bugs
aarora8 Sep 1, 2018
8877530
updating result
aarora8 Sep 2, 2018
59e2c8b
updating documentation, results and parameter tuning
aarora8 Sep 2, 2018
5fc0d17
fixing chain scripts
aarora8 Sep 2, 2018
1138ee3
updating parameters
aarora8 Sep 2, 2018
b3532ce
updating parameters and results
aarora8 Sep 3, 2018
9b67d9d
adding overwrite option and punctuation topology
aarora8 Sep 3, 2018
89c9ec7
adding overwrite option
aarora8 Sep 4, 2018
c05cd4d
adding aachen splits
aarora8 Sep 4, 2018
5dfe8fc
fixing bugs
aarora8 Sep 4, 2018
d7448df
modification from review
aarora8 Sep 5, 2018
d7d5c22
updating parameter and result
aarora8 Sep 6, 2018
43e9af9
updating parameter and result
aarora8 Sep 6, 2018
17c506b
adding data preprocessing in test and val
aarora8 Sep 7, 2018
d640742
updating results
aarora8 Sep 7, 2018
7dfd0b5
Merge branch 'master' of https://github.com/kaldi-asr/kaldi into iam_4
aarora8 Sep 7, 2018
94a80ad
replacing prepend words with common prepend words
aarora8 Sep 7, 2018
711c3c9
updating remove_test_utterances_from_lob for aachen split
aarora8 Sep 7, 2018
5f2d960
removing data/val/text from train_lm
aarora8 Sep 7, 2018
7f2ad0b
cosmetic fixes in unk arc decoding
aarora8 Sep 7, 2018
8f2ac25
adding val data for decoding
aarora8 Sep 7, 2018
b8e71b2
modification from the review
aarora8 Sep 10, 2018
e9a75f6
modification from review
aarora8 Sep 10, 2018
ae674ed
modification from review
aarora8 Sep 10, 2018
7651f37
modification for downloading aachen splits
aarora8 Sep 10, 2018
417d97c
fixing bug in rescoring
aarora8 Sep 11, 2018
6a86531
hardcoding for removing only remaining long utterence
aarora8 Sep 12, 2018
ba07ff0
fix in hardcoding
aarora8 Sep 12, 2018
5398412
modification from review
aarora8 Sep 12, 2018
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 30 additions & 0 deletions egs/iam/v2/local/chain/compare_wer.sh
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,36 @@ for x in $*; do
done
echo

echo -n "# WER val "
for x in $*; do
wer=$(cat $x/decode_val/scoring_kaldi/best_wer | awk '{print $2}')
printf "% 10s" $wer
done
echo

echo -n "# WER (rescored) val "
for x in $*; do
wer="--"
[ -d $x/decode_val_rescored ] && wer=$(cat $x/decode_test_rescored/scoring_kaldi/best_wer | awk '{print $2}')
printf "% 10s" $wer
done
echo

echo -n "# CER val "
for x in $*; do
cer=$(cat $x/decode_val/scoring_kaldi/best_cer | awk '{print $2}')
printf "% 10s" $cer
done
echo

echo -n "# CER (rescored) val "
for x in $*; do
cer="--"
[ -d $x/decode_val_rescored ] && cer=$(cat $x/decode_test_rescored/scoring_kaldi/best_cer | awk '{print $2}')
printf "% 10s" $cer
done
echo

if $used_epochs; then
exit 0; # the diagnostics aren't comparable between regular and discriminatively trained systems.
fi
Expand Down
19 changes: 19 additions & 0 deletions egs/iam/v2/local/chain/tuning/run_cnn_e2eali_1a.sh
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ stage=0

nj=30
train_set=train
decode_val=true
nnet3_affix= # affix for exp dirs, e.g. it was _cleaned in tedlium.
affix=_1a #affix for TDNN+LSTM directory e.g. "1a" or "1b", in case we change the configuration.
e2echain_model_dir=exp/chain/e2e_cnn_1a
Expand Down Expand Up @@ -243,3 +244,21 @@ if [ $stage -le 7 ]; then
--nj $nj --cmd "$cmd" \
$dir/graph data/test $dir/decode_test || exit 1;
fi

if [ $stage -le 8 ] && $decode_val; then
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it's not a hassle, please decode both val and test in the same stage using a for loop, e.g. for testset in test $maybe_val; do ... done where maybe_val can be set to "val" or empty. See swbd/s5c/local/chain/run_tdnn.sh.

frames_per_chunk=$(echo $chunk_width | cut -d, -f1)
steps/nnet3/decode.sh --acwt 1.0 --post-decode-acwt 10.0 \
--extra-left-context $chunk_left_context \
--extra-right-context $chunk_right_context \
--extra-left-context-initial 0 \
--extra-right-context-final 0 \
--frames-per-chunk $frames_per_chunk \
--nj $nj --cmd "$cmd" \
$dir/graph data/val $dir/decode_val || exit 1;

steps/lmrescore_const_arpa.sh --cmd "$cmd" $lang_decode $lang_rescore \
data/val $dir/decode_val{,_rescored} || exit 1
fi

echo "Done. Date: $(date). Results:"
local/chain/compare_wer.sh $dir
19 changes: 19 additions & 0 deletions egs/iam/v2/local/chain/tuning/run_cnn_e2eali_1b.sh
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ stage=0

nj=30
train_set=train
decode_val=true
nnet3_affix= # affix for exp dirs, e.g. it was _cleaned in tedlium.
affix=_1b #affix for TDNN+LSTM directory e.g. "1a" or "1b", in case we change the configuration.
e2echain_model_dir=exp/chain/e2e_cnn_1a
Expand Down Expand Up @@ -249,3 +250,21 @@ if [ $stage -le 7 ]; then
steps/lmrescore_const_arpa.sh --cmd "$cmd" $lang_decode $lang_rescore \
data/test $dir/decode_test{,_rescored} || exit 1
fi

if [ $stage -le 8 ] && $decode_val; then
frames_per_chunk=$(echo $chunk_width | cut -d, -f1)
steps/nnet3/decode.sh --acwt 1.0 --post-decode-acwt 10.0 \
--extra-left-context $chunk_left_context \
--extra-right-context $chunk_right_context \
--extra-left-context-initial 0 \
--extra-right-context-final 0 \
--frames-per-chunk $frames_per_chunk \
--nj $nj --cmd "$cmd" \
$dir/graph data/val $dir/decode_val || exit 1;

steps/lmrescore_const_arpa.sh --cmd "$cmd" $lang_decode $lang_rescore \
data/val $dir/decode_val{,_rescored} || exit 1
fi

echo "Done. Date: $(date). Results:"
local/chain/compare_wer.sh $dir
19 changes: 19 additions & 0 deletions egs/iam/v2/local/chain/tuning/run_cnn_e2eali_1c.sh
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ stage=0

nj=30
train_set=train
decode_val=true
nnet3_affix= # affix for exp dirs, e.g. it was _cleaned in tedlium.
affix=_1c #affix for TDNN+LSTM directory e.g. "1a" or "1b", in case we change the configuration.
e2echain_model_dir=exp/chain/e2e_cnn_1a
Expand Down Expand Up @@ -251,3 +252,21 @@ if [ $stage -le 7 ]; then
steps/lmrescore_const_arpa.sh --cmd "$cmd" $lang_decode $lang_rescore \
data/test $dir/decode_test{,_rescored} || exit 1
fi

if [ $stage -le 8 ] && $decode_val; then
frames_per_chunk=$(echo $chunk_width | cut -d, -f1)
steps/nnet3/decode.sh --acwt 1.0 --post-decode-acwt 10.0 \
--extra-left-context $chunk_left_context \
--extra-right-context $chunk_right_context \
--extra-left-context-initial 0 \
--extra-right-context-final 0 \
--frames-per-chunk $frames_per_chunk \
--nj $nj --cmd "$cmd" \
$dir/graph data/val $dir/decode_val || exit 1;

steps/lmrescore_const_arpa.sh --cmd "$cmd" $lang_decode $lang_rescore \
data/val $dir/decode_val{,_rescored} || exit 1
fi

echo "Done. Date: $(date). Results:"
local/chain/compare_wer.sh $dir
21 changes: 18 additions & 3 deletions egs/iam/v2/local/chain/tuning/run_cnn_e2eali_1d.sh
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ stage=0

nj=30
train_set=train
test_dir=data/test
decode_val=true
nnet3_affix= # affix for exp dirs, e.g. it was _cleaned in tedlium.
affix=_1d #affix for TDNN+LSTM directory e.g. "1a" or "1b", in case we change the configuration.
e2echain_model_dir=exp/chain/e2e_cnn_1b
Expand Down Expand Up @@ -244,10 +244,25 @@ if [ $stage -le 7 ]; then
--extra-right-context-final 0 \
--frames-per-chunk $frames_per_chunk \
--nj $nj --cmd "$cmd" \
$dir/graph $test_dir $dir/decode_test || exit 1;
$dir/graph data/test $dir/decode_test || exit 1;

steps/lmrescore_const_arpa.sh --cmd "$cmd" $lang_decode $lang_rescore \
$test_dir $dir/decode_test{,_rescored} || exit 1
data/test $dir/decode_test{,_rescored} || exit 1
fi

if [ $stage -le 8 ] && $decode_val; then
frames_per_chunk=$(echo $chunk_width | cut -d, -f1)
steps/nnet3/decode.sh --acwt 1.0 --post-decode-acwt 10.0 \
--extra-left-context $chunk_left_context \
--extra-right-context $chunk_right_context \
--extra-left-context-initial 0 \
--extra-right-context-final 0 \
--frames-per-chunk $frames_per_chunk \
--nj $nj --cmd "$cmd" \
$dir/graph data/val $dir/decode_val || exit 1;

steps/lmrescore_const_arpa.sh --cmd "$cmd" $lang_decode $lang_rescore \
data/val $dir/decode_val{,_rescored} || exit 1
fi

echo "Done. Date: $(date). Results:"
Expand Down
14 changes: 13 additions & 1 deletion egs/iam/v2/local/chain/tuning/run_e2e_cnn_1a.sh
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ stage=0
train_stage=-10
get_egs_stage=-10
affix=1a
nj=30

# training options
tdnn_dim=450
Expand All @@ -37,6 +38,7 @@ l2_regularize=0.00005
frames_per_iter=1000000
cmvn_opts="--norm-means=true --norm-vars=true"
train_set=train
decode_val=true
lang_decode=data/lang
lang_rescore=data/lang_rescore_6g

Expand Down Expand Up @@ -163,12 +165,22 @@ fi
if [ $stage -le 5 ]; then
frames_per_chunk=$(echo $chunk_width | cut -d, -f1)
steps/nnet3/decode.sh --acwt 1.0 --post-decode-acwt 10.0 \
--nj 30 --cmd "$cmd" \
--nj $nj --cmd "$cmd" \
$dir/graph data/test $dir/decode_test || exit 1;

steps/lmrescore_const_arpa.sh --cmd "$cmd" $lang_decode $lang_rescore \
data/test $dir/decode_test{,_rescored} || exit 1
fi

if [ $stage -le 6 ] && $decode_val; then
frames_per_chunk=$(echo $chunk_width | cut -d, -f1)
steps/nnet3/decode.sh --acwt 1.0 --post-decode-acwt 10.0 \
--nj $nj --cmd "$cmd" \
$dir/graph data/val $dir/decode_val || exit 1;

steps/lmrescore_const_arpa.sh --cmd "$cmd" $lang_decode $lang_rescore \
data/val $dir/decode_val{,_rescored} || exit 1
fi

echo "Done. Date: $(date). Results:"
local/chain/compare_wer.sh $dir
14 changes: 13 additions & 1 deletion egs/iam/v2/local/chain/tuning/run_e2e_cnn_1b.sh
Original file line number Diff line number Diff line change
Expand Up @@ -23,12 +23,14 @@ stage=0
train_stage=-10
get_egs_stage=-10
affix=1b
nj=30

# training options
tdnn_dim=450
minibatch_size=150=100,64/300=50,32/600=25,16/1200=16,8
common_egs_dir=
train_set=train
decode_val=true
lang_decode=data/lang
lang_rescore=data/lang_rescore_6g

Expand Down Expand Up @@ -149,12 +151,22 @@ fi
if [ $stage -le 5 ]; then
frames_per_chunk=$(echo $chunk_width | cut -d, -f1)
steps/nnet3/decode.sh --acwt 1.0 --post-decode-acwt 10.0 \
--nj 30 --cmd "$cmd" \
--nj $nj --cmd "$cmd" \
$dir/graph data/test $dir/decode_test || exit 1;

steps/lmrescore_const_arpa.sh --cmd "$cmd" $lang_decode $lang_rescore \
data/test $dir/decode_test{,_rescored} || exit 1
fi

if [ $stage -le 6 ] && $decode_val; then
frames_per_chunk=$(echo $chunk_width | cut -d, -f1)
steps/nnet3/decode.sh --acwt 1.0 --post-decode-acwt 10.0 \
--nj $nj --cmd "$cmd" \
$dir/graph data/val $dir/decode_val || exit 1;

steps/lmrescore_const_arpa.sh --cmd "$cmd" $lang_decode $lang_rescore \
data/val $dir/decode_val{,_rescored} || exit 1
fi

echo "Done. Date: $(date). Results:"
local/chain/compare_wer.sh $dir