[WIP] Draft of new neural net adaptation framework (will also involve script rewrite)#2913
[WIP] Draft of new neural net adaptation framework (will also involve script rewrite)#2913danpovey wants to merge 110 commits intokaldi-asr:masterfrom
Conversation
unless the constructor
[WIP] differentiable SVD
… derivs not right)
Implement get_train_schedule.py
|
need |
| # EOF | ||
| # | ||
| # note: $langs is "default" | ||
| steps/chaina/get_model_context.sh \ |
| dir=$4 | ||
|
|
||
| tree=$chaindir/${lang}.tree | ||
| trans_mdl=$chaindir/0/${lang}.mdl # contains the transition model and a nnet, but |
There was a problem hiding this comment.
trans_mdl=$chaindir/0/${lang}.mdl -> trans_mdl=$chaindir/init/${lang}.mdl
|
|
||
| if args.dropout_schedule == "": | ||
| args.dropout_schedule = None | ||
| dropout_edit_option = common_train_lib.get_dropout_edit_option( |
There was a problem hiding this comment.
common?
should be libs.nnet3.train.dropout_schedule
There was a problem hiding this comment.
That's already imported in common_train_lib. See steps/chain/train.py.
egs/wsj/s5/steps/chaina/train.sh
Outdated
| # source the 1st line of schedule.txt in the shell; this sets | ||
| # lrate and dropout_opt, among other variables. | ||
| . <(head -n 1 $dir/schedule.txt) | ||
| langs=$(awk '/^langs/ { $1=""; print; }' <$dir/0/info.txt) |
There was a problem hiding this comment.
$dir/0/info.txt -> $dir/init/info.txt
|
the include thing, I'll wait for hossein to look at it; I'll fix the other
ones.
…On Thu, Jan 17, 2019 at 8:50 PM Gaofeng Cheng ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In egs/wsj/s5/steps/chaina/internal/get_train_schedule.py
<#2913 (comment)>:
> + # regardless of the --num-jobs-initial and --num-jobs-final. This
+ # is because the model averaging does not work reliably for a
+ # freshly initialized model.
+ if iter == 0:
+ current_num_jobs = 1
+
+ lrate = common_train_lib.get_learning_rate(iter, current_num_jobs,
+ num_iters,
+ num_scp_files_processed,
+ num_scp_files_to_process,
+ args.initial_effective_lrate,
+ args.final_effective_lrate)
+
+ if args.dropout_schedule == "":
+ args.dropout_schedule = None
+ dropout_edit_option = common_train_lib.get_dropout_edit_option(
common?
should be libs.nnet3.train.dropout_schedule
------------------------------
In egs/wsj/s5/steps/chaina/train.sh
<#2913 (comment)>:
> + --dropout-schedule="$dropout_schedule" \
+ --num-scp-files=$num_scp_files \
+ --frame-subsampling-factor=$frame_subsampling_factor \
+ --initial-effective-lrate=$initial_effective_lrate \
+ --final-effective-lrate=$final_effective_lrate \
+ --schedule-out=$dir/schedule.txt
+
+
+
+if [ "$use_gpu" != "no" ]; then gpu_cmd_opt="--gpu 1"; else gpu_cmd_opt=""; fi
+
+num_iters=$(wc -l <$dir/schedule.txt)
+# source the 1st line of schedule.txt in the shell; this sets
+# lrate and dropout_opt, among other variables.
+. <(head -n 1 $dir/schedule.txt)
+langs=$(awk '/^langs/ { $1=""; print; }' <$dir/0/info.txt)
$dir/0/info.txt -> $dir/init/info.txt
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#2913 (review)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ADJVu7VItXKalAS9YiEkizOBnNDzaI5iks5vEShKgaJpZM4ZSRX->
.
|
Set random seed in choose_egs_to_merge.py
|
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
|
This issue has been automatically closed by a bot strictly because of inactivity. This does not mean that we think that this issue is not important! If you believe it has been closed hastily, add a comment to the issue and mention @kkm000, and I'll gladly reopen it. |
|
This issue has been automatically marked as stale by a bot solely because it has not had recent activity. Please add any comment (simply 'ping' is enough) to prevent the issue from being closed for 60 more days if you believe it should be kept open. |
No description provided.