Replies: 1 comment
-
>>> david.levinthal1 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
>>> david.levinthal1
[February 22, 2018, 10:18pm]
There have been questions about evaluating performance and monitoring
progress. slash
at slash ~ line 1600 (depending on version) add 2 lines slash
slash # Get the first job slash
job = COORD.get_job() slash
batch_time_start = datetime.datetime.now() slash < slash - slash - slash -- slash
summed_batch_loss = 0.0 slash < slash - slash - slash - slash - slash -- slash
at slash ~ line 1640 add the lines shown below the call to session.run slash
slash # Compute the batch slash
slash _, current_step, batch_loss, batch_report = session.run( slash [train_op,
global_step, loss, report_params slash ], slash DEEPSPEECH.cdx deepspeech.commands DEEPSPEECH.pages DEEPSPEECH.warc.gz discourse.mozilla.org html-to-markdown.sh shell-conver-html-to-split-posts.sh sorted-deepspeech-posts slash *extra_params)
summed_batch_loss += batch_loss
mod_current_step_100 = current_step % 100
if mod_current_step_100 == 0:
batch_time_stop = datetime.datetime.now() - batch_time_start
delta_seconds = batch_time_stop.total_seconds()
average_batch_loss = summed_batch_loss/100.
print ( 'current_step = %d time_delta = %f avg batch_loss = %f ' %
(current_step, delta_seconds, average_batch_loss))
summed_batch_loss = 0.0
batch_time_start = datetime.datetime.now()
# DL loss debug
# print ( ' current_step = %d, batch_loss = %f ' % (current_step, batch_loss))
The commented out line was part of some debugging I had to do. As far as
I can tell having a comma separated list of input training files will
result in the batch loss occasionally returning a inf value.
[This is an archived TTS discussion thread from discourse.mozilla.org/t/printing-periodic-progress-during-training]
Beta Was this translation helpful? Give feedback.
All reactions