-
Notifications
You must be signed in to change notification settings - Fork 15
Open
Description
Hi, may I ask how long it will take for training on the Reddit and Holmes dataset?
I ran the code using a single GPU and here is the logging information:
Epoch 1/1
2560/2560 [==============================] - 1058s 413ms/step - loss: 10.8904 - decoder_softmax_loss:
3.5243 - concat_1_loss: 0.0277
n_batch: 20, prev 0
spent: 1067 sec
train: 10.8904
It seems that training an epoch (10M instances) will take over a month, which is a little slow.
Is it a normal speed, or did I miss anything? Thank you.
Metadata
Metadata
Assignees
Labels
No labels