-
Notifications
You must be signed in to change notification settings - Fork 11
Description
Hello, I was running the code and found a place where the parameters were incorrect causing the stack to report an error.
Stacks
File "C:\Users\xbj0916\Desktop\CREAD\modeling\main.py", line 136, in train
_ = run_one_epoch('dev', dev_dataloader, trainer, -1, 'teacher_force')
File "C:\Users\xbj0916\Desktop\CREAD\modeling\main.py", line 59, in run_one_epoch
loss, _, _, _, _, _, _ = model(input_ids=batch['input_ids'], attention_mask=batch['attention_mask'],
File "C:\Program Files\Anaconda\envs\ML\lib\site-packages\torch\nn\modules\module.py", line 1110, in _call_impl
return forward_call(*input, **kwargs)
File "C:\Users\xbj0916\Desktop\CREAD\modeling\model.py", line 326, in forward
transformer_outputs = self.transformer(input_ids, past=past, attention_mask=attention_mask,
File "C:\Program Files\Anaconda\envs\ML\lib\site-packages\torch\nn\modules\module.py", line 1110, in _call_impl
return forward_call(*input, **kwargs)
TypeError: forward() got an unexpected keyword argument 'past'
Epoch -1 teacher_force: 0%| | 0/1901 [00:00<?, ?it/s]
Error Points: https://github.com/apple/ml-cread/blob/main/modeling/model.py#L325
Fix
- Change
transformer_outputs = self.transformer(input_ids, attention_mask=attention_mask,
token_type_ids=token_type_ids, position_ids=position_ids,
head_mask=head_mask, inputs_embeds=inputs_embeds, use_cache=use_cache,
output_attentions=True, output_hidden_states=True)- Remove
pastargument
def forward(self, input_ids=None, past=None, attention_mask=None, token_type_ids=None, position_ids=None,
head_mask=None, inputs_embeds=None, labels=None, use_cache=None, output_attentions=None,
output_hidden_states=None, step=None, mention_labels=None, predict_mention=True, predict_lm=True,
coref_attn=None, batch=None, coref_links=None)。I hope this is useful to you, and finally, I wish Apple better and better!