Skip to content

MSRA数据集 #24

@Mosquito0352

Description

@Mosquito0352

File "D:\pythonProject\MECT4CNER\model.py", line 122, in forward
char_encoded = self.char_encoder(components_embed, embedding, embedding, seq_len, lex_num=lex_num, pos_s=pos_s,
File "D:\anaconda\lib\site-packages\torch\nn\modules\module.py", line 1110, in _call_impl
return forward_call(*input, **kwargs)
File "D:\pythonProject\MECT4CNER\Modules\TransformerEncoder.py", line 47, in forward
output = self.transformer_layer(query, key, value, seq_len, lex_num=lex_num,
File "D:\anaconda\lib\site-packages\torch\nn\modules\module.py", line 1110, in _call_impl
return forward_call(*input, **kwargs)
File "D:\pythonProject\MECT4CNER\Modules\TransformerEncoderLayer.py", line 47, in forward
output = self.attn(query, key, value, seq_len, lex_num=lex_num,
File "D:\anaconda\lib\site-packages\torch\nn\modules\module.py", line 1110, in _call_impl
return forward_call(*input, **kwargs)
File "D:\pythonProject\MECT4CNER\Modules\AdaptSelfAttention.py", line 87, in forward
attn_score_raw = A_C + B_D + self.randomAttention[:, :, :max_seq_len, :max_seq_len]
RuntimeError: The size of tensor a (364) must match the size of tensor b (310) at non-singleton dimension 3

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions