Skip to content

[HW3] Using nn.MultiheadAttention and nn.LayerNorm #140

Answered by seominjoon
yuneg11 asked this question in Q&A
Discussion options

You must be logged in to vote

You can either use PyTorch's or copy from Annotated Transformer, whichever is easier for you.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by yuneg11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants