Skip to content

Commit

Permalink
clean up conditional order
Browse files Browse the repository at this point in the history
  • Loading branch information
haileyschoelkopf committed Dec 27, 2022
1 parent 03458c5 commit 27e56e3
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions megatron/model/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -666,11 +666,11 @@ def forward(self, x, attention_mask, layer_past=None):

residual = x
# applies the correct normalization depending on if the norms are tied
if not self.gpt_j_tied:
x1, x2 = self.input_layernorm(x), self.post_attention_layernorm(x)
else:
if self.gpt_j_tied:
x = self.input_layernorm(x)
x1, x2 = x, x
else:
x1, x2 = self.input_layernorm(x), self.post_attention_layernorm(x)

# attention operator
attention_output, attention_bias = self.attention(
Expand Down

0 comments on commit 27e56e3

Please sign in to comment.