Skip to content

Commit

Permalink
Bug fix
Browse files Browse the repository at this point in the history
  • Loading branch information
gabhijith authored Jul 20, 2023
1 parent d877ada commit 933f453
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions part4/4.JetTaggingPointCloud.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -508,14 +508,14 @@
" self.attention = tf.keras.layers.MultiHeadAttention(num_heads=self.num_heads, \n",
" key_dim=self.hidden_units//self.num_heads)\n",
" self.feedforward = tf.keras.Sequential([\n",
" Dense(units=self.mlp_hidden_units, activation=\"relu\"),\n",
" layers.Dense(units=self.mlp_hidden_units, activation=\"relu\"),\n",
" # Dropout(rate=self.dropout_rate),\n",
" Dense(units=input_shape[-1])\n",
" layers.Dense(units=input_shape[-1])\n",
" ])\n",
" self.layer_norm1 = LayerNormalization(epsilon=1e-6)\n",
" self.layer_norm2 = LayerNormalization(epsilon=1e-6)\n",
" self.dropout1 = Dropout(rate=self.dropout_rate)\n",
" self.dropout2 = Dropout(rate=self.dropout_rate)\n",
" self.layer_norm1 = layers.LayerNormalization(epsilon=1e-6)\n",
" self.layer_norm2 = layers.LayerNormalization(epsilon=1e-6)\n",
" self.dropout1 = layers.Dropout(rate=self.dropout_rate)\n",
" self.dropout2 = layers.Dropout(rate=self.dropout_rate)\n",
" super(SABTransformerBlock, self).build(input_shape)\n",
" \n",
" def call(self, inputs, mask=None):\n",
Expand Down

0 comments on commit 933f453

Please sign in to comment.