Replies: 1 comment
-
Currently, the activation function can only be configured separately for the branch and trunk subnets:
If you want to set a specific activation function for each layer, you'll need to modify activation.get and SingleOutputStrategy. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi all,
Does anyone know how to change the activation functions in each layer of the branh and trunk nets?
This piece of code selects the same activation (relu) for all layers of both branch and trunk nets. Yet, I'd like to select it individually.
net = dde.nn.DeepONet(
[m, 40, p], # dimensions of the fully connected branch net
[n, 40, p], # dimensions of the fully connected trunk net
"relu",
"Glorot normal", # initialization of parameters
)
Thanks a lot in advance
Beta Was this translation helpful? Give feedback.
All reactions