You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would find it useful if the activation functions where generalised to handle complex inputs. Many of them are still well-defined in this case, for example, there is no reason for sigmoid to be limited to real values, and it could be easily generalized.
softplus is also mathematically well defined, but for an efficient implementation someone with more experience than me could comment on how to rewrite it
softplus(x::Real) = ifelse(x > 0, x + log1p(exp(-x)), log1p(exp(x)))
The main advantage for this is that people like me working with complex-valued neural networks (often encountered in physics) could depend on NNlib and get the GPU versions of those functions with no effort.
Would you accept a PR (and a complimentary PR to CuArrays.jl) for that?
The text was updated successfully, but these errors were encountered:
I noticed that in #118 (@devmotion) you were considering using the implementations of some activation functions from StatsFun .
As the PR is somewhat stale, is it still under consideration, or I can safely make a PR to NNlib?
Hi,
I would find it useful if the activation functions where generalised to handle complex inputs. Many of them are still well-defined in this case, for example, there is no reason for sigmoid to be limited to real values, and it could be easily generalized.
softplus
is also mathematically well defined, but for an efficient implementation someone with more experience than me could comment on how to rewrite itThe main advantage for this is that people like me working with complex-valued neural networks (often encountered in physics) could depend on NNlib and get the GPU versions of those functions with no effort.
Would you accept a PR (and a complimentary PR to
CuArrays.jl
) for that?The text was updated successfully, but these errors were encountered: