Home page

Deconstructing Deep Learning + δeviations

Drop me an email | RSS feed link : Click
Format : Date | Title
  TL; DR

Total posts : 78

View My GitHub Profile


Index page

Activation functions

Implementing activation functions.

Activation functions are an extremely important part of any neural network. But they are actually much simpler than we make them out to be. Here are some of them. Lets define a test matrix.

test = [100 1.0 0.0 -300.0;100 1.0 0.0 300.0]

Relu

relu(mat) = max.(0, mat)

Leaky relu

lrelu(x) = max.(0.01x, x)

PRelu

#export
prelu(x,a) = max.(x, x.*a)
prelu(test,0.10)

Maxout

maxout(x,a) = max.(x, x.*a)
maxout(test,0.10)

Sigmoid

σ(x) = 1 ./(1 .+exp.(-x))
σ(test)

Noisy Relu

using Distributions
noisyrelu(x) = max.(0, x.+rand(Distributions.Normal(), 1))
noisyrelu(test)

Softplus

softplus(x) = log.(exp.(test).+1)
softplus(test)

Elu

elu(x,a) = max.(x, a.*(exp.(x) .-1))
elu(test,0.1)

Swish

swish(x,β) = x ./(1 .+exp.(-β.*x))
swish(test,0.1)