Skip to content

support for selu activation function #181

@BorisVSchmid

Description

@BorisVSchmid

Ran into an interesting discussion on hackers' news on an activation function that seems to work well, even with deeper fully connected networks. See comparison and paper here, and discussion at reddit and HN.

https://github.com/shaohua0116/Activation-Visualization-Histogram

https://www.reddit.com/r/MachineLearning/comments/6g5tg1/r_selfnormalizing_neural_networks_improved_elu/
https://news.ycombinator.com/item?id=14527686

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions