Researchclopedia
Research
Researchers
Institutions
Topics
Submit
About
Search...
⌘
K
Command Palette
Search for a command to run...
Back to research
Dual Rectified Linear Units (DReLUs): A replacement for tanh activation functions in Quasi-Recurrent Neural Networks
2018
47 citations
Journal Article
green Open Access
Field-Weighted Citation Impact:
4.22
Dual Rectified Linear Units (DReLUs): A replacement for tanh activation functions in Quasi-Recurrent Neural Networks | Researchclopedia