Skip to content
This repository was archived by the owner on Apr 11, 2021. It is now read-only.
This repository was archived by the owner on Apr 11, 2021. It is now read-only.

The activation functions needs to be changed to Htanh() instead of tanh. #3

@sudo-sh

Description

@sudo-sh

p=tf.tanh(wx)
Needs to be replaced by
p=tf.clip_by_value(wx,-1,1)
So, that it becomes the hard tanh function, as described in the paper and validates the straight through estimator.

I.Hubara et.al Binarized Neural Networks: Training Neural Networks with Weights and
Activations Constrained to +1 or -1.

Thanks
-Sudarshan

PS:- Great work done!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions