Skip to content

Weight initialization #204

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

OneAdder
Copy link
Collaborator

Weights Initialization

Added functions for Xavier and Kaiming. The rule of thumb here:

  • S-shaped activation (tanh, sigmoid, etc.) => Xavier
  • ReLU-shaped activation (relu, gelu, silu, etc.) => Kaiming

For networks without Layer or Batch Normalization, that simple tweak will significantly increase convergance

@milancurcic
Copy link
Member

Thanks, Michael, this is definitely needed.

About 1.5 years ago I started an Initializers PR (#151) but forgot about it. Basically it follows a similar pattern to how activations and optimizers are done in NF, which allows complete customization if specified, and sane defaults (like the ones you have here) if unspecified.

Do you think it would work well?

@OneAdder
Copy link
Collaborator Author

@OneAdder
Copy link
Collaborator Author

OneAdder commented Feb 17, 2025

@milancurcic Yes, I think #151 will work!

Comment on lines +128 to +135
if (&
self % activation_name == 'relu' &
.or. self % activation_name == 'leaky_relu' &
.or. self % activation_name == 'celu' &
) then
call random_he(self % weights, self % input_size)
elseif (self % activation_name == 'sigmoid' .or. self % activation_name == 'tanhf') then
call random_xavier(self % weights, self % input_size)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should these be as default? Or should the user be able to choose for another pseudo-random generator?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like how it's done here: #151
In the DL Framework of my dreams, I would have an option to pass the algorithm of weights initialization into a layer's constructor. So:

  • What I want: Initializers stub #151 with Kaiming weights by default but it requires a lot of refactoring
  • Why I made this PR-draft: it is correct from the mathematical standpoint, Xavier for S-shaped and He for .*elu. Will probably resolve CNN training on MNIST does not converge #145 if added to Conv layer
  • How Torch does it: Kaiming for everything. Not ideal, but covers vast majority of cases

@@ -23,4 +23,22 @@ impure elemental subroutine random_normal(x)
x = sqrt(- 2 * log(u(1))) * cos(2 * pi * u(2))
end subroutine random_normal

impure elemental subroutine random_he(x, n_prev)
!! Kaiming weight initialization
real, intent(in out) :: x
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
real, intent(in out) :: x
real, intent(out) :: x


impure elemental subroutine random_xavier(x, n_prev)
!! Kaiming weight initialization
real, intent(in out) :: x
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
real, intent(in out) :: x
real, intent(out) :: x

Comment on lines +39 to +40
lower = -(1. / sqrt(real(n_prev)))
upper = 1. / sqrt(real(n_prev))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
lower = -(1. / sqrt(real(n_prev)))
upper = 1. / sqrt(real(n_prev))
upper = 1. / sqrt(real(n_prev))
lower = -upper

lower = -(1. / sqrt(real(n_prev)))
upper = 1. / sqrt(real(n_prev))
call random_number(x)
x = lower + x * (upper - lower)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this correct if lower == -upper?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants