API - Initializers

To make TensorLayerX simple, TensorLayerX only warps some basic initializers. For more complex activation, TensorFlow(MindSpore, PaddlePaddle, PyTorch) API will be required.

Initializer

Initializer base class: all initializers inherit from this class.

Zeros

Initializer that generates tensors initialized to 0.

Ones

Initializer that generates tensors initialized to 1.

Constant([value])

Initializer that generates tensors initialized to a constant value.

RandomUniform([minval, maxval, seed])

Initializer that generates tensors with a uniform distribution.

RandomNormal([mean, stddev, seed])

Initializer that generates tensors with a normal distribution.

TruncatedNormal([mean, stddev, seed])

Initializer that generates a truncated normal distribution.

HeNormal([seed])

He normal initializer.

deconv2d_bilinear_upsampling_initializer(shape)

Returns the initializer that can be passed to DeConv2dLayer for initializing the weights in correspondence to channel-wise bilinear up-sampling.

XavierNormal([seed])

This class implements the Xavier weight initializer from the paper by Xavier Glorot and Yoshua Bengio.using a normal distribution.

XavierUniform([seed])

This class implements the Xavier weight initializer from the paper by Xavier Glorot and Yoshua Bengio.using a uniform distribution.

Initializer

class tensorlayerx.nn.initializers.Initializer[source]

Initializer base class: all initializers inherit from this class.

Zeros

class tensorlayerx.nn.initializers.Zeros[source]

Initializer that generates tensors initialized to 0.

Examples

>>> import tensorlayerx as tlx
>>> init = tlx.initializers.zeros()
>>> print(init(shape=(5, 10), dtype=tlx.float32))

Ones

class tensorlayerx.nn.initializers.Ones[source]

Initializer that generates tensors initialized to 1.

Examples

>>> import tensorlayerx as tlx
>>> init = tlx.initializers.ones()
>>> print(init(shape=(5, 10), dtype=tlx.float32))

Constant

class tensorlayerx.nn.initializers.Constant(value=0)[source]

Initializer that generates tensors initialized to a constant value.

Parameters

value (A python scalar or a numpy array.) – The assigned value.

Examples

>>> import tensorlayerx as tlx
>>> init = tlx.initializers.constant(value=10)
>>> print(init(shape=(5, 10), dtype=tlx.float32))

RandomUniform

class tensorlayerx.nn.initializers.RandomUniform(minval=-0.05, maxval=0.05, seed=None)[source]

Initializer that generates tensors with a uniform distribution.

Parameters
  • minval (A python scalar or a scalar tensor.) – Lower bound of the range of random values to generate.

  • maxval (A python scalar or a scalar tensor.) – Upper bound of the range of random values to generate.

  • seed (A Python integer.) – Used to seed the random generator.

Examples

>>> import tensorlayerx as tlx
>>> init = tlx.initializers.random_uniform(minval=-0.05, maxval=0.05)
>>> print(init(shape=(5, 10), dtype=tlx.float32))

RandomNormal

class tensorlayerx.nn.initializers.RandomNormal(mean=0.0, stddev=0.05, seed=None)[source]

Initializer that generates tensors with a normal distribution.

Parameters
  • mean (A python scalar or a scalar tensor.) – Mean of the random values to generate.

  • stddev (A python scalar or a scalar tensor.) – Standard deviation of the random values to generate.

  • seed (A Python integer.) – Used to seed the random generator.

  • maxval=0.05 (minval=-0.05,) –

Examples

>>> import tensorlayerx as tlx
>>> init = tlx.initializers.random_normal(mean=0.0, stddev=0.05)
>>> print(init(shape=(5, 10), dtype=tlx.float32))

TruncatedNormal

class tensorlayerx.nn.initializers.TruncatedNormal(mean=0.0, stddev=0.05, seed=None)[source]

Initializer that generates a truncated normal distribution.

These values are similar to values from a RandomNormal except that values more than two standard deviations from the mean are discarded and re-drawn. This is the recommended initializer for neural network weights and filters.

Parameters
  • mean (A python scalar or a scalar tensor.) – Mean of the random values to generate.

  • stddev (A python scalar or a scalar tensor.) – Standard deviation of the andom values to generate.

  • seed (A Python integer.) – Used to seed the random generator.

Examples

>>> import tensorlayerx as tlx
>>> init = tlx.initializers.truncated_normal(mean=0.0, stddev=0.05)
>>> print(init(shape=(5, 10), dtype=tlx.float32))

HeNormal

class tensorlayerx.nn.initializers.HeNormal(seed=None)[source]

He normal initializer.

Parameters

seed (A Python integer.) – Used to seed the random generator.

Examples

>>> import tensorlayerx as tlx
>>> init = tlx.initializers.he_normal()
>>> print(init(shape=(5, 10), dtype=tlx.float32))

deconv2d_bilinear_upsampling_initializer

tensorlayerx.nn.initializers.deconv2d_bilinear_upsampling_initializer(shape)[source]

Returns the initializer that can be passed to DeConv2dLayer for initializing the weights in correspondence to channel-wise bilinear up-sampling. Used in segmentation approaches such as [FCN](https://arxiv.org/abs/1605.06211)

Parameters

shape (tuple of int) – The shape of the filters, [height, width, output_channels, in_channels]. It must match the shape passed to DeConv2dLayer.

Returns

A constant initializer with weights set to correspond to per channel bilinear upsampling when passed as W_int in DeConv2dLayer

Return type

tf.constant_initializer

XavierNormal

class tensorlayerx.nn.initializers.XavierNormal(seed=None)[source]

This class implements the Xavier weight initializer from the paper by Xavier Glorot and Yoshua Bengio.using a normal distribution.

Parameters

seed (A Python integer.) – Used to seed the random generator.

XavierUniform

class tensorlayerx.nn.initializers.XavierUniform(seed=None)[source]

This class implements the Xavier weight initializer from the paper by Xavier Glorot and Yoshua Bengio.using a uniform distribution.

Parameters

seed (A Python integer.) – Used to seed the random generator.