softplusLayer

Softplus layer for actor or critic network

Description

A SoftplusLayer is a deep neural network layer that implements the softplus activation Y = log(1 + eX), which ensures that the output is always positive. This activation function is a smooth continuous version of reluLayer. You can incorporate this layer into the deep neural networks you define for actors in reinforcement learning agents. This layer is useful for creating continuous Gaussian policy deep neural networks, for which the standard deviation output must be positive.

Creation

Description

example

sLayer = softplusLayer creates a softplus layer with default property values.

sLayer = softplusLayer(Name,Value) sets properties using name-value pairs. For example, softplusLayer('Name','softlayer') creates a softplus layer and assigns the name 'softlayer'.

Properties

expand all

Name of layer, specified as a character vector. To include a layer in a layer graph, you must specify a nonempty unique layer name. If you train a series network with this layer and Name is set to '', then the software automatically assigns a name to the layer at training time.

This property is read-only.

Description of layer, specified as a character vector. When you create the softplus layer, you can use this property to give it a description that helps you identify its purpose.

Examples

collapse all

Create s softplus layer.

sLayer = softplusLayer;

You can specify the name of the softplus layer. For example, if the softplus layer represents the standard deviation of a Gaussian policy deep neural network, you can specify an appropriate name.

sLayer = softplusLayer('Name','stddev')
sLayer = 
  SoftplusLayer with properties:

    Name: 'stddev'

  Show all properties

You can incorporate sLayer into an actor network for reinforcement learning.

Introduced in R2020a