Press "Enter" to skip to content

MindSpore激活函数总结与测试

本站内容均来自兴趣收集,如不慎侵害的您的相关权益,请留言告知,我们将尽快删除.谢谢.

技术背景

 

激活函数在机器学习的前向网络中担任着非常重要的角色,我们可以认为它是一个决策函数。举个例子说,我们要判断一个输出的数据是猫还是狗,我们所得到的数据是0.01,而我们预设的数据中0代表猫1代表狗,那幺0.01虽然不是0也不是1,但是我们可以预期这张图片是猫的概率肯定是非常大的。这样的话我们就可以假定一个激活函数,当得到的数据小于0.5时,这个数据就被认为是猫,大于0.5时,这个数据就被认为是狗,这就是人为定义的一种决策函数。这篇文章主要介绍的是,在MindSpore中已经实现的几种激活函数及其使用方法。

 

双余弦激活函数

 

双余弦函数取的两边极限是 \([-1,1]\) ,整体的趋势在靠近中间模糊地带时斜率是最大的,这种场景适用于区分的两种结果主体特征差异较大的情况。

该函数作图所用到的python代码如下,其中通过 plt.gca() 的方法来调整坐标轴的呈现:

 

import matplotlib.pyplot as plt
import numpy as np
def _tanh(l) -> list:
    '''Self defined equation for evaluating.
    '''
    new_l = []
    for x in l:
        th = (np.exp(2*x)-1)/(np.exp(2*x)+1)
        new_l.append(th)
    return new_l
plt.figure()
plt.title('Tanh')
plt.xlabel('x')
plt.ylabel('y')
plt.ylim(-3,3)
ax = plt.gca()
ax.spines['right'].set_color('none')
ax.spines['top'].set_color('none')
ax.xaxis.set_ticks_position('bottom')
ax.yaxis.set_ticks_position('left')
ax.spines['bottom'].set_position(('data', 0))
ax.spines['left'].set_position(('data', 0))
x = np.arange(-6,6,0.05)
y = _tanh(x)
plt.plot(x,y)
plt.savefig('function.png')

 

双余弦激活函数的函数形式为:

 

\[tanh(x)=\frac{e^x-e^{-x}}{e^x+e^{-x}}=\frac{e^{2x}-1}{e^{2x}+1} \]

 

在官方的指导文档中我们可以看到,该激活函数支持昇腾、GPU和CPU3个平台的操作:

那幺我们通过一个在CPU上执行的案例来对这个函数进行测试:

 

# activation.py
from mindspore import context
context.set_context(mode=context._MODE, device_target="CPU")
import mindspore as ms
from mindspore import Tensor, ops
import numpy as np
from tabulate import tabulate
def _tanh(l) -> list:
    '''Self defined equation for evaluating.
    '''
    new_l = []
    for x in l:
        th = (np.exp(2*x)-1)/(np.exp(2*x)+1)
        new_l.append(th)
    return new_l
x = np.([1,2,3,4,5]).astype(np.float32)
input_x = Tensor(x, ms.float32)
tanh = ops.Tanh()
output = tanh(input_x)
_output = _tanh(x)
# Format output information
header = ['Mindspore Output', 'Equation Output']
table = []
for i in range(len(x)):
    table.append((output[i], _output[i]))
print (tabulate(table, headers=header, tablefmt='fancy_grid'))

 

在这个测试案例中,带下划线的是我们自己实现的遍历计算的激活函数,MindSpore的激活函数都在ops这个路径下。最后,我们用tabulate稍微美化了一下输出数据的效果,执行结果如下所示:

 

[email protected]:~/projects/gitlab/dechin/src/mindspore$ sudo docker run --rm -v /dev/shm:/dev/shm -v /home/dechin/projects/gitlab/dechin/src/mindspore/:/home/ --runtime=nvidia --privileged=true swr.cn-south-1.myhuaweicloud.com/mindspore/mindspore-gpu:1.2.0 /bin/bash -c "cd /home && python -m pip install tabulate && python activation.py"
Looking in indexes: http://mirrors.aliyun.com/pypi/simple/
Collecting tabulate
  Downloading http://mirrors.aliyun.com/pypi/packages/ca/80/7c0cad11bd99985cfe7c09427ee0b4f9bd6b048bd13d4ffb32c6db237dfb/tabulate-0.8.9-py3-none-any.whl
Installing collected packages: tabulate
  WARNING: The script tabulate is installed in '/usr/local/python-3.7.5/bin' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed tabulate-0.8.9
WARNING: You are using pip version 19.2.3, however version 21.1.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
╒════════════════════╤═══════════════════╕
│ Mindspore Output   │   Equation Output │
╞════════════════════╪═══════════════════╡
│ 0.7615942          │          0.761594 │
├────────────────────┼───────────────────┤
│ 0.9640276          │          0.964028 │
├────────────────────┼───────────────────┤
│ 0.9950548          │          0.995055 │
├────────────────────┼───────────────────┤
│ 0.9993293          │          0.999329 │
├────────────────────┼───────────────────┤
│ 0.9999092          │          0.999909 │
╘════════════════════╧═══════════════════╛

 

由于这里官方容器镜像缺乏了一个tabulate的库,而我这边本地执行之后又不希望保留众多的容器历史记录,因此每次运行容器都会加上 rm 选项,所以最偷懒的做法就是在容器运行指令里面加一条pip安装python库的指令,这也是因为这个库比较小,安装并不需要多少时间。如果对可操作性要求比较高的童鞋,可以参考docker的restart指令去运行或者是在原镜像的基础上自行安装好相应的python库再commit到镜像中,比如可以参考这一篇博客。

 

Softmax激活函数

 

Softmax是一个指数型的归一化函数,常用于判定一个给定的函数值是否属于某一个类别的分类器,相应的函数值越高取得的概率就越大,在多类别的分类器中发挥着重要的作用,其函数图像如下图所示:

Softmax所对应的函数表达形式为:

 

\[softmax(x)=\frac{e^x}{\sum_je^x_j} \]

 

该函数的函数图生成代码如下所示:

 

import matplotlib.pyplot as plt
import numpy as np
def _softmax(l) -> list:
    '''Self defined equation for evaluating.
    '''
    new_l = []
    for x in l:
        th = np.exp(x)
        new_l.append(th)
    sum_th = sum(new_l)
    for i in range(len(l)):
        new_l[i] /= sum_th
    return new_l
plt.figure()
plt.title('Softmax')
plt.xlabel('x')
plt.ylabel('y')
ax = plt.gca()
ax.spines['right'].set_color('none')
ax.spines['top'].set_color('none')
ax.xaxis.set_ticks_position('bottom')
ax.yaxis.set_ticks_position('left')
ax.spines['bottom'].set_position(('data', 0))
ax.spines['left'].set_position(('data', 0))
x = np.arange(-6,6,0.05)
y = _softmax(x)
print (x,y)

 

在这个代码中,除去替换了softmax的计算函数,也取消了y轴的范围限制。对应的Softmax函数在MindSpore中的调用如下,同样的也是从ops里面获取Softmax函数:

 

# activation.py
from mindspore import context
context.set_context(mode=context.GRAPH_MODE, device_target="CPU")
import mindspore as ms
from mindspore import Tensor, ops
import numpy as np
from tabulate import tabulate
def _softmax(l) -> list:
    '''Self defined equation for evaluating.
    '''
    new_l = []
    for x in l:
        th = np.exp(x)
        new_l.append(th)
    sum_th = sum(new_l)
    for i in range(len(l)):
        new_l[i] /= sum_th
    return new_l
x = np.array([1, 2, 3, 4, 5]).astype(np.float32)
input_x = Tensor(x, ms.float32)
softmax = ops.Softmax()
output = softmax(input_x)
_output = _softmax(x)
# Format output information
header = ['Mindspore Output', 'Equation Output']
table = []
for i in range(len(x)):
    table.append((output[i], _output[i]))
print (tabulate(table, headers=header, tablefmt='fancy_grid'))

 

同样的使用docker容器来运行这个mindspore实例并得到对比的结果:

 

[email protected]:~/projects/gitlab/dechin/src/mindspore$ sudo docker run --rm -v /dev/shm:/dev/shm -v /home/dechin/projects/gitlab/dechin/src/mindspore/:/home/ --runtime=nvidia --privileged=true swr.cn-south-1.myhuaweicloud.com/mindspore/mindspore-gpu:1.2.0 /bin/bash -c "cd /home && python -m pip install tabulate && python activation.py"
Looking in indexes: http://mirrors.aliyun.com/pypi/simple/
Collecting tabulate
  Downloading http://mirrors.aliyun.com/pypi/packages/ca/80/7c0cad11bd99985cfe7c09427ee0b4f9bd6b048bd13d4ffb32c6db237dfb/tabulate-0.8.9-py3-none-any.whl
Installing collected packages: tabulate
  WARNING: The script tabulate is installed in '/usr/local/python-3.7.5/bin' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed tabulate-0.8.9
WARNING: You are using pip version 19.2.3, however version 21.1.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
╒════════════════════╤═══════════════════╕
│ Mindspore Output   │   Equation Output │
╞════════════════════╪═══════════════════╡
│ 0.011656228        │         0.0116562 │
├────────────────────┼───────────────────┤
│ 0.031684916        │         0.0316849 │
├────────────────────┼───────────────────┤
│ 0.08612853         │         0.0861285 │
├────────────────────┼───────────────────┤
│ 0.23412165         │         0.234122  │
├────────────────────┼───────────────────┤
│ 0.63640857         │         0.636409  │
╘════════════════════╧═══════════════════╛

 

Sigmoid激活函数

 

不同的激活函数适用于不同的应用场景,比如Softmax更加适用于多类别分类器中只有一个正确答案的场景,而Sigmoid函数则适用于多个正确答案的场景,可以获取到对应于每一种答案的概率。从函数图像来说,Sigmoid激活函数的形状有点像是前面一个章节中提到的Tanh激活函数:

其对应的函数表达形式为:

 

\[sigmoid(x)=\frac{1}{1+e^{-x}} \]

 

该函数生成函数图像的python代码如下所示:

 

import matplotlib.pyplot as plt
import numpy as np
def _sigmoid(l) -> list:
    '''Self defined equation for evaluating.
    '''
    new_l = []
    for x in l:
        th = 1/(np.exp(-x)+1)
        new_l.append(th)
    return new_l
plt.figure()
plt.title('Sigmoid')
plt.xlabel('x')
plt.ylabel('y')
ax = plt.gca()
ax.spines['right'].set_color('none')
ax.spines['top'].set_color('none')
ax.xaxis.set_ticks_position('bottom')
ax.yaxis.set_ticks_position('left')
ax.spines['bottom'].set_position(('data', 0))
ax.spines['left'].set_position(('data', 0))
x = np.arange(-6,6,0.05)
y = _sigmoid(x)
plt.plot(x,y)
plt.savefig('function.png')

 

以下是使用MindSpore来运行Sigmoid激活函数的方案,并与我们自己写的Sigmoid计算函数进行一次对比:

 

# activation.py
from mindspore import context
context.set_context(mode=context.GRAPH_MODE, device_target="CPU")
import mindspore as ms
from mindspore import Tensor, ops
import numpy as np
from tabulate import tabulate
def _sigmoid(l) -> list:
    '''Self defined equation for evaluating.
    '''
    new_l = []
    for x in l:
        th = 1/(np.exp(-x)+1)
        new_l.append(th)
    return new_l
x = np.array([1, 2, 3, 4, 5]).astype(np.float32)
input_x = Tensor(x, ms.float32)
sigmoid = ops.Sigmoid()
output = sigmoid(input_x)
_output = _sigmoid(x)
# Format output information
header = ['Mindspore Output', 'Equation Output']
table = []
for i in range(len(x)):
    table.append((output[i], _output[i]))
print (tabulate(table, headers=header, tablefmt='fancy_grid'))

 

在Docker容器的执行下输出如下所示(请自行忽略在安装tabulate的过程中产生的一些冗余打印信息):

 

[email protected]:~/projects/gitlab/dechin/src/mindspore$ sudo docker run --rm -v /dev/shm:/dev/shm -v /home/dechin/projects/gitlab/dechin/src/mindspore/:/home/ --runtime=nvidia --privileged=true swr.cn-south-1.myhuaweicloud.com/mindspore/mindspore-gpu:1.2.0 /bin/bash -c "cd /home && python -m pip install tabulate && python activation.py"
Looking in indexes: http://mirrors.aliyun.com/pypi/simple/
Collecting tabulate
  Downloading http://mirrors.aliyun.com/pypi/packages/ca/80/7c0cad11bd99985cfe7c09427ee0b4f9bd6b048bd13d4ffb32c6db237dfb/tabulate-0.8.9-py3-none-any.whl
Installing collected packages: tabulate
  WARNING: The script tabulate is installed in '/usr/local/python-3.7.5/bin' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed tabulate-0.8.9
WARNING: You are using pip version 19.2.3, however version 21.1.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
╒════════════════════╤═══════════════════╕
│ Mindspore Output   │   Equation Output │
╞════════════════════╪═══════════════════╡
│ 0.7310586          │          0.731059 │
├────────────────────┼───────────────────┤
│ 0.8807971          │          0.880797 │
├────────────────────┼───────────────────┤
│ 0.95257413         │          0.952574 │
├────────────────────┼───────────────────┤
│ 0.9820138          │          0.982014 │
├────────────────────┼───────────────────┤
│ 0.9933072          │          0.993307 │
╘════════════════════╧═══════════════════╛

 

Softplus激活函数

 

Softplus是跟Softmax性质比较相似的一种激活函数类型,下图是Softplus的函数形状示意图:

其对应的函数表达形式为:

 

\[softplus(x)=log(1+e^x) \]

 

我们可以很明显的看到,Softplus函数在后期的增长其实更是线性的一个趋势而不是指数增长的趋势( \(\lim\limits_{x\to+\infty}softplus(x)=x\) ),相比于Softmax函数要更加柔和一些。生成该函数图像的python代码如下所示:

 

import matplotlib.pyplot as plt
import numpy as np
def _softplus(l) -> list:
    '''Self defined equation for evaluating.
    '''
    new_l = []
    for x in l:
        th = np.log(1+np.exp(x))
        new_l.append(th)
    return new_l
plt.figure()
plt.title('Softplus')
plt.xlabel('x')
plt.ylabel('y')
ax = plt.gca()
ax.spines['right'].set_color('none')
ax.spines['top'].set_color('none')
ax.xaxis.set_ticks_position('bottom')
ax.yaxis.set_ticks_position('left')
ax.spines['bottom'].set_position(('data', 0))
ax.spines['left'].set_position(('data', 0))
x = np.arange(-6,6,0.05)
y = _softplus(x)
plt.plot(x,y)
plt.savefig('function.png')

 

同样的我们也可以看一下在MindSpore上面实现Softplus激活函数的方法,这里有一点需要注意的是,在 官方文档 中也有相应的提示:

这个激活函数的实现目前仅支持了GPU和昇腾的版本,因此我们这里在context中修改为GPU的字段来进行运行,这一点跟前面几个激活函数有所差别:

 

# activation.py
from mindspore import context
context.set_context(mode=context.GRAPH_MODE, device_target="GPU")
import mindspore as ms
from mindspore import Tensor, ops
import numpy as np
from tabulate import tabulate
def _softplus(l) -> list:
    '''Self defined equation for evaluating.
    '''
    new_l = []
    for x in l:
        th = np.log(1+np.exp(x))
        new_l.append(th)
    return new_l
x = np.array([1, 2, 3, 4, 5]).astype(np.float32)
input_x = Tensor(x, ms.float32)
softplus = ops.Softplus()
output = softplus(input_x)
_output = _softplus(x)
# Format output information
header = ['Mindspore Output', 'Equation Output']
table = []
for i in range(len(x)):
    table.append((output[i], _output[i]))
print (tabulate(table, headers=header, tablefmt='fancy_grid'))

 

在docker容器中的运行结果如下:

 

[email protected]:~/projects/gitlab/dechin/src/mindspore$ sudo docker run --rm -v /dev/shm:/dev/shm -v /home/dechin/projects/gitlab/dechin/src/mindspore/:/home/ --runtime=nvidia --privileged=true swr.cn-south-1.myhuaweicloud.com/mindspore/mindspore-gpu:1.2.0 /bin/bash -c "cd /home && python -m pip install tabulate && python activation.py"
Looking in indexes: http://mirrors.aliyun.com/pypi/simple/
Collecting tabulate
  Downloading http://mirrors.aliyun.com/pypi/packages/ca/80/7c0cad11bd99985cfe7c09427ee0b4f9bd6b048bd13d4ffb32c6db237dfb/tabulate-0.8.9-py3-none-any.whl
Installing collected packages: tabulate
  WARNING: The script tabulate is installed in '/usr/local/python-3.7.5/bin' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed tabulate-0.8.9
WARNING: You are using pip version 19.2.3, however version 21.1.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
╒════════════════════╤═══════════════════╕
│ Mindspore Output   │   Equation Output │
╞════════════════════╪═══════════════════╡
│ 1.3132616          │           1.31326 │
├────────────────────┼───────────────────┤
│ 2.126928           │           2.12693 │
├────────────────────┼───────────────────┤
│ 3.0485873          │           3.04859 │
├────────────────────┼───────────────────┤
│ 4.01815            │           4.01815 │
├────────────────────┼───────────────────┤
│ 5.0067153          │           5.00672 │
╘════════════════════╧═══════════════════╛

 

需要特殊说明的是,MindSpore在GPU版本的运行中,需要先编译一段时间,因此速度相比于CPU会没有那幺快,除非是数据量特别大的情况,才能塞满GPU的Flops。

 

总结概要

 

这篇文章主要介绍了Softplus、Sigmoid、Softmax和Tanh这4种激活函数,激活函数在机器学习领域中主要起到的是一个决策性质的作用,在分类器中有重要的应用价值。而除了这4种激活函数之外,MindSpore还实现了Softsign的激活函数,但是目前只能在昇腾平台上使用,其他的除了Softplus只能在GPU和昇腾平台运行之外,都是全平台支持的(CPU、GPU、昇腾)。

 

版权声明

 

本文首发链接为: https://www.cnblogs.com/dechinphy/p/activate.html

 

作者ID:DechinPhy

 

更多原着文章请参考: https://www.cnblogs.com/dechinphy/

 

打赏专用链接: https://www.cnblogs.com/dechinphy/gallery/image/379634.html

 

腾讯云专栏同步: https://cloud.tencent.com/developer/column/91958

Be First to Comment

发表评论

您的电子邮箱地址不会被公开。 必填项已用*标注