Press "Enter" to skip to content

机器学习笔记 – 关于Contrastive Loss对比损失

本站内容均来自兴趣收集,如不慎侵害的您的相关权益,请留言告知,我们将尽快删除.谢谢.

一、对比损失

 

虽然二元交叉熵(下图公式)肯定是损失函数的有效选择,但它不是唯一的选择(甚至不是最佳选择)。

 

 

然而,实际上有一个更适合孪生网络的损失函数,称为对比损失。

 

 

其中Y是我们的标签。如果图像对属于同一类,则为 1,如果图像对属于不同类,则为 0。

 

变量是孪生网络的输出之间的欧几里得距离。

 

max 函数取最大值 0 和边距 m 减去距离。

 

孪生网络的目标不是对一组图像对进行分类,而是区分它们。 从本质上讲,对比损失是评估孪生网络在图像对之间的区分程度。

 

如果能读得进去论文,可以更深入了解下。 http://yann.lecun.com/exdb/publis/pdf/hadsell-chopra-lecun-06.pdf http://yann.lecun.com/exdb/publis/pdf/hadsell-chopra-lecun-06.pdf 在 Keras/TensorFlow 代码中实现的对比损失函数的公式(也是论文中说的exact loss functi):

 

使用 Keras 和 TensorFlow 实现对比损失函数

二、参考代码

 

Image similarity estimation using a Siamese Network with a contrastive loss Keras documentation https://keras.io/examples/vision/siamese_contrastive/ 摘抄函数定义1

 

def loss(margin=1):
    """Provides 'constrastive_loss' an enclosing scope with variable 'margin'.
    Arguments:
        margin: Integer, defines the baseline for distance for which pairs
                should be classified as dissimilar. - (default is 1).
    Returns:
        'constrastive_loss' function with data ('margin') attached.
    """
    # Contrastive loss = mean( (1-true_value) * square(prediction) +
    #                         true_value * square( max(margin-prediction, 0) ))
    def contrastive_loss(y_true, y_pred):
        """Calculates the constrastive loss.
        Arguments:
            y_true: List of labels, each label is of type float32.
            y_pred: List of predictions of same length as of y_true,
                    each label is of type float32.
        Returns:
            A tensor containing constrastive loss as floating point value.
        """
        square_pred = tf.math.square(y_pred)
        margin_square = tf.math.square(tf.math.maximum(margin - (y_pred), 0))
        return tf.math.reduce_mean(
            (1 - y_true) * square_pred + (y_true) * margin_square
        )
    return contrastive_loss

 

摘抄函数定义2

 

# import the necessary packages
import tensorflow.keras.backend as K
import tensorflow as tf
def contrastive_loss(y, preds, margin=1):
# explicitly cast the true class label data type to the predicted
# class label data type (otherwise we run the risk of having two
# separate data types, causing TensorFlow to error out)
y = tf.cast(y, preds.dtype)
# calculate the contrastive loss between the true labels and
# the predicted labels
squaredPreds = K.square(preds)
squaredMargin = K.square(K.maximum(margin - preds, 0))
loss = K.mean(y * squaredPreds + (1 - y) * squaredMargin)
# return the computed contrastive loss to the calling function
return loss

Be First to Comment

发表回复

您的电子邮箱地址不会被公开。 必填项已用*标注