## 一、对比损失

max 函数取最大值 0 和边距 m 减去距离。

## 二、参考代码

```def loss(margin=1):
"""Provides 'constrastive_loss' an enclosing scope with variable 'margin'.
Arguments:
margin: Integer, defines the baseline for distance for which pairs
should be classified as dissimilar. - (default is 1).
Returns:
'constrastive_loss' function with data ('margin') attached.
"""
# Contrastive loss = mean( (1-true_value) * square(prediction) +
#                         true_value * square( max(margin-prediction, 0) ))
def contrastive_loss(y_true, y_pred):
"""Calculates the constrastive loss.
Arguments:
y_true: List of labels, each label is of type float32.
y_pred: List of predictions of same length as of y_true,
each label is of type float32.
Returns:
A tensor containing constrastive loss as floating point value.
"""
square_pred = tf.math.square(y_pred)
margin_square = tf.math.square(tf.math.maximum(margin - (y_pred), 0))
return tf.math.reduce_mean(
(1 - y_true) * square_pred + (y_true) * margin_square
)
return contrastive_loss```

```# import the necessary packages
import tensorflow.keras.backend as K
import tensorflow as tf
def contrastive_loss(y, preds, margin=1):
# explicitly cast the true class label data type to the predicted
# class label data type (otherwise we run the risk of having two
# separate data types, causing TensorFlow to error out)
y = tf.cast(y, preds.dtype)
# calculate the contrastive loss between the true labels and
# the predicted labels
squaredPreds = K.square(preds)
squaredMargin = K.square(K.maximum(margin - preds, 0))
loss = K.mean(y * squaredPreds + (1 - y) * squaredMargin)
# return the computed contrastive loss to the calling function
return loss```