SimCSE的loss实现-tensorflow2

2022/3/28 23:31:31

本文主要是介绍SimCSE的loss实现-tensorflow2,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!

对比学习的核心就是loss的编写,记录下loss的tensorflow实现

def unsupervise_loss(y_pred, alpha=0.05):
    idxs = tf.range(y_pred.shape[0])
    y_true = idxs + 1 - idxs % 2 * 2
    y_pred = tf.math.l2_normalize(y_pred, dim = 1)
    similarities = tf.matmul(y_pred, y_pred,adjoint_b = True)
    similarities = similarities - tf.eye(tf.shape(y_pred)[0]) * 1e12
    similarities = similarities / alpha
    print(y_true)
    loss = tf.keras.losses.sparse_categorical_crossentropy(y_true, similarities, from_logits=True)
    return tf.reduce_mean(loss)

def supervise_loss(y_pred, alpha=0.05):
    row = tf.range(0, y_pred.shape[0], 3)
    col = tf.range(y_pred.shape[0])
    col = tf.squeeze(tf.where(col % 3 != 0),axis=1)
    y_true = tf.range(0, len(col), 2)
    y_pred = tf.math.l2_normalize(y_pred, dim = 1)
    similarities = tf.matmul(y_pred, y_pred,adjoint_b = True)

    similarities = tf.gather(similarities, row, axis=0)
    similarities = tf.gather(similarities, col, axis=1)

    similarities = similarities / alpha
    loss = tf.keras.losses.sparse_categorical_crossentropy(y_true, similarities, from_logits=True)
    return tf.reduce_mean(loss)

假设embedding向量维度为3

y_pred = tf.random.uniform((6,3))



这篇关于SimCSE的loss实现-tensorflow2的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!


扫一扫关注最新编程教程