SimCSE的loss实现-tensorflow2
2022/3/28 23:31:31
本文主要是介绍SimCSE的loss实现-tensorflow2,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
对比学习的核心就是loss的编写,记录下loss的tensorflow实现
def unsupervise_loss(y_pred, alpha=0.05): idxs = tf.range(y_pred.shape[0]) y_true = idxs + 1 - idxs % 2 * 2 y_pred = tf.math.l2_normalize(y_pred, dim = 1) similarities = tf.matmul(y_pred, y_pred,adjoint_b = True) similarities = similarities - tf.eye(tf.shape(y_pred)[0]) * 1e12 similarities = similarities / alpha print(y_true) loss = tf.keras.losses.sparse_categorical_crossentropy(y_true, similarities, from_logits=True) return tf.reduce_mean(loss) def supervise_loss(y_pred, alpha=0.05): row = tf.range(0, y_pred.shape[0], 3) col = tf.range(y_pred.shape[0]) col = tf.squeeze(tf.where(col % 3 != 0),axis=1) y_true = tf.range(0, len(col), 2) y_pred = tf.math.l2_normalize(y_pred, dim = 1) similarities = tf.matmul(y_pred, y_pred,adjoint_b = True) similarities = tf.gather(similarities, row, axis=0) similarities = tf.gather(similarities, col, axis=1) similarities = similarities / alpha loss = tf.keras.losses.sparse_categorical_crossentropy(y_true, similarities, from_logits=True) return tf.reduce_mean(loss)
假设embedding向量维度为3
y_pred = tf.random.uniform((6,3))
这篇关于SimCSE的loss实现-tensorflow2的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2024-10-30tensorflow是什么-icode9专业技术文章分享
- 2024-10-15成功地使用本地的 NVIDIA GPU 运行 PyTorch 或 TensorFlow
- 2024-01-23供应链投毒预警 | 恶意Py包仿冒tensorflow AI框架实施后门投毒攻击
- 2024-01-19attributeerror: module 'tensorflow' has no attribute 'placeholder'
- 2024-01-19module 'tensorflow.compat.v2' has no attribute 'internal'
- 2023-07-17【2023年】第33天 Neural Networks and Deep Learning with TensorFlow
- 2023-07-10【2023年】第32天 Boosted Trees with TensorFlow 2.0(随机森林)
- 2023-07-09【2023年】第31天 Logistic Regression with TensorFlow 2.0(用TensorFlow进行逻辑回归)
- 2023-07-01【2023年】第30天 Supervised Learning with TensorFlow 2(用TensorFlow进行监督学习 2)
- 2023-06-18【2023年】第29天 Supervised Learning with TensorFlow 1(用TensorFlow进行监督学习 1)