paddle 动态图自定义 layer
2022/4/26 23:13:43
本文主要是介绍paddle 动态图自定义 layer,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
class DNN(paddle.nn.Layer): #DNN层,负责抽取high-order特征 def __init__(self, sparse_feature_number, sparse_feature_dim, dense_feature_dim, num_field, layer_sizes): super(DNN, self).__init__() self.sparse_feature_number = sparse_feature_number self.sparse_feature_dim = sparse_feature_dim self.dense_feature_dim = dense_feature_dim self.num_field = num_field self.layer_sizes = layer_sizes #利用FM模型的隐特征向量作为网络权重初始化来获得子网络输出向量 sizes = [sparse_feature_dim * num_field] + self.layer_sizes + [1] acts = ["relu" for _ in range(len(self.layer_sizes))] + [None] self._mlp_layers = [] for i in range(len(layer_sizes) + 1): linear = paddle.nn.Linear( in_features=sizes[i], out_features=sizes[i + 1], weight_attr=paddle.ParamAttr( initializer=paddle.nn.initializer.Normal( std=1.0 / math.sqrt(sizes[i])))) self.add_sublayer('linear_%d' % i, linear) self._mlp_layers.append(linear) if acts[i] == 'relu': act = paddle.nn.ReLU() self.add_sublayer('act_%d' % i, act) #得到输入层到embedding层该神经元相连的五条线的权重 #前向传播反馈 def forward(self, feat_embeddings): y_dnn = paddle.reshape(feat_embeddings, [-1, self.num_field * self.sparse_feature_dim]) for n_layer in self._mlp_layers: y_dnn = n_layer(y_dnn) return y_dnn
这篇关于paddle 动态图自定义 layer的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2024-11-23JAVA语音识别项目入门教程
- 2024-11-23Java云原生学习:从入门到实践
- 2024-11-22Java创业学习:初学者的全面指南
- 2024-11-22JAVA创业学习:零基础入门到实战应用教程
- 2024-11-22Java创业学习:从零开始的Java编程入门教程
- 2024-11-22Java对接阿里云智能语音服务学习教程
- 2024-11-22JAVA对接阿里云智能语音服务学习教程
- 2024-11-22Java对接阿里云智能语音服务学习教程
- 2024-11-22Java副业学习:零基础入门到实战项目
- 2024-11-22Java副业学习:零基础入门指南