paddle 动态图自定义 layer
2022/4/26 23:13:43
本文主要是介绍paddle 动态图自定义 layer,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
class DNN(paddle.nn.Layer): #DNN层,负责抽取high-order特征 def __init__(self, sparse_feature_number, sparse_feature_dim, dense_feature_dim, num_field, layer_sizes): super(DNN, self).__init__() self.sparse_feature_number = sparse_feature_number self.sparse_feature_dim = sparse_feature_dim self.dense_feature_dim = dense_feature_dim self.num_field = num_field self.layer_sizes = layer_sizes #利用FM模型的隐特征向量作为网络权重初始化来获得子网络输出向量 sizes = [sparse_feature_dim * num_field] + self.layer_sizes + [1] acts = ["relu" for _ in range(len(self.layer_sizes))] + [None] self._mlp_layers = [] for i in range(len(layer_sizes) + 1): linear = paddle.nn.Linear( in_features=sizes[i], out_features=sizes[i + 1], weight_attr=paddle.ParamAttr( initializer=paddle.nn.initializer.Normal( std=1.0 / math.sqrt(sizes[i])))) self.add_sublayer('linear_%d' % i, linear) self._mlp_layers.append(linear) if acts[i] == 'relu': act = paddle.nn.ReLU() self.add_sublayer('act_%d' % i, act) #得到输入层到embedding层该神经元相连的五条线的权重 #前向传播反馈 def forward(self, feat_embeddings): y_dnn = paddle.reshape(feat_embeddings, [-1, self.num_field * self.sparse_feature_dim]) for n_layer in self._mlp_layers: y_dnn = n_layer(y_dnn) return y_dnn
这篇关于paddle 动态图自定义 layer的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2025-01-09百万架构师第十二课:源码分析:Spring 源码分析:Spring系统概述及IOC实现原理|JavaGuide
- 2025-01-08如何用关键链方法突破项目管理瓶颈?
- 2025-01-08电商人必看!6 款提升团队协作与客户满意度软件!
- 2025-01-08电商团队管理混乱?快用这 6 款软件优化协作流程!
- 2025-01-08短剧制作效率低?试试这5款任务管理工具
- 2025-01-08高效应对电商高峰,6 款团队协作软件大揭秘!
- 2025-01-08为什么外贸人都爱上了在线协作工具?
- 2025-01-08提升工作效率,从这些任务管理工具开始
- 2025-01-08新年电商订单暴增,必备的 6 款可视化协作办公软件有哪些?
- 2025-01-08短剧制作经理必备技能与工具全解析