VGG-19架构 pytorch实现
2021/10/23 23:16:42
本文主要是介绍VGG-19架构 pytorch实现,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
import torch import torch.nn as nn from torchinfo import summary class VGG19(nn.Module): def __init__(self): super().__init__() self.conv1=nn.Conv2d(3,64,kernel_size=3,padding=1) self.conv2=nn.Conv2d(64,64,kernel_size=3,padding=1) self.pool1 = nn.MaxPool2d(kernel_size=2,stride=2) self.conv3=nn.Conv2d(64,128,kernel_size=3,padding=1) self.conv4=nn.Conv2d(128,128,kernel_size=3,padding=1) self.pool2=nn.MaxPool2d(kernel_size=2,stride=2) self.conv5 = nn.Conv2d(128, 256, kernel_size=3,padding=1) self.conv6 = nn.Conv2d(256, 256, kernel_size=3,padding=1) self.conv7 = nn.Conv2d(256, 256, kernel_size=3,padding=1) self.conv8 = nn.Conv2d(256, 256, kernel_size=3,padding=1) self.pool3=nn.MaxPool2d(kernel_size=2,stride=2) self.conv9 = nn.Conv2d(256, 512, kernel_size=3,padding=1) self.conv10 = nn.Conv2d(512, 512, kernel_size=3,padding=1) self.conv11= nn.Conv2d(512, 512, kernel_size=3,padding=1) self.conv12= nn.Conv2d(512, 512, kernel_size=3,padding=1) self.pool4=nn.MaxPool2d(kernel_size=2,stride=2,padding=1) self.conv13 = nn.Conv2d(512, 512, kernel_size=3,padding=1) self.conv14 = nn.Conv2d(512, 512, kernel_size=3,padding=1) self.conv15 = nn.Conv2d(512, 512, kernel_size=3,padding=1) self.conv16 = nn.Conv2d(512, 512, kernel_size=3,padding=1) self.pool5=nn.MaxPool2d(kernel_size=2,stride=2) self.fc1=nn.Linear(7*7*512,4096) self.fc2=nn.Linear(4096,4096) self.fc3=nn.Linear(4096,10) self.relu=nn.ReLU() self.softmax=nn.Softmax(dim=1) print("success") def forward(self,x): x = self.relu(self.conv1(x)) x = self.relu(self.conv2(x)) x = self.pool1(x) x = self.relu(self.conv3(x)) x = self.relu(self.conv4(x)) x = self.pool2(x) x = self.relu(self.conv5(x)) x = self.relu(self.conv6(x)) x = self.relu(self.conv7(x)) x = self.relu(self.conv8(x)) x = self.pool3(x) x = self.relu(self.conv9(x)) x = self.relu(self.conv10(x)) x = self.relu(self.conv11(x)) x = self.relu(self.conv12(x)) x = self.pool4(x) x = self.relu(self.conv13(x)) x = self.relu(self.conv14(x)) x = self.relu(self.conv15(x)) x = self.relu(self.conv16(x)) x = self.pool5(x) # x = x.view(x.size()[0], -1) x = x.view(-1,7*7*512) x = self.relu(self.fc1(x)) x = self.relu(self.fc2(x)) output = self.softmax(self.fc3(x)) return output net= VGG19() # print(net) data=torch.ones(size=(10,3,224,224)) net(data) print(net(data).shape) print(net(data)) # summary(net)
结果:
================================================================= Layer (type:depth-idx) Param # ================================================================= VGG19 -- ├─Conv2d: 1-1 1,792 ├─Conv2d: 1-2 36,928 ├─MaxPool2d: 1-3 -- ├─Conv2d: 1-4 73,856 ├─Conv2d: 1-5 147,584 ├─MaxPool2d: 1-6 -- ├─Conv2d: 1-7 295,168 ├─Conv2d: 1-8 590,080 ├─Conv2d: 1-9 590,080 ├─Conv2d: 1-10 590,080 ├─MaxPool2d: 1-11 -- ├─Conv2d: 1-12 1,180,160 ├─Conv2d: 1-13 2,359,808 ├─Conv2d: 1-14 2,359,808 ├─Conv2d: 1-15 2,359,808 ├─MaxPool2d: 1-16 -- ├─Conv2d: 1-17 2,359,808 ├─Conv2d: 1-18 2,359,808 ├─Conv2d: 1-19 2,359,808 ├─Conv2d: 1-20 2,359,808 ├─MaxPool2d: 1-21 -- ├─Linear: 1-22 102,764,544 ├─Linear: 1-23 16,781,312 ├─Linear: 1-24 40,970 ├─ReLU: 1-25 -- ├─Softmax: 1-26 --
这篇关于VGG-19架构 pytorch实现的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2024-11-23增量更新怎么做?-icode9专业技术文章分享
- 2024-11-23压缩包加密方案有哪些?-icode9专业技术文章分享
- 2024-11-23用shell怎么写一个开机时自动同步远程仓库的代码?-icode9专业技术文章分享
- 2024-11-23webman可以同步自己的仓库吗?-icode9专业技术文章分享
- 2024-11-23在 Webman 中怎么判断是否有某命令进程正在运行?-icode9专业技术文章分享
- 2024-11-23如何重置new Swiper?-icode9专业技术文章分享
- 2024-11-23oss直传有什么好处?-icode9专业技术文章分享
- 2024-11-23如何将oss直传封装成一个组件在其他页面调用时都可以使用?-icode9专业技术文章分享
- 2024-11-23怎么使用laravel 11在代码里获取路由列表?-icode9专业技术文章分享
- 2024-11-22怎么实现ansible playbook 备份代码中命名包含时间戳功能?-icode9专业技术文章分享