卷积神经网络
2021/4/15 18:58:19
本文主要是介绍卷积神经网络,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
目录
- 2D Convolution
- Kernel size
- Padding & Stride
- Channels
- For instance
- LeNet-5
- Pyramid Architecture
- layers.Conv2D
- weight & bias
- nn.conv2d
- Gradient?
- For instance
2D Convolution
Kernel size
- 矩阵卷积
Padding & Stride
- 步长2
Channels
For instance
- x: [b,28,28,3]
- one k: [3,3,3]
- multi-k: [16,3,3,3]
- stride: 1
- padding: [1,1,1,1]
- bias: [16]
- out: [b,28,28,16]
LeNet-5
Pyramid Architecture
- 从底层的边缘颜色到高层抽象的概念(轮子、车窗)
layers.Conv2D
import tensorflow as tf from tensorflow.keras import layers
x = tf.random.normal([1, 32, 32, 3])
# padding='valid':输入和输出维度不同 layer = layers.Conv2D(4, kernel_size=5, strides=1, padding='valid') out = layer(x) out.shape
TensorShape([1, 28, 28, 4])
# padding='same':输入和输出维度相同 layer = layers.Conv2D(4, kernel_size=5, strides=1, padding='same') out = layer(x) out.shape
TensorShape([1, 32, 32, 4])
layer = layers.Conv2D(4, kernel_size=5, strides=2, padding='same') out = layer(x) out.shape
TensorShape([1, 16, 16, 4])
layer.call(x).shape
TensorShape([1, 16, 16, 4])
weight & bias
layer = layers.Conv2D(4, kernel_size=5, strides=2, padding='same') out = layer(x) out.shape
TensorShape([1, 16, 16, 4])
# 5,5--》size,3--》通道数,4--》核数量 layer.kernel.shape
TensorShape([5, 5, 3, 4])
layer.bias
<tf.Variable 'conv2d_11/bias:0' shape=(4,) dtype=float32, numpy=array([0., 0., 0., 0.], dtype=float32)>
nn.conv2d
w = tf.random.normal([5, 5, 3, 4]) b = tf.zeros([4]) x.shape
TensorShape([1, 32, 32, 3])
out = tf.nn.conv2d(x, w, strides=1, padding='VALID') out.shape
TensorShape([1, 28, 28, 4])
out = out + b out.shape
TensorShape([1, 28, 28, 4])
out = tf.nn.conv2d(x, w, strides=2, padding='VALID') out.shape
TensorShape([1, 14, 14, 4])
Gradient?
\[\frac{\partial{Loss}}{\partial{w}} \]
For instance
这篇关于卷积神经网络的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2024-06-25AI大模型企业应用实战(24)-什么是zero-shot, one-shot和few-shot Learning?
- 2024-06-24AI大模型企业应用实战(19)-RAG应用框架和解析器
- 2024-06-24AI大模型企业应用实战(20)-RAG相似性检索的关键 - Embedding
- 2024-06-24AI大模型企业应用实战(21)-RAG的核心-结果召回和重排序
- 2024-06-24AI大模型企业应用实战(22)-Prompt让LLM理解知识
- 2024-06-23AI大模型企业应用实战(17)-利用memory为LLM解决长短时记忆问题
- 2024-06-23AI大模型企业应用实战(18)-“消灭”LLM幻觉的利器 - RAG介绍
- 2024-06-23AI 大模型应用开发实战(04)-AI生态产业拆解
- 2024-06-23AI 大模型企业应用实战(13)-Lostinthemiddle长上下文精度处理
- 2024-06-15史上最强 AI 翻译诞生了!拳打谷歌,脚踢 DeepL