人工智能 神经元
2022/3/19 23:37:55
本文主要是介绍人工智能 神经元,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
import math
import numpy as np
import pandas as pd
from pandas import DataFrame
y =[0.14 ,0.64 ,0.28 ,0.33 ,0.12 ,0.03 ,0.02 ,0.11 ,0.08 ]
x1 =[0.29 ,0.50 ,0.00 ,0.21 ,0.10 ,0.06 ,0.13 ,0.24 ,0.28 ]
x2 =[0.23 ,0.62 ,0.53 ,0.53 ,0.33 ,0.15 ,0.03 ,0.23 ,0.03 ]
theata = [-1,-1,-1,-1,-1,-1,-1,-1,-1]
x = np.array([x1,x2,theata])
W_mid = DataFrame(0.5,index=['input1','input2','theata'],columns=['mid1','mid2','mid3','mid4'])
W_out = DataFrame(0.5,index=['input1','input2','input3','input4','theata'],columns=['a'])
def sigmoid(x):#映射函数
return 1/(1+math.exp(-x))
#训练神经元
def train(W_out, W_mid,data,real):
#中间层神经元输入和输出层神经元输入
Net_in = DataFrame(data,index=['input1','input2','theata'],columns=['a'])
Out_in = DataFrame(0,index=['input1','input2','input3','input4','theata'],columns=['a'])
Out_in.loc['theata'] = -1
#中间层和输出层神经元权值
W_mid_delta = DataFrame(0,index=['input1','input2','theata'],columns=['mid1','mid2','mid3','mid4'])
W_out_delta = DataFrame(0,index=['input1','input2','input3','input4','theata'],columns=['a'])
for i in range(0,4):
Out_in.iloc[i] = sigmoid(sum(W_mid.iloc[:,i]*Net_in.iloc[:,0]))
res = sigmoid(sum(Out_in.iloc[:,0]*W_out.iloc[:,0]))
error = abs(res-real)
#输出层权值变化量
yita =0.25#学习率
W_out_delta.iloc[:,0] = yita*res*(1-res)*(real-res)*Out_in.iloc[:,0]
W_out_delta.iloc[4,0] = -(yita*res*(1-res)*(real-res))
W_out = W_out + W_out_delta
#中间层权值变化量
for i in range(0,4):
W_mid_delta.iloc[:,i] = yita*Out_in.iloc[i,0]*(1-Out_in.iloc[i,0])*W_out.iloc[i,0]*res*(1-res)*(real-res)*Net_in.iloc[:,0]
W_mid_delta.iloc[2,i] = -(yita*Out_in.iloc[i,0]*(1-Out_in.iloc[i,0])*W_out.iloc[i,0]*res*(1-res)*(real-res))
W_mid = W_mid + W_mid_delta
return W_out,W_mid,res,error
def reault(data,W_out, W_mid):
Net_in = DataFrame(data,index=['input1','input2','theata'],columns=['a'])
Out_in = DataFrame(0,index=['input1','input2','input3','input4','theata'],columns=['a'])
Out_in.loc['theata'] = -1
#中间层的输出
for i in range(0,4):
Out_in.iloc[i] = sigmoid(sum(W_mid.iloc[:,i]*Net_in.iloc[:,0]))
res = sigmoid(sum(Out_in.iloc[:,0]*W_out.iloc[:,0]))
return res
for i in range(0,9):
W_out,W_mid,res,error = train(W_out,W_mid,x[0:,i],y[i])
res1 = reault([0.38 ,0.49,-1 ], W_out, W_mid)
res2 = reault([0.29 ,0.47 ,-2], W_out, W_mid)
print(res1,res2)
这篇关于人工智能 神经元的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2024-12-22程序员出海做 AI 工具:如何用 similarweb 找到最佳流量渠道?
- 2024-12-20自建AI入门:生成模型介绍——GAN和VAE浅析
- 2024-12-20游戏引擎的进化史——从手工编码到超真实画面和人工智能
- 2024-12-20利用大型语言模型构建文本中的知识图谱:从文本到结构化数据的转换指南
- 2024-12-20揭秘百年人工智能:从深度学习到可解释AI
- 2024-12-20复杂RAG(检索增强生成)的入门介绍
- 2024-12-20基于大型语言模型的积木堆叠任务研究
- 2024-12-20从原型到生产:提升大型语言模型准确性的实战经验
- 2024-12-20啥是大模型1
- 2024-12-20英特尔的 Lunar Lake 计划:一场未竟的承诺