欢迎来到天天文库
浏览记录
ID:59191256
大小:1.12 MB
页数:32页
时间:2020-09-26
《秦晓飞系列-深度学习-1.3浅层神经网络ppt课件.ppt》由会员上传分享,免费在线阅读,更多相关内容在教育资源-天天文库。
1、浅层神经网络ShallowNeuralNetworks主讲教师:秦晓飞上海理工大学光电学院3.1NeuralNetworkOverviewWeuseasingleneurontoimplementlogisticregression3.1NeuralNetworkOverviewWeusesingleneuronasLEGObrickstobuildasinglehiddenlayerneuralnetwork3.2NeuralNetworkRepresentationWeusea3inputs,1output,1hiddenlayerwith4neuronSNNas
2、aexample:3.2NeuralNetworkRepresentationinputlayerhiddenlayeroutputlayer“Hidden”layermeansinthetrainingset,thereisnoknownvalueaboutthislayer’soutputs.Wecallthisa2-layerNN.Thenumberoflayersonlyincludelayerswithparameters,whichareneedtolearn.3.3ComputingaNeuralNetwork’sOutputWeusea3inputs,
3、1output,1hiddenlayerwith4neuronSNNasaexample:3.3ComputingaNeuralNetwork’sOutput3.3ComputingaNeuralNetwork’sOutputHowtovectorizethesemultineuroncalculation?3.3ComputingaNeuralNetwork’sOutputTheoutputlayerisasingleneuronthatactaslogisticregression.Thesefourequationsaretheforwardpropagatio
4、nofaSNN,thathandlesonlyonetrainingexample.Howaboutthewholetrainingexampleset?3.4VectorizingacrossMultipleExamples3.4VectorizingacrossMultipleExamplesfori=1tomHowtoremovethisforloop?3.4VectorizingacrossMultipleExamplesfori=1tom#trainingexamples#neuronnodes#trainingexamples#neuronnodes3.5
5、JustificationforVectorizedImplementation3.5JustificationforVectorizedImplementationfori=1tom3.6ActivationFunctions3.6ActivationFunctionsWhenzislarge,thederivativesofσandtanhwillvanish.ReLUisnotderivableat0,butithaslittlechancetogetthereinpractice.ReLU’sderivativewillvanishwhenz<0,butwhe
6、nnetworklayerhasenoughneurons,itwillgetenoughzsgreaterthan0,andhave1asderivatives,whichwillletthegradientdecentgoon.3.7WhyNeedaNonlinearActivationFunction?3.7WhyNeedaNonlinearActivationFunction?Linear/IdentityactivationOnehiddenlayerSNNisequivalenttologisticregressionifthehiddenlayerus
7、elinearactivationfunction.Neveruselinearactivationunlessinthelastlayeroflinearregressionproblem.3.8DerivativesofActivationFunctions3.8DerivativesofActivationFunctions3.8DerivativesofActivationFunctions3.9GradientDescentforNeuralNetworks3.9GradientDescentforNeuralNetworksParamet
此文档下载收益归作者所有