资源描述:
《Professor Forcing A New Algorithm for TrainingRecurrent Networks》由会员上传分享,免费在线阅读,更多相关内容在学术论文-天天文库。
1、ProfessorForcing:ANewAlgorithmforTrainingRecurrentNetworksAlexLamb*AnirudhGoyal*MILAMILAUniversitédeMontréalUniversitédeMontréalanirudhgoyal9119@gmail.comlambalex@iro.umontreal.caYingZhangSaizhengZhangAaronCourvilleMILAMILAMILAUniversitédeMontréalUniversitédeM
2、ontréalUniversitédeMontréalying.zhlisa@gmail.comsaizhenglisa@gmail.comaaron.courville@gmail.comYoshuaBengioMILAUniversitédeMontréal,CIFARSeniorFellowyoshua.umontreal@gmail.com*Indicatesfirstauthors.Orderingdeterminedbycoinflip.AbstractTheTeacherForcingalgorithmt
3、rainsrecurrentnetworksbysupplyingobservedsequencevaluesasinputsduringtrainingandusingthenetwork’sownone-step-aheadpredictionstodomulti-stepsampling.WeintroducetheProfessorForcingalgorithm,whichusesadversarialdomainadaptationtoencouragethedynamicsoftherecurrent
4、networktobethesamewhentrainingthenetworkandwhensamplingfromthenetworkovermultipletimesteps.WeapplyProfessorForcingtolanguagemodeling,vocalsynthesisonrawwaveforms,handwritinggeneration,andimagegeneration.EmpiricallywefindthatProfessorForcingactsasaregularizer,im
5、-provingtestlikelihoodoncharacterlevelPennTreebankandsequentialMNIST.arXiv:1610.09038v1[stat.ML]27Oct2016Wealsofindthatthemodelqualitativelyimprovessamples,especiallywhensam-plingforalargenumberoftimesteps.Thisissupportedbyhumanevaluationofsamplequality.Trade-o
6、ffsbetweenProfessorForcingandScheduledSamplingarediscussed.WeproduceT-SNEsshowingthatProfessorForcingsuccessfullymakesthedynamicsofthenetworkduringtrainingandsamplingmoresimilar.1IntroductionRecurrentneuralnetworks(RNNs)havebecometobethegenerativemodelsofchoic
7、eforsequentialdata(Graves,2012)withimpressiveresultsinlanguagemodeling(Mikolov,2010;MikolovandZweig,2012),speechrecognition(Bahdanauetal.,2015;Chorowskietal.,2015),MachineTransla-tion(Choetal.,2014a;Sutskeveretal.,2014;Bahdanauetal.,2014),handwritinggeneration
8、(Graves,2013),imagecaptiongeneration(Xuetal.,2015;ChenandLawrenceZitnick,2015),etc.29thConferenceonNeuralInformationProcessingSystems(NIPS2016),Barcelona,Spain.TheRNNmodelsthedatav