资源描述:
《基于超分辨率图像的卷积稀疏编码》由会员上传分享,免费在线阅读,更多相关内容在教育资源-天天文库。
1、基于超分辨率图像的卷积稀疏编码EXPLOITINGSPARSENESSINDEEPNEURALNETWORKSFORLARGEVOCABULARYSPEECHRECOGNITIONDongYu1,FrankSeide2,GangLi2,LiDeng12MicrosoftResearch,Redmond,USAMicrosoftResearchAsia,Beijing,P.R.C1{dongyu,fseide,ganl,deng}@microsoft.comABSTRACTRecently,wedevelopedconte
2、xt-dependentdeepneuralnetwork(DNN)hiddenMarkovmodelsforlargevocabularyspeechrecogni-tion.Whilereducingerrorsby33%comparedtoitsdiscriminativelytrainedGaussian-mixturecounterpartontheswitchboardbenchmarktask,DNNrequiresmuchmoreparameters.In23thispaper,wereportourre
3、centworkonDNNforimprovedgeneralization,modelsize,andcomputationspeedbyexploitingparametersparseness.Wefor-mulatethegoalofenforcingsparsenessassoftregularizationandconvexconstraintoptimizationproblems,andproposesolutionsun-derthestochasticgradientascentsetting.Wea
4、lsoproposenoveldatastructurestoexploittherandomsparsenesspatternstoreducemodelsizeandcomputationtime.Theproposedsolutionshavebeenevaluatedonthevoice-searchandswitchboarddatasets.Theyhavedecreasedthenumberofnonzeroconnectionstoonethirdwhilere-ducingtheerrorrateby0
5、.2-0.3%overthefullyconnectedmodelonbothdatasets.Thenonzeroconnectionshavebeenfurtherreducedtoonly12%and19%onthetworespectivedatasetswithoutsacrificingspeechrecognitionperformance.Undertheseconditionswecanre-ducethemodelsizeto18%and29%,andcomputationto14%and23%,res
6、pectively,onthesetwodatasets.IndexTerms—speechrecognition,deepbeliefnetworks,deepneuralnetworks,sparseness1.INTRODUCTIONRecently,wehavewitnessedtheresurrectionof23artificialneuralnet-work(NN)hiddenMarkovmodel(HMM)hybridsystemsforspeechrecognition.Thismainlyattribu
7、testothediscoveryofthestrongmodelingabilityofdeepneuralnetworks(DNNs1)andtheavailabil-ityofhigh-speedgeneralpurposegraphicalprocessingunits(GPG-PUs)fortrainingDNNs.Anotableadvanceisthecontext-dependentDNN-HMMs(CD-DNN-HMMs)inwhichDNNsdirectlymodelthesenones(i.e.,t
8、iedCDphonestates)andapproximatetheiremissionprobabilitiesinHMMspeechrecognizers[1].CD-DNN-HMMshavebeenshowntobehighlypromising.Theyhaveachieved16%[1]and33%[2,3