entropy and mutual information-[from book of cover]

entropy and mutual information-[from book of cover]

ID:34528525

大小:633.67 KB

页数:43页

时间:2019-03-07

entropy and mutual information-[from book of cover]_第1页
entropy and mutual information-[from book of cover]_第2页
entropy and mutual information-[from book of cover]_第3页
entropy and mutual information-[from book of cover]_第4页
entropy and mutual information-[from book of cover]_第5页
资源描述:

《entropy and mutual information-[from book of cover]》由会员上传分享,免费在线阅读,更多相关内容在教育资源-天天文库

1、CHAPTER2ENTROPY,RELATIVEENTROPY,ANDMUTUALINFORMATIONInthischapterweintroducemostofthebasicdefinitionsrequiredforsubsequentdevelopmentofthetheory.Itisirresistibletoplaywiththeirrelationshipsandinterpretations,takingfaithintheirlaterutility.Afterdefiningentropya

2、ndmutualinformation,weestablishchainrules,thenonnegativityofmutualinformation,thedata-processinginequality,andillustratethesedefinitionsbyexaminingsufficientstatisticsandFano’sinequality.Theconceptofinformationistoobroadtobecapturedcompletelybyasingledefinition

3、.However,foranyprobabilitydistribution,wedefineaquantitycalledtheentropy,whichhasmanypropertiesthatagreewiththeintuitivenotionofwhatameasureofinformationshouldbe.Thisnotionisextendedtodefinemutualinformation,whichisameasureoftheamountofinformationonerandomvari

4、ablecontainsaboutanother.Entropythenbecomestheself-informationofarandomvariable.Mutualinformationisaspecialcaseofamoregeneralquantitycalledrelativeentropy,whichisameasureofthedistancebetweentwoprobabilitydistributions.Allthesequantitiesarecloselyrelatedandsh

5、areanumberofsimpleproperties,someofwhichwederiveinthischapter.Inlaterchaptersweshowhowthesequantitiesariseasnaturalanswerstoanumberofquestionsincommunication,statistics,complexity,andgambling.Thatwillbetheultimatetestofthevalueofthesedefinitions.2.1ENTROPYWefi

6、rstintroducetheconceptofentropy,whichisameasureoftheuncertaintyofarandomvariable.LetXbeadiscreterandomvariablewithalphabetXandprobabilitymassfunctionp(x)=Pr{X=x},x∈X.ElementsofInformationTheory,SecondEdition,ByThomasM.CoverandJoyA.ThomasCopyright2006JohnWil

7、ey&Sons,Inc.1314ENTROPY,RELATIVEENTROPY,ANDMUTUALINFORMATIONWedenotetheprobabilitymassfunctionbyp(x)ratherthanpX(x),forconvenience.Thus,p(x)andp(y)refertotwodifferentrandomvariablesandareinfactdifferentprobabilitymassfunctions,pX(x)andpY(y),respectively.Defin

8、itionTheentropyH(X)ofadiscreterandomvariableXisdefinedbyH(X)=−p(x)logp(x).(2.1)x∈XWealsowriteH(p)fortheabovequantity.Thelogistothebase2andentropyisexpressedinbits.Forexample,theentropyofafaircoi

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。