欢迎来到天天文库
浏览记录
ID:51903061
大小:37.50 KB
页数:2页
时间:2020-03-18
《信息论实验作业.doc》由会员上传分享,免费在线阅读,更多相关内容在行业资料-天天文库。
1、实验一信息量的计算1、基本要求:编写计算离散随机变量的熵、联合熵、条件熵、互信息的Matlab程序2、输入:一个离散的概率分布3、输出:信息量(单位比特)4、函数说明:熵e=Entropy(x)x是一个向量表示一个离散的概率分布联合熵e=JEntropy(xy)xy联合分布密度函数条件熵e=CEntropy(xy,sign)xy联合分布密度函数,sign=‘x’表示y关于x的条件概率,sign=‘y’表示x关于y的条件概率互信息e=IInfo(xy)xy联合分布密度函数%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
2、%%%%%程序:functione=Entropy(x)%熵,x是一个向量表示一个离散的概率分布x(x==0)=1;e=-sum(x.*log2(x))%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%functione=JEntropy(xy)%联合熵,xy联合分布密度函数xy(xy==0)=1;e=-sum(sum(xy.*log2(xy)))%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%functione=CEntropy(xy,sign)%条件熵.xy=
3、联合分布密度函数,sign=‘x’表示y关于x的条件概率x=sum(xy,2);y=sum(xy,1);ifsign=='x'e=JEntropy(xy)-Entropy(x);elseifsign=='y'e=JEntropy(xy)-Entropy(y);elsee=FALSEendend%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%functione=IInfo(xy)%互信息,xy联合分布密度函数x=sum(xy,2);y=sum(xy,1);e=log2(xy)-log2(x*y)%%%%%%%
4、%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%利用下表中的数据进行测试,结果如下:ajai01201/41/18011/1811/181/31/184/9201/187/361/411/364/91/4测试结果:>>x=[11/364/91/4]>>Entropy(x)e=1.5426>>xy=[1/41/1801/181/31/1801/187/36]>>JEntropy(xy)e=2.4144>>CEntropy(xy,'y')e=0.8717>>e=IInfo(xy)e=1.4210-1.2895-Inf-1.2
5、8950.7549-1.0000-Inf-1.00001.6374
此文档下载收益归作者所有