HBCV2 Biased Cross-Validation smoothing parameter for 2D data. CALL: [hs,hvec,score] = hbcv2(data,kernel); hs = smoothing parameter data = two column data matrix kernel = 'gaussian' - Gaussian kernel 'epanechnikov' - Epanechnikov kernel. 'biweight' - Bi-weight kernel. 'triweight' - Tri-weight kernel. 'triangluar' - Triangular kernel. 'rectangular' - Rectanguler kernel. 'laplace' - Laplace kernel. 'logistic' - Logistic kernel. Note that only the first 4 letters of the kernel name is needed. HBCV is a hybrid of crossvalidation and direct plug-in estimates. The main attraction of HBCV is that it is more stable than HLSCV in the sense that its asymptotic variance is considerably lower. However, this reduction in variance comes at the expense of an increase in bias, with HBCV tending to be larger than the HNS estimate. Asymptotically HBCV has a relative slow convergence rate. Example: data = wnormrnd(0, 1,20,2) hs = hbcv2(data,'gaus'); See also hste, hboot, hns, hos, hldpi, hlscv, hscv, hstt, kde, kdefun
2-Stage Solve the Equation estimate of smoothing parameter. | |
Return 2'nd order moment of kernel pdf | |
Multivariate Kernel Function. | |
Clear variables and functions from memory. | |
Display message and abort function. | |
Average or mean value. | |
Matrix or vector norm. | |
Standard deviation. |
001 function h=hbcv2(A,kernel) 002 %HBCV2 Biased Cross-Validation smoothing parameter for 2D data. 003 % 004 % CALL: [hs,hvec,score] = hbcv2(data,kernel); 005 % 006 % hs = smoothing parameter 007 % data = two column data matrix 008 % kernel = 'gaussian' - Gaussian kernel 009 % 'epanechnikov' - Epanechnikov kernel. 010 % 'biweight' - Bi-weight kernel. 011 % 'triweight' - Tri-weight kernel. 012 % 'triangluar' - Triangular kernel. 013 % 'rectangular' - Rectanguler kernel. 014 % 'laplace' - Laplace kernel. 015 % 'logistic' - Logistic kernel. 016 % 017 % Note that only the first 4 letters of the kernel name is needed. 018 % 019 % HBCV is a hybrid of crossvalidation and direct plug-in estimates. 020 % The main attraction of HBCV is that it is more stable than HLSCV in 021 % the sense that its asymptotic variance is considerably lower. However, 022 % this reduction in variance comes at the expense of an increase in 023 % bias, with HBCV tending to be larger than the HNS estimate. 024 % Asymptotically HBCV has a relative slow convergence rate. 025 % 026 % Example: data = wnormrnd(0, 1,20,2) 027 % hs = hbcv2(data,'gaus'); 028 % 029 % See also hste, hboot, hns, hos, hldpi, hlscv, hscv, hstt, kde, kdefun 030 031 032 % tested on : matlab 5.2 033 % history: 034 % revised pab aug2005 035 % -added ad hoc support for other kernels than Gaussian 036 % Revised pab dec 2003 037 % -fixed some bugs 038 % -added todo comments 039 % revised pab 20.10.1999 040 % updated to matlab 5.2 041 % changed input arguments 042 % taken from kdetools Christian C. Beardah 1993-94 043 044 % TODO % Add support for other kernels than Gaussian 045 046 047 % Transform A so rows have st. dev.=1: 048 049 [n d]=size(A); 050 if d~=2, 051 error('Wrong shape of data') 052 end 053 oldA=A; 054 A=(oldA-ones(n,1)*mean(oldA))*diag([1./std(oldA)]); % centre data about 055 % its mean 056 maxit=20; 057 058 tol=1e-2; 059 060 gradf=zeros(2,1); 061 062 delta=1e-6; 063 064 i=0; 065 066 vx(1)=hste(A(:,1),kernel); 067 vy(1)=hste(A(:,2),kernel); 068 069 x0=vx(1); 070 y0=vy(1); 071 072 fn(1)=bcv2(A,vx(1),vy(1)); 073 074 075 076 % R= int(mkernel(x)^2) 077 % mu2= int(x^2*mkernel(x)) 078 kernel2 = 'gauss'; 079 % values for gaussian kernel 080 [mu2ns,Rns] = kernelstats(kernel2); 081 Rns = Rns^d; 082 STEconstant2 = Rns /(mu2ns^(2)*n); 083 084 [mu2,R] = kernelstats(kernel); 085 R = R^d; 086 087 STEconstant = R /(mu2^(2)*n); 088 089 090 091 i=1; % i is the iteration counter. 092 093 while i <= maxit, 094 095 gradf(1)=(bcv2(A,x0+delta,y0)-bcv2(A,x0-delta,y0))/(2*delta); 096 gradf(2)=(bcv2(A,x0,y0+delta)-bcv2(A,x0,y0-delta))/(2*delta); 097 098 gradf=gradf/(i*norm(gradf)); 099 100 r0=[x0 y0]'; 101 102 for j=1:31, 103 M(j,:)=(r0+(j-16)*gradf/15)'; 104 end; 105 106 L=(M(:,1)>0) & (M(:,2)>0); 107 108 L=remzero([M,L]); % check this wrong??? 109 110 M=L(:,1:2); 111 112 for j=1:length(M), 113 t(j,1)=bcv2(A,M(j,1),M(j,2)); 114 end; 115 116 [N,I]=min(t); 117 118 r=M(I,:)'; 119 120 vx(i+1)=r(1); % Store iterate. 121 vy(i+1)=r(2); 122 123 fn(i+1)=N; 124 125 if (abs(norm(r-r0)) < tol), 126 127 h=r'; 128 129 % Transform values of h back: 130 131 h=diag(std(oldA))*h'*(STEconstant/STEconstant2)^(1/6); 132 133 return; 134 end; 135 136 i=i+1; % Update the iteration counter. 137 138 x0=r(1); 139 y0=r(2); 140 141 clear t 142 143 end; % of while statement. 144 145 disp('Maximum number of iterations reached - terminating execution.'); 146 return 147 148 function z=bcv2(A,h1,h2) 149 % BCV2 Biased cross-validation criterion function. 150 % 151 % BCV2(A,H1,H2) 152 % 153 % Christian C. Beardah 1993-94 154 155 n=length(A); 156 157 A1=A(:,1); 158 159 A2=A(:,2); 160 161 M1=A1*ones(size(A1')); 162 163 T1=(M1-M1')/h1; 164 165 M2=A2*ones(size(A2')); 166 167 T2=(M2-M2')/h2; 168 169 z=1/(4*pi*n*h1*h2); 170 171 T=T1.^2+T2.^2; 172 173 T=(T.^2-8*T+8).*mkernel(T1,'gauss').*mkernel(T2,'gauss'); 174 175 T=T-diag(diag(T)); 176 z=z+sum(sum(T))/(4*n*(n-1)*h1*h2); 177 178 return 179 180 function X=remzero(X) 181 %REMZERO Removes rows containing zeroes from a matrix or vector. 182 % 183 % CALL Y = remzero(X) 184 % 185 % Example: 186 % x = [1 2 0 4 0 6].' 187 % x = remzero(x); 188 % 189 190 %History 191 % 192 % by Christian C. Beardah 1993 193 194 [r c]=size(X); 195 196 if r>1 & c>1, 197 [i j]=find(X==0); 198 X(i,:)=[]; 199 else, 200 X=X(find(X)); 201 end; 202 203 return 204 205
Comments or corrections to the WAFO group