HLSCV Least Squares Cross-Validation estimate of smoothing parameter CALL: [hs,hvec,score] = hlscv(data,kernel,hvec); hs = smoothing parameter hvec = vector defining possible values of hs (default linspace(0.25*h0,h0,100), h0=hos(data,kernel)) score = score vector data = data vector kernel = 'epanechnikov' - Epanechnikov kernel. (default) 'biweight' - Bi-weight kernel. 'triweight' - Tri-weight kernel. 'triangluar' - Triangular kernel. 'gaussian' - Gaussian kernel 'rectangular' - Rectanguler kernel. 'laplace' - Laplace kernel. 'logistic' - Logistic kernel. Note that only the first 4 letters of the kernel name is needed. Studies have shown that the theoretical and practical performance of HLSCV are somewhat disappointing. In particular HLSCV is highly variable and a relative slow convergence rate. Example: data = normrnd(0,1,20,1) [hs hvec score] = hlscv(data,'epan'); plot(hvec,score) See also hste, hbcv, hboot, hos, hldpi, hscv, hstt, kde, kdefun
Oversmoothing Parameter. | |
Binned Kernel Density Estimator. | |
Create or alter KDE OPTIONS structure. | |
Multivariate Kernel Function. | |
Linearly spaced vector. | |
Trapezoidal numerical integration. |
001 function [h,hvec,score]=hlscv(A,kernel,hvec) 002 % HLSCV Least Squares Cross-Validation estimate of smoothing parameter 003 % 004 % CALL: [hs,hvec,score] = hlscv(data,kernel,hvec); 005 % 006 % hs = smoothing parameter 007 % hvec = vector defining possible values of hs 008 % (default linspace(0.25*h0,h0,100), h0=hos(data,kernel)) 009 % score = score vector 010 % data = data vector 011 % kernel = 'epanechnikov' - Epanechnikov kernel. (default) 012 % 'biweight' - Bi-weight kernel. 013 % 'triweight' - Tri-weight kernel. 014 % 'triangluar' - Triangular kernel. 015 % 'gaussian' - Gaussian kernel 016 % 'rectangular' - Rectanguler kernel. 017 % 'laplace' - Laplace kernel. 018 % 'logistic' - Logistic kernel. 019 % 020 % Note that only the first 4 letters of the kernel name is needed. 021 % Studies have shown that the theoretical and practical performance of 022 % HLSCV are somewhat disappointing. In particular HLSCV is highly 023 % variable and a relative slow convergence rate. 024 % Example: data = normrnd(0,1,20,1) 025 % [hs hvec score] = hlscv(data,'epan'); 026 % plot(hvec,score) 027 % See also hste, hbcv, hboot, hos, hldpi, hscv, hstt, kde, kdefun 028 029 % Reference: 030 % B. W. Silverman (1986) 031 % 'Density estimation for statistics and data analysis' 032 % Chapman and Hall pp 48--52 033 % 034 % Wand,M.P. and Jones, M.C. (1986) 035 % 'Kernel smoothing' 036 % Chapman and Hall, pp 63--65 037 038 % tested on : matlab 5.2 039 % history: 040 % revised pab 20.10.1999 041 % updated to matlab 5.2 042 % changed input arguments 043 % taken from kdetools Christian C. Beardah 1995 044 045 046 047 048 A=A(:); 049 n=length(A); 050 051 if nargin<2|isempty(kernel), 052 kernel='epan'; 053 end; 054 055 inc = 512; 056 057 if nargin<3|isempty(hvec), 058 H=hos(A,kernel); 059 hvec=linspace(0.25*H,H,100); 060 else 061 hvec=abs(hvec); 062 end; 063 064 steps=length(hvec); 065 066 067 M=A*ones(size(A')); 068 069 Y1=(M-M'); 070 071 kopt = kdeoptset('kernel',kernel,'inc',inc); 072 for i=1:length(hvec), 073 kopt.hs = hvec(i); 074 f = kdebin(A,kopt); 075 076 %delta=f.x{1}(2)-f.x{1}(1); 077 %L1=delta*sum(f.f.^2); 078 L1 = trapz(f.x{1},f.f.^2); 079 080 Y=Y1/hvec(i); 081 082 L2=sum(sum(mkernel(Y,kernel)))/(hvec(i)*(n-1))-n*mkernel(0,kernel)/(hvec(i)*(n-1)); 083 084 score(i)=L1-2*L2/n; 085 086 end; 087 088 [L,I]=min(score); 089 090 h=hvec(I); 091
Comments or corrections to the WAFO group