Software Listing of Author : "Dimitrios Ververidis"
- Feature Selection using Matlab
- License: Freeware
- Price: 0.00


The DEMO includes 5 feature selection algorithms: * Sequential Forward Selection (SFS) * Sequential Floating Forward Selection (SFFS) * Sequential Backward Selection (SBS) * Sequential Floating Backward Selection (SFBS) * ReliefF Two CCR estimation methods: * Cross-validation * Resubstitution After selecting the best feature subset, the classifier obtained can be used for classifying any pattern. Figure: Upper panel is the pattern x feature matrix Lower panel left are the features selected Lower panel right is the CCR curve during feature selection steps Right panel is the classification results of some patterns. This software was developed using Matlab 7.5 and Windows XP. Copyright: D. Ververidis and C.Kotropoulos AIIA Lab, Thessaloniki, Greece, jimver@aiia.csd.auth.gr costas@aiia.csd.auth.gr In order to run the DEMO: In...
- Publisher: Dimitrios Ververidis
- Date Released: 14-04-2013
- Download Size: 3308 KB
- Download
- Platform: Matlab, Scripts
- Gaussian Mixture Modeling GUI (GMM DEMO)
- License: Freeware
- Price: 0.00


The Expectation-Maximization algorithm (EM) is widely used to find the parameters of a mixture of Gaussian probability density functions (pdfs) or briefly Gaussian components that fits the sample measurement vectors in maximum likelihood sense [1]. In our work, the expectation-maximization (EM) algorithm for Gaussian mixture modeling is improved via three statistical tests: a) A multivariate normality test, b) a central tendency (kurtosis) criterion, and c) a test based on marginal cdf to find a discriminant to split a non-Gaussian component. -Input Buttons Button 1: Open Data file (.mat) or (.tif) Button 2: Draw Gaussian Data with Mouse Left mbutton = Draw Right mbutton = Jump a point Return key = Finish -Operational Buttons Button 3: Start GMM modeling Button 4: Stop GMM modeling -Output Button Button 5: Save GMM parameters as...
- Publisher: Dimitrios Ververidis
- Date Released: 22-06-2013
- Download Size: 573 KB
- Download
- Platform: Matlab, Scripts
- Information loss of the Mahalanobis distance in high dimensions: Matlab implementation
- License: Shareware
- Price:


The Mahalanobis distance between a pattern measurement vector of dimensionality D and the center of the class it belongs to is distributed as a chi^2 with D degrees of freedom, when an infinite training set is used. However, the distribution of Mahalanobis distance becomes either Fisher or Beta depending on whether cross-validation or re-substitution is used for parameter estimation in finite training sets. The total variation between chi^2 and Fisher as well as between chi^2 and Beta allows us to measure the information loss in high dimensions. The information loss is exploited then to set a lower limit for the correct classification rate achieved by the Bayes classifier that is used in subset feature selection. Installation: ------------- The 5 functions should be in the current path of Matlab. Usage: ------ LowCCRLimit =...
- Publisher: Dimitrios Ververidis
- Date Released: 24-04-2013
- Download Size: 10 KB
- Download
- Platform: Matlab, Scripts