Adaptive regression and model selection in data mining problems

dc.contributor.authorBakin, Sergeyen_AU
dc.date.accessioned2012-10-14T23:52:29Z
dc.date.issued1999
dc.description.abstractData Mining is a new and rapidly evolving area which deals with problems related to extracting structure from massive commercial and scientific data sets. Regression analysis is one of the major Data Mining techniques. The data sets encountered in the Data Mining area are often characterized by a large number of attributes (variables) as well as data records.This imposes two major requirements on the regression analysis tools used in Data Mining: first, in order to produce accurate and parsimonious models exhibiting the most important features of the problem in hand, they should be able to perform model selection adaptively and, second, the cost of running such tools has to be reasonably low. Most of the modern regression tools fail to meet the above requirements. This thesis is intended to contribute to the improvement of the existing methodologies as well as to propose new approaches. We focus on two regression estimation techniques. The first one,called Probing Least Absolute Squares Modelling (PLASM), is a generalization of the Least Absolute Shrinkage and Selection Operator (LASSO) by R. Tibshirani which minimizes the residual sum of squares subject to the 11-norm of the regression coefficients being less than a constant. LASSO has been shown to enjoy stability of the ridge regression coupled with the ability to carry out model selection. In our approach called PLASM, we replace the constraint employed in LASSO with a different constraint. PLASM allows for an arbitrary grouping of basis functions in a model and includes LASSO as a special case.The implication of using the new constraint is that PLASM is able to perform model selection in terms of groups of basis functions. This turns out to be very useful in a number of data analytic problems.For example, as far as additive modelling is concerned,the dimensionality of the PLASM minimization problem is much less than that of LASSO and is independent (at least explicitly) of the number of datapoints which makes it suitable for use in the Data Mining context. The second tool we consider this thesis is the Multivariate Adaptive Regression Splines (MARS) developed by J.Friedman.In our version of MARS called BMARS, we use B-splines instead of truncated power basis functions. The fact that B-splines have the compact support property allows us to introduce a new strategy whereby at any momen the algorithm builds a model using B-splines of a certain scale only and it switches over to splines of smaller scale after the fitting ability of the current splines has been exhausted. Also, we discuss a parallel version of BMARS as well as an application of the algorithm to processing of a large commercial data set.The results of the numerical experiments demonstrate that, while being considerably more efficient, BMARS is able to produce models competitive with those of the original MARS.en_AU
dc.identifier.otherb20466316
dc.identifier.urihttp://hdl.handle.net/1885/9449
dc.language.isoen_AUen_AU
dc.titleAdaptive regression and model selection in data mining problemsen_AU
dc.typeThesis (PhD)en_AU
dcterms.valid1999en_AU
local.contributor.affiliationSchool of Mathematical Sciences, Australian National Universityen_AU
local.description.refereedYesen_AU
local.identifier.doi10.25911/5d78db4c25dbb
local.mintdoimint
local.type.degreeDoctor of Philosophy (PhD)en_AU

Downloads

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Bakin_S_1999.pdf
Size:
4.33 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
70 B
Format:
Item-specific license agreed upon to submission
Description: