This page contains a Flash digital edition of a book.
problem. A common choice for
the penalty is the ℓp-norm of the
coefficient vector a = (a
1
, a
2
, … ,
a
n
). This interpolates between the
subset selection problem (g = 0)
and ridge regression (g = 2) and
includes the well-studied lasso
(g = 1). For g # 1, sparse solu-
tions are obtained, and for g $
1, the penalty is convex.
Although one could choose
an optimal (l, g) by cross vali-
dation, this can be prohibitively
expensive. In this case, so-called
path-seeking methods that can
be used to generate the full path
of optimal solutions {â(l) : 0 #
l # } in time that is not much
The Geometric
their application to natural image
more than that needed to fit a
Perspective: Qualitative
statistics and data visualization.
single model have been studied.
Analysis of Data Friedman described a generalized
Statistical and Machine
A very different perspective was
path-seeking algorithm, which
Learning Perspectives
provided by Gunnar Carlsson of
efficiently solves this problem for
Stanford University, who gave an
Statistical and machine learning
a much wider range of loss and
overview of geometric and topo-
perspectives on MMDS were
penalty functions.
logical approaches to data analy-
the subject of a pair of tutorials
Jordan, in his tutorial, “Kernel-
sis in his tutorial, “Topology and
by Jerome Friedman of Stanford
Based Contrast Functions for
Data.” The motivation underly-
University and Michael Jordan
Sufficient Dimension Reduction,”
ing these approaches is to provide
of the University of California at
considered the dimensionality
insight into the data by impos-
Berkeley. Given a set of measured
reduction problem in a super-
ing a geometry on it. Part of the
values of attributes of an object,
vised learning setting. Methods
problem is thus to define useful
x = (x
1
, x
2
, … , x
n
), the basic
such as principal components
metrics—especially as applica-
predictive or machine learning
analysis, Johnson-Lindenstrauss
tions such as clustering, clas-
problem is to predict or estimate
techniques, and Laplacian-based
sification, and regression often
the unknown value of another
nonlinear methods are often
depend sensitively on the choice
attribute y.
used, but their applicability is
of metric—and two design goals
In his tutorial, “Fast Sparse
limited since the axes of maxi-
have recently emerged. First, don’t
Regression and Classification,”
mal discrimination between two
trust large distances. As distances
Friedman began with the com-
classes may not align well with
are often constructed from a sim-
mon assumption of a linear model
the axes of maximum variance.
ilarity measure, small distances
in which the prediction
One might hope there exists a
reliably represent similarity, but . Unless the
low-dimensional subspace of
large distances make little sense. number of observations is much
the input space X, which can be
Second, trust small distances larger than n, however, empiri-
found efficiently and retains the
only a bit. After all, similarity cal estimates of the loss function
statistical relationship between X
measurements are still very noisy. exhibit high variance. To make
and the response space Y.
These ideas suggest the design the estimates more regular, one
Jordan showed that this prob-
of analysis tools that are robust typically considers a constrained
lem of sufficient dimensionality
to stretching and shrinking of or penalized optimization prob-
reduction (SDR) could be for-
the underlying metric. Much of lem. The choice of an appropriate
mulated in terms of conditional
Carlsson’s tutorial was occupied by value for the regularization param-
independence and evaluated in
describing these analysis tools and eter l is a classic model selection
terms of operators on reproducing
18 AmstAt News JUNE 2009
Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74  |  Page 75  |  Page 76  |  Page 77  |  Page 78  |  Page 79  |  Page 80  |  Page 81  |  Page 82  |  Page 83  |  Page 84  |  Page 85  |  Page 86  |  Page 87  |  Page 88  |  Page 89  |  Page 90  |  Page 91  |  Page 92  |  Page 93  |  Page 94  |  Page 95  |  Page 96  |  Page 97  |  Page 98  |  Page 99  |  Page 100
Produced with Yudu - www.yudu.com