Home Page Icon
Home Page
Table of Contents for
Index
Close
Index
by Gustau Camps-Valls, Jordi Muñoz-Marí, Manel Martínez-Ramón, José Luis Rojo-Álvar
Digital Signal Processing with Kernel Methods
Cover
Title Page
About the Authors
Preface
Why Did We Write This Book?
Structure and Contents
Acknowledgements
List of Abbreviations
Part I: Fundamentals and Basic Elements
1 From Signal Processing to Machine Learning
1.1 A New Science is Born: Signal Processing
1.2 From Analog to Digital Signal Processing
1.3 Digital Signal Processing Meets Machine Learning
1.4 Recent Machine Learning in Digital Signal Processing
2 Introduction to Digital Signal Processing
2.1 Outline of the Signal Processing Field
2.2 From Time–Frequency to Compressed Sensing
2.3 Multidimensional Signals and Systems
2.4 Spectral Analysis on Manifolds
2.5 Tutorials and Application Examples
2.6 Questions and Problems
3 Signal Processing Models
3.1 Introduction
3.2 Vector Spaces, Basis, and Signal Models
3.3 Digital Signal Processing Models
3.4 Tutorials and Application Examples
3.5 Questions and Problems
3.A MATLAB simpleInterp Toolbox Structure
4 Kernel Functions and Reproducing Kernel Hilbert Spaces
4.1 Introduction
4.2 Kernel Functions and Mappings
4.3 Kernel Properties
4.4 Constructing Kernel Functions
4.5 Complex Reproducing Kernel in Hilbert Spaces
4.6 Support Vector Machine Elements for Regression and Estimation
4.7 Tutorials and Application Examples
4.8 Concluding Remarks
4.9 Questions and Problems
Part II: Function Approximation and Adaptive Filtering
5 A Support Vector Machine Signal Estimation Framework
5.1 Introduction
5.2 A Framework for Support Vector Machine Signal Estimation
5.3 Primal Signal Models for Support Vector Machine Signal Processing
5.4 Tutorials and Application Examples
5.5 Questions and Problems
6 Reproducing Kernel Hilbert Space Models for Signal Processing
6.1 Introduction
6.2 Reproducing Kernel Hilbert Space Signal Models
6.3 Tutorials and Application Examples
6.4 Questions and Problems
7 Dual Signal Models for Signal Processing
7.1 Introduction
7.2 Dual Signal Model Elements
7.3 Dual Signal Model Instantiations
7.4 Tutorials and Application Examples
7.5 Questions and Problems
8 Advances in Kernel Regression and Function Approximation
8.1 Introduction
8.2 Kernel‐Based Regression Methods
8.3 Bayesian Nonparametric Kernel Regression Models
8.4 Tutorials and Application Examples
8.5 Concluding Remarks
8.6 Questions and Problems
9 Adaptive Kernel Learning for Signal Processing
9.1 Introduction
9.2 Linear Adaptive Filtering
9.3 Kernel Adaptive Filtering
9.4 Kernel Least Mean Squares
9.5 Kernel Recursive Least Squares
9.6 Explicit Recursivity for Adaptive Kernel Models
9.7 Online Sparsification with Kernels
9.8 Probabilistic Approaches to Kernel Adaptive Filtering
9.9 Further Reading
9.10 Tutorials and Application Examples
9.11 Questions and Problems
Part III: Classification, Detection, and Feature Extraction
10 Support Vector Machine and Kernel Classification Algorithms
10.1 Introduction
10.2 Support Vector Machine and Kernel Classifiers
10.3 Advances in Kernel‐Based Classification
10.4 Large‐Scale Support Vector Machines
10.5 Tutorials and Application Examples
10.6 Concluding Remarks
10.7 Questions and Problems
11 Clustering and Anomaly Detection with Kernels
11.1 Introduction
11.2 Kernel Clustering
11.3 Domain Description Via Support Vectors
11.4 Kernel Matched Subspace Detectors
11.5 Kernel Anomaly Change Detection
11.6 Hypothesis Testing with Kernels
11.7 Tutorials and Application Examples
11.8 Concluding Remarks
11.9 Questions and Problems
12 Kernel Feature Extraction in Signal Processing
12.1 Introduction
12.2 Multivariate Analysis in Reproducing Kernel Hilbert Spaces
12.3 Feature Extraction with Kernel Dependence Estimates
12.4 Extensions for Large‐Scale and Semi‐supervised Problems
12.5 Domain Adaptation with Kernels
12.6 Concluding Remarks
12.7 Questions and Problems
References
Index
End User License Agreement
Search in book...
Toggle Font Controls
Playlists
Add To
Create new playlist
Name your new playlist
Playlist description (optional)
Cancel
Create playlist
Sign In
Email address
Password
Forgot Password?
Create account
Login
or
Continue with Facebook
Continue with Google
Sign Up
Full Name
Email address
Confirm Email Address
Password
Login
Create account
or
Continue with Facebook
Continue with Google
Prev
Previous Chapter
References
Next
Next Chapter
End User License Agreement
Index
a
abundance
access point
active learning
adaptive filtering
adaptive kernel learning
additive noise
additive noise model
alternative hypothesis
analysis equation
anomaly change detection
anomaly detection
antenna array
anti‐causal systems
array processing
audio
audio compression
autocorrelation
autocorrelation‐induced kernel, autocorrelation kernel
autocorrelation kernel
autocorrelation matrix
autoregressive (AR)
autoregressive and moving average (ARMA)
autoregressive and exogenous (ARX)
b
bag of words features
bag of words kernel
band‐pass
bandwidth
base‐band representation
basis
basis pursuit
Bayesian nonparametric
Bernoulli–Gauss distribution
Bernoulli process
bias–variance dilemma
bi‐exponential distribution
see
Laplacian noise
big data
biomedical signals
biophysical
bit error rate (BER)
Blackman–Tukey correlogram
blind source separation (BSS)
Bootstrap resampling
B‐scan
Burg’s method
butterfly algorithm
c
canonical basis
cardiac mesh
cardiac navigation systems
cardiac signal
cardiag image
Cauchy–Schwartz inequality
Cauchy sequence
causal systems
centering
change detection
change vector analysis
channel estimation
chaotic
Choi–Williams distribution
Cholesky decomposition
Cholesky factorization
chronocrome
classification
clustering
codebook
collinearity
color image
communication
complex algebra
complex envelope
complex exponential
complexification trick
complex signal
composite kernel
compressed sensing
confidence interval (CI)
constrained covariance (COCO)
continuous‐time equivalent system for nonuniform interpolation
continuous‐time signals
convex
convolution
convolution (multidimensional)
correlation
correlogram
cost function
covariance function
covariance operator
covariate shift
cross‐correlation
cross‐covariance
cross‐information
cross‐validation
CUSUM
d
deal effect curve
decision tree
deconvolution
denoising
determination coefficient
dictionary learning
digital filtering
dimensionality reduction
Dirac delta
direction of arrival (DOA)
discrete cosine transform (DCT)
discrete‐time signals
domain adaptation
domain description
dot product
double side band
dual parameters
dual representation
dual signal model (DSM)
e
eigenfunctions
electric networks
electroanatomic map (EAM)
electrocardiogram (ECG)
electroencephalogram (EEG)
elliptical
empirical kernel map
empirical risk
endmember
energy
energy spectral density
equalization
Euclidean distance
Euclidean divergence
Euler–Poincaré formula
evidence
expectation–maximization
eye diagram
f
feature map
feature mapping
feature space
feedback
filter
filter bank analysis
filtering
finite impulse response (FIR)
Fisher discriminant
Fourier coefficients
Fourier transform
fractal
free parameters
frequency
functional analysis
function approximation
fuzzy
fuzzy clustering
g
Gabor transform
gamma distribution
gamma‐filter
gamma function
Gaussian distribution
Gaussian mixture model (GMM)
Gaussian mixtures
Gaussian noise
Gaussian processes
generative kernel
genetic
Gram matrix
graph
graph Laplacian
graph Laplacian matrix
Grassman–Stiefel manifold
grayscale image
greedy algorithms
h
Hammerstein system
Hammerstein–Wiener model
Hanning pulse
heart rate variability (HRV)
Heisenberg’s principle
Hermitian signal
Hermitic transpose operator
heteroscedastic
Hilbert–Schmidt component analysis (HSCA)
Hilbert–Schmidt independence criterion (HSIC)
Hilbert space
hinge loss
histogram kernel
Holter
homoscedastic
Hotelling’s test
Huber
ɛ
‐Huber
Huber loss
hyperparameters
hyperresolution method
hyperspectral
hypothesis testing
i
ill‐posed problem
incomplete Cholesky decomposition (ICD)
independent component analysis (ICA)
indoor location
information divergence
information potential
information‐theoretic learning
inner product
see also
dot product
;
scalar product
inner product space
innovation process
in‐phase and quadrature‐phase
input features
input space
ɛ
‐insensitive loss
interception
interpolation
invariance learning
isomap
j
Jacobian weighting
Jensen–Shannon
jitter
joint input‐output mapping
k
Kalman
Kalman filter
Karhunen–Loeve
Karush–Kuhn–Tucker conditions
kernel
kernel adaptive filtering
kernel alignment
kernel autoregressive and moving average (KARMA)
kernel blind source separation (KBSS)
kernel canonical correlation analysis (KCCA)
kernel density estimation
kernel dependence estimation
kernel dimensionality reduction (KDR)
kernel entropy component analysis (KECA)
Kernel Fisher’s discriminant analysis (KFDA)
kernel generalized variance (KGV)
kernel independent component analysis (KICA)
kernelization
kernel least mean squares (KLMS)
kernel manifold alignment (KEMA)
kernel matrix
kernel mean matching (KMM)
kernel methods
kernel multivariate analysis (KMVA)
kernel mutual information (KMI)
kernel orthonormalized partial least squares (KOPLS)
kernel partial least squares (KPLS)
kernel principal component analysis (KPCA)
kernel recursive least squares (KRLS)
kernel ridge regression (KRR)
kernel signal to noise ratio (KSNR)
kernel trick
Kirchoff operator
Kirkwood distribution taking
k‐means
k nearest neighbors (k‐NN)
Kronecker delta
l
Label Propagation
Lagrange functional
Laplace–Beltrami operator
Laplace–de Rham operator
Laplacian eigenmaps (LE)
Laplacian noise
Laplacian operator
Large Margin Filtering (LMF)
large‐scale
latent space
least absolute deviation (LAD)
least absolute shrinkage and selection operator (LASSO)
least mean squares (LMS)
least squares (LS)
Least‐Squares Support Vector Machine (LS‐SVM)
linear and time‐invariant (LTI) systems
linear discriminant analysis (LDA)
linear independence
l1‐norm
l2‐norm
locally linear embedding (LLE)
logistic regression
Lomb periodogram
Lorenz
m
Mackey–Glass
magnetic resonance imaging (MRI)
manifold alignment
manifolds
Margenau–Hill distribution
marketing
Markov chain
matched filter
maximum a posteriori (MAP)
maximum likelihood (ML)
maximum mean discrepancy (MMD)
minimum power distortionless response (MPDR)
maximum variance unfolding (MVU)
mean map kernel
medical imaging
memory depth
Mercer, James
Mercer’s kernel
Mercer’s theorem
M‐estimate
metric
Mexican hat wavelet
MIMO
see
multi‐input multi‐output (MIMO)
minimax
minimum mean square error (MMSE)
minimum noise fraction (MNF)
minimum phase
minimum variance distortionless response (MVDR)
model diagnosis
modulated kernel
modulation
moving average (MA)
multiclass
multidimensional sampling
multidimensional scaling (MDS)
multidimensional signal
multi‐input multi‐output (MIMO)
multilabel
multi‐output
multiple kernel learning (MKL)
multiple signal classification (MUSIC)
multiresolution analysis
multispectral remote‐sensing
multiuser detection
mutual information
n
Nadayara–Watson (NW)
natural signals
neural networks
neuron
noise
nonlinear algorithms
nonlinear channel identification
nonlinearity/nonlinearities
nonlinear signal model
nonlinear SVM
nonlinear system identification
nonparametric
nonparametric spectral analysis
nonuniform interpolation
nonuniform sampling
normal equation
normalization
null hypothesis
Nyquist pulse
Nyquist theorem
o
one‐against‐one (OAO)
one‐class classification
one‐class support measure machines (OC‐SMM)
one‐class support vector machine
online learning
online regression
online sparsification
optimization functional
optimized kernel entropy component analysis (OKECA)
orthogonal base
orthogonal frequency division multiplexing (OFDM)
orthogonality
orthogonal subspace projection (OSP)
orthonormal base
outlier
overfitting
p
Page distribution
parallelization
parameter estimation
parametric spectral analysis
Parseval identity
Parseval’s theorem
parsimonious
Parzen windows
pervasive change
phase
posterior probability
power
power spectral density
pre‐image
primal‐dual functional
primal representation
principal component analysis (PCA)
prior probability
probabilistic cluster kernel
probability density function
probability product kernel
projections
promotion
pseudoinverse
pyramid match kernel
Pytagorean theorem
q
Q‐mode
QRS complex
quadrature amplitude modulation (QAM)
quadrature‐phase
see
in‐phase and quadrature‐phase
quadrature‐phase shift keying (QPSK)
r
radar
radial basis function (RBF)
random Fourier features (RFF)
rank
Rayleigh distribution
received signal strength
recursive filters
recursive least squares
recursivity
reflectivity
regression
regularization
relevance vector machine (RVM)
reliability
remote sensing
Rényi entropy
replication (bootstrap)
representer theorem
reproducing kernel Hilbert space (RKHS)
reproducing property
resample (bootstrap)
residual
Riesz representation theorem
Rihaczek distribution
RKHS signal model (RSM)
R‐mode
running spectrum
s
sample selection
sampling
sampling period
satellite image
scalar product
see also
inner product
seismology
self‐organizing map (SOM)
semiparametric regression (SR)
semisupervised
Shannon
Shannon’s sampling theorem
shift‐invariant
signal
signal detection
signal interpolation
signal model
signal space
signal‐to‐noise ratio (SNR)
similarity
sinc function
sinc interpolation
sinc interpolator
single side‐band
slack variable
snapshot
social networks
sparse deconvolution
sparse kernel feature extraction
sparse learning
sparsity
spatial reference
spectral
spectral angle mapper (SAM)
spectrogram
spectrum
speech recognition
stacked kernel
state‐space representation
steering vector
stiffness matrix
structural risk
structured output learning
structure‐preserving algorithms
subband coding algorithm
subspace
subspace detector
subspace methods
support vector
support vector domain description (SVDD)
support vector machine for digital signal processing (SVM for DSP)
support vector machines (SVMs)
support vector regression (SVR)
surrogate
synthesis equation
system identification
systems with memory
t
tachogram
temporal reference
tensor‐product kernel
tessellation
texture classification
thermal noise
thin plate spline
Tikhonov regularization
time series prediction tin plate spline
transductive support vector machine (TSVM)
transfer component analysis (TCA)
transfer learning
transform coding
translation‐invariant kernel
triangle inequality
Tutte Laplacian
u
ultrasound
unmixing
unscented Kalman filter (UKF)
unsupervised
v
Vapnik‐Chervonenkis capacity (VCC)
variance
vector quantization algorithms
vector space
Volterra
Voronoi
w
warped Gaussian Process Regression (WGP)
wavelet function
Welch periodogram
Wiener
Wiener filter
Wiener system
Wigner–Ville distribution
y
Yen’s interpolator
z
z‐transform
Add Highlight
No Comment
..................Content has been hidden....................
You can't read the all page of ebook, please click
here
login for view all page.
Day Mode
Cloud Mode
Night Mode
Reset