Spectral Algorithms I

1y ago
4 Views
1 Downloads
2.62 MB
57 Pages
Last View : 3m ago
Last Download : 3m ago
Upload by : Dahlia Ryals
Transcription

Spectral Algorithms ISlides based on “Spectral Mesh Processing” Siggraph 2010 course

Why Spectral?A different way to look at functions on a domain

Why Spectral?Better representations lead to simpler solutions

A Motivating ApplicationShape Correspondence Rigid alignment easy ‐different pose?Spectral transform normalizesshape poseRigid alignnment

Spectral Geometry ProcessingUse eigen‐structureof “well behaved” linear operatorsfor geometry processing

Eigen structureEigen‐structure Eigenvectorsgand eigenvaluesg Diagonalization orAu λu,, u 0A UΛUTeigen‐decomposition Projection into eigen‐subspacey’ U(k)U(k)Ty DFT‐like spectral transformŷ UTy

Eigen endecompositionA UΛsubspaceprojectionmeshgeometryy y’ U(3)U(3)Typositive definitematrix A UT

Eigen endecompositionAUΛspectraltransformmeshgeometryy ŷ UTypositive definitematrix A UTUT

Classification of Applications Eigenstructure(s) used– Eigenvalues: signature for shape characterization– Eigenvectors: form spectral embedding (a transform)– Eigenprojection: also a transform DFT‐like Dimensionality of spectral embeddings– 1D: mesh sequencing– 2D or 3D: graph drawing or mesh parameterization– Higher D: clustering, segmentation, correspondence Mesh operator used––––Laplace Beltrami, distances matrix, otherCombinatorial vs. geometric1st‐order vs. higher orderNNormalizedli d vs. un‐normalizedli d

Operators? Best– Symmetric positive definite operator: xTAx 0 for any x Can live with– Semi‐positive definite (xTAx 0 for any x)– Non symmetric, as long as eigenvalues are real and positivee.g.: L DW, where W is SPD and D is diagonal. Beware of– Non‐squareNon square operators– Complex eigenvalues– Negative eigenvalues

Spectral Processing ‐ Perspectives SSignalg a pprocessingocess g– Filtering and compression– Relation to discrete Fourier transform (DFT) Geometric– Global and intrinsic Machine learning– Dimensionality reduction

The smoothing problemSmooth out rough features of a contour (2D shape)

Laplacian smoothingMove each vertex towards the centroid of its neighboursHere: Centroid midpoint Move half way

Laplacian smoothing and Laplacian Local averaging 1D discrete Laplacian

Smoothing result Obtained by 10 steps of Laplacian smoothing

Signal representation Represent a contour using a discrete periodic2D signalxx‐coordinatescoordinates of theseahorse contour

Laplacian smoothing in matrix formSmoothing operatorx component onlyy treated same way

1D discrete Laplacian operatorSmoothing and Laplacian operator

Spectral analysis of signal/geometryExpress signal X as a linear sum of eigenvectorsDFT‐like spectral transformProject X along eigenvectorX ETSpatial domainSpectral domain

Plot of eigenvectorsFirst 8 eigenvectors of the 1D periodic LaplacianMore oscillation as eigenvalues (frequencies) increase

Relation to Discrete Fourier Transform Smallest eigenvalue of L is zero Each remaining eigenvalue (except for the last one when n is even) hasmultiplicity 2 The plotted real eigenvectors are not unique to L One particular set of eigenvectors of L are the DFT basis Both sets exhibit similar oscillatory behaviours w.r.t. frequencies

Reconstruction and compression Reconstruction using k leading coefficients A form of spectral compression with info lossgiven by L2

Plot of spectral transform coefficients Fairly fast decay as eigenvalue increases

Reconstruction examplesn 401n 75

Laplacian smoothing as filtering Recall the Laplacian smoothing operator Repeated application of SA filter applied tospectral coefficients

Examplesm 1m 5Filter:m 10m 50

Computational issues No need to compute spectral coefficients for filtering– Polynomial (e.g., Laplacian): matrix‐vector multiplication Spectral compression needs explicit spectral transform Efficient computation [Levy et al.al 08]

Towards spectral mesh transform Signal representation– Vectors of x, y, z vertex coordinates (x, y, z)Laplacian operator for meshes– Encodes connectivity and geometry– Combinatorial: graph Laplacians and variants– Discretization of the continuous Laplace‐Beltrami operator The same kind of spectral transform and analysis

Spectral Mesh Compression

Spectral Processing ‐ Perspectives SSignalg a pprocessingocess g– Filtering and compression– Relation to discrete Fourier transform (DFT) Geometric– Global and intrinsic Machine learning– Dimensionality reduction

A geometric perspective: classicalCl i l EuclideanClassicalE lidgeometryt– Primitives not represented in coordinates– Geometric relationships deduced in apure and self‐contained manner– Use of axioms

A geometric perspective: analyticDDescartes’t ’ analyticl ti geometryt– Algebraic analysis toolsintroduced– Primitives referenced in globalframe extrinsic approach

Intrinsic approachRiemann’s intrinsic view of geometry– Geometry viewed purely from the surfaceperspective– Metric: “distance” between points on surface– Many spaces (shapes) can be treatedsimultaneously: isometry

Spectral methods: intrinsic viewSpectrall approachh takesk theh intrinsic view– Intrinsic geometric/mesh information captured via alinear mesh operator– Eigenstructures of the operator present the intrinsicgeometric information in an organized manner– Rarely need all eigenstructures,eigenstructures dominant ones oftensuffice

Capture of global information(Courant‐Fisher) Let S ℜn n be a symmetric matrix. Then its eigenvalues λ1 λ2 . λn must satisfy the following,λi minv 2 1vT vk 0, 1 k ii-1vT Svwhere v1, v2, , vi – 1 are eigenvectors of S corresponding to the smallesteigenvalues λ1, λ2 , , λi – 1, respectively.

Interpretationλi miniv 2 1vT vk 0, 1 k i-1vT SvvT SvvT vRayleigh quotient Smallest eigenvector minimizes the Rayleigh quotient k‐th smallest eigenvector minimizes Rayleigh quotient, among the vectorsorthogonal to all previous eigenvectors S l iSolutionsto globall b l optimizationti i ti problemsbl

Use of eigenstructures Eigenvalues– Spectral graph theory: graph eigenvalues closely related toalmost all major global graph invariants– Have been adopted as compact global shape descriptors Eigenvectors– Useful extremal properties, e.g., heuristic for NP‐hardproblems normalized cuts and sequencing– Spectral embeddings capture global information, e.g.,clustering

Example: clustering problem

Example: clustering problem

Spectral clusteringEncode information aboutpairwise point affinities Input dataAij e Operator Api p j22σ 2Spectral embeddingLeadingeigenvectors

Spectral clustering eigenvectorsIn spectral domainPerform any clustering(e.g., k‐means) inspectral domain

Why does it work this way?Linkage‐basedLikb d(local info.)spectraldomainSpectral clustering

Local vs. global distancesWould be nice to clusteraccording to cij A good distance: Points in samecluster closer in transformeddomain Look at set of shortest paths more global Commute time distance cij expected time for random walk toggo from i to j and then back to i

Local vs. global distancesIn spectral domain

Commute time and spectral Eigen‐decomposeEigendecompose the graph Laplacian KK UΛUT Let K’ be the generalized inverse of K,K’ UΛ’UT,Λ’ii 1/Λii if Λii 0, otherwise Λ’ii 0. Note: the Laplacian is singular

Commute time and spectral Let zi be the i‐th row of UΛ’ 1/2 the spectralpembeddingg– Scaling each eigenvector by inverse square root of eigenvalue Then zi – zj 2 cijthe commute time distance[Klein & Randic 93, Fouss et al. 06] Full set of eigenvectors used, but select first k in practice

Example: intrinsic geometrySpectral transform to handle shape poseRigid alignmentOur first example: correspondence

Spectral Processing ‐ Perspectives SSignalg a pprocessingocess g– Filtering and compression– Relation to discrete Fourier transform (DFT) Geometric– Global and intrinsic Machine learning– Dimensionality reduction

Spectral embeddingA UΛUT Spectral decomposition Full spectral embedding given by scaled eigenvectors (each scaled bysquared root of eigenvalue) completely captures the operatorWWT AW UΛ12

Dimensionality reduction Full spectral embedding is high‐dimensionalhigh dimensional Use few dominant eigenvectors dimensionalityreduction– Information‐preserving– StructureSt tenhancementht (Polarization(P l i ti Theorem)Th)– Low‐D representation: simplifying solutions

Eckard & Young: InfoInfo‐preservingpreserving A ℜn n : symmetric and positive semisemi‐definitedefinite U(k) ℜn k : leading eigenvectors of A, scaled by square root ofeigenvalues Then U(k)U(k)T: best rank‐k approximation of A in Frobenius normU(k)

Brand & Huang: Polarization Theorem

Low‐dimLowdim simpler problems Mesh projected into the eigenspace formed by the first two eigenvectors of amesh Laplacian Reduce 3D analysis to contour analysis [Liu & Zhang 07]

Challenges ‐ Not quite DFT Basis for DFT is fixed ggiven n,, e.g.,g , regulargand easyy to comparep((Fourierdescriptors) Spectral mesh transform isoperator‐operator‐dependentWhich operator to use?Different behavior of eigen‐functions on the same sphere

Challenges ‐ No free lunch N meshNoh LaplacianL l i on generall meshesh can satisfyti f a listli t off allll desirabled i blproperties Remedy: use nice meshes Delaunay or non‐obtuseDelaunay but obtuseNon‐obtuse

Additional issues Computational issues: FFT vs. eigen‐decomposition Regularity of vibration patterns lost– Difficult to characterize eigenvectors, eigenvalue notenough– Non‐trivialNon trivial to compare two sets of eigenvectors howto pair up?

ConclusionUse eigen‐structure of “well‐behaved” linear operators for geometrypprocessinggSolve problem in a different domain via a spectral transformFourier analysis on meshesCaptures global and intrinsic shape characteristicsDimensionality reduction: effective and simplifying

- Spectral graph theory: grapheigenvalues closely related to almost all major global graph invariants - Have been adopted as compact global shape descriptors Eigenvectors - Useful extremalproperties, e.g., heuristic for NP‐hard problems normalized cuts and sequencing

Related Documents:

Texts of Wow Rosh Hashana II 5780 - Congregation Shearith Israel, Atlanta Georgia Wow ׳ג ׳א:׳א תישארב (א) ׃ץרֶָֽאָּהָּ תאֵֵ֥וְּ םִימִַׁ֖שַָּה תאֵֵ֥ םיקִִ֑לֹאֱ ארָָּ֣ Îָּ תישִִׁ֖ארֵ Îְּ(ב) חַורְָּ֣ו ם

including spectral subtraction [2-5] Wiener filtering [6-8] and signal subspace techniques [9-10], (ii) Spectral restoration algorithms including . Spectral restoration based speech enhancement algorithms are used to enhance quality of noise masked speech for robust speaker identification. In presence of background noise, the performance of .

Spectral iQ Gain refers to gain applied to the newly generated spectral cue. This value is relative to gain prescribed across the target region. For example, if the target region spans 1000 Hz and 3000 Hz and the gain applied in that range is 20dB, a Spectral iQ gain setting of 3dB will cause Spectral iQ to generate new cues that peak at

Spectral Methods and Inverse Problems Omid Khanmohamadi Department of Mathematics Florida State University. Outline Outline 1 Fourier Spectral Methods Fourier Transforms Trigonometric Polynomial Interpolants FFT Regularity and Fourier Spectral Accuracy Wave PDE 2 System Modeling Direct vs. Inverse PDE Reconstruction 3 Chebyshev Spectral Methods .

speech enhancement techniques, DFT-based transforms domain techniques have been widely spread in the form of spectral subtraction [1]. Even though the algorithm has very . spectral subtraction using scaling factor and spectral floor tries to reduce the spectral excursions for improving speech quality. This proposed

Power Spectral Subtraction which itself creates a bi-product named as synthetic noise[1]. A significant improvement to spectral subtraction with over subtraction noise given by Berouti [2] is Non -Linear Spectral subtraction. Ephraim and Malah proposed spectral subtraction with MMSE using a gain function based on priori and posteriori SNRs [3 .

In this paper, we propose a spectral measure for network robustness: the second spectral moment m 2 of the network. Our results show that a smaller second spectral moment m 2 indicates a more robust network. We demonstrate both theoretically and with extensive empirical studies that the second spectral moment can help (1) capture various .

Multiband spectral subtraction was proposed by Kamath [4]. It is very hard for any speech enhancement algorithms to perform homogeneously over all noise types. For this reason algorithms are built on certain assumptions. Spectral subtraction algorithm of speech enhancement is built under the assumption that the noise is additive and is