Adaptive Filtering - Theory And Applications

2y ago
42 Views
5 Downloads
2.34 MB
107 Pages
Last View : 30d ago
Last Download : 3m ago
Upload by : Roy Essex
Transcription

Adaptive Filtering - Theory and ApplicationsJosé C. M. BermudezDepartment of Electrical EngineeringFederal University of Santa CatarinaFlorianópolis – SCBrazilIRIT - INP-ENSEEIHT, ToulouseMay 2011José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 20111 / 107

1Introduction2Adaptive Filtering Applications3Adaptive Filtering Principles4Iterative Solutions for the Optimum Filtering Problem5Stochastic Gradient Algorithms6Deterministic Algorithms7AnalysisJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 20112 / 107

IntroductionJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 20113 / 107

Estimation TechniquesSeveral techniques to solve estimation problems.Classical EstimationMaximum Likelihood (ML), Least Squares (LS), Moments, etc.Bayesian EstimationMinimum MSE (MMSE), Maximum A Posteriori (MAP), etc.Linear EstimationFrequently used in practice when there is a limitation in computationalcomplexity – Real-time operationJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 20114 / 107

Linear EstimatorsSimpler to determine: depend on the first two moments of dataStatistical Approach – Optimal Linear Filters Minimum Mean Square ErrorRequire second order statistics of signalsDeterministic Approach – Least Squares Estimators Minimum Least Squares ErrorRequire handling of a data observation matrixJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 20115 / 107

Limitations of Optimal Filters and LS EstimatorsStatistics of signals may not be available or cannot be accuratelyestimatedThere may not be available time for statistical estimation (real-time)Signals and systems may be non-stationaryMemory required may be prohibitiveComputational load may be prohibitiveJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 20116 / 107

Iterative SolutionsSearch the optimal solution starting from an initial guessIterative algorithms are based on classical optimization algorithmsRequire reduced computational effort per iterationNeed several iterations to converge to the optimal solutionThese methods form the basis for the development of adaptivealgorithmsStill require the knowledge of signal statisticsJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 20117 / 107

Adaptive FiltersUsually approximate iterative algorithms and:Do not require previous knowledge of the signal statisticsHave a small computational complexity per iterationConverge to a neighborhood of the optimal solutionAdaptive filters are good for:Real-time applications, when there is no time for statistical estimationApplications with nonstationary signals and/or systemsJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 20118 / 107

Properties of Adaptive FiltersThey can operate satisfactorily in unknown and possibly time-varyingenvironments without user interventionThey improve their performance during operation by learningstatistical characteristics from current signal observationsThey can track variations in the signal operating environment (SOE)José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 20119 / 107

Adaptive Filtering ApplicationsJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201110 / 107

Basic Classes of Adaptive Filtering ApplicationsSystem IdentificationInverse System ModelingSignal PredictionInterference CancelationJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201111 / 107

System Identificationeo (n)x(n)unknownsystemy(n) d(n)ˆd(n)adaptivefilteradaptivealgorithmJosé Bermudez (UFSC)e(n) Adaptive Filteringother signalsIRIT - Toulouse, 201112 / 107

Applications – System IdentificationChannel EstimationCommunications systemsObjective: model the channel to design distortion compensationx(n): training sequencePlant IdentificationControl systemsObjective: model the plant to design a compensatorx(n): training sequenceJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201113 / 107

Echo CancellationTelephone systems and VoIPEcho caused by network impedance mismatches or acousticenvironmentObjective: model the echo path impulse responsex(n): transmitted signald(n): echo noisex(n)TxHHTxHECRxRxe(n) Hd(n)Figure: Network Echo CancellationJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201114 / 107

Inverse System ModelingAdaptive filter attempts to estimate unknown system’s inverseAdaptive filter input usually corrupted by noiseDesired response d(n) may not be availableDelayz(n)s(n)UnknownSystem x(n)d(n)AdaptiveFilter y(n)AdaptiveAlgorithmJosé Bermudez (UFSC)Adaptive Filteringe(n)othersignalsIRIT - Toulouse, 201115 / 107

Applications – Inverse System ModelingChannel EqualizationLocal gen.z(n)s(n)Channel x(n)d(n)AdaptiveFilter e(n)y(n)AdaptiveAlgorithmx(n)Objective: reduce intersymbol interferenceInitially – training sequence in d(n)After training: d(n) generated from previous decisionsJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201116 / 107

Signal Predictiond(n)x(n)AdaptiveFilterDelay e(n)y(n)x(n no )AdaptiveAlgorithmothersignalsmost widely used case – forward predictionsignal x(n) to be predicted from samples{x(n no ), x(n no 1), . . . , x(n no L)}José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201117 / 107

Application – Signal PredictionDPCM Speech Quantizer - Linear Predictive CodingObjective: Reduce speech transmission bandwidthSignal transmitted all the time: quantization errorPredictor coefficients are transmitted at low ratee(n)prediction errorSpeechsignal d(n)Q[e(n)]QuantizerDPCMsignaly(n) y(n) Q[e(n)] d(n)PredictorJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201118 / 107

Interference CancelationOne or more sensor signals are used to remove interference and noiseReference signals correlated with the inteference should also beavailableApplications: array processing for radar and communicationsbiomedical sensing systemsactive noise control systemsJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201119 / 107

Application – Interference CancelationActive Noise ControlRef: D.G. Manolakis, V.K. Ingle and S.M. Kogon, Statistical and Adaptive Signal Processing, 2000.Cancelation of acoustic noise using destructive interferenceSecondary system between the adaptive filter and the cancelationpoint is unavoidableCancelation is performed in the acoustic environmentJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201120 / 107

Active Noise Control – Block Diagramx(n)d(n) woz(n)e(n)s Sw(n)y(n)g(ys )ys (n)yg (n)Ŝxf (n)José Bermudez (UFSC)AdaptiveAlgorithmAdaptive FilteringIRIT - Toulouse, 201121 / 107

Adaptive Filtering PrinciplesJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201122 / 107

Adaptive Filter FeaturesAdaptive filters are composed of three basic modules:Filtering strucure Determines the output of the filter given its input samplesIts weights are periodically adjusted by the adaptive algorithmCan be linear or nonlinear, depending on the applicationLinear filters can be FIR or IIRPerformance criterion Defined according to application and mathematical tractabilityIs used to derive the adaptive algorithmIts value at each iteration affects the adaptive weight updatesAdaptive algorithm Uses the performance criterion value and the current signalsModifies the adaptive weights to improve performanceIts form and complexity are function of the structure and of theperformance criterionJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201123 / 107

Signal Operating Environment (SOE)Comprises all informations regarding the properties of the signals andsystemsInput signalsDesired signalUnknown systemsIf the SOE is nonstationaryAquisition or convergence mode: from start until close to bestperformanceTracking mode: readjustment following SOE’s time variationsAdaptation can beSupervised – desired signal is available e(n) can be evaluatedUnsupervised – desired signal is unavailableJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201124 / 107

Performance EvaluationConvergence rateMisadjustmentTrackingRobustness (disturbances and numerical)Computational requirements (operations and memory)Structure facility of implementationperformance surfacestabilityJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201125 / 107

Optimum versus Adaptive Filters in Linear EstimationConditions for this studyStationary SOEFilter structure is transversal FIRAll signals are real valuedPerformance criterion: Mean-square error E[e2 (n)]The Linear Estimation Problemd(n)x(n)Linear FIR Filter y(n)w e(n) Jms E[e2 (n)]José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201126 / 107

The Linear Estimation Problemd(n)Linear FIR Filter y(n)x(n)w e(n) x(n) [x(n), x(n 1), · · · , x(n N 1)]Ty(n) xT (n)we(n) d(n) y(n) d(n) xT (n)wJms E[e2 (n)] σd2 2pT w wT Rxx wwherep E[x(n)d(n)];Rxx E[x(n)xT (n)]Normal EquationsRxx wo p wo R 1xx pfor Rxx 0Jmsmin σd2 pT R 1xx pJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201127 / 107

What if d(n) is nonstationary?d(n)x(n)Linear FIR Filterwy(n) e(n) x(n) [x(n), x(n 1), · · · , x(n N 1)]Ty(n) xT (n)w(n)e(n) d(n) y(n) d(n) xT (n)w(n)Jms (n) E[e2 (n)] σd2 (n) 2p(n)T w(n) wT (n)Rxx w(n)wherep(n) E[x(n)d(n)];Rxx E[x(n)xT (n)]Normal EquationsRxx wo (n) p(n) wo (n) R 1xx p(n)for Rxx 0Jmsmin (n) σd2 (n) pT (n)R 1xx p(n)José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201128 / 107

Optimum Filters versus Adaptive FiltersOptimum FiltersAdaptive FiltersComputep(n) E[x(n)d(n)]Solve Rxxwo p(n)Filtering: y(n) xT (n)w(n)Evaluate error: e(n) d(n) y(n)Adaptive algorithm:wo (n)Filter with y(n) xT (n)wo (n)Nonstationary SOE:Optimum filter determinedfor each value of nJosé Bermudez (UFSC)w(n 1) w(n) w[x(n), e(n)] w(n) is chosen so that w(n) is close towo (n) for n largeAdaptive FilteringIRIT - Toulouse, 201129 / 107

Characteristics of Adaptive FiltersSearch for the optimum solution on the performance surfaceFollow principles of optimization techniquesImplement a recursive optimization solutionConvergence speed may depend on initializationHave stability regionsSteady-state solution fluctuates about the optimumCan track time varying SOEs better than optimum filtersPerformance depends on the performance surfaceJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201130 / 107

Iterative Solutions for theOptimum Filtering ProblemJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201131 / 107

Performance (Cost) FunctionsMean-square error E[e2 (n)] (Most popular)Adaptive algorithms: Least-Mean Square (LMS), Normalized LMS(NLMS), Affine Projection (AP), Recursive Least Squares (RLS), etc.Regularized MSEJrms E[e2 (n)] αkw(n)k2Adaptive algorithm: leaky least-mean square (leaky LMS)ℓ1 norm criterionJℓ1 E[ e(n) ]Adaptive algorithm: Sign-ErrorJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201132 / 107

Performance (Cost) Functions – continuedLeast-mean fourth (LMF) criterionJLM F E[e4 (n)]Adaptive algorithm: Least-Mean Fourth (LMF)Least-mean-mixed-norm (LMMN) criterion1JLM M N E[αe2 (n) (1 α)e4 (n)]2Adaptive algorithm: Least-Mean-Mixed-Norm (LMMN)Constant-modulus criterion 2JCM E[ γ xT (n)w(n) 2 ]Adaptive algorithm: Constant-Modulus (CM)José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201133 / 107

MSE Performance Surface – Small Input Correlation3500300025002000150010005000200w2José Bermudez (UFSC) 20 200 101020w1Adaptive FilteringIRIT - Toulouse, 201134 / 107

MSE Performance Surface – Large Input 020100 10w2José Bermudez (UFSC) 20 20 15 10 505101520w1Adaptive FilteringIRIT - Toulouse, 201135 / 107

The Steepest Descent Algorithm – Stationary SOECost FunctionJms (n) E[e2 (n)] σd2 2pT w(n) wT (n)Rxx w(n)Weight Update Equationw(n 1) w(n) µc(n)µ: step-sizec(n): correction term (determines direction of w(n))Steepest descent adjustment:c(n) Jms (n) Jms (n 1) Jms (n)w(n 1) w(n) µ[p Rxx w(n)]José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201136 / 107

Weight Update Equation About the Optimum WeightsWeight Error Update Equationw(n 1) w(n) µ[p Rxx w(n)]Using p Rxx wow(n 1) (I µRxx )w(n) µRxx woWeight error vector: v(n) w(n) wov(n 1) (I µRxx )v(n)Matrix I µRxx must be stable for convergence ( λi 1)Assuming convergence, limn v(n) 0José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201137 / 107

Convergence Conditionsv(n 1) (I µRxx )v(n);Rxxpositive definiteEigen-decomposition of RxxRxx QΛQTv(n 1) (I µQΛQT )v(n)QT v(n 1) QT v(n) µΛQT v(n)Defining ṽ(n 1) QT v(n 1)ṽ(n 1) (I µΛ)ṽ(n)José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201138 / 107

Convergence Propertiesṽ(n 1) (I µΛ)ṽ(n)ṽk (n 1) (1 µλk )ṽk (n), k 1, . . . , Nṽk (n) (1 µλk )n ṽk (0)Convergence modes monotonic if0 1 µλk 1 oscillatory if 1 1 µλk 0Convergence if 1 µλk 1 0 µ José Bermudez (UFSC)Adaptive Filtering2λmaxIRIT - Toulouse, 201139 / 107

Optimal Step-Sizeṽk (n) (1 µλk )n ṽk (0); convergence modes: 1 µλkmax 1 µλk : slowest modemin 1 µλk : fastest modeJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201140 / 107

Optimal Step-Size – continued 1 µλk 10µo1/λmax1/λminµmin max{ 1 µλk }µ1 µo λmin (1 µo λmax )µo Optimal slowest modes: ρ 1ρ 1 ;José Bermudez (UFSC)2λmax λminρ λmaxλminAdaptive FilteringIRIT - Toulouse, 201141 / 107

The Learning Curve – Jms(n)Excess MSENz} {XTJms (n) Jmsmin ṽ (n)Λṽ(n) Jmsmin λk ṽk2 (n)k 1Since ṽk (n) (1 µλk )n ṽk (0),Jms (n) Jmsmin NXλk (1 µλk )2n ṽk2 (0)k 1λk (1 µλk )2 0 monotonic convergenceStability limit is again 0 µ 2λmaxJms (n) converges faster than w(n)Algorithm converges faster as ρ λmax /λmin 1José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201142 / 107

Simulation Resultsx(n) α x(n 1) v(n)Steepest Descent Mean Square Error0 10 10 20 20 30 30MSE (dB)MSE (dB)Steepest Descent Mean Square Error0 40 40 50 50 60 60 70020406080100120iteration140160Input: White noise (ρ 1)180200 70020406080100120iteration140160180200Input: AR(1), α 0.7 (ρ 5.7)Linear system identification - FIR with 20 coefficientsStep-size µ 0.3Noise power σv2 10 6José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201143 / 107

The Newton AlgorithmSteepest descent: linear approx. of Jms about the operating pointNewton’s method: Quadratic approximation of JmsExpanding Jms (w) in Taylor’s series about w(n),Jms (w) Jms [w(n)] T Jms [w w(n)]1 [w w(n)]T H(n)[w w(n)]2Diferentiating w.r.t. w and equating to zero at w w(n 1), Jms [w(n 1)] Jms [w(n)] H[w(n)][w(n 1) w(n)] 0w(n 1) w(n) H 1 [w(n)] Jms [w(n)]José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201144 / 107

The Newton Algorithm – continued Jms [w(n)] 2p 2Rxx w(n)H(w(n)) 2RxxThus, adding a step-size control,w(n 1) w(n) µR 1xx [ p Rxx w(n)]Quadratic surface conv. in one iteration for µ 1Requires the determination ofR 1xx,/Can be used to derive simpler adaptive algorithmsWhen H(n) is close to singular regularizationH̃(n) 2Rxx 2εIJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201145 / 107

Basic Adaptive AlgorithmsJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201146 / 107

Least Mean Squares (LMS) AlgorithmCan be interpreted in different waysEach interpretation helps understanding the algorithm behaviorSome of these interpretations are related to the steepest descentalgorithmJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201147 / 107

LMS as a Stochastic Gradient AlgorithmSuppose we use the estimate Jms (n) E[e2 (n)] e2 (n)The estimated gradient vector becomes2ˆ ms (n) e (n) 2e(n) e(n) J w(n) w(n)Since e(n) d(n) xT (n)w(n),ˆ ms (n) 2e(n)x(n) J(stochastic gradient)and, using the steepest descent weight update equation,w(n 1) w(n) µe(n)x(n)José Bermudez (UFSC)Adaptive Filtering(LMS weight update)IRIT - Toulouse, 201148 / 107

LMS as a Stochastic Estimation Algorithm Jms (n) 2p 2Rxx w(n)Stochastic estimatorsp̂ d(n)x(n)Then,R̂xx x(n)xT (n)ˆ ms (n) 2d(n)x(n) 2x(n)xT (n)w(n) Jˆ ms (n) is the steepest descent weight update,Using Jw(n 1) w(n) µe(n)x(n)José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201149 / 107

LMS – A Solution to a Local OptimizationError expressionse(n) d(n) xT (n)w(n) (a priori error)ǫ(n) d(n) xT (n)w(n 1) (a posteriori error)We want to maximize ǫ(n) e(n) with ǫ(n) e(n) ǫ(n) e(n) xT (n) w(n) w(n) w(n 1) w(n) Expressing w(n) as w(n) w(n)e(n) ǫ(n) e(n) xT (n) w(n)e(n) For max ǫ(n) e(n) w(n)in the direction of x(n)José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201150 / 107

w(n) µx(n) and w(n) µx(n)e(n)andw(n 1) w(n) µe(n)x(n)Asǫ(n) e(n) xT (n) w(n) µxT (n)x(n)e(n) ǫ(n) e(n) requires 1 µxT (n)x(n) 1, or0 µ José Bermudez (UFSC)2kx(n)k2(stability region)Adaptive FilteringIRIT - Toulouse, 201151 / 107

Observations - LMS AlgorithmLMS is a noisy approximation of the steepest descent algorithmThe gradient estimate is unbiasedThe errors in the gradient estimate lead to Jmsex ( ) 6 0Vector w(n) is now randomSteepest descent properties are no longer guaranteed LMS analysis requiredThe instantaneous estimates allow tracking without redesignJosé Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201152 / 107

Some Research ResultsJ. C. M. Bermudez and N. J. Bershad, “A nonlinear analytical model for thequantized LMS algorithm - the arbitrary step size case,” IEEE Transactions onSignal Processing, vol.44, No. 5, pp. 1175-1183, May 1996.J. C. M. Bermudez and N. J. Bershad, “Transient and tracking performanceanalysis of the quantized LMS algorithm for time-varying system identification,”IEEE Transactions on Signal Processing, vol.44, No. 8, pp. 1990-1997, August1996.N. J. Bershad and J. C. M. Bermudez, “A nonlinear analytical model for thequantized LMS algorithm - the power-of-two step size case,” IEEE Transactions onSignal Processing, vol.44, No. 11, pp. 2895- 2900, November 1996.N. J. Bershad and J. C. M. Bermudez, “Sinusoidal interference rejection analysisof an LMS adaptive feedforward controller with a noisy periodic reference,” IEEETransactions on Signal Processing, vol.46, No. 5, pp. 1298-1313, May 1998.J. C. M. Bermudez and N. J. Bershad, “Non-Wiener behavior of the Filtered-XLMS algorithm,” IEEE Trans. on Circuits and Systems II - Analog and DigitalSignal Processing, vol.46, No. 8, pp. 1110-1114, Aug 1999.José Bermudez (UFSC)Adaptive FilteringIRIT - Toulouse, 201153 / 107

Some Research Results – continuedO. J. Tobias, J. C. M. Bermudez and N. J. Bershad, “Mean weight behavior of theFiltered-X LMS algorithm,” IEEE Transactions on Signal Processing, vol. 48, No.4, pp. 1061-1075, April 2000.M. H. Costa, J. C. M. Bermudez and N. J. Bershad, “Stochastic analysis of theLMS algorithm with a saturation nonlinearity following the adaptive filter output,”IEEE Transactions on Signal Processing, vol. 49, No. 7, pp. 1370-1387, July 2001.M. H. Costa, J. C, M. Bermudez and N. J. Bershad, “Stochastic analysis of theFiltered-X LMS algorithm in systems with nonlinear secondary paths,” IEEETransactions on Signal Processing, vol. 50, No. 6, pp. 1327-1342, June 2002.M. H. Costa, J. C. M. Bermudez and N. J. Bershad, “The performance surface infiltered nonlinear mean square estimation,” IEEE Transactions on Circuits andSystems I, vol. 50, No. 3, p. 445-447, March 2003.G. Barrault, J. C. M. Bermudez and A. Lenzi, “New Analytical Model for theFiltered-x Least Mean Squares Algorithm Verified Through Active Noise ControlExperiment,” Mechanical Systems and Signal Processing, v. 21, p. 1839-1852,2007.José Bermudez

Adaptive Filter Features Adaptive filters are composed of three basic modules: Filtering strucure Determines the output of the filter given its input samples Its weights are periodically adjusted by the adaptive algorithm Can be linear or nonlinear, depending on the application Linear filters can be FIR or IIR Performance criterion Defin

Related Documents:

3 filtering and selective social filtering),6 Algeria (no evidence of filtering),7 and Jordan (selective political filtering and no evidence of social filtering).8 All testing was conducted in the period of January 2-15, 2010.

Sybase Adaptive Server Enterprise 11.9.x-12.5. DOCUMENT ID: 39995-01-1250-01 LAST REVISED: May 2002 . Adaptive Server Enterprise, Adaptive Server Enterprise Monitor, Adaptive Server Enterprise Replication, Adaptive Server Everywhere, Adaptive Se

Adaptive Signal Processing. BRIEF SURVEY This section provides a brief survey of adaptive algorithms for filtering applications. Discrete-time adaptive signal processing (ASP) algorithms [1-5] and more specifically Least Mean Square (LMS) [1] and recursive least square adaptive fil

WebTitan Web Filtering and URL Filtering Categories: The 53 Categories available in Web Titan for Web Filtering and URL Filtering: 1.Alcohol: Web pages that promote, advocate or sell alcohol including beer, wine and hard liquor. 4.Business/Services: General business websites. 7.Community Sites: Newsgroup sites and posting including

SonicWALL Content Filtering feature. A Web browser is used to access the SonicWALL Management interface, and the commands and functions of Content Filtering. The following sections are in this chapter: Accessing the SonicWALL using a Web browser Enabling Content Filtering and Blocking Customizing Content Filtering

Chapter Two first discusses the need for an adaptive filter. Next, it presents adap-tation laws, principles of adaptive linear FIR filters, and principles of adaptive IIR filters. Then, it conducts a survey of adaptive nonlinear filters and a survey of applica-tions of adaptive nonlinear filters. This chapter furnishes the reader with the necessary

Highlights A large thermal comfort database validated the ASHRAE 55-2017 adaptive model Adaptive comfort is driven more by exposure to indoor climate, than outdoors Air movement and clothing account for approximately 1/3 of the adaptive effect Analyses supports the applicability of adaptive standards to mixed-mode buildings Air conditioning practice should implement adaptive comfort in dynamic .

The Highway Asset Management Policy and the Highway Asset Management Strategy have been developed to help us to take account of these challenges. The policy is designed to drive continuous improvement in the way we maintain our highway network to ensure that it continues to be safe serviceable and sustainable. It sets out the principles that will ensure we adopt and develop a strategic .