Filtered Backprojection Algorithm Can Outperform Iterative .

2y ago
12 Views
3 Downloads
904.35 KB
7 Pages
Last View : 2m ago
Last Download : 3m ago
Upload by : Troy Oden
Transcription

Filtered Backprojection Algorithm Can OutperformIterative Maximum Likelihood Expectation-MaximizationAlgorithmGengsheng L. ZengUtah Center for Advanced Imaging Research (UCAIR), Department of Radiology, Universityof Utah, Salt Lake City, UT 84108Received 1 March 2012; revised 12 March 2012; accepted 14 March 2012ABSTRACT: The iterative maximum-likelihood expectation-maximization (ML-EM) algorithm is an excellent algorithm for image reconstruction and usually provides better images than the filtered backprojection(FBP) algorithm. However, a windowed FBP algorithm can outperformthe ML-EM in certain occasions, when the least-squared differencefrom the true image, that is, the least-squared error (LSE), is used asthe comparison criterion. Computer simulations were carried out forthe two algorithms. For a given data set the best reconstruction (compared to the true image) from each algorithm was first obtained, andthe two reconstructions are compared. The stopping iteration numberof the ML-EM algorithm and the parameters of the windowed FBPalgorithm were determined, so that they produced an image that wasclosest to the true image. However, to use the LSE criterion to compare algorithms, one must know the true image. How to select the optimal parameters when the true image is unknown is a practical openproblem. For noisy Poisson projections, computer simulation resultsindicate that the ML-EM images are better than the regular FBPimages, and the windowed FBP algorithm images are better than theML-EM images. For the noiseless projections, the FBP algorithmsoutperform the ML-EM algorithm. The computer simulations revealthat the windowed FBP algorithm can provide a reconstruction thatC 2012 Wileyis closer to the true image than the ML-EM algorithm. VPeriodicals, Inc. Int J Imaging Syst Technol, 22, 114–120, 2012; Published onlinein Wiley Online Library (wileyonlinelibrary.com). DOI 10.1002/ima.22011Key words: image reconstruction; ML-EM algorithm; filteredbackprojection algorithm; Poisson noiseI. INTRODUCTIONThe iterative maximum-likelihood expectation-maximization (MLEM) algorithm (or its ordered-subset version) is the most widelyused iterative image reconstruction algorithm in nuclear medicine(Shepp and Vardi, 1982). Many studies have been carried out tocompare the ML-EM algorithm with the analytical filtered backprojection (FBP) algorithm (Wilson and Tsui, 1993; Prvulovich et al.,Correspondence to: Gengsheng L. Zeng; e-mail: larry@ucair.med.utah.edu orzenglarry@hotmail.comGrant sponsors: This work was supported in part by the Margolis Foundation andNIH grant NIH R01 HL108350.' 2012 Wiley Periodicals, Inc.1997; Wang et al., 1998; Gutman et al., 2003). Except for someparametric imaging studies, the ML-EM algorithm output is usuallyconsidered to be better than the FBP algorithm’s output (Carsonet al., 1994; Oda et al., 2001). However, image evaluation and comparison is always task based, and there is not a universal criterion.Using one criterion, algorithm A can be better than algorithm B.While using another criterion, algorithm B can be better and algorithm A. If the task is lesion detection, an algorithm that can provide a smooth background and a high-contrast lesion wins (Tourassiet al., 1991). Some tasks rely on the image variance and bias tradeoff (Xu et al., 1993), or resolution and signal-to-noise trade-off.The FBP algorithm’s performance depends on the selection ofthe window function that is used to regulate the ramp filter. A newwindow function is suggested in this article, and the motivation forthe design of the window function is presented in the Appendix.If we adopt the criterion that the best image is the one closest tothe true image in the sense of the pixel-by-pixel least-squared-error(LSE), then the proposed windowed FBP algorithm can outperformthe ML-EM algorithm. The windowed FBP algorithm may notwork well in a lesion detection task; but it may find application inquantitative image reconstruction, which is important in, for example, kinetic parameter estimation in dynamic studies.As the true image must be known to use this criterion, only computer simulations are performed in this article. It is impractical touse real data for the comparison studies, because the true image isunknown.II. METHODSThe Shepp–Logan head phantom is used in the computer simulations (Shepp and Logan, 1974). The projection data are calculatedanalytically as line-integrals of the phantom (without pixelizationof the phantom). Poisson noise is added to the projections, and 100noise realizations are used for each simulation study. The onedimensional (1D) detector array contains 128 bins. No scattering,attenuation, system blurring, and motion are considered in data generation. The detector stops at 120 views uniformly distributed over1808. The image is reconstructed into a two-dimensional (2D) 128

Table I. ML-EM versus windowed FBP computer simulationsTotal Photon Counts in the ProjectionsML-EM(Image from OneRealization)Windowed FBP(Image fromOne Realization)LSE(FBP)/LSE(ML)0.029/0.169 5 0.17; FBP is betterNoiseless (no random noise is added.Projections are calculated analytically)Iterations 5 53k 5 1 3 108, g 5 13.815 3 106/1.722 3 107 5 0.22(over 100 noise realizations); FBP is better380,000,000 (a typical value)Iterations 5 51k 5 1 3 108, g 5 11.019 3 105/2.011 3 105 5 0.51(over 100 noise realizations); FBP is better38,000,000 (a typical value)Iterations 5 45k 5 1808, g 5 13.306 3 103/3.799 3 104 5 0.87(over 100 noise realizations); FBP is better3,800,000 (a typical value)Iterations 5 28k 5 195, g 5 187.141/104.696 5 0.83(over 100 noise realizations); FBP is better380,000 (a typical value)Iterations 5 14k 5 83, g 5 31.755/2.686 5 0.65(over 100 noise realizations); FBP is better38,000 (a typical value)Iterations 5 6k 5 37, g 5 14Vol. 22, 114–120 (2012)115

Table I. (continued)Total Photon Counts in the ProjectionsML-EM(Image from OneRealization)Windowed FBP(Image fromOne Realization)0.029/0.055 5 0.53(over 100 noise realizations); FBP is better3800 (a typical value)Iterations 5 3k 5 24, g 5 613 128 array, using the ML-EM algorithm and the new windowedFBP algorithm, respectively.No modification is made to the conventional ML-EM algorithmthat models the Poisson noise. The number of iterations is chosen,when the reconstructed image is closest to the true image, that is,when the pixel-by-pixel squared errorSquared error ¼X½ReconstructionðkÞ TrueðkÞ 2ð1Þk2All pixelsreaches a minimum.The windowed FBP algorithm is almost the same as the standardFBP algorithm except that the ramp filter is regulated by a newlyproposed window function. Using window functions in an FBPalgorithm is not new. One popular window function is the rectangular function suggested by Ramachandran and Lakshminarayanan(1971). Another popular window is smoother than the rectangularfunction and was proposed by Shepp and Logan (1974). Many otherlowpass filters, such as the Hamming window (Hamming, 1977),the Hann window (which is the raised cosine function) (Chesler andRiederer, 1975), the Butterworth window (Butterworth, 1930), andso on, are also widely utilized. Our proposed window function isFigure 1. Window functions used in the computer simulations. A: g5 1, k 5 1 3 108; B: g 5 1, k 5 1808; C: g 5 1, k 5 195; D: g 5 3, k 583; E: g 5 14, k 5 37; and F: g 5 61, k 5 24.116LSE(FBP)/LSE(ML)Vol. 22, 114–120 (2012)designed and inspired by the iterative Landweber algorithm (Strand,1974). A symbolic derivation of this window function is given inthe Appendix and in Zeng (2012a). The proposed window functionis expressed in the Fourier domain as:WindowðxÞ ¼ 1 ð1 að0:5 þ 0:5 cosðxÞÞg kÞ;jxjð2Þwhere x is the frequency variable, and g and k are the parametersselected by the user. In Eq. (2), a 0 is a small constant that corresponds to the step size in the iterative Landweber algorithm, k corresponds to the number of iterations in the Landweber algorithm,and g is number of times to use the raised cosine lowpass filter 0.51 0.5 cos(x). In computer simulations, the discrete-signal frequency x is in the range of (2p, p]. If the data in the discrete Fourier domain have M samples, the smallest (i.e., closest to the directcurrent (DC)) nonzero frequency is at 2p/M. Thus, a can be chosenin the interval 0 g a 2p/M to ensure the requirement of ð0:5þ0:5 cosðxÞÞ 1 a 1 when x 0. This requirement guaranjx jtees that the window function [Eq. (2)] is always non-negative. Inour computer simulations, a 5 p/M 5 p/128 is adopted. When x 50 (DC), the window function is not effective and not used, becauseat this DC frequency the ramp-filter x 5 0.In this article, the 1D filtering step in the windowed FBP algorithm is implemented in the Fourier domain. The ML-EM algorithmhas a non-negativity constraint, whereas the FBP algorithm doesnot have this constraint. In order to have a fair comparison, after theimage is reconstructed with the windowed FBP algorithm, all negative valued pixels are set to zero. In fact, there exist better ways toprocess the negative reconstruction values as suggested by O’Sullivan et al. (1993). The resulting image is scaled by a constant, sothat the total sum of the forward projections of the image is thesame as the total sum of the projection photon counts.For a given set of projections, the parameters k and g are chosensuch that the reconstructed image and the true image have the minimum squared error as defined in Eq. (1).III. RESULTSIn Table I, the optimal ML-EM images and the optimal windowedFBP images are compared.By optimal we mean that the images reach the least squared difference from the true image or LSE. The LSE listed in Table I isthe average over 100 noise realizations. In Table I, if the ratio

Table II. ML-EM versus regular FBP computer simulationsTotal Photon Counts in the ProjectionsML-EM(Image from One Realization)Regular FBP(Image from One Realization)LSE(FBP)/LSE(ML)0.0293/0.169 5 0.172; FBP is betterNoiseless (no random noise is added.Projections are calculated analytically)Iterations 5 533.828 3 106/1.722 3 107 5 0.22(over 100 noise realizations); FBP is better380,000,000 (a typical value)Iterations 5 511.234 3 105/2.011 3 105 5 0.61(over 100 noise realizations);FBP is better38,000,000 (a typical value)Iterations 5 451.010 3 104/3.799 3 103 5 2.66(over 100 noise realizations);ML-EM is better3,800,000 (a typical value)Iterations 5 286.430 3 102/1.047 3 102 5 6.14(over 100 noise realizations);ML-EM is better380,000 (a typical value)Iterations 5 1414.725/2.686 5 5.48(over 100 noise realizations);ML-EM is better38,000 (a typical value)Iterations 5 6Vol. 22, 114–120 (2012)117

Table II. (Continued)Total Photon Counts in the ProjectionsML-EM(Image from One Realization)Regular FBP(Image from One Realization)LSE(FBP)/LSE(ML)0.193/0.055 5 3.50(over 100 noise realizations);ML-EM is better3800 (a typical value)Iterations 5 3LSE(FBP)/LSE(ML) is less than 1, then the FBP image has asmaller LSE than the ML-EM image.Table I shows consistently that FBP images are superior to MLEM images for different noise levels. However, we do not have amathematical proof that for any projection data set, a set of parameters k and g may always be determined, such that their associatedFBP algorithm will outperform the ML-EM algorithm. One cannotdraw definite conclusions just from examples. In other words, westill cannot say that the windowed FBP algorithm is always betterthan the ML-EM algorithm.Figure 1 shows some examples of the frequency-domain window functions that have been used in the computer simulationsreported in Table I. The trend of the window function appears to beto suppress the high-frequency components more and more as theprojection data noise gets larger (i.e., the counts become lower). Itseems that every noise level corresponds to an optimal shape of thewindow function, but we have not yet established a mathematicalrelation between them.Finally, Table II presents the same comparison results as in Table I, except that the windowed FBP algorithm is replaced by theregular FBP algorithm, which in fact, uses the rectangular windowfunction (Ramachandran and Lakshminarayanan, 1971). Whenusing very-noisy Poisson projections, the ML-EM images outperform the regular FBP images. Tables I and II imply that optimization of the window function parameters does make a difference.IV. DISCUSSION AND CONCLUSIONSIn this article, we suggest a new window function to be used withthe traditional FBP algorithm. The purpose of the window functionis to emulate a linear iterative algorithm, characterized by parameters k and g. By carefully choosing these two parameters, the windowed FBP can outperform the ML-EM algorithm, using the criterion of LSE. We understand that using LSE as the figure-of-merit(FOM) is problematic in practice and is not very meaningful formany tasks (Barrett and Myers, 2004). This article merely presents amathematical case: if LSE is chosen as the FOM, the ML-EM algorithm may not always gives the optimal solution even with Poissondata. If a different criterion is used, the windowed FBP algorithmmay not perform as well as the ML-EM algorithm. A potentialapplication of the windowed FBP algorithm is in quantitative imaging, for example, in dynamic studies of compartment modeling.Why is a new window function needed? Why not use one of themany existing window functions (e.g., the Hann window or Butterworth window)? Unlike so many existing window functions that arespecified by the cut-off frequencies, the new window functions areclosely related to the iterative Landweber algorithm’s iteration118Vol. 22, 114–120 (2012)number, which, in turn, is related to many other iterative algorithmsincluding the ML-EM algorithm, if an appropriate noise weightingmodel is modeled. For each noise level in the phantom study, thebest iteration number of the ML-EM algorithm should be chosen,and the best parameters k and g of the new FBP window functionshould be determined. One Hann window or one Butterworth window (which may be approximated by one set of k and g) cannot bethe optimal function for all situations.In the computer simulations, we observe that the FBP results aremuch better than the ML-EM results, when there is no noise or theprojection data’s count rate is extremely high. This is because theML-EM algorithm uses nonoverlapped, piece-wise constant, squarepixels to represent the image, and the curved edges in the objectcannot be accurately represented. The image misrepresentationerrors are amplified, as the iteration number increases.The optimal parameters in both algorithms depend on not onlythe noise level but also the object. A practical open problem associated with the windowed FBP algorithm is the difficulty in determining the optimal values of k and g for a given noisy projection dataset, because in an actual patient or phantom scan the true image isnever available.The iterative ML-EM algorithm is an excellent and widely usedreconstruction method for image reconstruction, especially in nuclear medicine imaging. Its projection and backprojection matricescan readily model the imaging geometry, system blurring, photonattenuation and scattering, and other effects. However, when thecounts are low, it may provide biased reconstruction (Carson et al.,1994). Also, different frequency components have different convergence rates. Thus, when the iterative algorithm is stopped early, thereconstructed image may have a nonuniform point spread function(Zeng, 2010).On the other hand, the FBP algorithm is a linear method anddoes not introduce image bias. As pointed out by Hanson (1980),the FBP algorithm is an optimal estimator for the task of determination of an object’s amplitude (bias). Its point spread function isshift-invariant throughout the entire image.No scattering, attenuation, blurring, and motion are simulated inthe computer simulations. Poisson noise is the only factor thatdegrades the image quality. In the real world, when scattering,attenuation and blurring effects must be compensated, the projectorin the ML-EM algorithm is able to model these effects, and it ischallenging for the FBP algorithm to model these effects. In thiscase, the ML-EM algorithm will outperform the FBP.Iterative Bayesian algorithm uses some prior information to specify a desired and useful solution, by adding an extra Bayesian termin the objective function. Our most recent Medical Physics paper(Zeng, 2012b) developed a Bayesian-FBP algorithm, by adding a

Bayesian term bh(x)to the ramp filter in a special way, that is, thefrequency-domain ramp filter x becomes 1/[1/ x 1 bh(x)], whereb is small weighting factor for the Bayesian contribution. ThisBayesian-FBP algorithm can give the same result as an iterativeBayesian algorithm. Although no noise-weighing is used in the newFBP algorithm of this article, noise weighting can be incorporated inthe window function, and the new windowed FBP algorithm can besignificantly improved (Zeng, 2012b).The current version of the windowed FBP algorithm cannot handle data attenuation, scattering, system blurring, and irregular imaging/sampling geometries. An FBP algorithm that can compensatefor nonuniform attenuation in single photon emission computedtomography is available (Novikov, 2002). It is likely that this newFBP algorithm with attenuation correction capability can bemodified with some window function, and its noise performancegets improved. These open problems will be left for furtherinvestigation.ACKNOWLEDGMENTSThe author thanks Dr. Roy Rowley of the University of Utah forEnglish editing.APPENDIXThe proposed window function is motivated and inspired by theiterative Landweber algorithm, which solves a system of linearequations AX 5 P by an iterative procedure:Xðkþ1Þ ¼ X ðkÞ þ aAT SðP AXðkÞ Þ;ðA1Þwhere A is the projection matrix, AT is the backprojection matrix, Pis the projection data vector, X is the image vector, X(k) is the estimated image at the kth iteration, and a 0 is the step size. In Eq.(A1), S is a lowpass filter, and this lowpass filter is new, unlike inthe original Landweber algorithm. This recursive relation can berewritten as a nonrecursive expression asXðkþ1Þ¼XðkÞk 1XX ðkÞ ¼ AT a½ðI a SAT AÞn SP:ðA4Þn¼0Notice that when a 0 is small enough, as k ? 1 we havek 1XðI a SAT AÞn ! ðSAT AÞ 1 ¼ ðAT AÞ 1 S 1 ;a½ðA5Þn¼0andXðkÞ ! AT ðAT AÞ 1 S 1 SP ¼ AT ðAT AÞ 1 P:ðA6ÞFrom Eq. (A6) we observe that as the iteration number tends to infinity, the Landweber algorithm resembles an FBP algorithm, with(ATA)21 being the ramp-filter. This observation motivated us todesign a window function for the FBP algorithm, making the windowed FBP act like an iterative algorithm, so that the performanceof the FBP algorithm may be similar to an iterative algorithm.For a finite k, we havea½k 1XðI aSAT AÞn S ¼ ðAT AÞ 1 S 1 ½I ðI aSAT AÞk S:n¼0ðA7ÞThe proofs of the above equations are available in a review paperby Schafer et al. (1981).In the Fourier domain, (ATA)21 can be approximated as theramp-filter x , the effects of S21 and S cancel each other, and [I 2(I 2 aSAT A)k] can be approximated by the window functionWk ðxÞ ¼ 1 f1 a½0:5 þ 0:5 cosðxÞ g kg:jx jðA8ÞIn (A7), a multiple-raised-cosine lowpass filter [0.5 1 0.5 cos(x)]cis chosen to implement the lowpass filter S, and g is a non-negativereal number. When g 5 1, [0.5 1 0.5 cos(x)] is the Hann filter(Chesler and Riederer, 1975). In fact, one can use any lowpass filterto implement S.ðkÞþ aA SðP AX ÞTThus, Eq. (A3) becomes¼ aA SP þ ðI aAT SAÞXðkÞT¼ aAT SP þ ðI aAT SAÞ½aAT SP þ ðI aAT SAÞXðk 1Þ ¼ aAT SP þ ðI aAT SAÞaAT SP þ ðI aAT SAÞ2 X ðk 1ÞREFERENCES¼ aAT SP þ ðI aAT SAÞaAT SP þ ðI aAT SAÞ2 ½aAT SPH.H. Barrett and K. Myers, Foundations of image science, Wiley, Hoboken,NJ, 2004.þ ðI aAT SAÞXðk 2Þ ¼ aAT SP þ ðI aAT SAÞ aAT SPþ ðI aAT SAÞ2 aAT SP þ ðI aAT SAÞ3 Xðk 2ÞkS. Butterworth, On the theory of filter amplifiers, Exp Wireless WirelessEng 7 (1930), 536–541.¼ . . . ¼ ½I þ ðI aA SAÞ

The iterative maximum-likelihood expectation-maximization (ML-EM) algorithm (or its ordered-subset version) is the most widely used iterative image reconstruction algorithm in nuclear medicine (Shepp and Vardi, 1982). Many studies have been carried out to compare the ML

Related Documents:

Tomographic Reconstruction 3D Image Processing Torsten Möller . History Reconstruction — basic idea Radon transform Fourier-Slice theorem (Parallel-beam) filtered backprojection Fan-beam filtered backprojection . reconstruction more direct: 39

Gigamon G-TAP A-Tx Filtered Data Stream Filtered Data Stream Filtered Data Stream Filtered Data Stream Filtered Data Stream Filtered Data Stream Gig amon In tel ig ent DAN TM U P W H E N I N S T A L L E D I N R E A R S L O T 17 24 SLO T 3 PO RTS 9 16 SLO T 2 PO RTS SLO T 1 POR TS G1 -G 4 POR TS 1-8 G/ 0 PO RTS ( FP ) Pw r GigaVUE-2 404MB

Review Autocorrelation Spectrum White Bandwidth Bandstop Shape Summary Outline 1 Review: Power Spectrum and Autocorrelation 2 Autocorrelation of Filtered Noise 3 Power Spectrum of Filtered Noise 4 Auditory-Filtered White Noise 5 What is the Bandwidth of the Auditory Filters? 6 Auditory-Filtered Other Noises 7 What is the Shape of the Auditory Filters? 8 Summary

the new algorithm with a 10cm-offset FOV. A real cone-beam CT image with an 8.5cm-offset FOV was also obtained from projections of an anthropomorphic head phantom. Results: The quality of the cone-beam CT images reconstructed using the new algorithm was similar to those using the Feldkamp algorithm which is used in conventional cone-beam CT.

Tomographic Image Reconstruction with Backprojection of Filtered Projections 1. Filter projections p(s,θ) with ramp filter: Fourier transform with respect to s Multiply with ω s Inverse Fourier transform 2. Backproject filtered projections Evaluate at s xcosθ ysinθ Integrate over θ from 0 to π

the Fourier slice theorem can be directly applied. In our new approach, we do not introduce a non- . function, this direct scheme has not been adopted widely, since the slices form samples of . Shepp-Logan phantom Blob image Reconstruction using Hermite expansions 19.34% 1.14% Filtered backprojection method 18.29% 1.45%

Algorithms and Data Structures Marcin Sydow Desired Properties of a Good Algorithm Any good algorithm should satisfy 2 obvious conditions: 1 compute correct (desired) output (for the given problem) 2 be e ective ( fast ) ad. 1) correctness of algorithm ad. 2)complexity of algorithm Complexity of algorithm measures how fast is the algorithm

Provide an overview of the Baldrige Excellence Framework 2. Share how we have used the framework to improve our performance 3. Share our lessons learned 4. Share how you can get started. 5 What We Don’t Want. 6 What We Do Want. 7 “I see the Baldridge process as a powerful set of mechanisms for disciplined people engaged in disciplined thought and taking disciplined action to create great .