Long-Memory Time Series : Theory and Methods
Long-Memory Time Series : Theory and Methods
Click to enlarge
Author(s): Palma, Wilfredo
ISBN No.: 9780470114025
Pages: 304
Year: 200703
Format: Trade Cloth (Hard Cover)
Price: $ 230.39
Dispatch delay: Dispatched between 7 to 15 days
Status: Available

Preface xiii Acronyms xvii 1 Stationary Precedes 1 1.1 Fundamental concepts 2 1.1.1 Stationarity 4 1.1.2 Singularity and Regularity 5 1.1.3 Wold Decomposition Theorem 5 1.


1.4 Causality 7 1.1.5 Invertibility 7 1.1.6 Best Linear Predictor 8 1.1.7 Szego-Kolmogorov Formula 8 1.


1.8 Ergodicity 9 1.1.9 Martingales 11 1.1.10 Cumulants 12 1.1.11 Fractional Brownian Motion 12 1.


1.12 Wavelets 14 1.2 Bibliographic Notes 15 Problems 16 2 State Space Systems 21 2.1 Introduction 22 2.1.1 Stability 22 2.1.2 Hankel Operator 22 2.


1.3 Observability 23 2.1.4 Controllability 23 2.1.5 Minimality 24 2.2 Representations of Linear Processes 24 2.2.


1 State Space Form to Wold Decomposition 24 2.2.2 Wold Decomposition to State Form 25 2.2.3 Hankel Operator to State Space Form 25 2.3 Estimation of the State 26 2.3.1 State Predictor 27 2.


3.2 State Filter 27 2.3.3 State Smoother 27 2.3.4 Missing Observation 28 2.3.5 Steady State System 28 2.


3.6 Prediction of Future Observations 30 2.4 Extensions 32 2.5 Bibliographic Notes 32 Problems 33 3 Long-Memory/Processes 39 3.1 Defining Long Memory 40 3.1.1 Alternative Definitions 41 3.1.


2 Extensions 43 3.2 ARFIMA Processes 43 3.2.1 Stationarity, Causality, and Invertibility 44 3.2.2 Infinite AR and MA Expansions 46 3.2.3 Spectral Density 47 3.


2.4 Autocovariance Function 47 3.2.5 Sample Mean 48 3.2.6 Partial Autocorrelations 49 3.2.7 Illustrations 49 3.


2.8 Approximation of Long-Memory Processes 55 3.3 Fractional Gaussian Noise 56 3.3.1 Sample Mean 56 3.4 Technical Lemmas 57 3.5 Bibliographic Notes 58 Problems 59 4 Estimation Methods 65 4.1 Maximum-Likelihood Estimation 66 4.


1.1 Cholesky Decomposition Method 66 4.1.2 Durbin-Levinson Algorithm 66 4.1.3 Computation of Autocovariances 67 4.1.4 State Space Approach 69 4.


2 Autoregressive Approximations 71 4.2.1 Haslett-Raftery Method72 4.2.2 Beran Approach 73 4.2.3 A State Space Method 74 4.3 Moving-Average Approximation 75 4.


4 Whittle Estimation 78 4.4.1 Other versions 80 4.4.2 Non-Gaussian Data 80 4.4.3 Semiparametric Methods 81 4.5 Other Methods 81 4.


5.1 A Regression Method 82 4.5.2 Rescale Range Method 83 4.5.3 Variance Plots 85 4.5.4 Detrended Fluctuation Analysis 87 4.


5.5 A Wavelet-Based Method 91 4.6 Numerical Experiments 92 4.7 Bibliographic Notes 93 Problems 94 5 Asymptotic Theory 97 5.1 Notation and Definitions 98 5.2 Theorems 99 5.2.1 Consistency 99 5.


2.2 Central Limit Theorem 101 5.2.3 Efficiency 104 5.3 Examples 104 5.4 Illustration 108 5.5 Technical Lemmas 109 5.6 Bibliographic Notes 109 Problems 109 6 Heteroskedastic Models 115 6.


1 Introduction 116 6.2 ARFIMA-GARCH Model 117 6.2.1 Estimation 119 6.3 Other Models 119 6.3.1 Estimation 121 6.4 Stochastic Volatility 121 6.


4.1 Estimation 122 6.5 Numerical Experiments 122 6.6 Application 123 6.6.1 Model without Leverage 123 6.6.2 Model with Leverage 124 6.


6.3 Model Comparison 124 6.7 Bibliographic Notes 125 Problems 126 7 Transformations 131 7.1 Transformation of Gaussian Processes 132 7.2 Autocorrelation of Squares 134 7.3 Asymptotic behavior 136 7.4 Illustrations 138 7.5 Bibliographic Notes 142 Problems 143 8 Bayesian Methods 147 8.


1 Bayesian Modeling 148 8.2 Markov Chain Monte Carlo Methods 149 8.2.1 Metropolis-Hastings Algorithm 149 8.2.2 Gibbs Sampler 150 8.2.3 Overdispersed Distributions 152 8.


3 Monitoring Convergence 153 8.4 A Simulated Example 155 8.5 Data Application 158 8.6 Bibliographic Notes 162 Problems 162 9 Prediction 167 9.1 One-Step Ahead Predictors 168 9.1.1 Infinite Past 168 9.1.


2 Finite Past 168 9.1.3 An Approximate Predictor 172 9.2 Multistep Ahead Predictors 173 9.2.1 Infinite Past 173 9.2.2 Finite Past 174 9.


3 Heteroskedastic Models 175 9.3.1 Prediction of Volatility 176 9.4 Illustration 178 9.5 Rational Approximations 180 9.5.1 Illustration 182 9.6 Bibliographic Notes Problems 184 10 Regression 187 10.


1 Linear Regression Model 188 10.1.1 Grenander conditions 188 10.2 Properties of the LSE 191 10.2.1 Consistency 192 10.2.2 Asymptotic Variance 193 10.


2.3 Asymptotic Normality 193 10.3 Properties of the BLUE 194 10.3.1 Efficiency of the LSE Relative to the BLUE 195 10.4 Estimation of the Mean 198 10.4.1 Consistency 198 10.


4.2 Asymptotic Variance 199 10.4.3 Normality 200 10.4.4 Relative Efficiency 200 10.5 Polynomial Trend 202 10.5.


1 Consistency 203 10.5.2 Asymptotic Variance 203 10.5.3 Normality 204 10.5.4 Relative Efficiency 204 10.6 Harmonic Regression 205 10.


6.1 Consistency 205 10.6.2 Asymptotic Variance 205 10.6.3 Normality 205 10.6.4 Efficiency 206 10.


7 Illustration: Air Pollution Data 207 10.8 Bibliographic Notes 210 Problems 211 11 Missing Data 215 11.1 Motivation 216 11.2 Likelihood Function with Incomplete Data 217 11.2.1 Integration 217 11.2.2 Maximization 218 11.


2.3 Calculation of the Likelihood Function 219 11.2.4 Kalman Filter with Missing Observations 219 11.3 Effects of Missing Values on ML Estimates 221 11.3.1 Monte Carlo Experiments 222 11.4 Effects of Missing Values on Prediction 223 11.


5 Illustrations 227 11.6 Interpolation of Missing Data 229 11.6.1 Bayesian Imputation 234 11.6.2 A Simulated Example 235 11.7 Bibliographic Notes 239 Problems 239 12 Seasonality 245 12.1 A Long-Memory Seasonal Model 246 12.


2 Calculation of the Asymptotic Variance 250 12.3 Autocovariance Function 252 12.4 Monte Carlo Studies 254 12.5 Illustration 258 12.6 Bibliographic Notes 260 Problems 261 References 265 Topic Index 279 Author Index 283.


To be able to view the table of contents for this publication then please subscribe by clicking the button below...
To be able to view the full description for this publication then please subscribe by clicking the button below...