时间序列分析:预测与控制(英文版·第3版)图书
人气:178

时间序列分析:预测与控制(英文版·第3版)

本书自1970年初版以来,不断修订再版,以其经典性和性成为有关时间序列分析领域书籍的典范。书中涉及时间序列*(统计)模型的建立及许多重要的应用领域的使用,包括预测,模型的描述、估计、识别和诊断,动态关...
  • 所属分类:图书 >教材>征订教材>高等理工   图书 >经济>统计/审计  
  • 作者:[美][George] [E.P.Box] ,[英][Gwilym] [M.Jenkins],[美][Gregory] [C.Reinsel] 著
  • 产品参数:
  • 丛书名:图灵原版数学·统计学系列
  • 国际刊号:9787115137722
  • 出版社:人民邮电出版社
  • 出版时间:2005-09
  • 印刷时间:2005-09-01
  • 版次:1
  • 开本:--
  • 页数:598
  • 纸张:胶版纸
  • 包装:平装
  • 套装:

内容简介

本书自1970年初版以来,不断修订再版,以其经典性和性成为有关时间序列分析领域书籍的典范。书中涉及时间序列(统计)模型的建立及许多重要的应用领域的使用,包括预测,模型的描述、估计、识别和诊断,动态关系的传递函数的识别、拟合及检验,干预事件影响的建模和过程控制等专题。本书叙述简明,强调实际技术,配有大量实例。

本书可作为统计和相关专业高年级本科生或研究生教材,也可以作为统计专业技术人员的参考书。

编辑推荐

时间序列分析是一门实用性很强、蓬勃发展的数据分析技术,现在广泛地应用于工业质量控制、生物基因工程和金融数据分析等诸多领域。而这一切的发展不能不提到G.E.P.BOX和G.M.JE-NKINS以及二人合著的早于1970年出版的《时间序列分析——预测与控制》。由于二位对时间序列数据分析的巨大贡献,大家将本书提出的ARIMA模型命名为BOX-JENKINS模型。

在这本时间序列分析经典之作中,几位统计大家用极其通俗的语言,运用大量的实例,深入浅出而又形象地阐明时间序列分析的精髓,使读者免去过多繁杂的数学公式推导证明,而很快掌握实践的技巧,体会其中直观而深刻的思想。相信每一位研读此书的读者都会获益匪浅。

作者简介

George E.P.Box 国际级统计学家。曾于1960年创立威斯康星大学统计系并任该系主任,现为该校名誉教授。BOX发表过200多篇论文,出版过很多重要著作,其中本书和STATISTICE FOR EXPERIMENTERS为其代表作。 Gwilym M.Jenkins 已故国际级统计学家。曾于1966年创立了英国

目录

1 INTRODUCTION 1

1.1 Four Important Practical Problems 2

1.1.1 Forecasting Time Series 2

1.1.2 Estimation of Transfer Functions 3

1.1.3 Analysis of Effects of Unusual Intervention Events To a System 4

1.1.4 Discrete Control Systems 5

1.2 Stochastic and Deterministic Dynamic Mathematical Models 7

1.2.1 Stationary and Nonstationary Stochastic Models for Forecasting and Control 7

1.2.2 Transfer Function Models 12

1.2.3 Models for Discrete Control Systems 14

1.3 Basic Ideas in Model Building 16

1.3.1 Parsimony 16

1.3.2 Iterative Stages in the Selection of a Model 16

Part I Stochastic Models and Their Forecasting 19

2 AUTOCORRELATION FUNCTION AND SPECTRUM OF STATIONARY PROCESSES 21

2.1 Autocorrelation Properties of Stationary Models 21

2.1.1 Time Series and Stochastic Processes 21

2.1.2 Stationary Stochastic Processes 23

2.1.3 Positive Definiteness and the Autocovariance Matrix 26

2.1.4 Autocovariance and Autocorrelation Functions 29

2.1.5 Estimation of Autocovariance and Autocorrelation Functions 30

2.1.6 Standard Error of Autocorrelation Estimates 32

2.2 Spectral Properties of Stationary Models 35

2.2.1 Periodogram of a Time Series 35

2.2.2 Analysis of Variance 36

2.2.3 Spectrum and Spectral Density Function 37

2.2.4 Simple Examples of Autocorrelation and Spectral Density Functions 41

2.2.5 Advantages and Disadvantages of the Autocorrelation and Spectral Density Functions 43

A2.1 Link Between the Sample Spectrum and Autocovariance Function Estimate 44

3 LINEAR STATIONARY MODELS 46

3.1 General Linear Process 46

3.1.1 Two Equivalent Forms for the Linear Process 46

3.1.2 Autocovariance Generating Function of a Linear Process 49

3.1.3 Stationarity and Invertibility Conditions for a Linear Process 50

3.1.4 Autoregressive and Moving Average Processes 52

3.2 Autoregressive Processes 54

3.2.1 Stationarity Conditions for Autoregressive Processes 54

3.2.2 Autocorrelation Function and Spectrum of Autoregressiue Processes 55

3.2.3 First-Order Autoregressive (Markov) Process 58

3.2.4 Second-Order Autoregressive Process 60

3.2.5 Partial Autocorrelation Function 64

3.2.6 Estimation of the Partial Autocorrelation Function 67

3.2.7 Standard Errors of Partial Autocorrelation Estimates 68

3.3 Moving Average Processes 69

3.3.1 Invertibility Conditions for Moving Average Processes 69

3.3.2 Autocorrelation Function and Spectrum of Moving Average Processes 70

3.3.3 First-Order Moving Average Process 72

3.3.4 Second-Order Moving Average Process 73

3.3.5 Duality Between Autoregressive and Moving Average Processes 75

3.4 Mixed Autoregressive-Moving Average Processes 77

3.4.1 Stationarity and Invertibility Properties 77

3.4.2 Autocorrelation Function and Spectrum of Mixed Processes 78

3.4.3 First-Order Autoregressive-First-Order Moving Average Process 80

3.4.4 Summary 83

A3.1 Autocovariances Autocovariance Generating Function and Stationarity Conditions for a General Linear Process 85

A3.2 Recursive Method for Calculating Estimates of Autoregressive Parameters 87

4 LINEAR NONSTATIONARY MODELS 89

4.1 Autoregressive Integrated Moving Average Processes 89

4.1.1 Nonstationary First-Order Autoregressive Process 89

4.1.2 General Model for a Nonstationary Process Exhibiting Homogeneity 92

4.1.3 General Form of the Autoregressive Integrated Moving Average Process 96

4.2 Three Explicit Forms for the Autoregressive Integrated Moving Average Model 99

4.2.1 Difference Equation Form of the Model 99

4.2.2 Random Shock Form of the Model I00

4.2.3 Inverted Form of the Model 106

4.3 Integrated Moving Average Processes 109

4.3.1 Integrated Moving Average Process of Order (0,1,1) 110

4.3.2 Integrated Moving Average Process of Order (0,2,2) 114

4.3.3 General Integrated Moving Average Process of Order (0,d,q) 118

A4.1 Linear Difference Equations 120

A4.2 IMA(0,1,1) Process With Deterministic Drift 125

A4.3 ARIMA Processes With Added Noise 126

A4.3.1 Sum of Two Independent Moving Average Processes 126

A4.3.2 Effect of Added Noise on the General Model 127

A4.3.3 Example for an IMA(O,1,1) Process with Added White Noise 128

A4.3.4 Relation Between the IMA(O,1,1) Process and a Random Walk 129

A4.3.5 Autocovariance Function of the General Model with Added Correlated Noise 129

5 FORECASTING 131

5.1 Minimum Mean Square Error Forecasts and Their Properties 131

5.1.1 Derivation of the Minimum Mean Square Error Forecasts 133

5.1.2 Three Basic Forms for the Forecast 135

5.2 Calculating and Updating Forecasts 139

5.2.1 Convenient Format for the Forecasts 139

5.2.2 Calculation of the ψ Weights 139

5.2.3 Use of the ψ Weights in Updating the Forecasts 141

5.2.4 Calculation of the Probability Limits of the Forecasts at Any Lead Time 142

5.3 Forecast Function and Forecast Weights 145

5.3.1 Eventual Forecast Function Determined by the Autoregressive Operator 146

5.3.2 Role of the Mooing Average Operator in Fixing the Initial Values 147

5.3.3 Lead l Forecast Weights 148

5.4 Examples of Forecast Functions and Their Updating 151

5.4.1 Forecasting an IMA(O,1,1) Process 151

5.4.2 Forecasting an IMA(O,2,2) Process 154

5.4.3 Forecasting a General IMA(O,d,q) Process 156

5.4.4 Forecasting Autoregressive Processes 157

5.4.5 Forecasting a (1,O,1) Process 160

5.4.6 Forecasting a (1,1,1) Process 162

5.5 Use of State Space Model Formulation for Exact Forecasting 163

5.5.1 State Space Model Representation for the ARIMA Process 163

5.5.2 Kalman Filtering Relations for Use in Prediction 164

5.6 Summary 166

A5.1 Correlations Between Forecast Errors 169

A5.1.1 Autocorrelation Function of Forecast Errors at Different Origins 169

A5.1.2 Correlation Between Forecast Errors at the Same Origin with Different Lead Times 170

A5.2 Forecast Weights for Any Lead Time 172

A5.3 Forecasting in Terms of the General Integrated Form 174

A5.3.1 General Method of Obtaining the Integrated Form 174

A5.3.2 Updating the General Integrated Form 176

A5.3.3 Comparison with the Discounted Least Squares Method 176

Part II Stochastic Model Building 181

6 MODELDENTIFICATION 183

6.l Objectives of Identification 183

6.1.1 Stages in the Identification Procedure 184

6.2 Identification Techniques 184

6.2.1 Use of the Autocorrelation and Partial Autocorrelation Functions in Identification 184

6.2.2 Standard Errors for Estimated Autocorrelations and Partial Autocorrelations 188

6.2.3 Identification of Some Actual Time Series 188

6.2.4 Some Additional Model Identification Tools 197

6.3 Initial Estimates for the Parameters 202

6.3.1 Uniqueness of Estimates Obtained from the Autocovariance Function 202

6.3.2 Initial Estimates for Moving Average Processes 202

6.3.3 Initial Estimates for Autoregressive Processes 204

6.3.4 Initial Estimates for Mixed Autoregressive-Moving Average Processes 206

6.3.5 Choice Between Stationary and Nonstationary Models in Doubtful Cases 207

6.3.6 More Formal Tests for Unit Roots in ARIMA Models 208

6.3.7 Initial Estimate of Residual Variance 211

6.3.8 Approximate Standard Error for 212

6.4 Model Multiplicity 214

6.4.1 Multiplicity of Autoregressive-Moving Average Models 214

6.4.2 Multiple Moment Solutions for Moving Average Parameters 216

6.4.3 Use of the Backward Process to Determine Starting Values 218

A6.1 Expected Behavior of the Estimated Autocorrelation Function for a Nonstationary Process 218

A6.2 General Method for Obtaining Initial Estimates of the Parameters of a Mixed Autoregressive-Moving Average Process 220

7 MODELESTIMATION 224

7.l Study of the Likelihood and Sum of Squares Functions 224

7.1.1 Likelihood Function 224

7.1.2 Conditional Likelihood for an ARIMA Process 226

7.1.3 Choice of Starting Values for Conditional Calculation 227

7.1.4 Unconditional Likelihood; Sum of Squares Function; Least Squares Estimates 228

7.1.5 General Procedure for Calculating the Unconditional Sum of Squares 233

7.1.6 Graphical Study of the Sum of Squares Function 238

7.1.7 Description of"Well-Behaved" Estimation Situations; Confidence Regions 241

7.2 Nonlinear Estimation 248

7.2.1 General Method of Approach 248

7.2.2 Numerical Estimates of the Derivatives 249

7.2.3 Direct Evaluation of the Derivatives 251

7.2.4 General Least Squares Algorithm for the Conditional Model 252

7.2.5 Summary of Models Fitted to Series A to F 255

7.2.6 Large-Sample Information Matrices and Covariance Estimates 256

7.3 Some Estimation Results for Specific Models 259

7.3.1 Autoregressive Processes 260

7.3.2 Moving Average Processes 262

7.3.3 Mixed Processes 262

7.3.4 Separation of Linear and Nonlinear Components in Estimation 263

7.3.5 Parameter Redundancy 264

7.4 Estimation Using Bayes' Theorem 267

7.4.1 Bayes' Theorem 267

7.4.2 Bayesian Estimation of Parameters 269

7.4.3 Autoregressive Processes 270

7.4.4 Moving Average Processes 272

7.4.5 Mixed processes 274

7.5 Likelihood Function Based on The State Space Model 275

A7.1 Review of Normal Distribution Theory 279

A7.1.1 Partitioning of a Positive-Definite Quadratic Form 279

A7.1.2 Two Useful Integrals 280

A7.1.3 Normal Distribution 281

A7.1.4 Student's t-Distribution 283

A7.2 Review of Linear Least Squares Theory 286

A7.2.1 Normal Equations 286

A7.2.2 Estima

网友评论(不代表本站观点)

来自vet408**的评论:

看起来不错

2009-04-21 15:51:05
来自无昵称**的评论:

Have not got it. I hope it would be awesome

2007-12-06 00:51:30
登录后即可发表评论

免责声明

更多相关图书
在线咨询