<p><strong>Preface </strong></p> <p><strong>1 Introduction to Probability 1</strong></p> <p>1.1 Introduction: Why Study Probability? 1</p> <p>1.2 The Different Kinds of Probability 2</p> <p>Probability as Intuition 2</p> <p>Probability as the Ratio of Favorable to Total Outcomes (Classical Theory) 3</p> <p>Probability as a Measure of Frequency of Occurrence 4</p> <p>Probability Based on an Axiomatic Theory 5</p> <p>1.3 Misuses, Miscalculations, and Paradoxes in Probability 7</p> <p>1.4 Sets, Fields, and Events 8</p> <p>Examples of Sample Spaces 8</p> <p>1.5 Axiomatic Definition of Probability 15</p> <p>1.6 Joint, Conditional, and Total Probabilities; Independence 20</p> <p>Compound Experiments 23</p> <p>1.7 Bayes’ Theorem and Applications 35</p> <p>1.8 Combinatorics 38</p> <p>Occupancy Problems 42</p> <p>Extensions and Applications 46</p> <p>1.9 Bernoulli Trials–Binomial and Multinomial Probability Laws 48</p> <p>Multinomial Probability Law 54</p> <p>1.10 Asymptotic Behavior of the Binomial Law: The Poisson Law 57</p> <p>1.11 Normal Approximation to the Binomial Law 63</p> <p>Summary 65</p> <p>Problems 66</p> <p>References 77</p> <p><strong>2 Random Variables 79</strong></p> <p>2.1 Introduction 79</p> <p>2.2 Definition of a Random Variable 80</p> <p>2.3 Cumulative Distribution Function 83</p> <p>Properties of F X(x) 84</p> <p>Computation of F X(x) 85</p> <p>2.4 Probability Density Function (pdf) 88</p> <p>Four Other Common Density Functions 95</p> <p>More Advanced Density Functions 97</p> <p>2.5 Continuous, Discrete, and Mixed Random Variables 100</p> <p>Some Common Discrete Random Variables 102</p> <p>2.6 Conditional and Joint Distributions and Densities 107</p> <p>Properties of Joint CDF F XY (x, y) 118</p> <p>2.7 Failure Rates 137</p> <p>Summary 141</p> <p>Problems 141</p> <p>References 149</p> <p>Additional Reading 149</p> <p><strong>3 Functions of Random Variables 151</strong></p> <p>3.1 Introduction 151</p> <p>Functions of a Random Variable (FRV): Several Views 154</p> <p>3.2 Solving Problems of the Type Y = g(X) 155</p> <p>General Formula of Determining the pdf of Y = g(X) 166</p> <p>3.3 Solving Problems of the Type Z = g(X, Y ) 171</p> <p>3.4 Solving Problems of the Type V = g(X, Y ), W = h(X, Y ) 193</p> <p>Fundamental Problem 193</p> <p>Obtaining f VW Directly from f XY 196</p> <p>3.5 Additional Examples 200</p> <p>Summary 205</p> <p>Problems 206</p> <p>References 214</p> <p>Additional Reading 214</p> <p><strong>4 Expectation and Moments 215</strong></p> <p>4.1 Expected Value of a Random Variable 215</p> <p>On the Validity of Equation 4.1-8 218</p> <p>4.2 Conditional Expectations 232</p> <p>Conditional Expectation as a Random Variable 239</p> <p>4.3 Moments of Random Variables 242</p> <p>Joint Moments 246</p> <p>Properties of Uncorrelated Random Variables 248</p> <p>Jointly Gaussian Random Variables 251</p> <p>4.4 Chebyshev and Schwarz Inequalities 255</p> <p>Markov Inequality 257</p> <p>The Schwarz Inequality 258</p> <p>4.5 Moment-Generating Functions 261</p> <p>4.6 Chernoff Bound 264</p> <p>4.7 Characteristic Functions 266</p> <p>Joint Characteristic Functions 273</p> <p>The Central Limit Theorem 276</p> <p>4.8 Additional Examples 281</p> <p>Summary 283</p> <p>Problems 284</p> <p>References 293</p> <p>Additional Reading 294</p> <p><strong>5 Random Vectors 295</strong></p> <p>5.1 Joint Distribution and Densities 295</p> <p>5.2 Multiple Transformation of Random Variables 299</p> <p>5.3 Ordered Random Variables 302</p> <p>Distribution of area random variables 305</p> <p>5.4 Expectation Vectors and Covariance Matrices 311</p> <p>5.5 Properties of Covariance Matrices 314</p> <p>Whitening Transformation 318</p> <p>5.6 The Multidimensional Gaussian (Normal) Law 319</p> <p>5.7 Characteristic Functions of Random Vectors 328</p> <p>Properties of CF of Random Vectors 330</p> <p>The Characteristic Function of the Gaussian (Normal) Law 331</p> <p>Summary 332</p> <p>Problems 333</p> <p>References 339</p> <p>Additional Reading 339</p> <p><strong>6 Statistics: Part 1 Parameter Estimation 340</strong></p> <p>6.1 Introduction 340</p> <p>Independent, Identically Distributed (i.i.d.) Observations 341</p> <p>Estimation of Probabilities 343</p> <p>6.2 Estimators 346</p> <p>6.3 Estimation of the Mean 348</p> <p>Properties of the Mean-Estimator Function (MEF) 349</p> <p>Procedure for Getting a δ-confidence Interval on the Mean of a Normal</p> <p>Random Variable When σ X Is Known 352</p> <p>Confidence Interval for the Mean of a Normal Distribution When σ<strong>X </strong>Is Not</p> <p>Known 352</p> <p>Procedure for Getting a δ-Confidence Interval Based on n Observations on</p> <p>the Mean of a Normal Random Variable when σ X Is Not Known 355</p> <p>Interpretation of the Confidence Interval 355</p> <p>6.4 Estimation of the Variance and Covariance 355</p> <p>Confidence Interval for the Variance of a Normal Random</p> <p>variable 357</p> <p>Estimating the Standard Deviation Directly 359</p> <p>Estimating the covariance 360</p> <p>6.5 Simultaneous Estimation of Mean and Variance 361</p> <p>6.6 Estimation of Non-Gaussian Parameters from Large Samples 363</p> <p>6.7 Maximum Likelihood Estimators 365</p> <p>6.8 Ordering, more on Percentiles, Parametric Versus Nonparametric Statistics 369</p> <p>The Median of a Population Versus Its Mean 371</p> <p>Parametric versus Nonparametric Statistics 372</p> <p>Confidence Interval on the Percentile 373</p> <p>Confidence Interval for the Median When n Is Large 375</p> <p>6.9 Estimation of Vector Means and Covariance Matrices 376</p> <p>Estimation of μ 377</p> <p>Estimation of the covariance K 378</p> <p>6.10 Linear Estimation of Vector Parameters 380</p> <p>Summary 384</p> <p>Problems 384</p> <p>References 388</p> <p>Additional Reading 389</p> <p><strong>7 Statistics: Part 2 Hypothesis Testing 390</strong></p> <p>7.1 Bayesian Decision Theory 391</p> <p>7.2 Likelihood Ratio Test 396</p> <p>7.3 Composite Hypotheses 402</p> <p>Generalized Likelihood Ratio Test (GLRT) 403</p> <p>How Do We Test for the Equality of Means of Two Populations? 408</p> <p>Testing for the Equality of Variances for Normal Populations:</p> <p>The F-test 412</p> <p>Testing Whether the Variance of a Normal Population Has a</p> <p>Predetermined Value: 416</p> <p>7.4 Goodness of Fit 417</p> <p>7.5 Ordering, Percentiles, and Rank 423</p> <p>How Ordering is Useful in Estimating Percentiles and the Median 425</p> <p>Confidence Interval for the Median When n Is Large 428</p> <p>Distribution-free Hypothesis Testing: Testing If Two Population are the</p> <p>Same Using Runs 429</p> <p>Ranking Test for Sameness of Two Populations 432</p> <p>Summary 433</p> <p>Problems 433</p> <p>References 439</p> <p><strong>8 Random Sequences 441</strong></p> <p>8.1 Basic Concepts 442</p> <p>Infinite-length Bernoulli Trials 447</p> <p>Continuity of Probability Measure 452</p> <p>Statistical Specification of a Random Sequence 454</p> <p>8.2 Basic Principles of Discrete-Time Linear Systems 471</p> <p>8.3 Random Sequences and Linear Systems 477</p> <p>8.4 WSS Random Sequences 486</p> <p>Power Spectral Density 489</p> <p>Interpretation of the psd 490</p> <p>Synthesis of Random Sequences and Discrete-Time Simulation 493</p> <p>Decimation 496</p> <p>Interpolation 497</p> <p>8.5 Markov Random Sequences 500</p> <p>ARMA Models 503</p> <p>Markov Chains 504</p> <p>8.6 Vector Random Sequences and State Equations 511</p> <p>8.7 Convergence of Random Sequences 513</p> <p>8.8 Laws of Large Numbers 521</p> <p>Summary 526</p> <p>Problems 526</p> <p>References 541</p> <p><strong>9 Random Processes 543</strong></p> <p>9.1 Basic Definitions 544</p> <p>9.2 Some Important Random Processes 548</p> <p>Asynchronous Binary Signaling 548</p> <p>Poisson Counting Process 550</p> <p>Alternative Derivation of Poisson Process 555</p> <p>Random Telegraph Signal 557</p> <p>Digital Modulation Using Phase-Shift Keying 558</p> <p>Wiener Process or Brownian Motion 560</p> <p>Markov Random Processes 563</p> <p>Birth—Death Markov Chains 567</p> <p>Chapman—Kolmogorov Equations 571</p> <p>Random Process Generated from Random Sequences 572</p> <p>9.3 Continuous-Time Linear Systems with Random Inputs 572</p> <p>White Noise 577</p> <p>9.4 Some Useful Classifications of Random Processes 578</p> <p>Stationarity 579</p> <p>9.5 Wide-Sense Stationary Processes and LSI Systems 581</p> <p>Wide-Sense Stationary Case 582</p> <p>Power Spectral Density 584</p> <p>An Interpretation of the psd 586</p> <p>More on White Noise 590</p> <p>Stationary Processes and Differential Equations 596</p> <p>9.6 Periodic and Cyclostationary Processes 600</p> <p>9.7 Vector Processes and State Equations 606</p> <p>State Equations 608</p> <p>Summary 611</p> <p>Problems 611</p> <p>References 633</p> <p><strong>Chapters 10 and 11 are available as Web chapters on the companion</strong></p> <p><strong>Web site at http://www.pearsonhighered.com/stark.</strong></p> <p><strong>10 Advanced Topics in Random Processes 635</strong></p> <p>10.1 Mean-Square (m.s.) Calculus 635</p> <p>Stochastic Continuity and Derivatives [10-1] 635</p> <p>Further Results on m.s. Convergence [10-1] 645</p> <p>10.2 Mean-Square Stochastic Integrals 650</p> <p>10.3 Mean-Square Stochastic Differential Equations 653</p> <p>10.4 Ergodicity [10-3] 658</p> <p>10.5 Karhunen—Lo`eve Expansion [10-5] 665</p> <p>10.6 Representation of Bandlimited and Periodic Processes 671</p> <p>Bandlimited Processes 671</p> <p>Bandpass Random Processes 674</p> <p>WSS Periodic Processes 677</p> <p>Fourier Series for WSS Processes 680</p> <p>Summary 682</p> <p>Appendix: Integral Equations 682</p> <p>Existence Theorem 683</p> <p>Problems 686</p> <p>References 699</p> <p><strong>11 Applications to Statistical Signal Processing 700</strong></p> <p>11.1 Estimation of Random Variables and Vectors 700</p> <p>More on the Conditional Mean 706</p> <p>Orthogonality and Linear Estimation 708</p> <p>Some Properties of the Operator ˆE 716</p> <p>11.2 Innovation Sequences and Kalman Filtering 718</p> <p>Predicting Gaussian Random Sequences 722</p> <p>Kalman Predictor and Filter 724</p> <p>Error-Covariance Equations 729</p> <p>11.3 Wiener Filters for Random Sequences 733</p> <p>Unrealizable Case (Smoothing) 734</p> <p>Causal Wiener Filter 736</p> <p>11.4 Expectation-Maximization Algorithm 738</p> <p>Log-likelihood for the Linear Transformation 740</p> <p>Summary of the E-M algorithm 742</p> <p>E-M Algorithm for Exponential Probability</p> <p>Functions 743</p> <p>Application to Emission Tomography 744</p> <p>Log-likelihood Function of Complete Data 746</p> <p>E-step 747</p> <p>M-step 748</p> <p>11.5 Hidden Markov Models (HMM) 749</p> <p>Specification of an HMM 751</p> <p>Application to Speech Processing 753</p> <p>Efficient Computation of P[E | M] with a Recursive</p> <p>Algorithm 754</p> <p>Viterbi Algorithm and the Most Likely State Sequence</p> <p>for the Observations 756</p> <p>11.6 Spectral Estimation 759</p> <p>The Periodogram 760</p> <p>Bartlett’s Procedure---Averaging Periodograms 762</p> <p>Parametric Spectral Estimate 767</p> <p>Maximum Entropy Spectral Density 769</p> <p>11.7 Simulated Annealing 772</p> <p>Gibbs Sampler 773</p> <p>Noncausal Gauss—Markov Models 774</p> <p>Compound Markov Models 778</p> <p>Gibbs Line Sequence 779</p> <p>Summary 783</p> <p>Problems 783</p> <p>References 788</p> <p><strong>Appendix A Review of Relevant Mathematics A-1</strong></p> <p>A.1 Basic Mathematics A-1</p> <p>Sequences A-1</p> <p>Convergence A-2</p> <p>Summations A-3</p> <p>Z-Transform A-3</p> <p>A.2 Continuous Mathematics A-4</p> <p>Definite and Indefinite Integrals A-5</p> <p>Differentiation of Integrals A-6</p> <p>Integration by Parts A-7</p> <p>Completing the Square A-7</p> <p>Double Integration A-8</p> <p>Functions A-8</p> <p>A.3 Residue Method for Inverse Fourier Transformation A-10</p> <p>Fact A-11</p> <p>Inverse Fourier Transform for psd of Random Sequence A-13</p> <p>A.4 Mathematical Induction A-17</p> <p>References A-17</p> <p><strong>Appendix B Gamma and Delta Functions B-1</strong></p> <p>B.1 Gamma Function B-1</p> <p>B.2 Incomplete Gamma Function B-2</p> <p>B.3 Dirac Delta Function B-2</p> <p>References B-5</p> <p><strong>Appendix C Functional Transformations and Jacobians C-1</strong></p> <p>C.1 Introduction C-1</p> <p>C.2 Jacobians for n = 2 C-2</p> <p>C.3 Jacobian for General n C-4</p> <p><strong>Appendix D Measure and Probability D-1</strong></p> <p>D.1 Introduction and Basic Ideas D-1</p> <p>Measurable Mappings and Functions D-3</p> <p>D.2 Application of Measure Theory to Probability D-3</p> <p>Distribution Measure D-4</p> <p><strong>Appendix E Sampled Analog Waveforms and Discrete-time Signals E-1</strong></p> <p><strong>Appendix F Independence of Sample Mean and Variance for Normal</strong></p> <p><strong>Random Variables F-1</strong></p> <p><strong>Appendix G Tables of Cumulative Distribution Functions: the Normal,</strong></p> <p><strong>Student t, Chi-square, and F G-1</strong></p> <p><strong>Index I-1</strong></p>