Boston University

College of Engineering

Department of Electrical and Computer Engineering

 

  

EC 381: PROBABILITY THEORY IN

ELECTRICAL AND COMPUTER ENGINEERING

SPRING 2014

(4 credits)

 http://people.bu.edu/cgc/ec381

 

Professor Christos G. Cassandras

Room 425, 8 St. Mary's St. (PHO Building)

TEL: 353-7154, E-MAIL: cgc@bu.edu, WWW: http://people.bu.edu/cgc

 


 

• Organization:      Lectures: M,W 12:00-2:00, PHO 210

 

Prerequisites:     Multivariate calculus (CAS MA 22)

 

• Requirements:

1. Weekly Homework Assignments

20%

2. Midterm Exam

40%

3. Second Midterm (or Final Exam)

40%

 

• Objectives:          

1. Develop a solid foundation in the concepts of probability theory.

2. Learn fundamental probabilistic modeling and analysis techniques so you can use them in Electrical and Computer Engineering applications.

3. Learn basic techniques for managing and processing data and apply them to basic estimation and hypothesis testing problems.

 

• Office Hours:       W: 2:00-3:00 pm.

 

• Graduate Teaching Fellow:    Michael George Sidhom Farag (mgsfarag@bu.edu), PHO 401A.

 

• Required Books: 1. Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers, R. Yates and D. Goodman, Wiley, Second Edition..

                        • In addition: Your own lecture notes! (Most will be provided)

 


 

• COURSE OUTLINE •

 

1. FOUNDATIONS OF PROBABILITY THEORY

1.1. What is “probability”?

1.2. Basic concepts (sample spaces, events, probability measures)

1.3. Review of set theory

1.4. Definition of probability and probability spaces

1.5. Probability axioms

1.6. Event independence

1.7. Conditional probability, Bayes’ Theorem

1.8. Counting methods: permutations, combinations, independent trials

 

2. DISCRETE RANDOM VARIABLES

2.1. Definition of random variable

2.2. Types of random variables

2.2. Probability mass functions (pmf)

2.3. Cumulative distribution functions (cdf)

2.4. Statistics of random variables: expectation (mean), variance

2.5. Families of useful discrete random variables

2.6. Functions of discrete random variables

2.7. Conditional probability mass functions (pmf), Conditional expectation

2.8. Random vectors

  

3. CONTINUOUS RANDOM VARIABLES

3.1. Cumulative distribution functions (cdf)

3.2. Probability density functions (pdf)

3.3. Expectation

3.4. Families of useful continuous random variables

3.5. Functions of random variables

3.6. Conditional probability density functions (pdf), Conditional expectation

3.7. Mixed (discrete and continuous) random variables

  

4. PAIRS OF RANDOM VARIABLES

4.1. Joint cumulative distribution functions (cdf) and mass functions (pmf)

4.2. Marginal probability mass functions

4.3. Joint probability density functions

4.4. Marginal probability density functions

4.5. Functions of two random variables

4.6. Expectation, Covariance, Correlation

4.7. Conditioning, conditional expectations

4.8. Bivariate Gaussian random variables

4.9. Random vectors of continuous/discrete random variables

 

5. SUMS OF RANDOM VARIABLES AND LIMIT THEOREMS

5.1. Sample averages

5.2. Moment generating functions

5.3. Markov and Chebychev inequalities

5.4. Weak and strong laws of large numbers

5.5. Central Limit Theorem

 

6. PARAMETER ESTIMATION

6.1. Point estimation

6.2. Interval estimation

 

7. MARKOV CHAINS

7.1. Chapman-Kolmogorov equations

7.2. Transient analysis

7.3. State classification

7.4. Steady-state analysis

 

8. HYPOTHESIS TESTING

8.1. Significance testing

8.2. Binary hypothesis testing

8.3. Multiple hypothesis testing