Centro de Investigación y de Estudios Avanzados del
Instituto Politécnico Nacional

Tópicos selectos de matemáticas I / Introduction to Mathematics for Intelligent Systems

Datos generales
Nombre Completo del Programa de Posgrado: Maestría en Ciencias en Ingeniería Eléctrica
Nombre Completo del Curso: Tópicos selectos de matemáticas I / Introduction to Mathematics for Intelligent Systems
Tipo de curso: Electivo
Créditos: 8
Número de Horas:
  • Teóricas: 60 Presenciales
  • Prácticas: 20 No presenciales
Profesores que impartirán el curso: Andrés Méndez Vázquez
The objective of this course is to teach four important areas in the Mathematics for Intelligent Systems:
• Theory and basic methods in Linear Algebra. • Introduction to statistics and probability methods. • Basic Methods in Optimization.

1. Linear Algebra [1, 2, 3]
1.1. Linear Equations, Matrices and Gaussian Elimination
1.1.1. (a) Linear Equations
1.1.2. (b) The Geometry of Linear Equations
1.1.3. (c) Matrix Notation and Matrix Multiplication
1.1.4. (d) Triangular Factors and Row Exchanges
1.1.5. (e) Inverses and Transposes
1.1.6. (f) Solving the Regression Problem
1.2. Vector Spaces
1.2.1. (a) Space of Vectors and Subspaces
1.2.2. (b) Solving Homogeneous and In-homogeneous Systems
1.2.3. (c) Linear Independence, Basis and Dimensions
1.2.4. (d) The Four Fundamental Subspaces
1.2.5. (e) Graphs and Networks
1.2.6. (f) Linear Transformations
1.3. Orthogonality
1.3.1. (a) Orthonormal Basis
1.3.2. (b) Projections onto sub-spaces
1.3.3. (c) The Regression Least Square
1.3.4. (d) Gram-Schmidt
1.4. Determinants
1.4.1. (a) The Properties of the Determinants(d)
1.4.2. (b) Permutations
1.4.3. (c) Cofactors
1.4.4. (d) Cramer’s Rule
1.4.5. (e) Applications
1.5. The Eigenvectors and Eigenvalues
1.5.1. (a) The classic
1.5.2. (b) Diagonalization
1.5.3. (c) Difference Equations and Powers Ak
1.5.4. (d) Singular Value Decomposition
1.6. Derivatives of Matrices
1.6.1. (a) Generalizing the concept of derivatives
1.7. Applications
1.7.1. (a) Solving the Linear Regression Problem
1.7.2. (b) Application Principal Component Analysis
1.7.3. (c) Application for image processing EigenFaces
1.7.4. (d) Markov Matrices and the Google Matrix
1.7.5. (e) Linear Algebra in Probability and Statistics
2. Probability and Statistical Theory [4, 5]
2.1. Introduction
2.1.1. (a) Probability Definition
2.1.2. (b) The Sample Space
2.1.3. (c) Basic Set Operations
2.1.4. (d) Counting i. How to produce probabilities?
2.2. Conditional Probability
2.2.1. (a) Definition and intuition
2.2.2. (b) Bayes’ Rule
2.2.3. (c) Conditional Probabilities are probabilities
2.2.4. (d) Independence of events
2.2.5. (e) The use of conditioning to solve problems
2.3. Random Variables
2.3.1. (a) The basic intuition and definition
2.3.2. (b) Distributions i. Continuous ii. Discrete
2.3.3. (c) Cumulative distribution function
2.3.4. (d) Function of Random Variables
2.4. Expectation
2.4.1. (a) Define expectation as a weighted average
2.4.2. (b) Linearity of the expectation
2.4.3. (c) Examples, Geometric and Negative Binomial
2.4.4. (d) Indicator Random Variable and the Fundamental Bridge
2.4.5. (e) Variance
2.5. Moments
2.5.1. (a) Moments as summarizing a distribution
2.5.2. (b) How do we interpret moments?
2.5.3. (c) Sample moments i. The Sample Mean ii. The Sample Variance
2.5.4. (d) Moment generating functions
2.6. Joint distributions
2.6.1. (a) Joint, Marginal and Conditional
2.6.2. (b) Transformation of variables
2.6.3. (c) Covariance and Correlations
2.6.4. (d) Conditional Expectation
2.7. Inequalities and Limit Theorems
2.7.1. (a) Inequalities
2.7.2. (b) Law of Large Numbers
2.8. Applications
2.8.1. (a) Bayesian inference
2.8.2. (b) Generative Models vs. Discriminative Models i. Models that learn from the data
2.8.3. (c) Markov Chains
2.8.4. (d) Markov Chain Monte Carlo
3. Optimization [6, 7]
3.1. Introduction
3.1.1. (a) Formulation
3.1.2. (b) Example: Least Squared Error
3.1.3. (c) Continuous versus Discrete Optimization
3.1.4. (d) Constrained and Unconstrained Optimization
3.2. The Basics
3.2.1. (a) What is a solution?
3.2.2. (b) How to recognize a minimum?
3.2.3. (c) Overview: i. Linear Search ii. Trust Region
3.3. Convex Functions
3.3.1. (a) Why Convex Functions?
3.3.2. (b) Differentiable Convex Functions
3.3.3. (c) Characterizing Convex Functions using differentiability
3.4. The Concept of an Algorithm
3.4.1. (a) Algorithms
3.4.2. (b) Convergence
3.4.3. (c) Comparison of Algorithms
3.5. Linear Search Methods
3.5.1. (a) Introduction
3.5.2. (b) Linear Search
3.5.3. (c) Search using Derivatives i. Gradient Descent ii. Newton’s Method
3.5.4. (d) Applications in the Perceptron Algorithm and Logistic Regression
3.6. Trust-Region Methods
3.6.1. (a) Linear Search Vs Trust-Region
3.6.2. (b) The Basics for the Trust-Region
3.6.3. (c) Basic Algorithms i. Levenberg-Marquardt algorithm ii. Dogleg Method
3.7. Specific Methods:
3.7.1. (a) Conjugate Gradient Methods
3.7.2. (b) Quasi-Newton Methods
3.8. Optimality Conditions and Duality
3.8.1. (a) Lagrange Multipliers
3.8.2. (b) The Karush-Kuhn-Tucher Condiditons
3.8.3. (c) Constraint Qualifications
3.8.4. (d) Lagrangian Dual Problems
3.8.5. (e) Formulating the Dual Problem
3.8.6. (f) Linear and Quadratic Programming
  1. G. Strang, Introduction to Linear Algebra. Wellesley, MA: Wellesley-Cambridge Press, fourth ed., 2009.
  2. K. Hoffman and R. Kunze, Linear algebra. Prentice-Hall mathematics series, Prentice-Hall, 1971.
  3. S. Lang, Algebra. Graduate Texts in Mathematics, Springer New York, 2005.
  4. R. Ash, Basic Probability Theory. Dover Books on Mathematics Series, Dover Publications, Incorporated, 2012.
  5. J. Blitzstein and J. Hwang, Introduction to Probability. Chapman & Hall/CRC Texts in Statistical Science, CRC Press, 2014.
  6. J. Nocedal and S. J. Wright, Numerical Optimization, second edition. World Scientific, 2006.
  7. M. S. Bazaraa, Nonlinear Programming: Theory and Algorithms. Wiley Publishing, 3rd ed., 2013.
  • Midterm 1 10%
  • Midterm 2 10%
  • Midterm 4 10%
  • Final 10%
  • 8 Homework’s 60%
  • Total 100%
  • Conocimientos:
  • Habilidades:
  • Actitudes y valores:
Centro de Investigación y de Estudios Avanzados del Instituto Politécnico Nacional
Av. del Bosque 1145, colonia el Bajío, CP 45019, Zapopan , Jalisco, México.
Tel: (33) 3777-3600 Fax: (33) 3777-3609