Mathematics for Machine Learning: PCA

开始时间: 08/01/2020 持续时间: Unknown

所在平台: Coursera

课程类别: 计算机科学

大学或机构: CourseraNew



Explore 1600+ online courses from top universities. Join Coursera today to learn data science, programming, business strategy, and more.


第一个写评论        关注课程


This intermediate-level course introduces the mathematical foundations to derive Principal Component Analysis (PCA), a fundamental dimensionality reduction technique. We'll cover some basic statistics of data sets, such as mean values and variances, we'll compute distances and angles between vectors using inner products and derive orthogonal projections of data onto lower-dimensional subspaces. Using all these tools, we'll then derive PCA as a method that minimizes the average squared reconstruction error between data points and their reconstruction. At the end of this course, you'll be familiar with important mathematical concepts and you can implement PCA all by yourself. If you’re struggling, you'll find a set of jupyter notebooks that will allow you to explore properties of the techniques and walk you through what you need to do to get on track. If you are already an expert, this course may refresh some of your knowledge. The lectures, examples and exercises require: 1. Some ability of abstract thinking 2. Good background in linear algebra (e.g., matrix and vector algebra, linear independence, basis) 3. Basic background in multivariate calculus (e.g., partial derivatives, basic optimization) 4. Basic knowledge in python programming and numpy Disclaimer: This course is substantially more abstract and requires more programming than the other two courses of the specialization. However, this type of abstract thinking, algebraic manipulation and programming is necessary if you want to understand and develop machine learning algorithms.

机器学习数学:PCA:这门中级课程介绍了数学基础,以得出主成分分析(PCA)这一基本的降维技术。我们将介绍一些数据集的基本统计信息,例如平均值和方差,我们将使用内积计算向量之间的距离和角度,并将数据正交投影到低维子空间上。使用所有这些工具,我们将得出PCA作为使数据点及其重构之间的平均平方误差最小化的方法。 在本课程结束时,您将熟悉重要的数学概念,并且可以自己实施PCA。如果您在挣扎中,则会找到一组Jupyter笔记本,它们将使您探索这些技术的特性,并逐步引导自己完成下一步工作。如果您已经是专家,那么本课程可能会刷新您的一些知识。 讲座,示例和练习要求: 1.具有抽象思维能力 2.具有良好的线性代数背景(例如矩阵和向量代数,线性独立性,基础) 3.多元演算的基本背景(例如偏导数,基本优化) 4. python编程和numpy的基础知识 免责声明:与专业课程的其他两个课程相比,该课程实质上更加抽象,并且需要更多的编程。但是,如果您想了解和开发机器学习算法,则需要这种抽象思维,代数运算和编程。





This course introduces the mathematical foundations to derive Principal Component Analysis (PCA), a


PCA 帝国理工学院 微积分 面向机器学习的数学 线性代数 数据科学 主成分分析 机器学习数学基础 数学 伦敦帝国理工学院 机器学习 多变量微积分