

Beschreibung
This textbook offers a fresh and balanced approach to the study of Linear Algebra in the context of modern Data Science. Whereas many existing texts either emphasize theory with little connection to practice or jump straight to applications with minimal mathe...
This textbook offers a fresh and balanced approach to the study of Linear Algebra in the context of modern Data Science. Whereas many existing texts either emphasize theory with little connection to practice or jump straight to applications with minimal mathematical explanation, this book provides equal weight to both foundations and applications.
Designed for undergraduates who have completed a proof-based Linear Algebra course, it introduces concepts and tools from Matrix Analysis that are essential for Data Science and Machine Learning. Topics include:
To further support hands-on learning, the book is accompanied by a GitHub repository with Python labs, allowing students to implement the techniques covered and bridge the gap between theory and computation.
With its clear explanations, practical insights, and balance of theory and application, Matrix Methods in Data Analysis is an invaluable resource for courses in applied Linear Algebra, Data Science, and introductory Machine Learning.
Illustrates how principles of linear algebra can be used to understand topics in data science Presents a balanced view of both theory and applications Includes supplementary github content through python labs
Autorentext
Maria Isabel Bueno is a Teaching Professor at the University of California, Santa Barbara, where she has served since 2006. She holds a Ph.D. from Universidad Carlos III de Madrid. Her research focuses on linear algebra and numerical linear algebra.
Javier Perez Alvaro is an Associate Professor at the University of Montana in Missoula, where he has served since 2017. He earned his Ph.D. from Universidad Carlos III de Madrid. His research focuses on numerical linear algebra and numerical analysis.
Inhalt
Part I: Linear Algebra And Machine Learning.- Why Should We Care?.- What You May Have Learned Before..- Core Topics.- Supplementary Topics.- Part II: Matrix Multiplication And Partitioned Matrices.- Why Should We Care?.- What You May Have Learned Before.- Core Topics.- Supplementary Topics.- From The Classroom To Real Life.- Part III: Norms, Distances, And Similarities.- Why Should We Care?.- What You May Have Learned Before.- Core Topics.- Supplementary Topics.- From The Classroom To Real Life.- Part IV: The Four Fundamental Subspaces Of A Matrix, And Gram-Matrices.- Why Should We Care? .- What You May Have Learned Before.- Core Topics.- Supplementary Topics.- From The Classroom To Real Life.- Part V: The Lu Factorization Of A Matrix.- Why Should We Care? .- What You May Have Learned Before.- Core Topics.- Supplementary Topics.- From The Classroom To Real Life.- Part VI: Orthogonality And The Qr Factorization.- Why Should We Care? .- What You May Have Learned Before.- Core Topics.- Supplementary Topics.- From The Classroom To Real Life.- Part VII: Orthogonal Projections And The Least Squares Problem.- Why Should We Care? .- What You May Have Learned Before.- Core Topics.- Supplementary Topics.- From The Classroom To Real Life.- Part VIII: Eigenvalues, Eigenvectors, And Algorithms.- Why Should We Care? .- What You May Have Learned Before.- Core Topics.- Supplementary Topics.- From The Classroom To Real Life.- Part IX: Symmetric And Positive Definite Matrices.- Why Should We Care? .- What You May Have Learned Before.- Core Topics.- Supplementary Topics.- From The Classroom To Real Life.- Part X: Singular Value Decomposition.- Why Should We Care? .- What You May Have Learned Before.- Core Topics.- Supplementary Topics.-From The Classroom To Real Life.- Part XI: Nonnegative Matrices And Perron Theory.- Why Should We Care? .- What You May Have Learned Before.- Core Topics.- Supplementary Topics.- From The Classroom To Real Life.- Index.
