-
Period: 2000 BCE to 300 BCE
Algebra in Mesopotamia
They were pioneers in the development of algebra, they used sexagimal numerical systems (with base 60) and solved quadratic, cubic and higher degree equations. They solved them through geometric methods and algorithms that described step-by-step procedures. -
1800 BCE
Algebra in Egypt
He developed methods to solve algebraic problems, his approach was more arithmetic than algebraic. They used what today we would call linear equations in the context of practical problems such as distributing goods, calculating areas and volumes, and solving problems related to agriculture. was more practical, using a technique known as the "false value method," which involved assuming an incorrect value, calculating the error, and then adjusting it to get the correct answer. -
1650 BCE
Rhind Papyrus
First records of solving systems of simple linear equations.
Mathematician: Ahmes -
600 BCE
The Greeks contributed to the development of algebra
Greek mathematicians, such as Pythagoras, Euclid, and Diophantus, contributed to the development of algebra. However, the Greeks focused more on geometry, and their algebra was closely related to it. -
Period: 206 BCE to 24
Algebra in Chinese civilization
Algebra in Chinese civilization had a notable development, especially since the Han dynasty (206 BC-24 AD) with the treatise Mathematics in Nine Books, which addressed economic and administrative problems. Mathematicians such as Liu Hui and others improved on this treatise, developing an algorithmic method for solving systems of linear equations similar to Gauss's method, which allowed them to recognize negative numbers, one of the main achievements of Chinese mathematics. -
820
"Al-jabr"
Fundamentals of algebra, including methods for solving linear and quadratic equations.
mathematician:Al-Khwarizmi -
Cartesian geometry
by René Descartes of coordinates in geometry. In fact, in this new geometry, now called Cartesian geometry, lines and planes are represented by linear equations, and calculating their intersections is equivalent to solving systems of linear equations. -
Cramer's rules and elimination method
Gabriel Cramer used them to give explicit solutions of linear systems, what is now called Cramer's rule. Gauss later further described the elimination method, which was initially billed as a breakthrough in geodesy. -
"Die lineare Ausdehnungslehre"
Hermann Grassmann published his "Extension Theory" which included new foundational topics of what is today called linear algebra. -
Introduction of the term "matrix"
Formalization of matrix notation, introduced the term womb, which is Latin for womb.
mathematician: Sylvester -
matrix multiplication and matrix inverse
making the general linear group possible. The group representation mechanism became available to describe complex and hypercomplex numbers. Crucially, Cayley used a single letter to denote an array, thus treating an array as an aggregate object. He also realized the connection between matrices and determinants, and wrote "There would be many things to be said about this theory of matrices which should, it seems to me, precede the theory of determinants." -
Invariant theory
Development of matrix theory and its applications , in addition to introducing matrix notation, developed the theory of invariants, which studies the properties of algebraic expressions that remain unchanged under certain transformations.
mathematician: Cayley -
The consolidation of concepts
They laid the foundations of set theory, which was fundamental for the formalization of the concepts of linear algebra.
Mathematicians: Georg Cantor and Richard Dedekind -
Associative Linear Algebra
Benjamin Peirce published his Associative Linear Algebra and his son Charles Sanders Peirce later expanded the work. -
theory of force
The telegraph required an explanatory system, and the publication in 1873 of A Treatise on Electricity and Magnetism instituted a theory of force fields and required differential geometry for its expression. Linear algebra is plane differential geometry and serves in the spaces tangent to the manifolds. The electromagnetic symmetries of spacetime are expressed by Lorentzs transformations, and much of the history of linear algebra is the history of Lorentz transformations. -
Hilbert spaces
Fundamentals of functional analysis and quantum mechanics.
Mathematician: Hilbert -
Formalization of the theory of vector spaces
Establishing a structure
Mathematician: Various mathematicians -
Computational linear algebra, machine learning and artificial intelligence
Development of efficient algorithms and applications in various areas.
Mathematics: various researchers -
Period: to
events of 2000-206
2000-2005: Progress in the development of more efficient numerical algorithms for matrix decomposition and optimization methods. This includes advances in the Power Method and factorization algorithms, such as QR and SVD (Singular Value Decomposition).
• 2006: Introduction of Convolutional Neural Networks (CNN) in deep learning by Geoffrey Hinton and colleagues, which is based on linear algebra operations such as convolution and matrix multiplication. -
Period: to
events of 2007-2009
2007: Increasing use of linear algebra in the analysis of large volumes of data, such as matrix decomposition for dimensionality reduction (e.g., PCA, Principal Component Analysis).
• 2009: Publication of key works in applied linear algebra in graph theory, such as the expansion of Laplace matrix theory and its use in graph algorithms. -
Period: to
events of 2012-2013
2012: Explosion of deep learning with the introduction of deep neural networks requiring intensive linear algebra operations for backpropagation and optimization.
• 2013: Advances in computational linear algebra, with the development of libraries such as BLAS (Basic Linear Algebra Subprograms) and LAPACK (Linear Algebra PACKage), optimized for parallel hardware architectures.
. -
Period: to
events 2015-2018
2015: Implementation of tensor decomposition methods, such as CP (CANDECOMP/PARAFAC) and Tucker decomposition, in big data and machine learning applications.
• 2018: Emergence of techniques such as word embeddings in Natural Language Processing (NLP), which depend on linear algebra to map words to vector spaces. -
Deep Learning
Publication of “Deep Learning” by Ian Goodfellow and others, cementing the central role of linear algebra in the field of deep learning. -
the development of machine learning models
Linear algebra remains crucial in the development of machine learning models, especially in areas such as explainable artificial intelligence (XAI), where matrix factorization methods are used to interpret complex models. -
advances in quantum computing
advances in quantum computing, which uses linear algebra to model the behavior of quantum systems. Matrix decomposition and quantum Fourier transformation are key examples. -
Hardware optimization for linear algebra operations
Hardware optimization for linear algebra operations, with the proliferation of tensor processing units (TPUs) designed specifically to accelerate matrix calculations in AI applications. -
The expansion of linear algebra in emerging areas such as artificial general intelligence
The expansion of linear algebra in emerging areas such as artificial general intelligence (AGI) and simulation of complex systems, where new algorithms and approaches based on matrices and tensors are being developed. -
INTEGRANTES: MARIA YEPES,KANER MORALES, SAMUEL GARCIA, SAMUEL BOWDEN, GUSTAVO GONZALEZ