I know how basic operations are performed on matrices, I can do transformations, find inverses, etc. But now that I think about it, I actually don't "understand" or know what I've been doing all this time. Our teacher made us memorise some rules and I've been following it like a machine.

So what exactly is a matrix? And what is a determinant?

What do they represent?

Is there a geometrical interpretation?

How are they used? Or, rather, what for are they used?

How do I understand the "properties" of matrix?

I just don't wanna mindlessly cram all those properties, I want to understand them better.

Any links, which would improve my understanding towards determinants and matrices? Please use simpler words. Thanks :)

Yves Daoust 05/15/2018.

A matrix is a compact but general way to represent any linear transform. (Linearity means that the image of a sum is the sum of the images.) Examples of linear transforms are rotations, scalings, projections. They map points/lines/planes to point/lines /planes.

So a linear transform can be represented by an array of coefficients. The size of the matrix tells you the number of dimension of the domain and the image spaces. The composition of two linear transforms corresponds to the product of their matrices. The inverse of a linear transform corresponds to the matrix inverse.

A determinant measures the volume of the image of a unit cube by the transformation; it is a single number. (When the number of dimensions of the domain and image differ, this volume is zero, so that such "determinants" are never considered.) For instance, a rotation preserves the volumes, so that the determinant of a rotation matrix is always 1. When a determinant is zero, the linear transform is "singular", which means that it loses some dimensions (the transformed volume is flat), and cannot be inverted.

The determinants are a fundamental tool in the resolution of systems of linear equations.

As you will later learn, a linear transformation can be decomposed in a pure rotation, a pure (anisotropic) scaling and another pure rotation. Only the scaling deforms the volumes, and the determinant of the transform is the product of the scaling coefficients.

Yly 05/16/2018.

**1. Definition of a matrix.**

The question of what a matrix *is*, precisely, is one I had for a long time as a high school student. It took many tries to get a straight answer, because people tend to conflate "matrix" with "linear transformation". The two are closely related, but NOT the same thing. So let me start with the fully rigorous definition of a matrix:

An $m$ by $n$ matrix is a function of two variables, the first of which has domain $\{1,2,\dots,m\}$ and the second of which has domain $\{1,2,\dots,n\}$.

This is the formal definition of matrices, but it's not how we usually think about them. We have a special notation for matrices--the "box of numbers" you are familiar with, where the value of the function at $(1,1)$ is put in the top left corner, the value at $(2,1)$ is put just below it, etc. We usually think of the matrix as just this box, and forget that it is a function. However, sometimes you need to remember that a matrix has a more formal definition, like when implementing matrices on a computer (most programming languages have matrices built into them).

**2. What matrices represent.**

Matrices can represent different things in different contexts, but there is one application that is most common. The most common application is linear transformations (a.k.a. linear maps), but before I get into that, let me briefly mention some other applications:

- Matrices can be used to store data. For example, images on a computer are often stored as a matrix, where the matrix's value at $(i,j)$ is the intensity of light on the camera pixel that is $i^{th}$ from the top and $j^{th}$ from the left.
- Matrices can be used as computational tools. For example, one way to compute the Fibonacci numbers is from powers of the matrix $$M = \begin{bmatrix} 1 & 1 \\ 1 & 0 \\ \end{bmatrix}$$ It turns out that $(M^k)_{11}$ is the $k^{th}$ Fibonacci number.
- Matrices can be used to encode some mathematical structure. I'm going to be sort of hand-wavy about this, but an example of what I have in mind is an adjacency matrix for a graph or network, which tells you which nodes are connected to which.

So the point is that a matrix can be used for lots of things. However, one usage prevails as most common, and that is representing **linear transformations**. The prevalence of this usage is why people often conflate the two concepts. A linear transformation is a function $f$ of vectors which has the following properties:

- $f(x+y) = f(x) + f(y)$ for any vectors $x$ and $y$.
- $f(ax) = af(x)$ for any vector $x$ and any scalar $a$.

These properties are what it takes to ensure that the function $f$ has "no curvature". So it's like a straight line, but possibly in higher dimensions.

The relationship between matrices and linear transformations comes from the fact that a linear transformation is completely specified by the values it takes on a *basis* for its domain. (I presume you know what a basis is.) To see how this works, suppose we have a linear transformation $f$ which has domain $V$ and range $W$, where $V$ is a vector space with basis $v_1,v_2,\dots, v_n$ and $W$ is a vector space with basis $w_1,w_2,\dots,w_m$. Then there is a matrix $M$ representing $f$ **with respect to these bases**, which has as element $(i,j)$ the coefficient of $w_i$ when you express $f(v_j)$ as a sum of basis elements in $W$.

The reason that this is a good idea is that if you have some miscellaneous vector $x = a_1 v_1 + a_2 v_2 + \cdots + a_n v_n \in V$, then if you represent $x$ as a column vector $[a_1,a_2,\dots,a_n]^T$ and $f$ as its matrix $M$, then the value $f(x)$ is given by the matrix product of $M$ and $[a_1,a_2,\dots,a_n]^T$. So the matrix $M$ completely encodes the linear transformation $f$, and matrix multiplication tells you how to decode it, i.e. how to use the matrix to get values of $f$.

**3. Geometrical intuition.**

In my opinion, the most important theorem for getting intuition for matrices and linear transformations is the singular value decomposition theorem. This says that any linear transformation can be written as a sequence of three simple transformations: a rotation, a stretching, and another rotation. Note that the stretching operation can stretch by different amounts in different orthogonal directions. This tells you that all linear transformations are some combination of rotation and stretching.

Other properties of matrices often have direct geometric interpretation, too. For example, the determinant tells you how a linear transformation changes volumes. By the singular value decomposition, a linear transformation turns a cube into some sort of stretched and rotated parallelogram. The determinant is the ratio of the volume of the resulting parallelogram to that of the cube you started with.

Not all properties of a matrix can be easily associated with familiar geometric concepts, though. I don't know of a good geometric picture for the trace, for instance. That doesn't mean that the trace is any less useful or easy to work with, though!

**4. Other properties.**

Almost all of the "properties" and "operations" for matrices come from properties of linear maps and theorems about them. For example, the standard multiplication of matrices is designed specifically to give the values of linear maps as explained above. This is NOT the only type of multiplication that can be defined on matrices, and in fact there are other types of multiplication for matrices (for example, the Hadamard product and the Kronecker product). These other types of multiplication are sometimes useful, but generally not as useful as regular matrix multiplication, so people often don't know (or care) about them.

**5. TL;DR**

The moral of the story is that you can use matrices for whatever you want (and they are indeed used in many different ways), but the way that most people use them most of the time is to represent linear maps, and the standard definitions and "properties" of matrices reflect this bias. The study of linear maps goes by the name "linear algebra", and a textbook on this subject is a good place to start if you want to learn more about matrices. (Depending on your background, you may find some good reference suggestions here: link.)

Michael Hoppe 05/16/2018.

Too long for a comment ... Just to start:

One essential way to understand matrices is to consider them as a collection of column vectors.

- Now the multiplication of a matrix with a vector is a linear combination of those column vectors, that is, an element of the span of the column vectors.

For example: $$\begin{pmatrix}1&2\\3&4\\5&6\end{pmatrix}\begin{pmatrix}-3\\4\end{pmatrix}=-3\begin{pmatrix}1\\3\\5\end{pmatrix}+4\begin{pmatrix} 2\\4\\6\end{pmatrix}. $$

From here some properties of matrices are understable:

—If the linear system $Ax=b$ is solvable, the vector $b$ is contained in the span of the column vectors of $A$, which is a geometric interpretation of the solvability.

—The solution set of $Ax=0$ is non-empty. Iff it has more than one solution, the zero vector is a nontrivial linear combination of the column vector, ie, they are linearly dependent.

— Convince yourself that the rank of a matrix is the dimension of the span of its column vectors. From here it’s clear why a linear system is solvable iff the rank of the coefficient matrix equal the rank of the augmented matrix and so on.

For me — some decades ago — it was very helpful to connect properties of matrices with linear systems.

yarchik 05/17/2018.

Matrix is an infinite or finite collection of some entities arranged in rows and columns. The entities are typically composed of numbers, symbols or expressions. The

*determinant*is one of the basic operation that can be performed on a*square*matrix. It is, therefore, a more specialized concept.Matrices can be thought of as a generalization of the number concept. Therefore, they can represent many things:

`Group operations Symmetry transformations Graphs Complex numbers, quaternions First derivatives of multivariate functions (the Jacobian matrix) Second derivatives of multivariate functions (the Hessian matrix)`

Some matrices have geometric interpretation, for instance:

`describing rotations, linear transforms, etc.`

Matrices are used whenever one needs to represent in a short form an infinite or finite collection of some entities arranged in rows and columns.

Properties of a matrix are first of all the properties of its entities (for instance $M(n, R)$ are square $n$-by-$n$ matrices over the $R$-ring). Second, properties can refer to the structure of a matrix itself. For instance it can be symmetric, orthogonal, etc.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

- connection between graphs and the eigenvectors of their matrix representation
- Matrix Multiplication - Why Rows $\cdot$ Columns = Columns?
- Simplified method for symmetric matrix determinants
- What's a matrix?
- Basic way to show for $n\times n$ matrices $A$ and $B$, that $(AB)^{-1} = (B^{-1})(A^{-1})$
- What kind of transformation does each matrix determine?
- What is meant by a transformation being linear?
- What exactly are operations involving tensors… In terms of their indices
- How do I invert this 3×3 matrix?
- What is the relation between kernel functions, kernels used in convolution and null spaces of a matrix?
- What's an intuitive way to think about the determinant?
- What is the most rigorous definition of a matrix?
- Geometric meaning of the determinant of a matrix
- Looking for a rigorous linear algebra book

- Fill up to duplicate ranges
- "Married since" in Bavarian
- Ask a girl whether she would like to come back to a hotel with me on a night out without being creepy
- What is the overland travel pace of a rogue traveling alone and capable of dashing with Cunning Action?
- Outer Limits or Twilight Zone episode about a little girl who sneaks onto a spaceship
- Maximum Hamming distance among a list of padded strings
- Do author name(s) dominate the content of the paper?
- Can I use my English name when applying for a job?
- What is the best way to ensure a disabled, inexperienced passenger will receive suitable assistance on a trip from Windhoek to Europe?
- Are all continuous random variables normally distributed?
- Heating a tube of aluminium electrically
- "Independent as a pig on ice."
- Riley pays a visit
- What is the opposite of cuvée
- Why is the shape of a hanging chain not a "V"?
- Would a Fireball cast by a Pixie/Human/Dragon of the same level have the same power?
- Forgotten Pin 1
- Is it reasonable for an author to withdraw an accepted paper because they have since done better work?
- Do we have to take down the material for 14 days even if the DMCA notice is erroneous?
- I know (all) the members of the interview panel. Does that matter?
- Do we have any surviving texts by Romano-Celtic authors?
- Are there infinitely many primes of this form?
- How much XP does a monster created by Animate Objects add to an encounter?
- How much physical strength is required to control a Cessna 172?