Vector Spaces and Inner Products: Foundations of Linear Structure and Geometry

Table of Contents

  1. Introduction
  2. What Is a Vector Space?
  3. Axioms of Vector Spaces
  4. Subspaces and Spanning Sets
  5. Linear Independence and Basis
  6. Dimension and Coordinate Systems
  7. Inner Product: Definition and Properties
  8. Examples of Inner Products
  9. Norms and Angles
  10. Orthogonality and Orthonormal Sets
  11. Projection of Vectors
  12. The Gram-Schmidt Process
  13. Orthogonal Complements and Decompositions
  14. Inner Product Spaces vs Euclidean Spaces
  15. Applications in Physics and Machine Learning
  16. Conclusion

1. Introduction

Vector spaces and inner products form the conceptual core of linear algebra. A vector space provides the structure to perform operations like addition and scalar multiplication. An inner product allows us to measure lengths, angles, and define geometric notions such as orthogonality.

These concepts are essential across physics, mathematics, engineering, and data science.


2. What Is a Vector Space?

A vector space over a field \( \mathbb{F} \) (usually \( \mathbb{R} \) or \( \mathbb{C} \)) is a set \( V \) equipped with two operations:

  1. Vector addition: \( \vec{u} + \vec{v} \in V \)
  2. Scalar multiplication: \( a\vec{v} \in V \)

It satisfies certain axioms such as associativity, distributivity, and the existence of a zero vector.


3. Axioms of Vector Spaces

A vector space \( V \) must satisfy:

  1. Closure under addition and scalar multiplication
  2. Associativity: \( (\vec{u} + \vec{v}) + \vec{w} = \vec{u} + (\vec{v} + \vec{w}) \)
  3. Commutativity: \( \vec{u} + \vec{v} = \vec{v} + \vec{u} \)
  4. Existence of additive identity: \( \vec{v} + \vec{0} = \vec{v} \)
  5. Existence of additive inverse: \( \vec{v} + (-\vec{v}) = \vec{0} \)
  6. Distributivity: \( a(\vec{u} + \vec{v}) = a\vec{u} + a\vec{v} \), etc.

4. Subspaces and Spanning Sets

A subspace is a subset of a vector space that is itself a vector space.

Given vectors \( \vec{v}_1, \dots, \vec{v}_n \), the span is:

\[
\text{span}\{\vec{v}1, \dots, \vec{v}_n\} = \left\{ \sum{i=1}^n a_i \vec{v}_i \mid a_i \in \mathbb{F} \right\}
\]

It is the smallest subspace containing all \( \vec{v}_i \).


5. Linear Independence and Basis

A set \( \{\vec{v}_1, \dots, \vec{v}_n\} \) is linearly independent if:

\[
a_1\vec{v}_1 + \dots + a_n\vec{v}_n = \vec{0} \Rightarrow a_i = 0 \ \forall i
\]

A basis is a minimal set of linearly independent vectors that span the space.


6. Dimension and Coordinate Systems

The number of vectors in any basis of a vector space is called its dimension.

Every vector can be uniquely expressed as a linear combination of basis vectors:

\[
\vec{v} = a_1\vec{e}_1 + a_2\vec{e}_2 + \dots + a_n\vec{e}_n
\]

The coefficients \( a_i \) are the coordinates of \( \vec{v} \).


7. Inner Product: Definition and Properties

An inner product on a real vector space \( V \) is a function:

\[
\langle \cdot, \cdot \rangle: V \times V \to \mathbb{R}
\]

That satisfies:

  1. Linearity in the first argument
  2. Symmetry: \( \langle \vec{u}, \vec{v} \rangle = \langle \vec{v}, \vec{u} \rangle \)
  3. Positive-definiteness: \( \langle \vec{v}, \vec{v} \rangle \geq 0 \) and equals zero only when \( \vec{v} = \vec{0} \)

8. Examples of Inner Products

  • Dot product in \( \mathbb{R}^n \):
    \[
    \langle \vec{u}, \vec{v} \rangle = \sum u_i v_i
    \]
  • Function inner product:
    \[
    \langle f, g \rangle = \int_a^b f(x)g(x) \, dx
    \]

9. Norms and Angles

The norm (length) of a vector is:

\[
|\vec{v}| = \sqrt{\langle \vec{v}, \vec{v} \rangle}
\]

The angle \( \theta \) between two vectors is defined via:

\[
\cos \theta = \frac{\langle \vec{u}, \vec{v} \rangle}{|\vec{u}| |\vec{v}|}
\]


10. Orthogonality and Orthonormal Sets

Two vectors are orthogonal if:

\[
\langle \vec{u}, \vec{v} \rangle = 0
\]

A set is orthonormal if:

  • All vectors have unit norm
  • All vectors are mutually orthogonal

11. Projection of Vectors

The projection of \( \vec{v} \) onto \( \vec{u} \) is:

\[
\text{proj}_{\vec{u}} \vec{v} = \frac{\langle \vec{v}, \vec{u} \rangle}{\langle \vec{u}, \vec{u} \rangle} \vec{u}
\]

Used in least squares, geometry, and physics.


12. The Gram-Schmidt Process

Transforms a linearly independent set \( \{ \vec{v}_1, \dots, \vec{v}_n \} \) into an orthonormal set \( \{ \vec{u}_1, \dots, \vec{u}_n \} \)

Algorithm:

  1. Set \( \vec{u}_1 = \vec{v}_1 / |\vec{v}_1| \)
  2. Subtract projections from subsequent vectors
  3. Normalize at each step

13. Orthogonal Complements and Decompositions

For a subspace \( W \subseteq V \), the orthogonal complement \( W^\perp \) consists of all vectors orthogonal to every vector in \( W \).

Any vector \( \vec{v} \in V \) can be decomposed as:

\[
\vec{v} = \vec{w} + \vec{w}^\perp
\]

Where \( \vec{w} \in W \), \( \vec{w}^\perp \in W^\perp \)


14. Inner Product Spaces vs Euclidean Spaces

  • Euclidean spaces: equipped with standard dot product
  • Inner product spaces: abstract vector spaces with a defined inner product (can be infinite-dimensional)

All Euclidean spaces are inner product spaces, but not vice versa.


15. Applications in Physics and Machine Learning

  • Quantum mechanics: Hilbert spaces, bras and kets
  • Mechanics: orthogonality of modes in vibration
  • ML & AI: projections, distances, similarity (e.g., cosine similarity)
  • Signal processing: Fourier series as orthonormal expansions

16. Conclusion

Vector spaces provide a linear framework for abstract reasoning, and inner products bring in geometric structure — angles, lengths, and orthogonality. Together, they underpin a vast array of theoretical and applied sciences.

From quantum physics to machine learning, mastering vector spaces and inner products is fundamental to advanced mathematical and physical reasoning.


.