Vector spaces are fundamental structures in linear algebra, providing a framework for understanding and manipulating vectors. They generalize the familiar Euclidean spaces (R2, R3) and are crucial for various applications in mathematics, physics, engineering, and especially data analytics (e.g., principal component analysis, machine learning algorithms often operate on vector spaces). This introductory part defines what a vector space is, introduces its core properties (axioms), and explains the concept of a subspace, which is a vector space contained within a larger one. Understanding these basics is essential for building a strong foundation in linear algebra for the GATE DA exam.
Key Concepts
#### 1. Definition of a Vector Space A vector space (or linear space) consists of:
A non-empty set V whose elements are called vectors.
A fieldF whose elements are called scalars (e.g., R for real vector spaces, C for complex vector spaces).
Two operations:
* Vector Addition: A rule that associates with each pair of vectors u,v∈V a vector u+v∈V. * Scalar Multiplication: A rule that associates with each scalar c∈F and vector u∈V a vector c⋅u∈V.
These operations must satisfy the following ten axioms for all u,v,w∈V and c,d∈F:
Axioms of Vector Addition:
Closure under Addition: u+v∈V
Commutativity: u+v=v+u
Associativity: (u+v)+w=u+(v+w)
Additive Identity (Zero Vector): There exists a unique vector 0∈V such that u+0=u for all u∈V.
Additive Inverse: For every u∈V, there exists a unique vector −u∈V such that u+(−u)=0.
Axioms of Scalar Multiplication:
Closure under Scalar Multiplication: c⋅u∈V
Distributivity (scalar over vector addition): c⋅(u+v)=c⋅u+c⋅v
Distributivity (vector over scalar addition): (c+d)⋅u=c⋅u+d⋅u
Associativity of Scalar Multiplication: c⋅(d⋅u)=(cd)⋅u
Multiplicative Identity: 1⋅u=u, where 1 is the multiplicative identity in F.
#### 2. Examples of Vector Spaces * The set of all n-tuples of real numbers, Rn, over the field R. * The set of all m×n matrices with real entries, Mm×n(R), over the field R. * The set of all polynomials of degree less than or equal to n, Pn(R), over the field R. * The set of all continuous real-valued functions on an interval [a,b], C[a,b], over the field R.
#### 3. Definition of a Vector Subspace A subspaceW of a vector space V over a field F is a non-empty subset of V that is itself a vector space under the same operations of vector addition and scalar multiplication defined on V.
To check if a non-empty subset W⊆V is a subspace, we only need to verify the following three conditions (the other axioms are inherited from V):
Subspace Test (Three Conditions):
Contains the Zero Vector: The zero vector of V, 0V, must be in W. That is, 0V∈W. (This also ensures W is non-empty).
Closure under Addition: For any u,v∈W, their sum u+v must also be in W.
Closure under Scalar Multiplication: For any u∈W and any scalar c∈F, the scalar multiple c⋅u must also be in W.
Alternative Subspace Test (One Condition): A non-empty subset W of a vector space V is a subspace if and only if for any u,v∈W and any scalars c,d∈F, the linear combination c⋅u+d⋅v is in W. This single condition implies both closure under addition (set c=1,d=1) and closure under scalar multiplication (set d=0).
#### 4. Examples of Subspaces * The set {(x,y,z)∈R3∣x=0} (the yz-plane) is a subspace of R3. * The set {(x,y,z)∈R3∣x+y+z=0} (a plane through the origin) is a subspace of R3. * The set of all n×n symmetric matrices is a subspace of Mn×n(R). * The set of all polynomials of degree ≤k is a subspace of Pn(R) for k≤n.
Important Formulas (Axioms/Conditions)
#### Vector Space Axioms: For u,v,w∈V and c,d∈F:
u+v∈V
u+v=v+u
(u+v)+w=u+(v+w)
∃!0∈V s.t. u+0=u
∀u∈V,∃!−u∈V s.t. u+(−u)=0
c⋅u∈V
c⋅(u+v)=c⋅u+c⋅v
(c+d)⋅u=c⋅u+d⋅u
c⋅(d⋅u)=(cd)⋅u
1⋅u=u
#### Subspace Test Conditions: A non-empty subset W⊆V is a subspace if:
0V∈W
∀u,v∈W⟹u+v∈W
∀u∈W,c∈F⟹c⋅u∈W
Or, equivalently:
W is non-empty.
∀u,v∈W,c,d∈F⟹c⋅u+d⋅v∈W
Examples
Example 1: Is W={(x,y,z)∈R3∣x+y+z=0} a subspace of R3? Let F=R.
Zero Vector: The zero vector (0,0,0) satisfies 0+0+0=0. So, (0,0,0)∈W.
Closure under Addition: Let u=(x1,y1,z1)∈W and v=(x2,y2,z2)∈W.
This means x1+y1+z1=0 and x2+y2+z2=0. Then u+v=(x1+x2,y1+y2,z1+z2). To check if u+v∈W, we sum its components:
Closure under Scalar Multiplication: Let u=(x,y,z)∈W and c∈R.
This means x+y+z=0. Then c⋅u=(cx,cy,cz). To check if c⋅u∈W, we sum its components:
cx+cy+cz=c(x+y+z)=c(0)=0
Thus, c⋅u∈W. Since all three conditions are met, W is a subspace of R3.
Example 2: Is W={(x,y,z)∈R3∣x+y+z=1} a subspace of R3?
Zero Vector: The zero vector (0,0,0) does not satisfy 0+0+0=1.
Since the zero vector is not in W, W is not a subspace of R3. (No need to check other conditions).
Example 3: Is W={(x,y,z)∈R3∣x≥0} a subspace of R3?
Zero Vector: The zero vector (0,0,0) satisfies 0≥0. So, (0,0,0)∈W.
Closure under Addition: Let u=(1,0,0)∈W and v=(2,0,0)∈W.
u+v=(3,0,0). Since 3≥0, u+v∈W. (This condition holds for these specific vectors).
Closure under Scalar Multiplication: Let u=(1,0,0)∈W and c=−1∈R.
Then c⋅u=(−1)⋅(1,0,0)=(−1,0,0). However, −1≥0, so c⋅u∈/W. Since closure under scalar multiplication fails, W is not a subspace of R3.
Important Points/Tips for Exam Preparation
* Zero Vector is Key: The quickest way to rule out a set as a subspace is to check if it contains the zero vector. If 0V∈/W, then W is not a subspace. * Closure is Essential: Subspaces must be "closed" under both vector addition and scalar multiplication. This means performing these operations on elements within the subspace must always result in another element within the same subspace. * Geometric Interpretation: In R2 and R3, subspaces are geometrically restricted: * In R2: The origin {(0,0)}, any line passing through the origin, and R2 itself. * In R3: The origin {(0,0,0)}, any line passing through the origin, any plane passing through the origin, and R3 itself. Any set that does not pass through the origin (e.g., a line y=x+1) or is not "flat" (e.g., a sphere) cannot be a subspace. * Linear Combinations: Remember the single, powerful test: a non-empty subset W is a subspace if and only if it is closed under linear combinations (c⋅u+d⋅v∈W). This is often more efficient for proofs. * PYQ Context: GATE questions often involve identifying subspaces from a list of options (Multiple Select Questions - MSQ). Practice applying the three subspace conditions rigorously to various types of sets (e.g., sets defined by equations, inequalities, or specific properties of vectors/matrices). * Field Matters: While most GATE questions use R as the field, be aware that the field F is part of the vector space definition. * Trivial Subspaces: Every vector space V has at least two subspaces: the zero vector space {0V} and V itself. These are called trivial subspaces.
---
Vector Spaces: Part 2 - Core Concepts
Chapter Overview
This section delves into the fundamental building blocks of linear algebra: vector spaces and their essential properties. We will define what constitutes a vector space, explore the concept of subspaces, and understand how vectors combine through linear combinations. Key topics include linear independence, basis, dimension, and the crucial fundamental subspaces associated with matrices, culminating in the Rank-Nullity Theorem. These concepts are vital for understanding the structure and properties of linear transformations and solving systems of linear equations.
Key Concepts
Vector Space Definition
A vector space V over a field F (typically R or C) is a set of objects called vectors, together with two operations: * Vector Addition: For any u,v∈V, u+v∈V. * Scalar Multiplication: For any c∈F and v∈V, cv∈V. These operations must satisfy the following 10 axioms for all u,v,w∈V and c,d∈F: 1. Closure under addition: u+v∈V 2. Commutativity of addition: u+v=v+u 3. Associativity of addition: (u+v)+w=u+(v+w) 4. Existence of zero vector: There exists a zero vector 0∈V such that v+0=v for all v∈V. 5. Existence of additive inverse: For every v∈V, there exists an additive inverse −v∈V such that v+(−v)=0. 6. Closure under scalar multiplication: cv∈V 7. Distributivity over vector addition: c(u+v)=cu+cv 8. Distributivity over scalar addition: (c+d)v=cv+dv 9. Associativity of scalar multiplication: c(dv)=(cd)v 10. Identity for scalar multiplication: 1v=v (where 1 is the multiplicative identity in F).
Examples: * Rn (the set of all n-tuples of real numbers) * Pn(x) (the set of all polynomials of degree at most n) * Mm×n(R) (the set of all m×n matrices with real entries)
Subspaces
A subset W of a vector space V is called a subspace of V if W itself is a vector space under the same operations of vector addition and scalar multiplication defined on V. To check if a non-empty subset W⊆V is a subspace, we only need to verify three conditions: 1. Zero vector: The zero vector of V is in W (0∈W). (This implies W is non-empty). 2. Closure under addition: For any u,v∈W, u+v∈W. 3. Closure under scalar multiplication: For any c∈F and v∈W, cv∈W.
Example: * The set of all vectors in R3 of the form (a,b,0) is a subspace of R3. * The set of all symmetric n×n matrices is a subspace of Mn×n(R). * The set of all solutions to a homogeneous system of linear equations Ax=0 is a subspace (the null space).
Linear Combination
A vector v∈V is a linear combination of vectors v1,v2,…,vk∈V if there exist scalars c1,c2,…,ck∈F such that:
v=c1v1+c2v2+⋯+ckvk
Example: In R3, the vector (5,1,3) is a linear combination of (1,0,0), (0,1,0), and (0,0,1) because (5,1,3)=5(1,0,0)+1(0,1,0)+3(0,0,1).
Span of a Set of Vectors
The span of a set of vectors S={v1,v2,…,vk} in a vector space V, denoted as span(S) or span{v1,…,vk}, is the set of all possible linear combinations of these vectors.
span(S)={c1v1+c2v2+⋯+ckvk∣ci∈F for i=1,…,k}
Property: span(S) is always a subspace of V. It is the smallest subspace of V that contains all vectors in S.
Example: In R3, span{(1,0,0),(0,1,0)} is the xy-plane, which is a subspace of R3.
Linear Independence and Dependence
A set of vectors S={v1,v2,…,vk} in a vector space V is said to be: * Linearly Independent: If the only solution to the vector equation c1v1+c2v2+⋯+ckvk=0 is c1=c2=⋯=ck=0. * Linearly Dependent: If there exist scalars c1,c2,…,ck, not all zero, such that c1v1+c2v2+⋯+ckvk=0. This means at least one vector in the set can be expressed as a linear combination of the others.
Properties: * Any set containing the zero vector is linearly dependent. * If a set of vectors is linearly independent, any subset of these vectors is also linearly independent. * If a set of vectors is linearly dependent, any superset containing these vectors is also linearly dependent.
Example: * In R2, {(1,0),(0,1)} is linearly independent. * In R2, {(1,0),(0,1),(2,3)} is linearly dependent because (2,3)=2(1,0)+3(0,1).
Basis of a Vector Space
A set of vectors B={v1,v2,…,vn} is a basis for a vector space V if it satisfies two conditions: 1. B is linearly independent. 2. span(B)=V. Every vector in V can be expressed uniquely as a linear combination of the basis vectors.
Examples: * Standard Basis for Rn: E={e1,e2,…,en}, where ei is a vector with 1 in the i-th position and 0 elsewhere. For R3, {(1,0,0),(0,1,0),(0,0,1)}. * Standard Basis for Pn(x): {1,x,x2,…,xn}. * Standard Basis for Mm×n(R): The set of m×n matrices with a single 1 in one position and 0 elsewhere.
Dimension of a Vector Space
The dimension of a vector space V, denoted as dim(V), is the number of vectors in any basis for V. If V={0}, then dim(V)=0. Properties: * If dim(V)=n, then any set of n linearly independent vectors in V forms a basis for V. * If dim(V)=n, then any set of more than n vectors in V is linearly dependent. * If dim(V)=n, then any set of fewer than n vectors cannot span V.
For an m×n matrix A: * Column Space (Image Space): Col(A) or Im(A). This is the span of the column vectors of A. It is a subspace of Rm.
Col(A)={Ax∣x∈Rn}
* Row Space: Row(A). This is the span of the row vectors of A. It is a subspace of Rn.
Row(A)={ATy∣y∈Rm}
* Null Space (Kernel): Null(A) or Ker(A). This is the set of all solutions to the homogeneous equation Ax=0. It is a subspace of Rn.
Null(A)={x∈Rn∣Ax=0}
* Left Null Space: Null(AT). This is the null space of the transpose of A. It is the set of all solutions to ATy=0. It is a subspace of Rm.
Null(AT)={y∈Rm∣ATy=0}
Rank and Nullity
* Rank of A: rank(A) is the dimension of the column space of A, which is equal to the dimension of the row space of A. It is also the number of pivot positions in the row echelon form of A.
rank(A)=dim(Col(A))=dim(Row(A))
* Nullity of A: nullity(A) is the dimension of the null space of A. It is the number of free variables in the solution to Ax=0.
nullity(A)=dim(Null(A))
Rank-Nullity Theorem
For an m×n matrix A, the sum of its rank and nullity is equal to the number of columns (n).
rank(A)+nullity(A)=n
Important Relations: * rank(A)=rank(AT). * dim(Col(A))+dim(Null(AT))=m.
Important Points/Tips for Exam Preparation
* Subspace Conditions: Thoroughly understand and be able to apply the three conditions for a subset to be a subspace. Many PYQs test this directly. * Linear Independence: Practice determining if a set of vectors is linearly independent. This often involves solving a homogeneous system of equations. * Basis and Dimension: Be proficient in finding a basis for a given vector space or subspace and determining its dimension. This might involve row reducing a matrix. * Fundamental Subspaces: Clearly distinguish between the column space, row space, and null space. Understand their definitions, how to find a basis for each, and their respective dimensions. * Rank-Nullity Theorem: This theorem is extremely powerful. Use it to relate the dimensions of the fundamental subspaces and to quickly find one dimension if others are known. * Matrix Operations: A strong grasp of matrix operations (row reduction, matrix multiplication) is essential for solving problems related to vector spaces. * Conceptual Understanding: Don't just memorize definitions; understand the underlying concepts. For instance, why is the zero vector crucial for a subspace? Why does linear independence matter for a basis?
---
Part 3: Advanced Topics in Vector Spaces
Chapter Overview
This part delves into advanced concepts building upon the foundational understanding of vector spaces and subspaces. We will explore linear transformations, their properties, and associated spaces like the kernel and image. Furthermore, we will introduce eigenvalues and eigenvectors, crucial for understanding the intrinsic properties of linear operators. Finally, we will cover inner product spaces, which equip vector spaces with geometric notions of length, angle, and orthogonality.
Key Concepts
#### 1. Linear Transformations
A function T:V→W between two vector spaces V and W (over the same field F) is a linear transformation if for all u,v∈V and c∈F:
* Kernel (Null Space) of T: The set of all vectors in V that are mapped to the zero vector in W.
Ker(T)={v∈V∣T(v)=0W}
Ker(T) is a subspace of V.
* Image (Range Space) of T: The set of all vectors in W that are images of some vector in V.
Im(T)={w∈W∣w=T(v) for some v∈V}
Im(T) is a subspace of W.
* Rank of T: The dimension of the image space.
rank(T)=dim(Im(T))
* Nullity of T: The dimension of the kernel space.
nullity(T)=dim(Ker(T))
* Rank-Nullity Theorem: For a linear transformation T:V→W, where V is a finite-dimensional vector space:
dim(V)=rank(T)+nullity(T)
* Matrix Representation of Linear Transformations: If V and W are finite-dimensional vector spaces with ordered bases BV={v1,…,vn} and BW={w1,…,wm} respectively, then T:V→W can be represented by an m×n matrix A. The j-th column of A is the coordinate vector of T(vj) with respect to BW.
[T(v)]BW=A[v]BV
#### 2. Eigenvalues and Eigenvectors
* Definition: For a linear operator T:V→V (or an n×n matrix A), a non-zero vector v∈V is an eigenvector of T (or A) if T(v)=λv (or Av=λv) for some scalar λ. The scalar λ is called an eigenvalue corresponding to the eigenvector v.
* Characteristic Equation: For a matrix A, eigenvalues are the roots of the characteristic equation:
det(A−λI)=0
where I is the identity matrix.
* Eigenspace: For an eigenvalue λ, the set Eλ={v∈V∣Av=λv} is a subspace of V, called the eigenspace corresponding to λ. It consists of all eigenvectors corresponding to λ and the zero vector.
* Algebraic Multiplicity (AM): The multiplicity of λ as a root of the characteristic polynomial.
* Geometric Multiplicity (GM): The dimension of the eigenspace Eλ, i.e., dim(Eλ)=nullity(A−λI).
* Properties: * For any eigenvalue λ, 1≤GM(λ)≤AM(λ). * Eigenvectors corresponding to distinct eigenvalues are linearly independent. * A matrix A is diagonalizable if and only if for every eigenvalue λ, AM(λ)=GM(λ). * The sum of eigenvalues (counting algebraic multiplicities) is the trace of the matrix:
∑λi=tr(A)
* The product of eigenvalues (counting algebraic multiplicities) is the determinant of the matrix:
∏λi=det(A)
#### 3. Inner Product Spaces
* Definition: An inner product on a vector space V over R (or C) is a function ⟨⋅,⋅⟩:V×V→R (or C) satisfying for all u,v,w∈V and c∈F: 1. ⟨u+v,w⟩=⟨u,w⟩+⟨v,w⟩ (Additivity in first argument) 2. ⟨cu,v⟩=c⟨u,v⟩ (Homogeneity in first argument) 3. ⟨u,v⟩=⟨v,u⟩ (Conjugate symmetry; for real spaces, ⟨u,v⟩=⟨v,u⟩) 4. ⟨u,u⟩≥0, and ⟨u,u⟩=0⟺u=0 (Positive-definiteness) A vector space equipped with an inner product is called an inner product space.
* Standard Inner Product (Dot Product) in Rn: For u=(u1,…,un) and v=(v1,…,vn):
⟨u,v⟩=u⋅v=i=1∑nuivi
* Norm (Length) of a Vector: Induced by the inner product:
∥v∥=⟨v,v⟩
* Distance between Vectors:
d(u,v)=∥u−v∥
* Orthogonality: Two vectors u,v are orthogonal if ⟨u,v⟩=0.
* Orthonormal Set: A set of vectors {v1,…,vk} is orthonormal if ⟨vi,vj⟩=δij (Kronecker delta, which is 1 if i=j and 0 if i=j).
* Gram-Schmidt Orthonormalization Process: A method to construct an orthonormal basis {e1,…,ek} from any given basis {v1,…,vk} of an inner product space. 1. Set u1=v1. 2. For j=2,…,k, compute:
uj=vj−i=1∑j−1⟨ui,ui⟩⟨vj,ui⟩ui
3. Normalize each uj to get ej=∥uj∥uj. The set {e1,…,ek} is an orthonormal basis.
Examples
* Linear Transformation: Let T:R2→R3 be defined by T(x,y)=(x+y,x−y,2x). * To find Ker(T), set T(x,y)=(0,0,0): x+y=0 x−y=0 2x=0 Solving these equations yields x=0,y=0. Thus, Ker(T)={(0,0)}, and nullity(T)=0. * To find Im(T), consider the images of the standard basis vectors: T(1,0)=(1,1,2) and T(0,1)=(1,−1,0). These vectors span Im(T). Since they are linearly independent, rank(T)=2. * Verify Rank-Nullity Theorem: dim(R2)=2. rank(T)+nullity(T)=2+0=2. The theorem holds.
* Eigenvalues/Eigenvectors: Consider the matrix
A=[2112]
* Characteristic equation:
det(A−λI)=det[2−λ112−λ]=(2−λ)2−1=0
(2−λ)2=1
2−λ=±1
This gives eigenvalues λ1=1 and λ2=3. * For λ1=1: Solve (A−I)v=0
[1111][xy]=[00]
This implies x+y=0. An eigenvector is
v1=[1−1]
* For λ2=3: Solve (A−3I)v=0
[−111−1][xy]=[00]
This implies −x+y=0. An eigenvector is
v2=[11]
* Gram-Schmidt Process: Given a basis {(1,1,0),(1,0,1),(0,1,1)} for R3. Let v1=(1,1,0), v2=(1,0,1), v3=(0,1,1). 1. Set u1=v1=(1,1,0). 2. For j=2,…,k, compute:
* Understand Definitions: Be precise with the definitions of linear transformation, kernel, image, eigenvalues, eigenvectors, and inner product. Many questions test conceptual understanding. * Subspace Verification: Remember that Ker(T) and Im(T) are always subspaces. This is a common type of question (as seen in PYQs). * Rank-Nullity Theorem: This is a fundamental theorem. Know how to apply it to find dimensions of kernel or image, or to relate properties of the domain and codomain. * Eigenvalue Properties: The relations between eigenvalues and the trace/determinant of a matrix are very useful for quick calculations and verification. * Diagonalization: Understand the conditions for a matrix to be diagonalizable, especially the relationship between algebraic and geometric multiplicities. * Orthogonality: This is a crucial concept in inner product spaces. Be familiar with the Gram-Schmidt orthonormalization process and its application. * Practice Problems: Work through various examples involving finding kernels, images, eigenvalues, and applying Gram-Schmidt. * PYQ Analysis: The Previous Year Questions often test the understanding of definitions and basic properties. For instance, identifying subspaces (which Ker(T) and Im(T) are) or properties of linear transformations. Pay attention to the wording of "select all choices that are subspaces" or "which of the following statements is/are correct".
---
Part 4: Applications (Examples)
Chapter Overview
This section delves into the practical application of vector space theory, focusing on how to identify and construct vector spaces and, more commonly, subspaces within larger vector spaces like Rn. Understanding these applications is crucial for solving problems related to linear systems, transformations, and data analysis. The primary focus will be on providing concrete examples and methods to verify if a given set is indeed a subspace, a frequently tested concept in exams.
Key Concepts
Subspace Definition: A non-empty subset W of a vector space V is a subspace of V if it satisfies the following three conditions:
* Zero Vector: The zero vector of V is in W. * Closure under Addition: For any u,v∈W, their sum u+v is also in W. * Closure under Scalar Multiplication: For any u∈W and any scalar c∈R, the scalar multiple cu is also in W.
Span of a Set of Vectors: The set of all possible linear combinations of a given set of vectors {v1,v2,…,vk} in a vector space V forms a subspace of V. This is denoted as span{v1,v2,…,vk}.
Null Space (Kernel) of a Matrix: For an m×n matrix A, the set of all solutions to the homogeneous linear system Ax=0 forms a subspace of Rn. This is called the null space of A, denoted as Null(A) or Ker(A).
Column Space (Image) of a Matrix: For an m×n matrix A, the set of all linear combinations of the column vectors of A forms a subspace of Rm. This is called the column space of A, denoted as Col(A) or Im(A).
Important Formulas
* Subspace Conditions: Let W⊆V be a non-empty subset. W is a subspace if: 1. 0V∈W 2. ∀u,v∈W,u+v∈W 3. ∀u∈W,∀c∈R,cu∈W
* Linear Combination: A vector v is a linear combination of vectors v1,…,vk if there exist scalars c1,…,ck such that:
v=c1v1+c2v2+⋯+ckvk
* Span of a Set: The span of a set of vectors S={v1,…,vk} is defined as:
span(S)={c1v1+⋯+ckvk∣c1,…,ck∈R}
* Null Space: For an m×n matrix A:
Null(A)={x∈Rn∣Ax=0}
Examples
Here are various examples illustrating how to determine if a given set is a subspace of Rn.
Example 1: Line through the origin in R2 Let W={(x,y)∈R2∣y=2x}.
Zero Vector: (0,0) satisfies 0=2(0), so (0,0)∈W.
Closure under Addition: Let u=(x1,y1) and v=(x2,y2) be in W. Then y1=2x1 and y2=2x2.
u+v=(x1+x2,y1+y2). We check if y1+y2=2(x1+x2). y1+y2=2x1+2x2=2(x1+x2). So, u+v∈W.
Closure under Scalar Multiplication: Let u=(x,y)∈W and c∈R. Then y=2x.
cu=(cx,cy). We check if cy=2(cx). cy=c(2x)=2(cx). So, cu∈W. Since all three conditions are met, W is a subspace of R2.
Example 2: Plane through the origin in R3 Let W={(x,y,z)∈R3∣x−2y+3z=0}. This is the solution set of a homogeneous linear equation.
Zero Vector: (0,0,0) satisfies 0−2(0)+3(0)=0, so (0,0,0)∈W.
Closure under Addition: Let u=(x1,y1,z1) and v=(x2,y2,z2) be in W.
Then x1−2y1+3z1=0 and x2−2y2+3z2=0. u+v=(x1+x2,y1+y2,z1+z2). (x1+x2)−2(y1+y2)+3(z1+z2)=(x1−2y1+3z1)+(x2−2y2+3z2)=0+0=0. So, u+v∈W.
Closure under Scalar Multiplication: Let u=(x,y,z)∈W and c∈R.
Then x−2y+3z=0. cu=(cx,cy,cz). (cx)−2(cy)+3(cz)=c(x−2y+3z)=c(0)=0. So, cu∈W. Thus, W is a subspace of R3.
Example 3: Span of vectors Let W=span⎩⎨⎧101,011⎭⎬⎫ in R3. By definition, the span of any set of vectors is always a subspace.
Zero Vector:
0⋅101+0⋅011=000∈W
Closure under Addition: Let u,v∈W. Then u=c1v1+c2v2 and v=d1v1+d2v2 for some scalars c1,c2,d1,d2.
This is a linear combination of v1,v2, so u+v∈W.
Closure under Scalar Multiplication: Let u∈W and k∈R. Then u=c1v1+c2v2.
ku=k(c1v1+c2v2)=(kc1)v1+(kc2)v2
This is a linear combination of v1,v2, so ku∈W. Thus, W is a subspace of R3.
Example 4: Non-subspace (missing zero vector) Let W={(x,y)∈R2∣y=2x+1}.
Zero Vector: For (0,0), 0=2(0)+1 is false (0=1). So, (0,0)∈/W.
Since the zero vector is not in W, W is not a subspace of R2. (No need to check other conditions).
Example 5: Non-subspace (not closed under addition) Let W={(x,y)∈R2∣x≥0,y≥0}. (First quadrant)
Zero Vector: (0,0) satisfies 0≥0,0≥0, so (0,0)∈W.
Closure under Addition: Let u=(1,2) and v=(3,4). Both are in W.
u+v=(1+3,2+4)=(4,6). Since 4≥0,6≥0, u+v∈W. (This condition holds for these specific vectors, but we need to check scalar multiplication).
Closure under Scalar Multiplication: Let u=(1,2)∈W and c=−1∈R.
cu=(−1)(1,2)=(−1,−2). Since −1<0 and −2<0, cu∈/W. Since W is not closed under scalar multiplication, W is not a subspace of R2.
Example 6: Non-subspace (not closed under addition and scalar multiplication) Let W={(x,y)∈R2∣y=x2}.
Zero Vector: (0,0) satisfies 0=02, so (0,0)∈W.
Closure under Addition: Let u=(1,1) and v=(2,4). Both are in W (1=12,4=22).
u+v=(1+2,1+4)=(3,5). For (3,5) to be in W, 5 must equal 32=9. Since 5=9, u+v∈/W. Since W is not closed under addition, W is not a subspace of R2.
Important Points/Tips for Exam Preparation
Always Check All Three Conditions: For a set to be a subspace, it must* satisfy all three conditions (zero vector, closure under addition, closure under scalar multiplication). If any one fails, it's not a subspace. * Zero Vector is Key: The easiest condition to check first is often the zero vector. If 0V∈/W, then W is immediately not a subspace. * Homogeneous vs. Non-homogeneous: The solution set of a homogeneous linear system (Ax=0) is always* a subspace (the null space). The solution set of a non-homogeneous linear system (Ax=b where b=0) is never* a subspace (because it won't contain the zero vector). * Geometric Intuition: * In R2, subspaces are the origin, lines through the origin, and R2 itself. * In R3, subspaces are the origin, lines through the origin, planes through the origin, and R3 itself. * Any set that does not pass through the origin (e.g., y=mx+c with c=0) cannot be a subspace. * Span is a Subspace: The span of any set of vectors is inherently a subspace. You don't need to prove it from scratch every time; just state it. * Common Non-Subspaces: Be familiar with common examples that fail the conditions: * Sets not containing the origin (e.g., x=1). * Sets defined by inequalities (e.g., x≥0). * Sets defined by non-linear equations (e.g., y=x2, xy=0). * Practice with R3: Many PYQs involve identifying subspaces of R3. Practice applying the conditions to lines, planes, and other subsets of R3. * Vector Notation: Be comfortable with both coordinate notation (x,y,z) and column vector notation xyz.
---
Key Points
* Vector Space Definition: A set V is a vector space over a field F if it is closed under vector addition and scalar multiplication, satisfying 10 axioms (associativity, commutativity, existence of zero vector, additive inverse, distributive properties, etc.). * Subspace Definition: A non-empty subset W of a vector space V is a subspace if it is itself a vector space under the same operations. * Subspace Test (Two-Step): A non-empty subset W⊆V is a subspace if: 1. For any u,v∈W, u+v∈W (closure under addition). 2. For any u∈W and scalar c∈F, c⋅u∈W (closure under scalar multiplication). * Subspace Test (One-Step): A non-empty subset W⊆V is a subspace if for any u,v∈W and scalars c1,c2∈F, c1u+c2v∈W (closure under linear combinations). * Important Note: The zero vector 0V must always be in any subspace W. If 0V∈/W, then W is not a subspace. * Span of a Set: The span of a set of vectors S={v1,v2,…,vk} in V, denoted span(S), is the set of all possible linear combinations of vectors in S. span(S) is always a subspace of V.
span(S)={c1v1+c2v2+⋯+ckvk∣ci∈F}
* Linear Independence and Dependence: * A set of vectors {v1,…,vk} is linearly independent if the only solution to c1v1+⋯+ckvk=0 is c1=⋯=ck=0. * A set of vectors is linearly dependent if there exist scalars c1,…,ck, not all zero, such that c1v1+⋯+ckvk=0. * Basis of a Vector Space: A set of vectors B={b1,b2,…,bn} is a basis for a vector space V if: 1. B is linearly independent. 2. B spans V (i.e., span(B)=V). * Dimension of a Vector Space: The number of vectors in any basis for V is called the dimension of V, denoted dim(V). * Fundamental Subspaces of a Matrix: For an m×n matrix A: * Column Space (Image): Col(A)={Ax∣x∈Rn}, which is a subspace of Rm. Its dimension is the rank of A. * Null Space (Kernel): Null(A)={x∈Rn∣Ax=0}, which is a subspace of Rn. Its dimension is the nullity of A. * Row Space: Row(A)=Col(AT), which is a subspace of Rn. Its dimension is also the rank of A. * Rank-Nullity Theorem: For an m×n matrix A, the sum of the dimension of its column space (rank) and the dimension of its null space (nullity) equals the number of columns (n).
rank(A)+nullity(A)=n
* Direct Sum: If W1 and W2 are subspaces of V, their sum is W1+W2={w1+w2∣w1∈W1,w2∈W2}. V is the direct sum of W1 and W2, denoted V=W1⊕W2, if V=W1+W2 and W1∩W2={0}. * In this case, dim(V)=dim(W1)+dim(W2). * Quotient Space: If W is a subspace of V, the quotient space V/W is the set of all cosets v+W={v+w∣w∈W} for v∈V. V/W is a vector space with operations (v1+W)+(v2+W)=(v1+v2)+W and c(v+W)=(cv)+W. * dim(V/W)=dim(V)−dim(W). * Isomorphism: Two vector spaces V and W are isomorphic if there exists a bijective linear transformation (an isomorphism) between them. Isomorphic vector spaces have the same dimension. Any n-dimensional vector space over a field F is isomorphic to Fn.
🎯 Key Points to Remember
✓Master the core concepts in Vector Spaces before moving to advanced topics
✓Practice with previous year questions to understand exam patterns
✓Review short notes regularly for quick revision before exams