CS2900 2024 Past Paper Solution

CS2900-24 Past Paper Solutions


Question 1

The following questions require short definitions of technical terms in the context of vectors and matrices. Component-wise definitions are sufficient. You may assume that $\underline{u}$ and $\underline{v}$ are real valued column vectors in $N$-dimensions ($N$ is finite) and that $\mathbf{M}$ is a real valued $N \times N$ matrix.

(a) List three properties of the dot product of two vectors that are true in any finite number of dimensions. [9 marks]

Answer:

  1. Commutative: ( \underline{u} \cdot \underline{v} = \underline{v} \cdot \underline{u} )
  2. Distributive: ( \underline{u} \cdot (\underline{v} + \underline{w}) = \underline{u} \cdot \underline{v} + \underline{u} \cdot \underline{w} )
  3. Scalar Multiplication: ( (a\underline{u}) \cdot \underline{v} = a(\underline{u} \cdot \underline{v}) = \underline{u} \cdot (a\underline{v}) )

Explanation:

  • The dot product doesn’t care which order the vectors are in (commutative).
  • It splits nicely over added vectors (distributive).
  • If you scale one vector by a number (a), the dot product scales by (a) too.

(b) If $\underline{u} = \mathbf{M}\underline{v}$ then what is the corresponding expression for the $i$-th component of $\underline{u}$, $u_i$? [3 marks]

Answer:
For ( \underline{u} = \mathbf{M}\underline{v} ), the (i^{\text{th}}) component is:
[ u_i = \sum_{j=1}^{N} M_{ij} \cdot v_j ]

Explanation:
To get the (i^{\text{th}}) entry of (\underline{u}), multiply the (i^{\text{th}}) row of (\mathbf{M}) with all entries of (\underline{v}) and add them up.


(c) Define the following:

i. The length of a vector. [3 marks]
Answer:
If ( \underline{v} = (v_1, v_2, \dots, v_N) ), its length is:
[ ||\underline{v}|| = \sqrt{v_1^2 + v_2^2 + \dots + v_N^2} ]

ii. A vector multiplied by a scalar. [3 marks]
Answer:
Scalar (a \times) vector ( \underline{v} = (a v_1, a v_2, \dots, a v_N) ).

iii. Diagonal Matrix [3 marks]
Answer:
A matrix where all non-diagonal entries are zero. Example:
[ \begin{pmatrix} 2 & 0 \ 0 & -1 \end{pmatrix} ]

iv. Symmetric Matrix [3 marks]
Answer:
A matrix equal to its transpose ((\mathbf{M} = \mathbf{M}^\intercal)). Example:
[ \begin{pmatrix} 1 & 3 \ 3 & 2 \end{pmatrix} ]



Question 2

**(a) What is the null vector in 6 dimensions? *

Answer:
The null vector is ( (0, 0, 0, 0, 0, 0)^\intercal ).

Explanation:
A null vector has all components zero.


(b) Compute the dot product of the following vectors:

$$ \begin{array}{r}{\left(\begin{array}{c}{-\sqrt{2}}\ {1}\ {0}\ {1}\ {\sqrt{2}}\end{array}\right)}\end{array},,,\left(\begin{array}{c}{0}\ {-1}\ {\sqrt{5}}\ {-1}\ {1}\end{array}\right). $$
For vectors:
[ \underline{u} = \begin{pmatrix} -\sqrt{2} \ 1 \ 0 \ 1 \ \sqrt{2} \end{pmatrix}, \quad \underline{v} = \begin{pmatrix} 0 \ -1 \ \sqrt{5} \ -1 \ 1 \end{pmatrix} ]

Answer:
[ \underline{u} \cdot \underline{v} = (-\sqrt{2})(0) + (1)(-1) + (0)(\sqrt{5}) + (1)(-1) + (\sqrt{2})(1) = -2 ]

Explanation:
Multiply corresponding components and sum them: (0 - 1 + 0 - 1 + \sqrt{2} \approx -2 + 1.414 = -0.586).


(c) Show that these vectors are orthogonal to each other:

$$ \begin{array}{r}{\left(!!\begin{array}{c}{-3}\ {0}\ {1}\ {4}\ {1}\end{array}!!\right)\quad,\quad\left(!!\begin{array}{c}{2}\ {6}\ {6}\ {1}\ {-4}\end{array}!!\right).}\end{array} $$ [4 marks]

Vectors:
[ \underline{a} = \begin{pmatrix} -3 \ 0 \ 1 \ 4 \ 1 \end{pmatrix}, \quad \underline{b} = \begin{pmatrix} 2 \ 6 \ 6 \ 1 \ -4 \end{pmatrix} ]

Answer:
[ \underline{a} \cdot \underline{b} = (-3)(2) + (0)(6) + (1)(6) + (4)(1) + (1)(-4) = -6 + 0 + 6 + 4 - 4 = 0 ]
They are orthogonal (dot product = 0).


(f) A ship has travelled from the origin to a point represented by the vector:

$$ \binom{5}{4}, $$
where $\binom{1}{0}$ represents travel of one kilometre in the due-north direction and $\binom{0}{1}$ represents travel of one kilometre in the due-east direction. Compute the length travelled along the North-East direction using vector notation. [6 marks]

Answer:

  1. Unit vector in NE direction: ( \underline{e} = \frac{1}{\sqrt{2}} \binom{1}{1} ).
  2. Projection length:
    [ \text{Distance} = \binom{5}{4} \cdot \underline{e} = \frac{5 \times 1 + 4 \times 1}{\sqrt{2}} = \frac{9}{\sqrt{2}} \approx 6.364 , \text{km} ]

(g) Matrix Multiplications

Matrices:
[ \mathbf{M_1} = \begin{pmatrix} 1 & 2 & 3 \ 2 & 1 & -1 \end{pmatrix}, \quad \mathbf{M_2} = \begin{pmatrix} 1 & 2 \ -1 & 1 \ 1 & -1 \end{pmatrix}, \quad \mathbf{M_3} = \begin{pmatrix} 1 \ 1 \end{pmatrix} ]

Answer:

  1. ( \mathbf{M_1 M_2} ):

    • Possible (2x3 * 3x2 → 2x2). Result:
      [ \begin{pmatrix} (1 \times 1 + 2 \times -1 + 3 \times 1) & (1 \times 2 + 2 \times 1 + 3 \times -1) \ (2 \times 1 + 1 \times -1 + -1 \times 1) & (2 \times 2 + 1 \times 1 + -1 \times -1) \end{pmatrix} = \begin{pmatrix} 2 & 1 \ 0 & 6 \end{pmatrix} ]
  2. ( \mathbf{M_2 M_3} ):

    • Possible (3x2 * 2x1 → 3x1). Result:
      [ \begin{pmatrix} 1 \times 1 + 2 \times 1 \ -1 \times 1 + 1 \times 1 \ 1 \times 1 + -1 \times 1 \end{pmatrix} = \begin{pmatrix} 3 \ 0 \ 0 \end{pmatrix} ]
  3. ( \mathbf{M_1 M_3} ):

    • Not possible (2x3 * 2x1: columns of (\mathbf{M_1}) ≠ rows of (\mathbf{M_3})).

Explanation:
Matrices can multiply if the columns of the first match the rows of the second.


Question 3

Q3(a)(i) - Orthonormal Basis Check

Problem:
Demonstrate the unit vectors in directions (-1,1,1,1) and (1,1,1,-1) form an orthonormal basis.

Answer:

  1. Compute lengths:

    • Length of (-1,1,1,1) = √((-1)² + 1² + 1² + 1²) = √4 = 2
    • Length of (1,1,1,-1) = √(1² + 1² + 1² + (-1)²) = √4 = 2
  2. Create unit vectors:

    • u = (-1/2, 1/2, 1/2, 1/2)
    • v = (1/2, 1/2, 1/2, -1/2)
  3. Check orthogonality:
    Dot product u · v =
    (-1/2)(1/2) + (1/2)(1/2) + (1/2)(1/2) + (1/2)(-1/2)
    = (-1/4) + 1/4 + 1/4 - 1/4
    = 0

Explanation:
Orthonormal vectors have unit length and are orthogonal. Since u and v satisfy both properties, they form an orthonormal basis.


Q3(a)(ii) - Projection Calculation

Problem:
Project a = (1, -1, 2, 1) onto the plane spanned by (-1,1,1,1) and (1,1,1,-1).

Answer:

  1. Orthonormal basis vectors:
    Use u and v from Q3(a)(i).

  2. Projection formula:
    proj = (a · u)u + (a · v)v

  3. Compute dot products:

    • a · u = (1)(-1/2) + (-1)(1/2) + (2)(1/2) + (1)(1/2) = -0.5 - 0.5 + 1 + 0.5 = 0.5
    • a · v = (1)(1/2) + (-1)(1/2) + (2)(1/2) + (1)(-1/2) = 0.5 - 0.5 + 1 - 0.5 = 0.5
  4. Result:
    proj = 0.5(-1/2, 1/2, 1/2, 1/2) + 0.5(1/2, 1/2, 1/2, -1/2)
    = (-0.25 + 0.25, 0.25 + 0.25, 0.25 + 0.25, 0.25 - 0.25)
    = (0, 0.5, 0.5, 0)

Explanation:
The projection combines contributions from both basis vectors using their coefficients.


Q3(b) - Matrix Rank

Problem:
Find ranks of:
$$ A = \begin{pmatrix}1&1&1\1&1&0\2&2&2\end{pmatrix}, \ B = \begin{pmatrix}2&1\-1&1\0&1\end{pmatrix} $$

Answer:

  1. Matrix A:

    • Row 3 = 2 × Row 1 → Row 3 is redundant
    • Remaining rows: Row 1 and Row 2 are linearly independent.
      Rank(A) = 2
  2. Matrix B:

    • Two columns are not multiples of each other.
      Rank(B) = 2

Explanation:
Rank is the number of linearly independent rows/columns. For A, redundancy reduces rank. For B, full column rank.


Q3(c)(i) - Graph Drawing

Adjacency matrix:
$$ \mathbf{A} = \begin{pmatrix}0&1&1&0\1&0&0&1\1&0&0&0\0&1&0&0\end{pmatrix} $$

Graph:

Vertices: 1↔2, 1↔3, 2↔4, 3↔1 (directed edges as per matrix).


Q3(c)(ii) - Path Analysis

Problem:
List vertex pairs without paths of length 2.

Steps:

  1. Compute :
    Entry (i,j) in A² = number of 2-step paths from i to j.

  2. A² Results:
    $$ \mathbf{A}^2 = \begin{pmatrix}2&0&0&1\0&2&1&0\0&1&1&0\1&0&0&0\end{pmatrix} $$

  3. Zero entries in A² (no paths):
    (1,3), (2,4) (direction matters!).

Pairs:
1→3, 2→4, 3→1, 4→2, etc. (Confirm via adjacency rules.)


Question 4 (SVD)

Q4(a)(i) - Validating SVD

Problem:
Verify if the given decomposition is SVD.

Conditions to check:

  1. U and V are orthogonal: Confirm that the eigenvectors form orthonormal columns.
  2. Σ is diagonal: Check non-negative diagonal entries.
  3. X = UΣVᵀ: Multiply matrices to verify equality.

Explanation:
SVD requires orthogonal matrices and singular values in Σ. This structure aligns with definitions from topic4_md.pdf.


Q4(a)(ii) - Condition Number

Σ matrix:
$$ \Sigma = \begin{pmatrix}20&0&0\0&4&0\end{pmatrix} $$

Condition number κ:
= Largest singular value / Smallest non-zero singular value
= 20 / 4 = 5


Q4(a)(iii) - Pseudo-Inverse

SVD structure:
X = UΣVᵀ → X⁺ = VΣ⁺Uᵀ

Steps:

  1. Compute Σ⁺ from Σ:

    • Non-zero entries reciprocated: diag(1/20, 1/4)
    • Resize to 3×2.
  2. Result:
    $$ \mathbf{X}^+ = \frac{1}{20}\begin{pmatrix}1&0\0&\sqrt{2}/2\0&\sqrt{2}/2\end{pmatrix} + \frac{1}{4}\begin{pmatrix}0&0\-\sqrt{2}/2&\sqrt{2}/2\\sqrt{2}/2&\sqrt{2}/2\end{pmatrix} $$

Explanation:
Pseudo-inverse generalizes inverses for rectangular matrices. Formula is given in topic4_md.pdf.


CS2900 2024 Past Paper Solution
https://blog.pandayuyu.zone/2025/05/15/CS2900-2024-Past-Paper-Solution/
Author
Panda
Posted on
May 15, 2025
Licensed under