Multivariable Calculus

Vectors

  • Magnitude
  • Direction

Sample 3d vector notation :

A=a1i^+a2j^+a3k^\vec{A} = a_1\hat{i} + a_2\hat{j} + a_3\hat{k} B=b1i^+b2j^+b3k^\vec{B} = b_1\hat{i} + b_2\hat{j} + b_3\hat{k}

Length of the vector : A=a12+a22+a32|\vec{A}| = \sqrt{a_1^2 + a_2^2 + a_3^2}

Addition

Sum of two vectors is the diagonal of the parallelogram formed with those two vectors

A+B=(a1+b1)i^+(a2+b2)j^+(a3+b3)k^\vec{A} + \vec{B} = (a_1 + b_1)\hat{i} + (a_2 + b_2)\hat{j} + (a_3 + b_3)\hat{k}

Dot Product

It is a scalar

AB=aibi\vec{A} \centerdot \vec{B} = \sum{a_ib_i}

Geometrically AB=ABcos(θ)\vec{A} \centerdot \vec{B} = |\vec{A}||\vec{B}| cos(\theta)

Cross Product

It is a vector

AXB=det(A,B)\vec{A} X \vec{B} = det(\vec{A}, \vec{B})

AXB\vec{A} X \vec{B} signifies the area of the parallelogram formed with those two vectors

Magnitude = ABsin(θ)|\vec{A}||\vec{B}| sin(\theta) Direction = perpendicular to both the vectors (Right hand thumb rule)

In case of three dimensions AXBXC\vec{A} X \vec{B} X \vec{C} signifies the volume of paralleopiped formed by A, B, C

Cross product of two vectors in 3d space AXB=i^j^k^a1a2a3b1b2b3 \vec{A} X \vec{B} = \begin{vmatrix} \hat{i} & \hat{j} & \hat{k} \\ a_1 & a_2 & a_3 \\ b_1 & b_2 & b_3 \end{vmatrix}

Planes

Given three points in a plane P1,P2,P3P_1, P_2, P_3. Let PP is a point in the plane then,

P1P(P1P2XP1P3)=0\vec{P_1P}\centerdot(\vec{P_1P_2}X\vec{P_1P_3}) = 0

det(P1P,P1P2,P1P3)=0\equiv det(\vec{P_1P}, \vec{P_1P_2}, \vec{P_1P_3}) = 0

Equations of planes

ax+by+cz=d ax + by + cz = d

Normal vector to the plane : a^,b^,c^\langle \hat{a}, \hat{b}, \hat{c} \rangle

Parametric equations of line

x(t)=a1t+b1,y(t)=a2t+b2,z(t)=a3t+b3 x(t) = a_1t+b_1, y(t) = a_2t+b_2, z(t) = a_3t+b_3

Matrices

AX=BX=A1B AX = B \equiv X = A^{-1}B

A1=adj(A)/det(A) A^{-1} = adj(A)/det(A)

In 3D system, in general two planes intersect a line and third plane intersects the line at a point

Other possible solutions are a line, a plane

Rank of a Matrix

Rank of a matrix is number of linearly independent columns or number of linearly independent rows

Trace of a Matrix

Trace of a matrix is sum of its diagonal elements

trace(A)=i(Aii)=i(λi) trace(A) = \sum_{i}(A_{ii}) = \sum_{i}(\lambda_i)

Here λ\lambda is eigen values of the matrix

Inverse of a Matrix

A1=Adj(A)/det(A) A^{-1} = Adj(A)/det(A)

Determinant of a Matrix

Det(A)=i(λi) Det(A) = \prod_{i}(\lambda_i)

Orthogonal Matrix

AAT=I=ATA AA^T = I = A^TA

Eigen values and Eigen vectors of a Matrix

For an n x n square matrix A, e is an eigen vector of AA with eigen value λ\lambda if

Ae=λe(AλI)e=0det(AλI)=0Ae = \lambda e \Rightarrow (A - \lambda I)e = 0\Rightarrow det(A - \lambda I) = 0

results matching ""

    No results matching ""