In the previous section, we saw that scalar product of vectors obeys distributive law. In this section, we will see a practical application of the distributive law. We will also see a few more properties of the scalar product.
We will write the steps:
1. Consider the scalar product $\mathbf\small{\vec{A}.\vec{B}}$
2. Let the rectangular components of $\mathbf\small{\vec{B}}$ be $\mathbf\small{B_x\,\hat{i}\;\; \text{and}\;\;B_y\,\hat{j}}$
3. Then we can write: $\mathbf\small{\vec{A}.\vec{B}=\vec{A}.[B_x\,\hat{i}+B_y\,\hat{j}]}$
• Applying distributive law, we get: $\mathbf\small{\vec{A}.\vec{B}=\vec{A}.(B_x\,\hat{i})+\vec{A}.(B_y\,\hat{j})}$
4. Now, let the rectangular components of $\mathbf\small{\vec{A}}$ be $\mathbf\small{A_x\,\hat{i}\;\; \text{and}\;\;A_y\,\hat{j}}$
• Then the result in (3) becomes: $\mathbf\small{\vec{A}.\vec{B}=[A_x\,\hat{i}+A_y\,\hat{j}].(B_x\,\hat{i})+[A_x\,\hat{i}+A_y\,\hat{j}].(B_y\,\hat{j})}$
5. Applying distributive law, we get:
$\mathbf\small{\vec{A}.\vec{B}=(A_x\,\hat{i}.B_x\,\hat{i})+(A_y\,\hat{j}.B_x\,\hat{i})+(A_x\,\hat{i}.B_y\,\hat{j})+(A_y\,\hat{j}.B_y\,\hat{j})}$
6. There are 4 terms on the right side. We will analyse each of them separately.
First term is $\mathbf\small{(A_x\,\hat{i}.B_x\,\hat{i})}$
• This is a dot product of two vectors $\mathbf\small{A_x\,\hat{i}\;\; \text{and}\;\;B_x\,\hat{i}}$
• To calculate the dot product, we want three items:
(i) The magnitude of the first vector
• In this case, it is $\mathbf\small{A_x}$
(ii) The magnitude of the second vector
• In this case, it is $\mathbf\small{B_x}$
(iii) Cosine of the angle θ between the two vectors
• In this case, both the vectors lie along the x axis. That means they are parallel
• The angle θ between them will be 0
• So cos θ = cos 0 = 1
■ Thus we get: $\mathbf\small{(A_x\,\hat{i}.B_x\,\hat{i})=A_x \times B_x \times \cos 0 =A_x \times B_x \times 1 = A_x \times B_x }$
Second term is $\mathbf\small{(A_y\,\hat{j}.B_x\,\hat{i})}$
• This is a dot product of two vectors $\mathbf\small{A_y\,\hat{j}\;\; \text{and}\;\;B_x\,\hat{i}}$
• To calculate the dot product, we want three items:
(i) The magnitude of the first vector
• In this case, it is $\mathbf\small{A_y}$
(ii) The magnitude of the second vector
• In this case, it is $\mathbf\small{B_x}$
(iii) Cosine of the angle θ between the two vectors
• In this case, the first vector lies along the y axis and the second vector lies along the x axis. That means they are perpendicular
• The angle θ between them will be 90o
• So cos θ = cos 90 = 0
(This is obvious because, we cannot project a vector onto another perpendicular vector)
■ Thus we get: $\mathbf\small{(A_y\,\hat{j}.B_x\,\hat{i})=A_y \times B_x \times \cos 90 =A_y \times B_x \times 0 = 0}$
Third term is $\mathbf\small{(A_x\,\hat{i}.B_y\,\hat{j})}$
• This is a dot product of two vectors $\mathbf\small{A_x\,\hat{i}\;\; \text{and}\;\;B_y\,\hat{j}}$
• To calculate the dot product, we want three items:
(i) The magnitude of the first vector
• In this case, it is $\mathbf\small{A_x}$
(ii) The magnitude of the second vector
• In this case, it is $\mathbf\small{B_y}$
(iii) Cosine of the angle θ between the two vectors
• In this case, the first vector lies along the x axis and the second vector lies along the y axis. That means they are perpendicular
• The angle θ between them will be 90o
• So cos θ = cos 90 = 0
(This is obvious because, we cannot project a vector onto another perpendicular vector)
■ Thus we get: $\mathbf\small{(A_x\,\hat{i}.B_y\,\hat{j})=A_x \times B_y \times \cos 90 =A_x \times B_y \times 0 = 0}$
Fourth term is $\mathbf\small{(A_y\,\hat{j}.B_y\,\hat{j})}$
• This is a dot product of two vectors $\mathbf\small{A_y\,\hat{j}\;\; \text{and}\;\;B_y\,\hat{j}}$
• To calculate the dot product, we want three items:
(i) The magnitude of the first vector
• In this case, it is $\mathbf\small{A_y}$
(ii) The magnitude of the second vector
• In this case, it is $\mathbf\small{B_y}$
(iii) Cosine of the angle θ between the two vectors
• In this case, both the vectors lie along the y axis. That means they are parallel
• The angle θ between them will be 0
• So cos θ = cos 0 = 1
■ Thus we get: $\mathbf\small{(A_y\,\hat{j}.B_y\,\hat{j})=A_y \times B_y \times \cos 0 =A_y \times B_y \times 1 = A_y \times B_y }$
7. Out of the four terms, the second and third terms become zero. So the final result is:
$\mathbf\small{\vec{A}.\vec{B}=(A_x\,\hat{i}.B_x\,\hat{i})+(A_y\,\hat{j}.B_y\,\hat{j})}$
That is: $\mathbf\small{\vec{A}.\vec{B}=A_xB_x+A_yB_y}$
■ We will write it as a result for easy reference
Eq.6.5:
$\mathbf\small{\text{If}\;\;\vec{A}=A_x\,\hat{i}+A_y\,\hat{j}}$
$\mathbf\small{\text{And}\;\;\vec{B}=B_x\,\hat{i}+B_y\,\hat{j}}$
$\mathbf\small{\text{Then}\;\;\vec{A}.\vec{B}=A_xB_x+A_yB_y}$
8. From the above steps, we get two important results:
(i) The scalar product of two parallel vectors is simply the product of their magnitudes
• This is clear from the analysis of first and fourth terms
(ii) The scalar product of two perpendicular vectors is zero
• This is clear from the analysis of second and third terms
■ We will write them as results for easy reference
Eq.6.6:
$\mathbf\small{\text{If}\;\;\vec{A}\;\;\text{and}\;\;\vec{B}\;\;\text{are parallel}}$
$\mathbf\small{\text{Then}\;\;\vec{A}.\vec{B}=|\vec{A}|\times |\vec{B}|}$
Eq.6.7:
$\mathbf\small{\text{If}\;\;\vec{A}\;\;\text{and}\;\;\vec{B}\;\;\text{are perpendicular}}$
$\mathbf\small{\text{Then}\;\;\vec{A}.\vec{B}=0}$
9. The result in (7) can be extended to three dimensions also. We get:
Eq.6.8:
$\mathbf\small{\text{If}\;\;\vec{A}=A_x\,\hat{i}+A_y\,\hat{j}+A_z\,\hat{k}}$
$\mathbf\small{\text{And}\;\;\vec{B}=B_x\,\hat{i}+B_y\,\hat{j}+B_z\,\hat{k}}$
$\mathbf\small{\text{Then}\;\;\vec{A}.\vec{B}=A_xB_x+A_yB_y+A_zB_z}$
Just as we calculated $\mathbf\small{\vec{A}.\vec{B}}$ above, we can calculate $\mathbf\small{\vec{A}.\vec{A}}$ also
We will write the steps:
1. Consider the scalar product $\mathbf\small{\vec{A}.\vec{A}}$
2. Let the rectangular components of $\mathbf\small{\vec{A}}$ be $\mathbf\small{A_x\,\hat{i}\;\; \text{and}\;\;A_y\,\hat{j}}$
3. Then we can write: $\mathbf\small{\vec{A}.\vec{A}=\vec{A}.[A_x\,\hat{i}+A_y\,\hat{j}]}$
Applying distributive law, we get: $\mathbf\small{\vec{A}.\vec{A}=\vec{A}.(A_x\,\hat{i})+\vec{A}.(A_y\,\hat{j})}$
4. Now, split the other $\mathbf\small{\vec{A}}$ also
Then the result in (3) becomes: $\mathbf\small{\vec{A}.\vec{A}=[A_x\,\hat{i}+A_y\,\hat{j}].(A_x\,\hat{i})+[A_x\,\hat{i}+A_y\,\hat{j}].(A_y\,\hat{j})}$
5. Applying distributive law, we get:
$\mathbf\small{\vec{A}.\vec{A}=(A_x\,\hat{i}.A_x\,\hat{i})+(A_y\,\hat{j}.A_x\,\hat{i})+(A_x\,\hat{i}.A_y\,\hat{j})+(A_y\,\hat{j}.A_y\,\hat{j})}$
6. There are 4 terms on the right side
• Each term is a dot product of two vectors
• If the two vectors in a term are perpendicular to each other, that term will become zero
• Thus the second and third terms will become zero
7. So the final result is:
$\mathbf\small{\vec{A}.\vec{A}=(A_x\,\hat{i}.A_x\,\hat{i})+(A_y\,\hat{j}.A_y\,\hat{j})}$
That is: $\mathbf\small{\vec{A}.\vec{A}=A_xA_x+A_yA_y}$
That is: $\mathbf\small{\vec{A}.\vec{A}=A_x^2+A_y^2}$
■ We will write it as a result for easy reference
Eq.6.9:
$\mathbf\small{\text{If}\;\;\vec{A}=A_x\,\hat{i}+A_y\,\hat{j}}$
$\mathbf\small{\text{Then}\;\;\vec{A}.\vec{A}=A_x^2+A_y^2}$
8. The result in (7) can be extended to three dimensions also. We get:
$\mathbf\small{\vec{A}.\vec{A}=(A_x\,\hat{i}+A_y\,\hat{j}+A_z\,\hat{k}).(A_x\,\hat{i}+A_y\,\hat{j}+A_z\,\hat{k})=A_x^2+A_y^2+A_z^2}$
■ We will write it as a result for easy reference
Eq.6.10:
$\mathbf\small{\text{If}\;\;\vec{A}=A_x\,\hat{i}+A_y\,\hat{j}+A_z\,\hat{k}}$
$\mathbf\small{\text{Then}\;\;\vec{A}.\vec{A}=A_x^2+A_y^2+A_z^2}$
Next we will prove that, if $\mathbf\small{\vec{B}=B_x\:\hat{i}+B_y\:\hat{j}}$,
Then: $\mathbf\small{\lambda\vec{B}=\lambda(B_x\:\hat{i}+B_y\:\hat{j})=(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$
Where $\mathbf\small{\lambda}$ is a scalar quantity
• A graphic representation of this property is shown in fig.6.5 below:
• $\mathbf\small{B_x\,\hat{i}\; \text{and}\;B_y\,\hat{j}}$ are the rectangular components of $\mathbf\small{\vec{B}}$
• $\mathbf\small{B_x\,\hat{i}}$ is multiplied by a scalar $\mathbf\small{\lambda}$
♦ This gives a new vector
• $\mathbf\small{B_y\,\hat{j}}$ is multiplied by the same scalar $\mathbf\small{\lambda}$
♦ This gives another new vector
• There will be a resultant for these two new vectors
• This new resultant is $\mathbf\small{\vec{B}}$ multiplied by $\mathbf\small{\lambda}$
■ That means:
♦ The magnitude of the new resultant is: $\mathbf\small{\lambda |\vec{B}|}$
♦ The direction of the new resultant is same as that of $\mathbf\small{\vec{B}}$
We will write the steps to prove it:
1. On the left side, we are multiplying a vector by a scalar $\mathbf\small{\lambda}$
• Then the magnitude of the new vector will become ${\mathbf\small{\lambda |\vec{B}|}}$
• The direction of the new vector will be same as the original
• We have to arrive at the above results from the right side also. Let us try:
2. On the right side, we have a vector: $\mathbf\small{(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$
• The magnitude of this vector will be: $\mathbf\small{\sqrt{(\lambda B_x)^2+(\lambda B_y)^2}=\sqrt{\lambda^2 B_x^2+\lambda^2 B_y^2}}$ (Using Eq.4.2)
$\mathbf\small{=\sqrt{\lambda^2 (B_x^2+ B_y^2)}=\lambda \sqrt{(B_x^2+ B_y^2)}=\lambda |\vec{B}|}$
• So magnitude of $\mathbf\small{(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$ is same as the magnitude of $\mathbf\small{\lambda\vec{B}}$
3. The direction of $\mathbf\small{(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$ is given by:
$\mathbf\small{\theta =\tan ^{-1}\frac{\lambda B_y}{\lambda B_x}=\tan ^{-1}\frac{B_y}{B_x}}$
• So direction of $\mathbf\small{(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$ is same as the direction of $\mathbf\small{\lambda\vec{B}}$
■Thus we proved that: $\mathbf\small{\lambda\vec{B}=\lambda(B_x\:\hat{i}+B_y\:\hat{j})=(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$
■ We will write it as a result for easy reference
Eq.6.11:
$\mathbf\small{\text{If}\;\;\vec{B}=B_x\,\hat{i}+B_y\,\hat{j}}$
$\mathbf\small{\text{Then}\;\;\lambda\vec{B}=(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$
We will write the steps:
1. First we will work on the left side
$\mathbf\small{\lambda\vec{B}=(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$
[Using Eq.6.11]
• Now, multiplying both sides by $\mathbf\small{\vec{A}}$, we get:
$\mathbf\small{\vec{A}.(\lambda\vec{B})=\vec{A}[(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}]}$
• Applying distributive law, we get:
$\mathbf\small{\vec{A}.(\lambda\vec{B})=\vec{A}.[(\lambda B_x)\:\hat{i}]+\vec{A.}[(\lambda B_y)\:\hat{j}}]$
• Now we put $\mathbf\small{\vec{A}=(A_x\:\hat{i}+A_y\:\hat{j})}$. We get:
$\mathbf\small{\vec{A}.(\lambda \vec{B})=(A_x\:\hat{i}+A_y\:\hat{j}).[(\lambda B_x)\:\hat{i}]+(A_x\:\hat{i}+A_y\:\hat{j}).[(\lambda B_y)\:\hat{j}}]$
• Applying distributive law, we get:
$\mathbf\small{\vec{A}.(\lambda \vec{B})=A_x\:\hat{i}.[(\lambda B_x)\:\hat{i}]+A_y\:\hat{j}.[(\lambda B_x)\:\hat{i}]+A_x\:\hat{i}.[(\lambda B_y)\:\hat{j}]+A_y\:\hat{j}.[(\lambda B_y)\:\hat{j}]}$
2. There are 4 terms in the above result
• Each term is a dot product of two vectors
• If the two vectors in a term are perpendicular to each other, that term will become zero
• Thus the second and third terms will become zero
• We get: $\mathbf\small{\vec{A}.(\lambda \vec{B})=A_x\:\hat{i}.[(\lambda B_x)\:\hat{i}]+A_y\:\hat{j}.[(\lambda B_y)\:\hat{j}]}$
$\mathbf\small{=A_x(\lambda B_x)+A_y(\lambda B_y)}$
$\mathbf\small{=\lambda A_xB_x+\lambda A_yB_y}$
3. Now we work on the right side
• On the right side, we have $\mathbf\small{\lambda(\vec{A}.\vec{B})}$
• But using Eq6.5, $\mathbf\small{\vec{A}.\vec{B}=A_xB_x+A_yB_y}$
• So the right side becomes: $\mathbf\small{\lambda (\vec{A}.\vec{B})=\lambda (A_xB_x+A_yB_y)=\lambda A_xB_x+\lambda A_yB_y}$
• This is same as the result in (2)
• So L.H.S = R.H.S
• Thus we proved that $\mathbf\small{\vec{A}.(\lambda \vec{B})=\lambda(\vec{A}.\vec{B})}$
■ We will write it as a result for easy reference
Eq.6.12:
$\mathbf\small{\vec{A}.(\lambda \vec{B})=\lambda(\vec{A}.\vec{B})}$
We will write the steps:
1. Consider the scalar product $\mathbf\small{\vec{A}.\vec{B}}$
2. Let the rectangular components of $\mathbf\small{\vec{B}}$ be $\mathbf\small{B_x\,\hat{i}\;\; \text{and}\;\;B_y\,\hat{j}}$
3. Then we can write: $\mathbf\small{\vec{A}.\vec{B}=\vec{A}.[B_x\,\hat{i}+B_y\,\hat{j}]}$
• Applying distributive law, we get: $\mathbf\small{\vec{A}.\vec{B}=\vec{A}.(B_x\,\hat{i})+\vec{A}.(B_y\,\hat{j})}$
4. Now, let the rectangular components of $\mathbf\small{\vec{A}}$ be $\mathbf\small{A_x\,\hat{i}\;\; \text{and}\;\;A_y\,\hat{j}}$
• Then the result in (3) becomes: $\mathbf\small{\vec{A}.\vec{B}=[A_x\,\hat{i}+A_y\,\hat{j}].(B_x\,\hat{i})+[A_x\,\hat{i}+A_y\,\hat{j}].(B_y\,\hat{j})}$
5. Applying distributive law, we get:
$\mathbf\small{\vec{A}.\vec{B}=(A_x\,\hat{i}.B_x\,\hat{i})+(A_y\,\hat{j}.B_x\,\hat{i})+(A_x\,\hat{i}.B_y\,\hat{j})+(A_y\,\hat{j}.B_y\,\hat{j})}$
6. There are 4 terms on the right side. We will analyse each of them separately.
First term is $\mathbf\small{(A_x\,\hat{i}.B_x\,\hat{i})}$
• This is a dot product of two vectors $\mathbf\small{A_x\,\hat{i}\;\; \text{and}\;\;B_x\,\hat{i}}$
• To calculate the dot product, we want three items:
(i) The magnitude of the first vector
• In this case, it is $\mathbf\small{A_x}$
(ii) The magnitude of the second vector
• In this case, it is $\mathbf\small{B_x}$
(iii) Cosine of the angle θ between the two vectors
• In this case, both the vectors lie along the x axis. That means they are parallel
• The angle θ between them will be 0
• So cos θ = cos 0 = 1
■ Thus we get: $\mathbf\small{(A_x\,\hat{i}.B_x\,\hat{i})=A_x \times B_x \times \cos 0 =A_x \times B_x \times 1 = A_x \times B_x }$
Second term is $\mathbf\small{(A_y\,\hat{j}.B_x\,\hat{i})}$
• This is a dot product of two vectors $\mathbf\small{A_y\,\hat{j}\;\; \text{and}\;\;B_x\,\hat{i}}$
• To calculate the dot product, we want three items:
(i) The magnitude of the first vector
• In this case, it is $\mathbf\small{A_y}$
(ii) The magnitude of the second vector
• In this case, it is $\mathbf\small{B_x}$
(iii) Cosine of the angle θ between the two vectors
• In this case, the first vector lies along the y axis and the second vector lies along the x axis. That means they are perpendicular
• The angle θ between them will be 90o
• So cos θ = cos 90 = 0
(This is obvious because, we cannot project a vector onto another perpendicular vector)
■ Thus we get: $\mathbf\small{(A_y\,\hat{j}.B_x\,\hat{i})=A_y \times B_x \times \cos 90 =A_y \times B_x \times 0 = 0}$
Third term is $\mathbf\small{(A_x\,\hat{i}.B_y\,\hat{j})}$
• This is a dot product of two vectors $\mathbf\small{A_x\,\hat{i}\;\; \text{and}\;\;B_y\,\hat{j}}$
• To calculate the dot product, we want three items:
(i) The magnitude of the first vector
• In this case, it is $\mathbf\small{A_x}$
(ii) The magnitude of the second vector
• In this case, it is $\mathbf\small{B_y}$
(iii) Cosine of the angle θ between the two vectors
• In this case, the first vector lies along the x axis and the second vector lies along the y axis. That means they are perpendicular
• The angle θ between them will be 90o
• So cos θ = cos 90 = 0
(This is obvious because, we cannot project a vector onto another perpendicular vector)
■ Thus we get: $\mathbf\small{(A_x\,\hat{i}.B_y\,\hat{j})=A_x \times B_y \times \cos 90 =A_x \times B_y \times 0 = 0}$
Fourth term is $\mathbf\small{(A_y\,\hat{j}.B_y\,\hat{j})}$
• This is a dot product of two vectors $\mathbf\small{A_y\,\hat{j}\;\; \text{and}\;\;B_y\,\hat{j}}$
• To calculate the dot product, we want three items:
(i) The magnitude of the first vector
• In this case, it is $\mathbf\small{A_y}$
(ii) The magnitude of the second vector
• In this case, it is $\mathbf\small{B_y}$
(iii) Cosine of the angle θ between the two vectors
• In this case, both the vectors lie along the y axis. That means they are parallel
• The angle θ between them will be 0
• So cos θ = cos 0 = 1
■ Thus we get: $\mathbf\small{(A_y\,\hat{j}.B_y\,\hat{j})=A_y \times B_y \times \cos 0 =A_y \times B_y \times 1 = A_y \times B_y }$
7. Out of the four terms, the second and third terms become zero. So the final result is:
$\mathbf\small{\vec{A}.\vec{B}=(A_x\,\hat{i}.B_x\,\hat{i})+(A_y\,\hat{j}.B_y\,\hat{j})}$
That is: $\mathbf\small{\vec{A}.\vec{B}=A_xB_x+A_yB_y}$
■ We will write it as a result for easy reference
Eq.6.5:
$\mathbf\small{\text{If}\;\;\vec{A}=A_x\,\hat{i}+A_y\,\hat{j}}$
$\mathbf\small{\text{And}\;\;\vec{B}=B_x\,\hat{i}+B_y\,\hat{j}}$
$\mathbf\small{\text{Then}\;\;\vec{A}.\vec{B}=A_xB_x+A_yB_y}$
8. From the above steps, we get two important results:
(i) The scalar product of two parallel vectors is simply the product of their magnitudes
• This is clear from the analysis of first and fourth terms
(ii) The scalar product of two perpendicular vectors is zero
• This is clear from the analysis of second and third terms
■ We will write them as results for easy reference
Eq.6.6:
$\mathbf\small{\text{If}\;\;\vec{A}\;\;\text{and}\;\;\vec{B}\;\;\text{are parallel}}$
$\mathbf\small{\text{Then}\;\;\vec{A}.\vec{B}=|\vec{A}|\times |\vec{B}|}$
Eq.6.7:
$\mathbf\small{\text{If}\;\;\vec{A}\;\;\text{and}\;\;\vec{B}\;\;\text{are perpendicular}}$
$\mathbf\small{\text{Then}\;\;\vec{A}.\vec{B}=0}$
9. The result in (7) can be extended to three dimensions also. We get:
Eq.6.8:
$\mathbf\small{\text{If}\;\;\vec{A}=A_x\,\hat{i}+A_y\,\hat{j}+A_z\,\hat{k}}$
$\mathbf\small{\text{And}\;\;\vec{B}=B_x\,\hat{i}+B_y\,\hat{j}+B_z\,\hat{k}}$
$\mathbf\small{\text{Then}\;\;\vec{A}.\vec{B}=A_xB_x+A_yB_y+A_zB_z}$
Just as we calculated $\mathbf\small{\vec{A}.\vec{B}}$ above, we can calculate $\mathbf\small{\vec{A}.\vec{A}}$ also
We will write the steps:
1. Consider the scalar product $\mathbf\small{\vec{A}.\vec{A}}$
2. Let the rectangular components of $\mathbf\small{\vec{A}}$ be $\mathbf\small{A_x\,\hat{i}\;\; \text{and}\;\;A_y\,\hat{j}}$
3. Then we can write: $\mathbf\small{\vec{A}.\vec{A}=\vec{A}.[A_x\,\hat{i}+A_y\,\hat{j}]}$
Applying distributive law, we get: $\mathbf\small{\vec{A}.\vec{A}=\vec{A}.(A_x\,\hat{i})+\vec{A}.(A_y\,\hat{j})}$
4. Now, split the other $\mathbf\small{\vec{A}}$ also
Then the result in (3) becomes: $\mathbf\small{\vec{A}.\vec{A}=[A_x\,\hat{i}+A_y\,\hat{j}].(A_x\,\hat{i})+[A_x\,\hat{i}+A_y\,\hat{j}].(A_y\,\hat{j})}$
5. Applying distributive law, we get:
$\mathbf\small{\vec{A}.\vec{A}=(A_x\,\hat{i}.A_x\,\hat{i})+(A_y\,\hat{j}.A_x\,\hat{i})+(A_x\,\hat{i}.A_y\,\hat{j})+(A_y\,\hat{j}.A_y\,\hat{j})}$
6. There are 4 terms on the right side
• Each term is a dot product of two vectors
• If the two vectors in a term are perpendicular to each other, that term will become zero
• Thus the second and third terms will become zero
7. So the final result is:
$\mathbf\small{\vec{A}.\vec{A}=(A_x\,\hat{i}.A_x\,\hat{i})+(A_y\,\hat{j}.A_y\,\hat{j})}$
That is: $\mathbf\small{\vec{A}.\vec{A}=A_xA_x+A_yA_y}$
That is: $\mathbf\small{\vec{A}.\vec{A}=A_x^2+A_y^2}$
■ We will write it as a result for easy reference
Eq.6.9:
$\mathbf\small{\text{If}\;\;\vec{A}=A_x\,\hat{i}+A_y\,\hat{j}}$
$\mathbf\small{\text{Then}\;\;\vec{A}.\vec{A}=A_x^2+A_y^2}$
8. The result in (7) can be extended to three dimensions also. We get:
$\mathbf\small{\vec{A}.\vec{A}=(A_x\,\hat{i}+A_y\,\hat{j}+A_z\,\hat{k}).(A_x\,\hat{i}+A_y\,\hat{j}+A_z\,\hat{k})=A_x^2+A_y^2+A_z^2}$
■ We will write it as a result for easy reference
Eq.6.10:
$\mathbf\small{\text{If}\;\;\vec{A}=A_x\,\hat{i}+A_y\,\hat{j}+A_z\,\hat{k}}$
$\mathbf\small{\text{Then}\;\;\vec{A}.\vec{A}=A_x^2+A_y^2+A_z^2}$
Next we will prove that, if $\mathbf\small{\vec{B}=B_x\:\hat{i}+B_y\:\hat{j}}$,
Then: $\mathbf\small{\lambda\vec{B}=\lambda(B_x\:\hat{i}+B_y\:\hat{j})=(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$
Where $\mathbf\small{\lambda}$ is a scalar quantity
• A graphic representation of this property is shown in fig.6.5 below:
Fig.6.5 |
• $\mathbf\small{B_x\,\hat{i}}$ is multiplied by a scalar $\mathbf\small{\lambda}$
♦ This gives a new vector
• $\mathbf\small{B_y\,\hat{j}}$ is multiplied by the same scalar $\mathbf\small{\lambda}$
♦ This gives another new vector
• There will be a resultant for these two new vectors
• This new resultant is $\mathbf\small{\vec{B}}$ multiplied by $\mathbf\small{\lambda}$
■ That means:
♦ The magnitude of the new resultant is: $\mathbf\small{\lambda |\vec{B}|}$
♦ The direction of the new resultant is same as that of $\mathbf\small{\vec{B}}$
We will write the steps to prove it:
1. On the left side, we are multiplying a vector by a scalar $\mathbf\small{\lambda}$
• Then the magnitude of the new vector will become ${\mathbf\small{\lambda |\vec{B}|}}$
• The direction of the new vector will be same as the original
• We have to arrive at the above results from the right side also. Let us try:
2. On the right side, we have a vector: $\mathbf\small{(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$
• The magnitude of this vector will be: $\mathbf\small{\sqrt{(\lambda B_x)^2+(\lambda B_y)^2}=\sqrt{\lambda^2 B_x^2+\lambda^2 B_y^2}}$ (Using Eq.4.2)
$\mathbf\small{=\sqrt{\lambda^2 (B_x^2+ B_y^2)}=\lambda \sqrt{(B_x^2+ B_y^2)}=\lambda |\vec{B}|}$
• So magnitude of $\mathbf\small{(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$ is same as the magnitude of $\mathbf\small{\lambda\vec{B}}$
3. The direction of $\mathbf\small{(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$ is given by:
$\mathbf\small{\theta =\tan ^{-1}\frac{\lambda B_y}{\lambda B_x}=\tan ^{-1}\frac{B_y}{B_x}}$
• So direction of $\mathbf\small{(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$ is same as the direction of $\mathbf\small{\lambda\vec{B}}$
■Thus we proved that: $\mathbf\small{\lambda\vec{B}=\lambda(B_x\:\hat{i}+B_y\:\hat{j})=(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$
■ We will write it as a result for easy reference
Eq.6.11:
$\mathbf\small{\text{If}\;\;\vec{B}=B_x\,\hat{i}+B_y\,\hat{j}}$
$\mathbf\small{\text{Then}\;\;\lambda\vec{B}=(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$
Next we will prove that $\mathbf\small{\vec{A}.(\lambda \vec{B})=\lambda(\vec{A}.\vec{B})}$
1. First we will work on the left side
$\mathbf\small{\lambda\vec{B}=(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}}$
[Using Eq.6.11]
• Now, multiplying both sides by $\mathbf\small{\vec{A}}$, we get:
$\mathbf\small{\vec{A}.(\lambda\vec{B})=\vec{A}[(\lambda B_x)\:\hat{i}+(\lambda B_y)\:\hat{j}]}$
• Applying distributive law, we get:
$\mathbf\small{\vec{A}.(\lambda\vec{B})=\vec{A}.[(\lambda B_x)\:\hat{i}]+\vec{A.}[(\lambda B_y)\:\hat{j}}]$
• Now we put $\mathbf\small{\vec{A}=(A_x\:\hat{i}+A_y\:\hat{j})}$. We get:
$\mathbf\small{\vec{A}.(\lambda \vec{B})=(A_x\:\hat{i}+A_y\:\hat{j}).[(\lambda B_x)\:\hat{i}]+(A_x\:\hat{i}+A_y\:\hat{j}).[(\lambda B_y)\:\hat{j}}]$
• Applying distributive law, we get:
$\mathbf\small{\vec{A}.(\lambda \vec{B})=A_x\:\hat{i}.[(\lambda B_x)\:\hat{i}]+A_y\:\hat{j}.[(\lambda B_x)\:\hat{i}]+A_x\:\hat{i}.[(\lambda B_y)\:\hat{j}]+A_y\:\hat{j}.[(\lambda B_y)\:\hat{j}]}$
2. There are 4 terms in the above result
• Each term is a dot product of two vectors
• If the two vectors in a term are perpendicular to each other, that term will become zero
• Thus the second and third terms will become zero
• We get: $\mathbf\small{\vec{A}.(\lambda \vec{B})=A_x\:\hat{i}.[(\lambda B_x)\:\hat{i}]+A_y\:\hat{j}.[(\lambda B_y)\:\hat{j}]}$
$\mathbf\small{=A_x(\lambda B_x)+A_y(\lambda B_y)}$
$\mathbf\small{=\lambda A_xB_x+\lambda A_yB_y}$
3. Now we work on the right side
• On the right side, we have $\mathbf\small{\lambda(\vec{A}.\vec{B})}$
• But using Eq6.5, $\mathbf\small{\vec{A}.\vec{B}=A_xB_x+A_yB_y}$
• So the right side becomes: $\mathbf\small{\lambda (\vec{A}.\vec{B})=\lambda (A_xB_x+A_yB_y)=\lambda A_xB_x+\lambda A_yB_y}$
• This is same as the result in (2)
• So L.H.S = R.H.S
• Thus we proved that $\mathbf\small{\vec{A}.(\lambda \vec{B})=\lambda(\vec{A}.\vec{B})}$
■ We will write it as a result for easy reference
Eq.6.12:
$\mathbf\small{\vec{A}.(\lambda \vec{B})=\lambda(\vec{A}.\vec{B})}$
No comments:
Post a Comment