Search This Blog

Simplicity is the ultimate sophistication.” — Leonardo da Vinci
Contact me: sreramk360@gmail.com

Sunday 22 February 2015

Proof the vector dot product and cross product are distributive

Proof that vector dot product is distributive

We may write a vector product $\vec{a}.\vec{b}$ as $ |a||b|cos\theta$, by definition. For expressing an n-dimensional Euclidean space we may use the summation notation $\vec{a} = \sum \limits_{i=0}^n a_i \hat k$. The magnitude of a vector quantity $|\vec{a}|$ can be expressed as $|\vec{a}|^2 = \sum\limits_{i=0}^n {a_i}^n$. The proof of this expression directly follows from the Pythagoras theorem which states that, the magnitude of the line joining the ends of the line segments that emerges mutually perpendicularly from a point in space is equal to the square root of the sum of the squares of the magnitude of the other two line segments. Here, two mutually perpendicular lines are taken; the same way, while extending this to three dimensional space in which we introduce another axis perpendicular to the plane in on which the right angled triangle lies. Lets extend another line across this third axis from the point we initially considered. The hypotenuses of the triangle, which was computed using the Pythagoras theorem, is defined to be perpendicular to the newly drawn line. This is because, the third axis is perpendicular to the plain on which we constructed our first triangle.

Because the hypotenuses lies on the plain, it becomes perpendicular to the third line. Now consider this new line and the hypotenuses to lie on another plane which cuts through the three dimensional coordinate system irregularly. We now have two perpendicular lines in this new plain which we again root the sum of the squares of their magnitudes to get the magnitude of another line which is the hypotenuse of this system of lines. Thereby we can say that the newly formed hypotenuse is the magnitude of the line joining the origin and the point $(x,y,z)$ in three dimensional space, if the point we initially considered is the origin of this three dimensional co-ordinate system. Now we may extend this to higher dimensions by introducing another axis which we define to be perpendicular to all the other three axis. Like our example of extending a two dimensional vector analysis to three dimensions by just considering the magnitude of a single line formed by the two axis; introducing another axis, does not complicate our process of generalizing our expression of vector magnitude to another dimension if we only consider the system of the third axis and the single line plotted previously.

Now, for extending this to higher dimensions, we may do the same. Lets just consider the three-dimensional magnitude of an n-dimensional line (by just neglecting the higher dimensions); its practically impossible to imagine another dimension. But we a find the magnitude of a four-spatial dimensional line by the expression $l_4 = \sqrt{{l_3}^2+{k_4}^2}$ (where, $l_3 = \sqrt{{k_1}^2+{k_2}^2+{k_3}^2}$ ). We know that the fourth line is perpendicular to the first three mutually perpendicular lines, but it's not necessary to imagine it practically to compute its magnitude in the 4D co-ordinate system. We have the magnitude $k_4$ and the line $l_3$ so they form a plane that irregularly cuts the 4D co-ordinate system, but again, we have a plain to which we can freely apply the Pythagoras theorem.

So now lets proceed to prove the distributive property of  vector dot product. To prove it, we have to show that $\vec{c}.(\vec{a}+\vec{b}) = |\vec{c}||\vec{a}+\vec{b}|cos{\theta}_{a+b,c} = |\vec{c}|     |\vec{a}|cos{\theta}_{a,c}+ |\vec{c}||\vec{b}|cos{\theta}_{b,c}$. To proceed with this proof, we have to show that $|\vec{a}+\vec{b}|^2 = |\vec{a}|^2 + |\vec{b}|^2 + 2|\vec{a}||\vec{b}|cos{\theta}_{a,b}$ and then show that $\vec{a}.\vec{b} = \sum\limits_{i=0}^n a_ib_i$.

The parallelogram law of vector addition can be proved in the following way (proof for the parallelogram law of vector addition),


   Let, $OA$ be the vector component $\vec{a}$ and let $OC$ be the vector component $\vec{b}$ and let $OB$ be the vector component $\vec{a}+\vec{b}$. Now lets express the length $OB$ in terms of $|\vec{a}|$ and $|\vec{b}|$ and the angle $\angle{BCD}$.  $\angle{BCD} = \angle{AOC}$ as the above diagram is a parallelogram. We can say that $OA = CB = |\vec{a}|$ and so we get, $CD = CB.cos\angle{BCD} = |\vec{a}|.cos\angle{BCD}$. Now, as we have $CD$  and $OC$ we can write $OD = OC+CD$ or, $OD = |\vec{b}| + |\vec{a}|.cos\angle{BCD}$. Now we can express $BD$ as, $BD = |\vec{a}|.sin\angle{BCD}$.

$${OB}^2 = |\vec{a}+\vec{b}|^2 = {BD}^2 + {OD}^2 $$
So we can write $$|\vec{a}+\vec{b}|^2 =  (|\vec{a}|.sin\angle{BCD})^2 + (|\vec{b}| + |\vec{a}|.cos\angle{BCD})^2 $$.

After simplification we get, $$|\vec{a}+\vec{b}|^2 = |\vec{a}|^2 + |\vec{b}|^2 + 2|\vec{a}||\vec{b}|cos{\theta}_{a,b}$$

proof that $\vec{a}.\vec{b} = \sum\limits_{i=0}^n a_ib_i$:

Now, lets proceed with proving the distributive property of vector dot product. We first need to show that,  $$\vec{a}.\vec{b} = \sum\limits_{i=0}^n a_ib_i$$. From the parallelogram law of vector addition we have $$cos\theta_{a,b} = \frac {|\vec{a}+\vec{b}|^2 - |\vec{a}|^2 - |\vec{b}|^2}{2|\vec{a}||\vec{b}|}$$. This can be rewritten in its summation form as,

$$cos\theta_{a,b} =
\frac {
              |\sum \limits_{i=0}^n a_i \hat k +
                                         \sum \limits_{i=0}^n b_i \hat k|^2 -
                                                               |\sum \limits_{i=0}^n a_i \hat k|^2 -
                                                                           |\sum \limits_{i=0}^n b_i \hat k|^2
         }
        {
                  2|\sum \limits_{i=0}^n a_i \hat k|
                   |\sum \limits_{i=0}^n b_i \hat k|
         }
$$ Or,

$$cos\theta_{a,b} =
\frac {
              |\sum \limits_{i=0}^n (a_i+b_i) \hat k|^2-
                                                               |\sum \limits_{i=0}^n a_i \hat k|^2 -
                                                                           |\sum \limits_{i=0}^n b_i \hat k|^2
         }
        {
                  2|\sum \limits_{i=0}^n a_i \hat k|
                   |\sum \limits_{i=0}^n b_i \hat k|
         }
$$ Or, 
$$cos\theta_{a,b} =
\frac {
              \sum \limits_{i=0}^n (a_i+b_i)^2-
                                                               \sum \limits_{i=0}^n {a_i }^2 -
                                                                           \sum \limits_{i=0}^n {b_i}^2
         }
        {
                  2\sqrt{\sum \limits_{i=0}^n {a_i}^2
                   \sum \limits_{i=0}^n {b_i}^2}
         }
$$ We know that, 

$$\vec{a}.\vec{b} = |\vec{a}||\vec{b}|cos\theta_{a,b}$$ Or, 

$$\vec{a}.\vec{b} = (\frac {
              \sum \limits_{i=0}^n (a_i+b_i)^2-
                                                               \sum \limits_{i=0}^n {a_i }^2 -
                                                                           \sum \limits_{i=0}^n {b_i}^2
         }
        {
                  2\sqrt{\sum \limits_{i=0}^n {a_i}^2
                   \sum \limits_{i=0}^n {b_i}^2}
         }
)                 \sqrt{ \sum \limits_{i=0}^n {a_i}^2                    \sum \limits_{i=0}^n {b_i}^2} $$ Or, 
$$ \vec{a}.\vec{b} = \frac{\sum \limits_{i=0}^n ({a_i}^2 + {b_i}^2 + 2{a_i}{b_i})-                                                               \sum \limits_{i=0}^n {a_i }^2 -
                                                                           \sum \limits_{i=0}^n {b_i}^2}{2} $$ Or,
$$\vec{a}.\vec{b} = \sum\limits_{i=0}^n a_ib_i$$

Proof that $(\vec{a} + \vec{b}) .\vec{c} = \vec{a}.\vec{c} + \vec{b}.\vec{c}$:

On substituting the result $\vec{x}.\vec{y} = \sum\limits_{i=0}^n x_iy_i$ in the expression we get,

$$
(\vec{a} + \vec{b}) .\vec{c} =
              \sum\limits_{i=0}^n (a_i+b_i)c_i =
                              \sum\limits_{i=0}^n (a_ic_i+b_ic_i) =
                                          \sum\limits_{i=0}^n (a_ic_i) + \sum\limits_{i=0}^n (b_ic_i)
 $$.
Now consider the RHS of the equation $(\vec{a} + \vec{b}) .\vec{c} = \vec{a}.\vec{c} + \vec{b}.\vec{c}$$; again on substituting the same result in it, we get:
$$\vec{a}.\vec{c} + \vec{b}.\vec{c} = \sum\limits_{i=0}^n (a_ic_i) + \sum\limits_{i=0}^n (b_ic_i)$$, which is same as the LHS. Hence we have proved that LHS = RHS or,
$(\vec{a} + \vec{b}) .\vec{c} = \vec{a}.\vec{c} + \vec{b}.\vec{c}$ 

Now lets show that $\vec{c}\times(\vec{a}+\vec{b}) = (\vec{c}\times \vec{a} + \vec{c}\times \vec{b})$. By proving this, we would be proving that the cross product of the sum of two vector quantities is distributive in nature. 

A vector cross product is defined as, $\vec{a}\times\vec{b} = |\vec{a}||\vec{b}|sin\theta_{a,b}\hat n$. This expression, like the expression for the vector dot product is considered to be the basic definition we introduce.The unit vector $\hat n$ shows the direction perpendicular to the plane which contains the lines $\vec{a}$ and $\vec{b}$. We know that 
$\vec{a}.\vec{b} = |\vec{a}||\vec{b}|cos\theta_{a,b}$, so we may convert it to $|\vec{a}||\vec{b}|sin\theta_{a,b}$ by squaring the dot product expression shown above and subtracting it from $|\vec{a}||\vec{b}|$. I.e., 

$$(|\vec{a}||\vec{b}|)^2 - (\vec{a}.\vec{b})^2 = (|\vec{a}||\vec{b}|)^2 -  (|\vec{a}|^2|\vec{b}|^2cos^2\theta_{a,b})^2$$ which is same as,

$$ (|\vec{a}||\vec{b}|)^2 - (\vec{a}.\vec{b})^2  = |\vec{a}|^2|\vec{b}|^2sin^2\theta_{a,b}$$

or we may write,

$$[(|\vec{a}||\vec{b}|)^2 - (\vec{a}.\vec{b})^2]\hat n  = |\vec{a}|^2|\vec{b}|^2sin^2\theta_{a,b}\hat n$$. Or, 

$$\vec{a}\times\vec{b}  = |\vec{a}||\vec{b}|sin\theta_{a,b}\hat n =\sqrt{ (|\vec{a}||\vec{b}|)^2 -  (|\vec{a}||\vec{b}|cos\theta_{a,b})^2}\hat n$$.

From the parallelogram law of vector addition, we know that $cos\theta_{a,b} = \frac {|\vec{a}+\vec{b}|^2 - |\vec{a}|^2 - |\vec{b}|^2}{2|\vec{a}||\vec{b}|}$ so, $sin\theta_{a,b} = \sqrt(1- [\frac {|\vec{a}+\vec{b}|^2 - |\vec{a}|^2 - |\vec{b}|^2}{2|\vec{a}||\vec{b}|}]^2)$. Now, we may write this as,
$$sin\theta_{a,b} = \sqrt{
\frac { 4(\sum \limits_{i=0}^n {a_i}^2
                  \sum \limits_{i=0}^n {b_i}^2) -
      (\sum \limits_{i=0}^n (a_i+b_i)^2-
         \sum \limits_{i=0}^n {a_i }^2-
                \sum \limits_{i=0}^n {b_i}^2)^2
         }
        {
                  4(\sum \limits_{i=0}^n {a_i}^2
                   \sum \limits_{i=0}^n {b_i}^2)
         }
   }
$$ Or,
$$|\vec a||\vec b|sin\theta_{a,b} =
\frac { \sqrt{
                4(\sum \limits_{i=0}^n {a_i}^2
                  \sum \limits_{i=0}^n {b_i}^2) -
      (\sum \limits_{i=0}^n (a_i+b_i)^2-
         \sum \limits_{i=0}^n {a_i }^2-
                \sum \limits_{i=0}^n {b_i}^2)^2
               }
         }
        {
                 2
         }

$$Or,
$$|\vec a||\vec b|sin\theta_{a,b} =
\frac { \sqrt{
                4(\sum \limits_{i=0}^n {a_i}^2
                  \sum \limits_{i=0}^n {b_i}^2) -
      (2\sum \limits_{i=0}^n (a_ib_i))^2
               }
         }
        {
                2
        }
$$ Or,
$$|\vec a||\vec b|sin\theta_{a,b} =
 \sqrt{
          (\sum \limits_{i=0}^n {a_i}^2
                  \sum \limits_{i=0}^n {b_i}^2) -
      (\sum \limits_{i=0}^n (a_ib_i))^2
               }
$$Or,

$$|\vec a||\vec b|sin\theta_{a,b} =
 \sqrt{
          (\sum \limits_{i=0}^n\sum \limits_{j=0}^n {a_i}^2{b_j}^2) -
      (\sum \limits_{i=0}^n\sum \limits_{j=0}^n (a_ib_i)(a_jb_j))
               }
$$ 

On simplification, we may write this as,

$$|\vec a||\vec b|sin\theta_{a,b} =
 \sqrt{
         (-\sum \limits_{i=0}^n\sum \limits_{j=0}^n a_ib_j)^2, i\not=j
               }
$$ Which implies,

$$|\vec a||\vec b|sin\theta_{a,b} = \sum \limits_{i=0}^n\sum \limits_{j=0}^n a_ib_j, i\not=j $$ Now, multiplying both sides with $\hat n$ we get, $$\vec{a}\times\vec{b} = (\sum \limits_{i=0}^n\sum \limits_{j=0}^n a_ib_j) \hat n, i\not=j $$







copyright (c) 2015 K Sreram. You may not distribute this article without the concerned permission from K Sreram.  

About my blog

No comments:

Post a Comment

Featured post

Why increasing complexity is not good?

“ Simplicity is the ultimate sophistication.” — Leonardo da Vinci Why is complicating things wrong ? - K Sr...