Problem

Given vectors a=(1,0,0), b=(1,1,0), and c=(1,1,1), find an orthonormal basis by using the Gram-Schmidt Method.

Answer

Expert–verified
Hide Steps
Answer

Finally, subtract the projection of c onto a and b from c to get a vector orthogonal to a and b. The projection of c onto a and b is ca||a||2a+cb||b||2b=(1,1,0). So, the orthogonal vector w is w=cca||a||2acb||b||2b=(0,0,1). Then, normalize w to get u3=1||w||w=(0,0,1).

Steps

Step 1 :Firstly, let's normalize the vector a. The magnitude of a is ||a||=12+02+02=1. So, the normalized vector u1 is u1=1||a||a=(1,0,0).

Step 2 :Next, let's subtract the projection of b onto a from b to get a vector orthogonal to a. The projection of b onto a is ba||a||2a=(1,0,0). So, the orthogonal vector v is v=bba||a||2a=(0,1,0). Then, normalize v to get u2=1||v||v=(0,1,0).

Step 3 :Finally, subtract the projection of c onto a and b from c to get a vector orthogonal to a and b. The projection of c onto a and b is ca||a||2a+cb||b||2b=(1,1,0). So, the orthogonal vector w is w=cca||a||2acb||b||2b=(0,0,1). Then, normalize w to get u3=1||w||w=(0,0,1).

link_gpt