Explore BrainMass
Share

Explore BrainMass

    Linear Algebra, Vector Space and Mapping

    This content was COPIED from BrainMass.com - View the original, and get the already-completed solution here!

    See the attached file.

    Let Beta = {x_1, ..., x_n} be a basis for a vector space V, and let P be the mapping P((a_1)(x_1) + ... + (a_n)(x_n)) = (a_1)(x_1) + ... + (a_k)(x_k).

    a) Show that Ker(P) = Span({x_k+1, ..., x_n}) and Im(P) = Span ({x_1, ..., x_k})

    b) Show that P^2 = P

    c) Show conversely that if P:V --> V is any linear mapping such that P^2 = P, then there exists a basis Beta for V such that P takes the form given in part a. (Hint: Show that P^2 = P implies that V = Ker(P) + Im(P). These mappings are called projections. The orthogonal projections we studied in Chapter 4 are special cases.

    © BrainMass Inc. brainmass.com May 20, 2020, 9:36 pm ad1c9bdddf
    https://brainmass.com/math/linear-algebra/linear-algebra-vector-space-mapping-467801

    Attachments

    Solution Preview

    See the attached file.

    Indeed,
    Ker(P)={a_1 x_1+⋯+a_n x_n |a_1,...,a_n∈R,a_1=⋯=a_k=0}={a_((k+1)) x_((k+1))+⋯+a_n x_n |a_((k+1) ),...,a_n∈R}=Span{x_((k+1)),...〖,x〗_n}.
    Obviously,
    Im(P)= }={a_1 x_1+⋯+a_k x_k |a_1,...,a_k∈R}= Span{x_1,...〖,x〗_k}.
    Indeed,
    P^2(a_1 x_1+⋯+a_n x_n)=P(P(a_1 x_1+⋯+a_n x_n))=P(a_1 x_1+⋯+a_k x_k)= a_1 x_1+⋯+a_k x_k= P(a_1 x_1+⋯+a_n x_n).
    This implies that ...

    Solution Summary

    This solution shows the step-by-step calculations for an attached Word document.

    $2.19

    ADVERTISEMENT