Explore BrainMass

Linear Algebra

Linear Algebra -- Orthonormal Sets

Please see problem #1 of the attachment. If you show me how to do #1 (the answers are a and d, by the way) I'll probably be able to do #2. Thanks!

Linear Algebra -- Linear Transformations

Determine whether the following are linear transformations from C[0,1] into R^1. L(f) = |f(0)| L(f) = [f(0) + f(1)]/2 L(f) = {integral from 0 to 1 of [f(x)]^2 dx}^(1/2) Thanks so much. :)

Matrix Theory

A) Let A be a positive definite matrix. Show that X has a unique positive square root. That is, show that there exists a unique positive matrix X such that X^2 =A. B) How many square roots can a positive definite matrix have?

Matrix Theory

Suppose A is diagonalizable with distinct eigenvalues... See attached file for full problem description.

Antisymmetric relations

Let R and S be antisymmetric relations on a set A. Does R union S have to be antisymmetric also? Give a counterexample if the answer is no and proof if it is yes.

Matrix Theory/ Isometries

Suppose A is a unitary matrix. (a) Show that there exists an orthonormal basis B of eigenvectors for A. (b) Let P be the associated change-of-basis matrix. Explain how to alter B such that P lies in SU(n).

Linear Algebra - Vector Spaces

Let P be the set of all polynomials. Show that P, with the usual addition and scalar multiplication of functions, forms a vector space. I'm just no good at proofs. I know we are supposed to go through and prove the Vector Space Axioms and the C1 and C2 closure properties. I just don't think I'm doing it successfully. I'm just