Explore BrainMass

Linear Algebra

Euclidean Linear Dependence

Is K={f_1(x)=1, f_2(x)=sin x, f_3(x)=cos x }cC[0,1] linearly independent or linearly dependent? Justify your answer.

Euclidean Linear Dependence

1. Is G={ [1 -1], [1 -4], [1 -6], [0 0] cM^2(R) linearly independent or linearly dependent? Justify your answer. { [-1 0] [1 0] [1 0 ] [1 0]}

Normal Subgroups

Please see the attached file for the fully formatted problem. Let G be a group and let D ={(a,a,a):a E G}. Prove that D is a normal subgroup of G+G+G if and only if G is Abelian.


Suppose that a "skew" product of vectors in R2 is defined by (u,v)=u1v1-u2v2 Prove that (u,v)squared >equal too (u,u)(v,v). (NOTE; This is just the reverse of the Cauchy- Schwartz inequality for the ordinary dot product.)

Linear Algebra: Eigenvalues

Find eigenvalues and eigenvectors of the matrix A=(2 1 9 2) By transforming the matrix in the basis of eigenvectors, show explictly that the matrix can diagonalized in the eigenvector basis.

Linear Algebra : Norms

For any x=(x1,....,xn), let us try to define a norm two ways. Consider (a) ||X||1=summation |Xi| from i=1 to n (b) ||X||b=summation |xi-xj| from i,j=1 to n Does either one of these formulas define a norm? If yes, show that all three axioms of norm hold. If no, demonstrate which axiom fails.

Linear Algebra : Orthogonal Projection

Consider vector space C[0,1] with scalar product: for any two functions f(x), g(x) (f,g)=integral from 0 to 1 of f(x)g(x)xdx. Find the orthogonal projection p of e^x onto x. Also find the norms of e^x and x.

Linear algebra

Suppose S is a linear space defined below. Are the following mappings L linear transformations from S into itself? If answer is yes, find the matrix representations of formations (in standard basis): (a) S=P4, L(p(x))=p(0)+x*p(1)+x^2*p(2)+X^3*p(4) (b) S=P4, L(p(x))=x^3+x*p'(x)+p(0) (c) S is a subspace of C[0,1] formed by

Multiplicative Inverse

By brute force, find a multiplicative inverse to 31 mod 100. Is there only one, or can you find more??

Solutions of Linear Equations

I would like a short explanation of Gaussian Elimination with partial pivoting and Gauss-Seidel. Also, explain when each applies or when one is better than the other. Please include some examples.

Wave equation with mixed boundary conditions

Uxx means second derivative with respect to x Uyy means second derivative with respect to y Uxx + Uyy = 0, 0 < x < pi, 0 < y < 1 Ux(0,y) = 0 = U(pi,y), 0 < y < 1 U(x,0) = 1, U(x,1) = 0, 0 < x < pi Please show all work including how eigenvalues and eigenvectors are derived. Thank you

Linear Algebra: Vectors - Inner Product

Show that the functions x and x^2 are orthogonal in P5 with inner product defined by ( <p,q>=sum from i=1 to n of p(xi)*q*(xi) ) where xi=(i-3)/2 for i=1,...,5. Show that ||X||1=sum i=1 to n of the absolute value of Xi. Show that ||x||infinity= max (1<=i<=n) of the absolute value of Xi. Thank you for your explanation.

Linear Algebra : Vectors - Inner Products

Given a vector w, the inner product of R^n is defined by: <x,y>=Summation from i=1 to n (xi,yi,wi) [a] Using this equation with weight vector w=(1/4,1/2,1/4)^t to define an inner product for R^3 and let x=(1,1,1)^T and y=(-5,1,3)^T Show that x and y are orthogonal with respect to this inner product. Compute the values of

Linear Algebra: Linear Mapping

Consider the following linear mapping from C[-pi,pi] into itself: L(f)=integral from -pi to pi of G(x),h(y),f(y)dy for any function f(x) in C[-pi,pi]. Here G(x), H(x) are given continuous functions. Find a function f such that L*f=lambda*f for some lambda and find the value of lambda. This is a generalization of the notion

Linear Algebra: Vector Spaces

Consider R2 with the following rules of multiplications and additions: For each x=(x1,x2), y=(y1,y2): x+y=(x2+y2,x1+y1) and for any scalar alpha, alpha*x=(alpha*x1, alpha*x2) Is it a vector space, if not demonstrate which axioms fail to hold. Also, show that Pn- the space of polynomials of order less than n is a vector spac

Linear Algebra: Matrix of Transformation

Are the following examples linear transformations from p3 to p4? If yes, compute the matrix of transformation in the standard basis of P3 {1,x,x^2} and P4 {1,x,x^2,x^3}. (a) L(p(x))=x^3*p''(x)+x^2p'(x)-x*p(x) (b) L(p(x))=x^2*p''(x)+p(x)p''(x) (c) L(p(x))=x^3*p(1)+x*p(0)

Linear Algebra: Find a Vector in a Basis

In the standard basis of P3 (i.e. {1,x,x^2}) p(x)=3-2x+5x^2, that is, it has coordinates p=(3,-2,5). Find the coordinates of this vector (polonomial) in the basis {1-x,1+x,x^2-1}

Linear Alegbra: Span of Dimension

What is the span of the dimension of... _______________________ over P3 v1=x^2, v2=1-x^2, v3=1 _______________________ over C[0,1] v1=cosx, v2=cos2x, v3=1 ___________________________ over R3, v1=(2,2,1),v2=(-3,0,-1),v3=(-4,2,-1) ______________________________

Finding the slope of a linear equation.

FIND THE SLOPE OF THIS EQUATION: 8x-2y= -48 Is the answer 4, -4, -6 or 6? Please explain how to solve the equation step by step and how to find the slope, also.

Nonisomorphic Central Extensions

Describe all nonisomorphic central extensions of Z_2 x Z_2 by a cyclic group Z_n for arbitrary n, meaning central extensions of the form: 1 --> Z_n --> G --> Z_2 x Z_2 --> 1