Explore BrainMass

Linear Algebra

Linear Algebra

For what value of the parameter b will the following system of equations fail to have a unique solution? (HINT - Do not attempt to actually solve the equations!!!!) x+2by-z = 2 2bx+3y-bz = 3 x+2y+z = 0

Linear Algebra

Compute the determinants of the following matrices? 0 2 1 -1 4 3 -2 1 -4 3 0 1 -4 1 -1 2 3 0 4 2 0 -2 1 0 1

Linear algebra problems

I have two questions that I need help with. 1) How would you find a basis of the kernel, a basis of the image and determine the dimension of each for this matrix? The matrix is in the attachment. 2) Are the following 3 vectors linearly dependent? (see attachment for the three vectors) How can you decide? I hope y

Linear functions and equations

1. Determine whether each of the following is a function or not. (a) f(x) = 1 if x>1 = 0 otherwise (b) f(x) = 2 if x>0 = -2 if x<0 = 2 or -2 if x = 0 = 0 otherwise (c) f(x) = 5/x 2. Suppose you have a lemonade stand, and when you charge $1 per cup of lemonade you se

Systems of Linear Equations : 3 Unknowns (Echelon Method)

The problem is to find all the possible solutions to the following: Eq 1: x + y = 2 Eq 2: y + z = 3 Eq 3: x + 2y + z = 5 I set up my matricies in the following: 1 1 0 2 0 1 1 3 1 2 1 5 operation 1: (-1*row 1 +row 3) 1 1 0 2 0 1 1 3 0 1 1 3 operation 2: (-1*row 2 +row 3) 1 1 0 2 0 1 1 3 0 0

Generating Linear Algebra

Vector Space and Subspaces Euclidian 3-space Problem:- Show that the vectors u1 = (1,2,3), u2 = (0,1,2), u3 = (2,0,1) generate R3(R).

Euclidean Linear Dependence

Is K={f_1(x)=1, f_2(x)=sin x, f_3(x)=cos x }cC[0,1] linearly independent or linearly dependent? Justify your answer.

Euclidean Linear Dependence

1. Is G={ [1 -1], [1 -4], [1 -6], [0 0] cM^2(R) linearly independent or linearly dependent? Justify your answer. { [-1 0] [1 0] [1 0 ] [1 0]}

Normal Subgroups

Please see the attached file for the fully formatted problem. Let G be a group and let D ={(a,a,a):a E G}. Prove that D is a normal subgroup of G+G+G if and only if G is Abelian.


Suppose that a "skew" product of vectors in R2 is defined by (u,v)=u1v1-u2v2 Prove that (u,v)squared >equal too (u,u)(v,v). (NOTE; This is just the reverse of the Cauchy- Schwartz inequality for the ordinary dot product.)

Linear Algebra: Eigenvalues

Find eigenvalues and eigenvectors of the matrix A=(2 1 9 2) By transforming the matrix in the basis of eigenvectors, show explictly that the matrix can diagonalized in the eigenvector basis.

Linear Algebra : Norms

For any x=(x1,....,xn), let us try to define a norm two ways. Consider (a) ||X||1=summation |Xi| from i=1 to n (b) ||X||b=summation |xi-xj| from i,j=1 to n Does either one of these formulas define a norm? If yes, show that all three axioms of norm hold. If no, demonstrate which axiom fails.

Linear Algebra : Orthogonal Projection

Consider vector space C[0,1] with scalar product: for any two functions f(x), g(x) (f,g)=integral from 0 to 1 of f(x)g(x)xdx. Find the orthogonal projection p of e^x onto x. Also find the norms of e^x and x.

Linear algebra

Suppose S is a linear space defined below. Are the following mappings L linear transformations from S into itself? If answer is yes, find the matrix representations of formations (in standard basis): (a) S=P4, L(p(x))=p(0)+x*p(1)+x^2*p(2)+X^3*p(4) (b) S=P4, L(p(x))=x^3+x*p'(x)+p(0) (c) S is a subspace of C[0,1] formed by

Multiplicative Inverse

By brute force, find a multiplicative inverse to 31 mod 100. Is there only one, or can you find more??

Solutions of Linear Equations

I would like a short explanation of Gaussian Elimination with partial pivoting and Gauss-Seidel. Also, explain when each applies or when one is better than the other. Please include some examples.

Wave equation with mixed boundary conditions

Uxx means second derivative with respect to x Uyy means second derivative with respect to y Uxx + Uyy = 0, 0 < x < pi, 0 < y < 1 Ux(0,y) = 0 = U(pi,y), 0 < y < 1 U(x,0) = 1, U(x,1) = 0, 0 < x < pi Please show all work including how eigenvalues and eigenvectors are derived. Thank you

Linear Algebra: Vectors - Inner Product

Show that the functions x and x^2 are orthogonal in P5 with inner product defined by ( <p,q>=sum from i=1 to n of p(xi)*q*(xi) ) where xi=(i-3)/2 for i=1,...,5. Show that ||X||1=sum i=1 to n of the absolute value of Xi. Show that ||x||infinity= max (1<=i<=n) of the absolute value of Xi. Thank you for your explanation.

Linear Algebra : Vectors - Inner Products

Given a vector w, the inner product of R^n is defined by: <x,y>=Summation from i=1 to n (xi,yi,wi) [a] Using this equation with weight vector w=(1/4,1/2,1/4)^t to define an inner product for R^3 and let x=(1,1,1)^T and y=(-5,1,3)^T Show that x and y are orthogonal with respect to this inner product. Compute the values of

Linear Algebra: Linear Mapping

Consider the following linear mapping from C[-pi,pi] into itself: L(f)=integral from -pi to pi of G(x),h(y),f(y)dy for any function f(x) in C[-pi,pi]. Here G(x), H(x) are given continuous functions. Find a function f such that L*f=lambda*f for some lambda and find the value of lambda. This is a generalization of the notion

Linear Algebra: Vector Spaces

Consider R2 with the following rules of multiplications and additions: For each x=(x1,x2), y=(y1,y2): x+y=(x2+y2,x1+y1) and for any scalar alpha, alpha*x=(alpha*x1, alpha*x2) Is it a vector space, if not demonstrate which axioms fail to hold. Also, show that Pn- the space of polynomials of order less than n is a vector spac