Share
Explore BrainMass

Linear Algebra

Eigenvector Estimation : Inverse Power Method

Please see the attached file for the fully formatted problems. Use the inverse power method to estimate the eigenvector corresponding to the eigenvalue with smallest absolute value for the matrix -1 -2 -1 A= -2 -4 -3 2 2 1 where X0= [1,1,-1]. In finding A-1 use exact arithmetic with fractions. ln applyi

Cosets Groups

Note: C means set containment (not proper) |G:H| means index of subgroup H in G U means union of sets E means belonging to Let K C H C G be groups. Show that both |G:H| and |H:K| are finite if and only if |G:K| is finite, and then |G:K| = |G:H||H:K|. Hint: if |H:K| = n, let Kh1, Kh2, ..., Khn be the distinct cosets of

Linear Optimization Applications

Please see the attached file for full problem description. The local drug store sells a wide variety of cold medications. During the particularly harsh winter, the only three types left on the shelf were in the children's section. These are:

Signals - System Properties

I have difficulty in determining whether the signals are memoryless or causal. Please see the attached file for full problem description.

Linear velocity

Find the linear velocity of a point on the edge of a drum rotating 52 times per minute. The diameter of the wheel is 16.0in. Please show me all the steps thank you

Proof : Diagonalization of Matrices

Please see the attached file for full problem description. Write a proof for the following statement: If A is an n x n upper triangular matrix with no two diagonal elements the same, then A is similar to a diagonal matrix. Show work.

Transformations : Diagonalization of Matrices

Please see the attached file for full problem description. The linear operator T: R^3 R^3 defined by T(x_1, x_2, x_3) = (x_1 - 3x_3, x_1 + 2x_2 + x_3, x_3 - 3x_1). Determine whether or not there is a basis F for R^3 relative to which the transformation T can be represented by a diagonal matrix D=[T]_F. If there is,

Matrices : Finding the Rank

How to find rank of a matrix: definitions and an example (4*4 matrix) with detailed explanations. Find the rank of A= [1 0 2 0] [4 0 3 0] [5 0 -1 0] [2 -3 1 1]. Show all work.

Linear algebra problems

I have two questions that I need help with. 1) How would you find a basis of the kernel, a basis of the image and determine the dimension of each for this matrix? The matrix is in the attachment. 2) Are the following 3 vectors linearly dependent? (see attachment for the three vectors) How can you decide? I hope y

Linear functions and equations

1. Determine whether each of the following is a function or not. (a) f(x) = 1 if x>1 = 0 otherwise (b) f(x) = 2 if x>0 = -2 if x<0 = 2 or -2 if x = 0 = 0 otherwise (c) f(x) = 5/x 2. Suppose you have a lemonade stand, and when you charge $1 per cup of lemonade you se

Systems of Linear Equations : 3 Unknowns (Echelon Method)

The problem is to find all the possible solutions to the following: Eq 1: x + y = 2 Eq 2: y + z = 3 Eq 3: x + 2y + z = 5 I set up my matricies in the following: 1 1 0 2 0 1 1 3 1 2 1 5 operation 1: (-1*row 1 +row 3) 1 1 0 2 0 1 1 3 0 1 1 3 operation 2: (-1*row 2 +row 3) 1 1 0 2 0 1 1 3 0 0

Generating Linear Algebra

Vector Space and Subspaces Euclidian 3-space Problem:- Show that the vectors u1 = (1,2,3), u2 = (0,1,2), u3 = (2,0,1) generate R3(R).

Euclidean Linear Dependence

1. Is H= the column vectors of the matrix M= [1 0 2 0] cR^4 linearly independent or [4 0 3 0] linearly dependent? [5 0 -1 0] Justify your answer.

Linear algebra

Suppose S is a linear space defined below. Are the following mappings L linear transformations from S into itself? If answer is yes, find the matrix representations of formations (in standard basis): (a) S=P4, L(p(x))=p(0)+x*p(1)+x^2*p(2)+X^3*p(4) (b) S=P4, L(p(x))=x^3+x*p'(x)+p(0) (c) S is a subspace of C[0,1] formed by

Solutions for equations

Consider the equation Ax=b, with a=(1 1 a) (1 -1 1) (2 -1 -1) b=(6+b) ( b ) ( b ) for which values of a,b this system has no solutions? infinitely many solutions? unique solution? if possible, find the solution x explicitly in terms of a,b.

Solutions of Linear Equations

I would like a short explanation of Gaussian Elimination with partial pivoting and Gauss-Seidel. Also, explain when each applies or when one is better than the other. Please include some examples.

Wave equation with mixed boundary conditions

Uxx means second derivative with respect to x Uyy means second derivative with respect to y Uxx + Uyy = 0, 0 < x < pi, 0 < y < 1 Ux(0,y) = 0 = U(pi,y), 0 < y < 1 U(x,0) = 1, U(x,1) = 0, 0 < x < pi Please show all work including how eigenvalues and eigenvectors are derived. Thank you

Linear Algebra: Vectors - Inner Product

Show that the functions x and x^2 are orthogonal in P5 with inner product defined by ( <p,q>=sum from i=1 to n of p(xi)*q*(xi) ) where xi=(i-3)/2 for i=1,...,5. Show that ||X||1=sum i=1 to n of the absolute value of Xi. Show that ||x||infinity= max (1<=i<=n) of the absolute value of Xi. Thank you for your explanation.

Linear Algebra : Vectors - Inner Products

Given a vector w, the inner product of R^n is defined by: <x,y>=Summation from i=1 to n (xi,yi,wi) [a] Using this equation with weight vector w=(1/4,1/2,1/4)^t to define an inner product for R^3 and let x=(1,1,1)^T and y=(-5,1,3)^T Show that x and y are orthogonal with respect to this inner product. Compute the values of

Linear Algebra and Numerical Analysis Polynomials

Questions on a Sequence of Polynomials. See attached file for full problem description. Let be the sequence of polynomials defined by , , 1) Show that is a polynomial of degree k. Calculate the coefficient of of . 2) Show by induction that for all real . 3) Deduce that if , . 4) Show that for all whole nat

Linear Algebra: Linear Mapping

Consider the following linear mapping from C[-pi,pi] into itself: L(f)=integral from -pi to pi of G(x),h(y),f(y)dy for any function f(x) in C[-pi,pi]. Here G(x), H(x) are given continuous functions. Find a function f such that L*f=lambda*f for some lambda and find the value of lambda. This is a generalization of the notion

Linear Algebra: Vector Spaces

Consider R2 with the following rules of multiplications and additions: For each x=(x1,x2), y=(y1,y2): x+y=(x2+y2,x1+y1) and for any scalar alpha, alpha*x=(alpha*x1, alpha*x2) Is it a vector space, if not demonstrate which axioms fail to hold. Also, show that Pn- the space of polynomials of order less than n is a vector spac

Linear Algebra: Matrix of Transformation

Are the following examples linear transformations from p3 to p4? If yes, compute the matrix of transformation in the standard basis of P3 {1,x,x^2} and P4 {1,x,x^2,x^3}. (a) L(p(x))=x^3*p''(x)+x^2p'(x)-x*p(x) (b) L(p(x))=x^2*p''(x)+p(x)p''(x) (c) L(p(x))=x^3*p(1)+x*p(0)

Linear Algebra: Find a Vector in a Basis

In the standard basis of P3 (i.e. {1,x,x^2}) p(x)=3-2x+5x^2, that is, it has coordinates p=(3,-2,5). Find the coordinates of this vector (polonomial) in the basis {1-x,1+x,x^2-1}