(See attached file for full problem description with image)
Given our function f(x) = x^3 + 2x^2 - 5x - 6 the shaded region required is the sum of two areas which we will call A (shaded region above the x axis) and B (shaded region below the x axis).
To find A we need to integrate f(x) with respect to x between two limits i.e. from the lowest value of x for region A, to the greatest value of x for region A.
To find these limits we must calculate the values of x at which f(x) cuts across the x axis, and as you know, these are the roots of our cubic equation.
There are a number of ways to find the roots of our equation - the easiest approach in this case seems to be by factorising the cubic.
To factorise f(x) we can use the polynomial factor theorem which tells us that:
if f(a) = 0 then (x-a) is a factor of f(x).
From your graph you might consider the value: a = -1
f(-1) = (-1)^3 + 2(-1)^2 -5(-1) - 6
This satisfies the condition of the factor theorem so:
=> (x - (-1)) is a factor of f(x).
We can use this to factorise f(x) completely and so find all the roots:
x^3 + 2x^2 - 5x - 6 = (x + 1) (x^2 + x - ...
The area of a shaded region is found.