By considering the integral of (z^2+1)^(-a) around a suitable contour C, prove:
Integral from x=0 to x=infinity of dx/(x^2+1)^a = sin(pi*a) Integral from u=1 to u=infinity of du/(u^2-1)^a
where 1/2 < a < 1.
(Include proofs that the integrals over any large or small circular arcs tend to
zero as their radii tend to infinity or zero, whichever applies. Observe that
(z^2 + 1)^(-a) has branch points at z = ±i and z = infinity.)
You have to define the function f(z) = (z^2+1)^(-a) as a single valued function first. To explain how to do this, let's first consider the simple case of z^p for fractional p. In polar coordinates, we can write this as:
z^p = r^p exp(i p theta)
The problem is then how to define the polar angle theta unambiguously; if you go round the origin, theta increases by 2 pi, so the function pics up a factor exp(2 pi i p), but since we're back where we started, we then don't have a unique function value. To fix ths problem, we need to introduce a branch cut. This is a line (or a curve), that starts from the origin and (in this case), moves to infinity. When going round the origin anti-clockwise and crossing this line, theta jumps by minus 2 pi. So, if we go round the origin once theta will be back at the starting value. Each point in the complex plane has thus be assigned a unique value for theta this way, and the function z^p is thus defined unambiguously.
The point z = 0 is a point where the function is not analytic. This point is called a branch point. The line across which theta jumps is called a branch cut. Defining the function unambiguously using a branch cut obviously makes the function discontinuous along the branch cut, so attention must be paid here when doing contour integration as we want the function to be analytic inside the contour in that case. The contour cannot cross branch cuts and the contour cannot contain branch point singularities.
In case of our function, f(z) = (z^2+1)^(-a), we have branch point singularities at z = i and z = - i (also at z = infinity, but that is of no concern in this problem). To define this function unambiguously, we need to consider local polar coordinates relative to both branch points and define branch cuts that fix the two polar angles unambiguously. Let's write our function as:
f(z) = (z^2+1)^(-a) = (z+i)^(-a) (z-i)^(-a)
Then what we need to do is define the two factors (z+i)^(-a) and (z-i)^(-a). Before committing to a precise definition, let's think in general terms about this. The first factor (z+i)^(-a) has a branch point singularity at
z = -i and we need to have branch cut starting there. Then we can write z+i as r exp(i theta) unambiguously. The polar angle theta is zero if we are to the right of the point z = - i, i.e. z =- i + some real number. It increases if you move around the point z = -i in a circle until you cross the branch cut.
We treat the factor (z-i)^(-a) in a similar way. We have a branch point singularity at z = i here and we need some branch cut stating ...
We explain how branch point singularities can be dealt with in contour integration problems. The problem is then solved using the given explanations.