% LaTeX
\documentclass[leqno]{article}
\usepackage[utf8x]{inputenc}
\usepackage{ucs}
\usepackage{soul}
\usepackage{url}
\usepackage{braket}
\usepackage{graphicx}
\usepackage[intlimits]{amsmath}
\usepackage{amssymb}
\usepackage{amsfonts}
\usepackage{bm}
\usepackage{gellmu2}
\usepackage[margin=100bp,nohead]{geometry}
\setlength{\parskip}{6bp}
\setlength{\parindent}{0bp}
\pagestyle{plain}
\thispagestyle{empty}
\title{About the Gamma Function}
\newlength{\centerskip}
\setlength{\centerskip}{\topsep}
\newcommand{\hsf}{\hspace*{\fill}}
\newcommand{\tdbc}[1]{\hsf\textbf{#1}\hsf}
\begin{document}
\begin{center}\LARGE\bfseries{}
About the Gamma Function
\end{center}
\begin{center}\large\bfseries{}
Notes for Honors Calculus II, \\[0.25\baselineskip] Originally Prepared in Spring 1995
\end{center}
\medskip
\section*{1\ \ \label{SU-1}Basic Facts about the Gamma Function}
\par{The Gamma function is defined by the improper integral \[ \Gamma{}(x) \ = \ \int_{0}^{\infty{}} \nolimits {t^{x} e^{-t} \frac{dt}{t} } \ . \] The integral is absolutely convergent for \( x \geq{} 1 \) since \[ t^{x-1} e^{-t} \leq{} e^{-t/2} \ , \quad{} t \gg{} 1 \ \] and \(\int_{0}^{\infty{}} \nolimits {e^{-t/2} dt } \) is convergent. The preceding inequality is valid, in fact, for all \(x\). But for \( x < 1 \) the integrand becomes infinitely large as \(t\) approaches \(0\) through positive values. Nonetheless, the limit \[ \mbox{lim}_{r \rightarrow{} 0+} \int_{r}^{1} \nolimits {t^{x-1} e^{-t} dt } \] exists for \( x > 0 \) since \[ t^{x-1} e^{-t} \leq{} t^{x-1} \] for \(t > 0\), and, therefore, the limiting value of the preceding integral is no larger than that of \[ \mbox{lim}_{r \rightarrow{} 0+} \int_{r}^{1} \nolimits {t^{x-1} dt } \ = \ \frac{1}{x} \ . \] Hence, \(\Gamma{}(x)\) is defined by the first formula above for all values \( x > 0 \).
}
\par{If one integrates by parts the integral \[ \Gamma{}(x + 1) \ = \ \int_{0}^{\infty{}} \nolimits {t^{x} e^{-t} dt } \ , \] writing \[ \int_{0}^{\infty{}} \nolimits {udv } \ = \ u(\infty{})v(\infty{}) - u(0)v(0) - \int_{0}^{\infty{}} \nolimits {vdu } \ ,\] with \(dv \, = \, e^{-t}dt\) and \(u \, = \, t^{x}\), one obtains the \emph{functional equation} \[ \Gamma{}(x+1) \ = \ x \Gamma{}(x) \ , \ \ x > 0 \ . \]
}
\par{Obviously, \(\Gamma{}(1) \, = \, \int_{0}^{\infty{}} \nolimits {e^{-t} dt } \, = \, 1 \), and, therefore, \(\Gamma{}(2) \, = \, 1 \cdot{} \Gamma{}(1) \, = \, 1\), \ \(\Gamma{}(3) \, = \, 2 \cdot{} \Gamma{}(2) \, = \, 2!\), \ \(\Gamma{}(4) \, = \, 3 \Gamma{}(3) \, = \, 3!\), \ldots{}, and, finally, \[ \Gamma{}(n+1) \ = \ n! \] for each integer \(n > 0\).
}
\par{Thus, the gamma function provides a way of giving a meaning to the ``factorial'' of any positive real number.
}
\par{Another reason for interest in the gamma function is its relation to integrals that arise in the study of probability. The graph of the function \(\varphi{}\) defined by \[ \varphi{}(x) \ = \ e^{-x^{2}} \] is the famous ``bell-shaped curve'' of probability theory. It can be shown that the anti-derivatives of \(\varphi{}\) are not expressible in terms of elementary functions. On the other hand, \[ \Phi{}(x) \ = \ \int_{-\infty{}}^{x} \nolimits {\varphi{}(t) dt } \] is, by the fundamental theorem of calculus, an anti-derivative of \(\varphi{}\), and information about its values is useful. One finds that \[ \Phi{}(\infty{}) \ = \ \int_{-\infty{}}^{\infty{}} \nolimits {e^{-t^{2}} dt } \ = \ \Gamma{}(1/2) \] by observing that \[ \int_{-\infty{}}^{\infty{}} \nolimits {e^{-t^{2}} dt } \ = \ 2 \cdot{} \int_{0}^{\infty{}} \nolimits {e^{-t^{2}} dt } \ , \] and that upon making the substitution \(t\, = \,u^{1/2}\) in the latter integral, one obtains \(\Gamma{}(1/2)\).
}
\par{To have some idea of the size of \(\Gamma{}(1/2)\), it will be useful to consider the qualitative nature of the graph of \(\Gamma{}(x)\). For that one wants to know the derivative of \(\Gamma{}\).
}
\par{By definition \(\Gamma{}(x)\) is an integral (a definite integral with respect to the dummy variable \ \(t\)) of a function of \(x\) and \(t\). Intuition suggests that one ought to be able to find the derivative of \(\Gamma{}(x)\) by taking the integral (with respect to \(t\)) of the derivative with respect to \(x\) of the integrand. Unfortunately, there are examples where this fails to be correct; on the other hand, it is correct in most situations where one is inclined to do it. The methods required to justify ``differentiation under the integral sign'' will be regarded as slightly beyond the scope of this course. A similar stance will be adopted also for differentiation of the sum of a convergent infinite series.
}
\par{Since \[ \frac{d}{dx} t^{x} \ = \ t^{x}(\log t) \ , \] one finds \[ \frac{d}{dx} \Gamma{}(x) \ = \ \int_{0}^{\infty{}} \nolimits {t^{x} (\log t) e^{-t} \frac{dt}{t} }\ , \] and, differentiating again, \[ \frac{d^{2}}{dx^{2}} \Gamma{}(x) \ = \ \int_{0}^{\infty{}} \nolimits {t^{x} (\log t)^{2} e^{-t} \frac{dt}{t} } \ . \] One observes that in the integrals for both \(\Gamma{}\) and the second derivative \(\Gamma{}^{\prime\prime{}}\) the integrand is always positive. Consequently, one has \(\Gamma{}(x) > 0\) and \(\Gamma{}^{\prime\prime{}}(x) > 0\) for all \(x > 0\). This means that the derivative \(\Gamma{}^{\prime{}}\) of \(\Gamma{}\) is a strictly increasing function; one would like to know where it becomes positive.
}
\par{If one differentiates the functional equation \[ \Gamma{}(x+1) \ = \ x \Gamma{}(x) \ , \ \ x > 0 \ , \] one finds \[ \psi{}(x+1) \ = \ \frac{1}{x} + \psi{}(x) \ , \ \ x > 0 \ , \] where \[ \psi{}(x) \ = \ \frac{d}{dx} \log \Gamma{}(x) \ = \ \frac{\Gamma{}^{\prime{}}(x)}{\Gamma{}(x)} \ , \] and, consequently, \[ \psi{}(n+1) \ = \ \psi{}(1) + \sum_{k=0}^{n} {\frac{1}{k} }\ . \] Since the harmonic series diverges, its partial sum in the foregoing line approaches \(\infty{}\) as \(x \rightarrow{} \infty{}\). Inasmuch as \(\Gamma{}^{\prime{}}(x) \, = \, \psi{}(x)\Gamma{}(x)\), it is clear that \(\Gamma{}^{\prime{}}\) approaches \(\infty{}\) as \(x \rightarrow{} \infty{}\) since \(\Gamma{}^{\prime{}}\) is steadily increasing and its integer values \((n-1)!\psi{}(n)\) approach \(\infty{}\). Because \(2 \, = \, \Gamma{}(3) > 1 \, = \, \Gamma{}(2)\), it follows that \(\Gamma{}^{\prime{}}\) cannot be negative everywhere in the interval \(2 \leq{} x \leq{} 3\), and, therefore, since \(\Gamma{}^{\prime{}}\) is increasing, \(\Gamma{}^{\prime{}}\) must be always positive for \(x \geq{} 3\). As a result, \(\Gamma{}\) must be increasing for \(x \geq{} 3\), and, since \(\Gamma{}(n + 1) \, = \, n!\), one sees that \(\Gamma{}(x)\) approaches \(\infty{}\) as \(x \rightarrow{} \infty{}\).
}
\par{It is also the case that \(\Gamma{}(x)\) approaches \(\infty{}\) as \(x \rightarrow{} 0\). To see the convergence one observes that the integral from \(0\) to \(\infty{}\) defining \(\Gamma{}(x)\) is greater than the integral from \(0\) to \(1\) of the same integrand. Since \(e^{-t} \geq{} 1/e\) for \(0 \leq{} t \leq{} 1\), one has \[\Gamma{}(x)>\int_{0}^{1} \nolimits {(1/e)t^{x-1}dt} \ = \ (1/e)\left[\frac{t^{x}}{x}\right]_{t=0}^{t\ = \ 1}\ = \ \frac{1}{ex}\ .\] It then follows from the mean value theorem combined with the fact that \(\Gamma{}^{\prime{}}\) always increases that \(\Gamma{}^{\prime{}}(x)\) approaches \(-\infty{}\) as \(x \rightarrow{} 0\).
}
\par{Hence, there is a unique number \(c > 0\) for which \(\Gamma{}^{\prime{}}(c) \, = \, 0\), and \(\Gamma{}\) decreases steadily from \(\infty{}\) to the minimum value \(\Gamma{}(c)\) as \(x\) varies from \(0\) to \(c\) and then increases to \(\infty{}\) as \(x\) varies from \(c\) to \(\infty{}\). Since \(\Gamma{}(1) \, = \, 1 \, = \, \Gamma{}(2)\), the number \(c\) must lie in the interval from \(1\) to \(2\) and the minimum value \(\Gamma{}(c)\) must be less than \(1\).
}
\par{\begin{center}
\includegraphics[width=0.3\textwidth]{grmplgamma}\\[0.125\baselineskip] Figure~1: Graph of the Gamma Function
\end{center}
}
\par{Thus, the graph of \(\Gamma{}\) (see Figure~1) is concave upward and lies entirely in the first quadrant of the plane. It has the \(y\)-axis as a vertical asymptote. It falls steadily for \(0 < x < c\) to a postive minimum value \(\Gamma{}(c) < 1\). For \(x > c\) the graph rises rapidly.
}
\section*{2\ \ \label{SU-2}Product Formulas}
\par{It will be recalled, as one may show using l'H\^{o}pital's Rule, that \[ e^{-t} \ = \ \underset{n \rightarrow{} \infty{}}{\mbox{lim}} \left(1-\frac{t}{n}\right)^{n} \ . \] From the original formula for \(\Gamma{}(x)\), using an interchange of limits that in a more careful exposition would receive further comment, one has \[ \Gamma{}(x) \ = \ \underset{n \rightarrow{} \infty{}}{\mbox{lim}} \Gamma{}(x,n) \ , \] where \(\Gamma{}(x,n)\) is defined by \[\Gamma{}(x,n)\ = \ \int_{0}^{n} \nolimits {t^{x-1}\left(1-\frac{t}{n}\right)^{n} dt}\ ,\ n\geq{} 1\ .\] The substitution in which \ \(t\) \ is replaced by \ \(nt\) \ leads to the formula \[ \Gamma{}(x,n) \ = \ n^{x} \int_{0}^{1} \nolimits {t^{x-1} (1 - t)^{n} dt} \ . \] This integral for \(\Gamma{}(x,n)\) is amenable to integration by parts. One finds thereby: \[ \Gamma{}(x,n)\ = \ \frac{1}{x}\left(\frac{n}{n-1}\right)^{x+1}\Gamma{}(x+1,n-1)\ ,n \geq{} 2 \ . \] For the smallest value of \(n\), \ \(n \, = \, 1\) \ , integration by parts yields: \[ \Gamma{}(x,1) \ = \ \frac{1}{x(x+1)} \ . \] Iterating \(n-1\) times, one obtains: \[ \Gamma{}(x,n) \ = \ n^{x} \frac{n!}{x(x+1)(x+2)\cdots{}(x+n)} \ , \ n \geq{} 1 \ . \] Thus, one arrives at the formula \[ \Gamma{}(x) \ = \ \underset{n \rightarrow{} \infty{}}{\mbox{lim}} n^{x} \frac{n!}{x(x+1)(x+2)\cdots{}(x+n)} \ . \]
}
\par{This last formula is not exactly in the form of an infinite product \[ \prod_{k=1}^{\infty{}} {p_{k} } \ = \ \underset{n \rightarrow{} \infty{}}{\mbox{lim}}{\prod_{k=1}^{n} {p_{k}}} \ . \] But a simple trick enables one to maneuver it into such an infinite product. One writes \ \(n\) \ as a ``collapsing product'': \[n+1\ = \ \frac{n+1}{n}\cdot{}\frac{n}{n-1}\cdot{} \cdots{} \cdot{}\frac{3}{2}\cdot{}\frac{2}{1} \] or \[ n+1 \ = \ \prod_{k=1}^{n} {\left(1 + \frac{1}{k}\right)} \ , \] and, taking the \(x\)th power, one has \[ (n+1)^{x} \ = \ \prod_{k=1}^{n} {\left(1 + \frac{1}{k}\right)^{x}} \ \ \ . \] Since \[ \mbox{lim}_{n \rightarrow{} \infty{}}\frac{n^{x}}{(n+1)^{x}} \ = \ 1 \ , \] one may replace the factor \(n^{x}\) by \((n+1)^{x}\) in the last expression above for \(\Gamma{}(x)\) to obtain \[ \Gamma{}(x) \ = \ \frac{1}{x}\mbox{lim}_{n \rightarrow{} \infty{}}{\prod_{k=1}^{n} {\frac{\left(1 + \frac{1}{k}\right)^{x}}{\left(1 + \frac{x}{k}\right)}}} \ , \] or \[\Gamma{}(x)\ = \ \frac{1}{x}{\prod_{k=1}^{\infty{}} {\frac{\left(1 + \frac{1}{k}\right)^{x}}{\left(1 + \frac{x}{k}\right)}}}\ .\]
}
\par{The convergence of this infinite product for \(\Gamma{}(x)\) when \(x > 0\) is a consequence, through the various maneuvers performed, of the convergence of the original improper integral defining \(\Gamma{}(x)\) for \(x > 0\).
}
\par{It is now possible to represent \(\log \Gamma{}(x)\) as the sum of an infinite series by taking the logarithm of the infinite product formula. But first it must be noted that \[ \frac{(1+t)^{r}}{1 + rt} > 0 \ \ \mbox{for} \ t > 0 \ , \ \ r > 0 \ . \] Hence, the logarithm of each term in the preceding infinite product is defined when \(x > 0\).
}
\par{Taking the logarithm of the infinite product one finds: \[ \log \Gamma{}(x) \ = \ - \log x + \sum_{k=1}^{\infty{}} {u_{k}(x)} \ , \] where \[ u_{k}(x) \ = \ x\log \left(1 + \frac{1}{k}\right)-\log \left(1 + \frac{x}{k}\right) \ . \] It is, in fact, almost true that this series converges absolutely for \emph{all} real values of \(x\). The only problem with non-positive values of \(x\) lies in the fact that \(\log (x)\) is meaningful only for \(x > 0\), \ and, therefore, \(\log (1+x/k)\) is meaningful only for \(k > \vert{}x\vert{}\). For fixed \(x\), if one excludes the finite set of terms \(u_{k}(x)\) for which \(k \leq{} \vert{}x\vert{}\), \ then the remaining ``tail'' of the series is meaningful and is absolutely convergent. To see this one applies the ``ratio comparison test'' which says that an infinite series converges absolutely if the ratio of the absolute value of its general term to the general term of a convergent positive series exists and is finite. For this one may take as the ``test series'', the series \[ \sum_{k=1}^{\infty{}} {\frac{1}{k^{2}}} \ . \] Now as \ \(k\) \ approaches \ \(\infty{}\), \ \(t \, = \, 1/k\) \ approaches \(0\), and so \begin{align*}\mbox{lim}_{k \rightarrow{} \infty{}}\frac{u_{k}(x)}{1/k^{2}} & \ = \ \mbox{lim}_{t \rightarrow{} 0}\frac{x\log (1+t)-\log (1+xt)}{t^{2}}\\ {}~ & \ = \ \mbox{lim}_{t \rightarrow{} 0}\frac{\frac{x}{1+t}-\frac{x}{1+xt}}{2t}\\ {}~ & \ = \ \mbox{lim}_{t \rightarrow{} 0}\frac{x[(1+xt)-(1+t)]}{2t(1+t)(1+xt)}\\ {}~ & \ = \ \frac{x(x-1)}{2} \ \,.\end{align*} Hence, the limit of \(\vert{}u_{k}(x)/k^{-2}\vert{}\) is \(\vert{}x(x-1)/2\vert{}\), and the series \(\sum {u_{k}(x)}\) is absolutely convergent for all real \(x\). The absolute convergence of this series foreshadows the possibility of defining \(\Gamma{}(x)\) for all real values of \(x\) other than non-positive integers. This may be done, for example, by using the functional equation \[ \Gamma{}(x+1) \ = \ x \Gamma{}(x) \] or \[ \Gamma{}(x) \ = \ \frac{1}{x} \Gamma{}(x + 1) \] to define \(\Gamma{}(x)\) for \(-1 < x < 0\) and from there to \(-2 < x < -1\), etc.
}
\par{Taking the derivative of the series for \(\log \Gamma{}(x)\) term-by-term -- once again a step that would receive justification in a more careful treatment -- and recalling the previous notation \(\psi{}(x)\) for the derivative of \(\log \Gamma{}(x)\), one obtains \begin{align*}\psi{}(x) + \frac{1}{x} & \ = \ \sum_{k=1}^{\infty{}} {\left\{\log \left(1 + \frac{1}{k}\right) -\frac{\frac{1}{k}}{\left(1 + \frac{x}{k}\right)}\right\}}\\ {}~ & \ = \ \underset{n \rightarrow{} \infty{}}{\mbox{lim}}\sum_{k=1}^{n} {\left\{\log \frac{k+1}{k}-\frac{1}{x+k}\right\}}\\ {}~ & \ = \ \underset{n \rightarrow{} \infty{}}{\mbox{lim}}\left\{\log (n+1)-\sum_{k=1}^{n} {\frac{1}{x+k}}\right\}\\ {}~ & \ = \ \underset{n \rightarrow{} \infty{}}{\mbox{lim}}\left\{\log (n+1)-\sum_{k=1}^{n} {\frac{1}{k}} +\sum_{k=1}^{n} {\left(\frac{1}{k}-\frac{1}{x+k}\right)}\right\}\\ {}~ & \ = \ \underset{n \rightarrow{} \infty{}}{\mbox{lim}}\left\{\log (n+1)-\sum_{k=1}^{n} {\frac{1}{k}} +x\sum_{k=1}^{n} {\frac{1}{k(x+k)}}\right\}\\ {}~ & \ = \ -\gamma{} + x\sum_{k=1}^{\infty{}} {\frac{1}{k(x+k)}} \ ,\end{align*} where \(\gamma{}\) denotes Euler's constant \[ \gamma{} \ = \ \underset{n \rightarrow{} \infty{}}{\mbox{lim}}\left(\sum_{k=1}^{n} {\frac{1}{k}} - \log n\right) \ . \]
}
\par{When \(x \, = \, 1\) one has \[ \psi{}(1) \ = \ -1 - \gamma{} + \sum_{k=1}^{\infty{}} {\frac{1}{k(k+1)}} \ , \] and since \[ \frac{1}{k(k+1)} \ = \ \frac{1}{k} - \frac{1}{k+1} \ , \] this series collapses and, therefore, is easily seen to sum to \(1\). Hence, \[ \psi{}(1) \ = \ - \gamma{} \ , \ \ \psi{}(2) \ = \ \psi{}(1) + 1/1 \ = \ 1 - \gamma{} \ . \] Since \ \(\Gamma{}^{\prime{}}(x) \, = \, \psi{}(x)\Gamma{}(x)\), \ one finds: \[ \Gamma{}^{\prime{}}(1) \ = \ - \gamma{} \ , \] and \[ \Gamma{}^{\prime{}}(2) \ = \ 1 - \gamma{} \ . \]
}
\medskip
\hspace*{\fill}\rule[1bp]{0.8\linewidth}{0.3bp}\hspace*{\fill}
\medskip
\par{These course notes were prepared while consulting standard references in the subject, which included those that follow.
}
\begin{thebibliography}{MM}
\label{SU-TheBibLiog}\bibitem[{1}]{courant}
R.~Courant, \textsl{Differential and Integral Calculus} (2 volumes), English translation by E.~J. McShane, Interscience Publishers, New York, 1961. \bibitem[{2}]{whittakerWatson}
E.~T. Whittaker \& G.~N. Watson, \textsl{A~Course of Modern Analysis}, 4th edition, Cambridge University Press, 1969. \bibitem[{3}]{widder}
David Widder, \textsl{Advanced Calculus}, 2nd edition, Prentice Hall, 1961. \end{thebibliography}
\end{document}