Electronic Thesis and Dissertation Repository

Algebraic Companions and Linearizations

Eunice Y. S. Chan, The University of Western Ontario

Abstract

In this thesis, we look at a novel way of finding roots of a scalar polynomial using eigenvalue techniques. We extended this novel method to the polynomial eigenvalue problem (PEP). PEP have been used in many science and engineering applications such vibrations of structures, computer-aided geometric design, robotics, and machine learning. This thesis explains this idea in the order of which we discovered it.

In Chapter 2, a new kind of companion matrix is introduced for scalar polynomials of the form $c(\lambda) = \lambda a(\lambda)b(\lambda)+c_0$, where upper Hessenberg companions are known for the polynomials $a(\lambda)$ and $b(\lambda)$. This construction can generate companion matrices with smaller entries than the Fiedler or Frobenius forms. This generalizes Piers Lawrence's Mandelbrot companion matrix. The construction was motivated by use of Narayana-Mandelbrot polynomials.

In Chapter 3, we define Euclid polynomials $E_{k+1}(\lambda) = E_{k} (\lambda) (E_{k} (\lambda) - 1) + 1$ where $E_{1}(\lambda) = \lambda + 1$ in analogy to Euclid numbers $e_k = E_{k} (1)$. We show how to construct companion matrices $E_{k}$, so $E_{k} (\lambda) = \det(\lambda I - E_{k} )$ is of height 1 (and thus of minimal height over all integer companion matrices for $E_{k}(\lambda)$). We prove various properties of these objects, and give experimental confirmation of some unproved properties.

In Chapter 4, we show how to construct linearizations of matrix polynomials $z\mat{a}(z)\mat{d}_0 + \mat{c}_0$, $\mat{a}(z)\mat{b}(z)$, $\mat{a}(z) + \mat{b}(z)$ (when $\deg(\mat{b}(z)) < \deg(\mat{a}(z))$), and $z\mat{a}(z)\mat{d}_0\mat{b}(z) + \mat{c}_0$ from linearizations of the component parts, matrix polynomials $\mat{a}(z)$ and $\mat{b}(z)$. This extends the new companion matrix construction introduced in Chapter 2 to matrix polynomials.

In Chapter 5, we define ``generalized standard triples'' which can be used in constructing algebraic linearizations; for example, for $\H(z) = z \mat{a}(z)\mat{b}(z) + \mat{c}_0$ from linearizations for $\mat{a}(z)$ and $\mat{b}(z)$. For convenience, we tabulate generalized standard triples for orthogonal polynomial bases, the monomial basis, and Newton interpolational bases; for the Bernstein basis; for Lagrange interpolational bases; and for Hermite interpolational bases. We account for the possibility of common similarity transformations. We give explicit proofs for the less familiar bases.

In Chapter 6, we investigate the numerical stability of algebraic linearization, which re-uses linearizations of matrix polynomials $\mat{a}(\lambda)$ and $\mat{b}(\lambda)$ to make a linearization for the matrix polynomial $\mat{P}(\lambda) = \lambda \mat{a}(\lambda)\mat{b}(\lambda) + \mat{c}$. Such a re-use \textsl{seems} more likely to produce a well-conditioned linearization, and thus the implied algorithm for finding the eigenvalues of $\mat{P}(\lambda)$ seems likely to be more numerically stable than expanding out the product $\mat{a}(\lambda)\mat{b}(\lambda)$ (in whatever polynomial basis one is using). We investigate this question experimentally by using pseudospectra.