Abstract
Rutile Ti single crystal plates have been reduced in hydrogen at about 700°C for several minutes to make them semiconducting. The concentration of oxygen vacancies was controlled by variations of time and temperature. The infrared absorption of a series of plane parallel plates having electrical resistivities ranging from 3 to 0.01 ohm-m has been examined. It is postulated that the electrical conductivity arises from the ionization of either one or two trapped electrons from each oxygen vacancy.
In samples with electrical resistivity (⊥ to the axis) greater than 0.04 ohm-m, the optical absorption at room temperature peaks at about 0.75 ev. For samples with electrical resistivity less than 0.03 ohm-m, the optical absorption shows a new maximum at 1.18 ev. The decrease of thermal activation energy with increasing oxygen vacancy concentration is expected to explain the "optical transition" from 0.75 to 1.18 ev. The ionization energies agree reasonably well with those calculated for a helium atom model of a doubly ionizable donor immersed in a dielectric medium [], namely 0.73 ev and 1.64 ev. A modification of this theory is also indicated which predicts the second ionization energy as 1.41 ev in better agreement with the experimental value of 1.18 ev.
- Received 7 May 1958
DOI:https://doi.org/10.1103/PhysRev.113.1222
©1959 American Physical Society