Search for a command to run...
Solutions to the energy-independent (gray) radiative transfer equations are compared to results of Monte Carlo simulations of the <SUP>56</SUP>Ni and <SUP>56</SUP>Co radioactive decay γ-ray energy deposition in supernovae. The comparison shows that an effective, purely absorptive, gray opacity, K<SUB>γ</SUB> ∼ (0.06±0.01) Ye cm<SUP>-2</SUP> g<SUP>-1</SUP>, where Y<SUB>e</SUB> is the total number of electrons per baryon accurately describes the interaction of γ-rays with the cool supernova gas and the local γ-ray energy deposition within the gas. The nature of the γ-ray interaction process (dominated by Compton scattering in the relativistic regime) creates a weak dependence of <SUB>γ</SUB> on the optical thickness of the (spherically symmetric) supernova atmosphere: The maximum value of <SUB>γ</SUB> applies during optically thick conditions when individual γ-rays undergo multiple scattering encounters and the lower bound is reached at the phase characterized by a total Thomson optical depth to the center of the atmosphere τ<SUB>e</SUB> ≲ 1. However, the constant asymptotic value, K<SUB>γ</SUB> = 0.050Y<SUB>e</SUB> cm<SUP>2</SUP> g<SUP>-1</SUP>, reproduces the thermal light curve due to γ-ray deposition for Type Ia supernova models to within 10% for the epoch from maximum light to t = 1200 days. Our results quantitatively confirm that the quick and efficient solution to the gray transfer problem provides an accurate representation of γ-ray energy deposition for a broad range of supernova conditions.