Indexed on: 02 Oct '96Published on: 02 Oct '96Published in: High Energy Physics - Phenomenology
Gauge theories broken by a single Higgs field are known to have first-order phase transitions in temperature if $\lambda/g^2 \ll 1$, where $g$ is the gauge coupling and $\lambda$ the Higgs self-coupling. If the theory is extended from one to $N$ Higgs doublets, with U($N$) flavor symmetry, the transition is known to be second order for $\lambda/g^2 \gtrsim 1$ in the $N\to\infty$ limit. We show that one can in principal compute the tricritical value of $\lambda/g^2$, separating first from second-order transitions, to any order in $1/N$. In particular, scalar fluctuations at the transition damp away the usual problems with the infrared behavior of high-temperature non-Abelian gauge theories. We explicitly compute the tricritical value of $\lambda/g^2$ for U(1) and SU(2) gauge theory to next-to-leading order in $1/N$.