One of the prime applications of squeezed light is enhancing the sensitivity of an interferometer below the quantum shot-noise limit, but so far, no such experimental demonstration was reported when using the optical Kerr effect. In prior setups involving Kerr-squeezed light, the role of the interferometer was merely to characterize the noise pattern. The lack of such a demonstration was largely due to the cumbersome tilting of the squeezed ellipse in phase space. Here, we present the first experimental observation of phase-sensitivity enhancement in an interferometer using Kerr squeezing.