# Path length difference and Diffraction

1. The problem statement, all variables and given/known data
A double slit experiment is set up using a helium-neon laser (wavelength 633 nm). Suppose we add a small piece of glass (n = 1.50) over one of the slits. Then, the central point on the screen is occupied by what had been the m = 10 dark fringe. Determine the thickness t of the glass.

2. Relevant equations

3. The attempt at a solution
I’m trying to figure out how to understand the solution to this problem. Basically the solution shows as a reference the diagram attached. However, what I don’t understand is why would adding the glass shift the rays like that? Is there any known explanation for that? How do we know that the interference that produces the central fringe is shifted?

Then, the solution stated finding the number of the wavelengths in both the glass and the "no glass interface". The equations are m1 = t/λ and m2 = (nt)/λ. Then, it states that the path length has increased by Δm wavelengths. Why is this the case? I thought that the path length was the "extra distance" that one of the rays had travelled with respect with the other. Is there a way to say in this case that the path length will be "Δm"?

I am sorry if I am not clear, but the topic is very confusing for me right now (especially understanding how does glass shift the fringe patterns and the meaning of the path length difference). Thanks for your patience.