The reason this happens is to optimise rendering of shaders in some cases. If possible the object is simply drawn directly to the screen with the shader applied, which is the fastest way. However if you have other shaders or effects - including setting the opacity - then the shader doesn't know about them. In order to get that included in the visual appearance, it reverts to a full-chain rendering with an intermediate surface (a transparent render texture the size of the screen). The rendering process then goes like this:
1. Render the object with its effect to the intermediate surface.
2. Render the object from the intermediate surface to the screen using the opacity.
This then makes sure the shader effect has the opacity applied. If there was a second shader, it would render to that screen with the second shader instead of opacity. If you have two shaders and opacity, it goes:
1. Render the object with effect1 to surface1.
2. Render the object from surface1 with effect2 to surface2.
3. Render the object from surface2 to the screen with the opacity.
This does mean that foreground texture co-ordinates work differently at different stages of the effect pipeline. The workaround is to work from the destStart and destEnd shader parameters, which give texture co-ordinates based on the background draw area (which is always a texture the size of the screen) rather than the foreground (which could be either an object texture or an intermediate screen-sized texture).