Heat transfer rate, as I learned in Navy nuclear power school, is Qdot=UAdeltaT. I imagine it could apply to the troposphere to stratosphere interface to some extant. Where U is the heat transfer rate per unit area, this wouldn’t change. A is area, square kilometers, which would change slightly. Delta T is temperature change. As the troposphere warms, increased delta T would allow for increased heat leakage to the sky. But, in order to support increased heat transfer rate, the future temperature must reach a new steady state slightly higher in order to maintain the increased heat loss. Sorta like if an object has an internal heat source and it produces more heat, the surface temperature must increase in order to blow off the increased heat flow.
In reference to how they know where the interface is and how they measure it, no idea. I knew that the troposphere to stratosphere is a fairly narrow interface, don’t know how or why ( logic dictates it would be a smooth transition from ground to space), but I didn’t know it was that well defined that it could be measured in meters or feet.
OK, I just skimmed troposphere at Wikipedia, it can vary from 6 km to 18 km, so it seems weird that they can measure the increase to meters or feet.