The size of transformers decrease if we increase frequency because the magnetic flux in the core is proportional to the voltage and inversely proportional to the frequency.
This means that at higher frequencies, the same voltage can produce less flux, which reduces the core losses and allows for a smaller core size. This is explained by Faraday’s law of induction, which relates the voltage, the number of turns, and the rate of change of flux in a winding¹.
One way to understand this intuitively is to imagine that the transformer core is like a sponge that can absorb and release magnetic energy. At lower frequencies, the sponge has more time to fill up and empty out with each cycle of the alternating current.
This means that the sponge needs to be bigger to store more energy. At higher frequencies, the sponge has less time to fill up and empty out, so it can be smaller and still transfer the same amount of energy².Another way to look at it is that the transformer size is related to the window area-core area product, which is a measure of how much copper wire can fit in the core.
The window area is the space available for the windings, and the core area is the cross-sectional area of the core. The window area-core area product depends on the frequency, the voltage, the current, and the power factor of the load.
As we increase frequency, we can reduce the core area for a given voltage and current, which means we can also reduce the window area and use less copper wire. This makes the transformer smaller and more efficient³.