I have returned to my 150W inverter simulation and have a new problem I can't decipher. This is the design:
It runs, it allegedly has good efficiency, and it creates a stable 180V output voltage. I've included a leakage inductance of 198nH on each primary winding to simulate stored energy in the transformer windings. I'm getting ringing on the drains which I expect, and I've added an RC snubber to reduce them. However, at the point of turn-off for the mosfets there is an enormous drain voltage spike up to 100V. It might be higher, but the Vds breakdown is 100V and I suspect the spike is even higher but the models are simply clamping it there.
I have experimented with diodes across drain-source, freewheeling diodes across drain-Vin, but the former does nothing and the latter severely distorts the switching waveform. When I check the primary input waveform of the transformer after the leakage it does not have these spikes:
I'm not sure how else to dissipate this stored leakage energy without interfering with operation. Since it happens at turn-off I wonder if there is a better drive scheme I can be doing? The whole idea of simulation was to identify issues that could destroy components but I'm not sure how to reduce this spike. Any help is appreciated.
1u
resistors, they can't be realistic. And why waste a full bridge when you could have had only two diodes? Also, do yourself, and others, a favour and use the already existent symbols for diode and MOSFET, it'll make for a much clearer schematic. As you probably know, if they're subcircuits instead of models, you need to change the prefix toX
, in rest it works just like a.model
. \$\endgroup\$