I like your approach, Kevin, and yes, on the low voltage DC side the regulator circuits should be OK with even a 15...20% higher input voltage.
However, the main risk here is at the transformer side. Not sure about Sony, but in the commercial goods world it is common to squeeze out the most possible performance using the least possible material.
When it comes to an AC power transformer, we speak about a lot of iron and copper to be saved. Then, when you consider how a transformer works, a 10...20% increase in the AC voltage can result a much higher increase in the energy "pumped" into the transformer, and that "extra" converts to different kind of losses (iron core loss, copper wire loss, etc.), and ultimately, heat.
Normally, such a transformer is designed to work near the limit of possible magnetic saturation of the core, to keep the size (= cost of the material) small. If you go over this limit, that's when problems happen. Same for the copper windings: the diameter of the wire is designed to bear the maximum current that particular device is expected to draw. But if you increase the voltage, with the same copper resistance, the current will increase, so will the heat generated inside the copper. On top of all, this is a non-linear function :-( . I.e., 10% more current (or voltage) means 20+ % more energy, and 20% more current means 40+ % more energy.
We did not speak about electric and magnetic (also, audible) noises of an over-saturated transformer, or voltage swings on the AC net, that can take the input AC voltage way higher, than the nominal value. Example: it is not uncommon on the 230V grids, that the voltage occassionally goes up to 238...242V. I believe it is similar on the 100/110/120V grids too. Imagine, if a device meant to be used on a 100V line gets a - say - 125...126V AC input voltage on a 120V net.