It's s lot harder to switch off (i.e. extinguish/break the arc) of DC current than AC current because there is no zero voltage crossing (or zero current crossing)  that a 60 hz (sine) wave has every second.  That's why the DC rating of a switch is usually lower than its AC rating. 
 Interesting question as to whether a 27 watt at 12 V light draws less current if the voltage is 14v. Why would the wattage stay constant? It's not like a mechanical load that stays steady.  
Not sure about LEDs, but with other lamps one would expect the resistance (R) of the filament to stay constant.  We know R=V/I, or I=V/R; and from before power: P=VxI, so substituting V/R for I,  P=V^2/R.   If R is constant, the power goes up by the square of the increase. So for a 27Watt at 12v lamp,  27W=12^2/R, solving for R= 12V^2/27watts=5.33 ohms. 
At 14 volts the power is 14^2/5.33ohms=36.75 watts, which at 14volts is  (36.75W/14V=) 2.62 amps.  
So.... Question is : are LEDS constant power, or do they get brighter  (use more power/watts) when the voltage increases like an incandescent does?