There is no correlation between being able to see better under 5500K or 3000K light.
There's a
significant difference when the light sources in question are discontinuous emitters like LEDs, HIDs, or fluorescents. These light sources all (but it's especially bad with LED or HID) excessive amounts of far blue that the human eye has difficulty (really, it's impossible) bringing into focus at the same time as the other wavelengths. This results in a net loss of visual acuity. Additionally, the discontinuous spectra make color differentiation more difficult and, in some cases, impossible, which can have a significant negative effect on your ability to visually determine what something is.
The blue crap you are referencing is higher in the spectrum (around 6500K). There is, however, some correlation in regards to blue being a harder color to see so subjectively you feel it is not as strong of a light.
First of all, you can't say 6500K is higher in the spectrum. Color temperature and position of a color within the spectrum have nothing to do with each other. Color temperature is an artificial scale correlating the overall emission spectrum of a radiator to the emission spectrum of an ideal blackbody radiator. So any color temperature of emitter that's
continuous would contain all "colors" of light but the relative intensities of those wavelengths would determine the color temperature.
Second, because of the fact that the color temperature scale looks at the total emission spectrum of an emitter, even yellower color temperatures can contain significant amounts of blue light. In fact, the nature of the discontinuous emitters like LED and HID leads to having a
HUGE amount of blue light in a fairly narrow wavelength range "balanced" by a wider, lower amplitude hump of warmer colors. It's this strong blue spike, even in a yellower color temperature emitter rated at 5000K or below, that causes vision problems.
Even a 5000K >95CRI light source contains enough blue light in relative proportion to the other colors to potentially cause problems (in large part because we never do reach direct sunlight intensity with our automotive/agricultural lighting). This is made
MUCH worse when you finally beat spec sheets out of people and find out the 5000K-6500K LEDs they're using for these light pods and bars are in the ~70 CRI range. It's abysmal.
Anyway, White light is a relative term. We (The US population) has been programmed to accept the warmth (2800K) of an incandescent bulb as a standard way things should look under artificial illumination. This is all psychological, a programming you have had since you were a child. We accept 5500K as the proper look for daylight. When faced with a 5500K bulb at night, when we are "programmed" for a 2800K bulb, we find it blue and "offensive". It is not the case in an analytical world. In daylight, we would find the 5500K bulb as augmentation and not being offensive.
It's not simply a conditioned response. In absence of a reference light source, the color temperature people consider neutral drops as illuminance drops. In other words, the dimmer the light gets (starting from bright sunlight) the yellower (lower color temperature) the light has to be for people to call it "white" even when you eliminate the other reasons that we can be tricked into thinking a light source is warmer or cooler than it really is. It points to an actual physiological mechanism in our visual system that causes color perception to vary with intensity.