All of it matters, including wiring. Small wires cause more voltage drop than big wires, and incandescent bulbs are very sensitive to voltage drop, a 10% drop in voltage leads to an approximately 33% drop in output. When you go to higher wattage draw bulbs, you may see an increase in brightness using the stock harness, but not near what they can put out as the voltage drop is increased by as you try to draw more current through the stock wires.
Daniel Stern Lighting Consultancy and Supply
Assuming the stock wiring is 16 gauge, which has a resistance of
0.00473 ohms per foot, and a length of cable of 6 feet, an amperage draw at 100 watts under 14.4 volts of 100/14/4 of 6.9 A, we get a total voltage drop in the wire of 6x0.00473x6.9 which is 0.19V or approximately 1.3%, not nearly as close to the 10% you quote.
Even assuming 18 gauge you still don't get above 2% drop in voltage, which translates to a drop of no more than 1.02^2 or 1.04 which means a 4% drop in luminous output (output is pretty much proportional to square of voltage).
IMO it's more important to make sure your voltage supply is proprerly regulated for the light bulb type you are using (standard running voltage is 14.4V, voltage with engone stopped is around 12V, depending on load), and that the connectors aren't corroded and well protected. That menas every connection between the battery and the light socket. Granted, if you measure voltage at the socket you will probably see a significant drop in use, on the order of one volt or so.
Using gold connectors and spotwelds between wire and connector, with dielectric lubricant, would ceratinly help make the connections truly low loss.
My JDM still has shiny brass surfaces visible in the connectors. Not too shabby after 17.5 years...
Anyway I'm going to measure these voltage drops tonight, check housing temperature and come back with real life figures.