So, then. This project A119. Just how bright would the flash have been?
Fortunately, we have a pretty nifty benchmark. On August 11, 2004, astronomers observed the first known Perseid impact on the moon. What was calculated to be a 12g impactor hit the moon at 61,000 meters per second (approximately 137,000 mph), creating a flash lasting 1/30th of a second. Its visual magnitude was 9.5.
Each visual magnitude is 2.5* times fainter than a magnitude lower, and lower numbers are brighter. Thus, Jupiter at -4 is 100 times brighter than a magnitude 1 star.
E= 1/2 * mass * velocity2, so E = 1/2*12g*(61000m/s)2 = 22 Megajoules. (=2.2 * 107 J)
The largest US nuclear test was the Castle Bravo test of 1954, with a yield of 15 Megatons of TNT, which equals 63 petajoules (= 6.3 * 1016 J).
Assuming the energy / light ratio is similar, we can calculate just how bright the flash would have been.
6.3E16** / 2.2E7 = 2.9E9 times brighter. That's 2.9 BILLION*** times brighter.
log2.52.9E9 = 23.8 magnitudes brighter than 9.5. 9.5 - 23.8 = magnitude -14.3.
The full moon, by comparison, is magnitude -12.7. The nuke would be 4.3 times brighter than the full moon.
* actually the fifth root of 100, 2.512, so that 5 magnitudes is equal to 100 times difference in brightness
** Exponential notation. 1.5E4 = 1.5*104 = 1.5*10000 = 15000
*** American billion - 109
The Boston Projection
1 day ago