In a sense, aurorae are brightenings in the sky caused by a fluctuation in the solar luminosity. But I presume that what you're asking is if the average surface brightness of the sky changes as a result of solar variability. The sun does actually vary at about a 1% level due to the presence of sunspots (this was measured by the solar maximum mission back in the 80s - see for example:
http://solarscience.msfc.nasa.gov/SMM.shtm ). It is actually brightest when there are more spots, and most of the fluctuation in brightness comes in the UV/X-rays. In the optical the variability is much lower (much less than 0.1% IIRC - see for example:
http://aanda.u-strasbg.fr:2002/articles/aa/full/2004/38/aa0028-04/aa0028-04.html ). 1% variations are too small to really detect by eye (the best visual variable star observers can detect variations on the order of a few percent, 1% is really too low), the actual variations that you would even be able to see in the optical are much smaller than that still.<br /><br />Re. the full moon brightness changes. There are a number of different factors contributing, solar variability is not one of them. See for example:<br />
http://curious.astro.cornell.edu/question.php?number=529<br /><br />I'm not really sure that the variations are that big of an effect though, I think a lot of it could be what time of night you happen to go out and look at it, it doesn't look quite as bright when it's closer to the horizon. <div class="Discussion_UserSignature"> </div>