Relationship between apparent magnitude & app. brightness?

Not open for further replies.


Hi, this is a question related to my Astronomy class. I would like the use the inverse square law for light to determine the luminosity of a star. The only values I have are the apparent magnitude and the distance. I am confused about the relationship between magnitute and brightness, reading the descriptions of both of these it seems that there should just be a simple equation relating them, but I am finding that is not the case.

Help please!


What you want to first compute is the star's absolute visual magnitude. You can do that with the parameters you have, apparent magnitude and distance.

Absolute visual magnitude or absolute magnitude compares the brightness of the objects so that the objects are placed the same distance of 10 parsecs away from observer. This allows comparing true brightness of the stars.

You can compute absolute magnitude using formulas and examples from here:

If you needed to know the intensity or amount of radiation that the star emits, you would first compute the absolute magnitude of the star and then compare it to our own star, who's intensity is known. However, to get the total intensity (radiation emitted on all wavelengths, not just visible) you would need to know the bolometric magnitude of the star. This magnitude accounts for all wavelengths the star emits radiation.

Hope this helps you to get started.


Well, we'll ignore for now that magnitudes are an anachronistic measurement that should no longer be used...but astronomers are a highly traditional lot...

The basic idea is that magnitudes is a unit of brightness calibrated so that a difference of 1 magnitude is the smallest change discernable by the average human eye. I.e. you see two stars of similar brightness, but you can distinguish that one is dimmer than the other, barely, and that should be a single magnitude...

Granted, this doesn't work well in practice, and now that it's been quantified so we can use it as a real unit of measurement, it works less well than it did. But that was the origin of the screwy "magnitude" system.

So the relationship between magnitude and brightness is pretty straightforward. They're the same thing, just measured with different units.

Now the big thing to actually worry about here is the difference between absolute, and apparent brightness/magnitude.

Absolute is a measure of how bright the star REALLY is, and this number is fixed, it doesn't matter how far away a star is, the absolute magnitude will never change, as it's based on the actual output from the star. This figure is determined by calculating what the brightness/magnitude of the star would be if we were able to view it from 10 parsecs away.

Apparent magnitude is how bright the star appears to be. If the star is 10 parsecs away, this figure is equal to the absolute magnitude. If the star is closer, it will be brighter (so lower magnitude) if it's further it will be dimmer (higher magnitude).

So the only thing that, theoretically, causes a difference between these two figures, is the distance at which the star is viewed. By finding the difference between these two magnitudes, you can find the "distance modulus" which can be run through the equations aphh provided to give you a distance.

So, now that you've read all that, I'll give you a clue as to where you've gone wrong: You're trying to use the logical "inverse square law" to figure the distance. You'll want to use the distance modulus equations Aphh provided, as they fit the structure of the question far better.
Not open for further replies.