Perhaps the easiest measurement to make of a star is its apparent brightness. I am purposely being careful about my choice of words. When I say *apparent brightness*, I mean how bright the star appears to a detector here on Earth. The **luminosity **of a star, on the other hand, is the amount of light it emits from its surface. The difference between luminosity and apparent brightness depends on distance. Another way to look at these quantities is that the luminosity is an intrinsic property of the star, which means that everyone who has some means of measuring the luminosity of a star should find the same value. However, apparent brightness is *not* an intrinsic property of the star; it depends on your location. So, everyone will measure a different apparent brightness for the same star if they are all different distances away from that star.

For an analogy with which you are familiar, consider again the headlights of a car. When the car is far away, even if its high beams are on, the lights will not appear too bright. However, when the car passes you within 10 feet, its lights may appear blindingly bright. To think of this another way, given two light sources with the same luminosity, the closer light source will appear brighter. However, not all light bulbs are the same luminosity. If you put an automobile headlight 10 feet away and a flashlight 10 feet away, the flashlight will appear fainter because its luminosity is smaller.

Stars have a wide range of apparent brightness measured here on Earth. The variation in their brightness is caused by both variations in their luminosity and variations in their distance. An intrinsically faint, nearby star can appear to be just as bright to us on Earth as an intrinsically luminous, distant star. There is a mathematical relationship that relates these three quantities–apparent brightness, luminosity, and distance for all light sources, including stars.

Why do light sources appear fainter as a function of distance? The reason is that as light travels towards you, it is spreading out and covering a larger area. This idea is illustrated in this figure:

Again, think of the luminosity—the energy emitted per second by the star—as an intrinsic property of the star. As that energy gets emitted, you can picture it passing through spherical shells centered on the star. In the above image, the entire spherical shell isn't illustrated, just a small section. Each shell should receive the same total amount of energy per second from the star, but since each successive sphere is larger, the light hitting an individual section of a more distant sphere will be diluted compared to the amount of light hitting an individual section of a nearby sphere. The amount of dilution is related to the surface area of the spheres, which is given by:

$A\text{}=\text{}4\text{}\pi \text{}{d}^{2}$ .

How bright will the same light source appear to observers fixed to a spherical shell with a radius twice as large as the first shell? Since the radius of the first sphere is d, and the radius of the second sphere would be $2\text{}x\text{}d$ , then the surface area of the larger sphere is larger by a factor of $4\text{}=\text{}\left({2}^{2}\right)$ . If you triple the radius, the surface area of the larger sphere increases by a factor of $9\text{}=\text{}\left({3}^{2}\right)$ . Since the same total amount of light is illuminating each spherical shell, the light has to spread out to cover 4 times as much area for a shell twice as large in radius. The light has to spread out to cover 9 times as much area for a shell three times as large in radius. So, a light source will appear four times fainter if you are twice as far away from it as someone else, and it will appear nine times fainter if you are three times as far away from it as someone else.

Thus, the equation for the apparent brightness of a light source is given by the luminosity divided by the surface area of a sphere with radius equal to your distance from the light source, or

$F\text{}=\text{}L\text{}/\text{}4\text{}\pi \text{}{d}^{2}$ , where d is your distance from the light source.

The apparent brightness is often referred to more generally as the flux, and is abbreviated F (as I did above). In practical terms, flux is given in units of energy per unit time per unit area (e.g., Joules / second / square meter). Since luminosity is defined as the amount of energy emitted by the object, it is given in units of energy per unit time [e.g., $Joules\text{}/\text{}second\text{}\left(1\text{}Joule\text{}/\text{}second\text{}=\text{}1\text{}Watt\right)$ ]. The distance between the observer and the light source is d, and should be in distance units, such as meters. You are probably familiar with the luminosity of light bulbs given in Watts (e.g., a 100 W bulb), and so you could, for example, refer to the Sun as having a luminosity of $3.9\text{}x\text{}{10}^{26}W$ . Given that value for the luminosity of the Sun and adopting the distance from the Sun to the Earth of $1\text{}AU\text{}=\text{}1.5\text{}x\text{}{10}^{11}m$ , you can calculate the Flux received on Earth by the Sun, which is:

$F\text{}=\text{}3.9\text{}x\text{}{10}^{26}W\text{}/\text{}4\text{}\pi \text{}{\left(1.5\text{}x\text{}{10}^{11}m\right)}^{2}=\text{}1,379\text{}W\text{}per\text{}square\text{}meter$

This value is usually referred to as the **solar constant**. However, as you might guess, since the Earth/Sun distance varies and the Sun's luminosity varies during the solar cycle, there is a few percent dispersion around the mean value of the solar "constant" over time.