This is not it (edit: “this” referring to atmospheric scattering), because if you look at the range of what is in shadow, it is significantly larger than the portion of the sky that you can see. If you could see that far, people in more of a 50% zone would be able to see the sky darken significantly in one direction and be bright in another. What we saw instead was the entire sky darkening evenly.
The real answer lies mostly in our nonlonear perception of light, meaning that we’re much more sensitive to the absolute change in amount of light when there’s less light than when there’s a lot. So the difference between 100% bright and 50% bright is a lot smaller to us than 50% bright and 0% bright.
try turning on a flashlight during the day and during the night and you can see the difference the same absolute change in brightness makes in different lighting scenarios. Let’s say that some flashlight is 5% the brightness of the sun. Going from 100% to 105% feels like nothing, but going from 0% to 5% is massive.
In fact you can model this difference using existing perceptually accurate color spaces. Let’s take the CIE L*a*b* color space. To find the perceived brightness (L*) you take Y, which is the absolute brightness in the CIE XYZ color space and run it through the following function: 116 f(Y/Yn) - 16 where Yn is the brightness of some predefined white point, and f is effectively the cube root (though it’s linear when lower than (6/29)^3 (less than 1%)).
If you look at the perceived brightness at 20% absolute brightness, you see that it’s not too far from the 75% brightness OP was describing.
I imagine there are other factors at play, but this is probably the biggest one.
This article actually talks about how your eyes adjust during an eclipse, which is a bit different from what i was talking about, but also likely just as important.
This is not it (edit: “this” referring to atmospheric scattering), because if you look at the range of what is in shadow, it is significantly larger than the portion of the sky that you can see. If you could see that far, people in more of a 50% zone would be able to see the sky darken significantly in one direction and be bright in another. What we saw instead was the entire sky darkening evenly.
The real answer lies mostly in our nonlonear perception of light, meaning that we’re much more sensitive to the absolute change in amount of light when there’s less light than when there’s a lot. So the difference between 100% bright and 50% bright is a lot smaller to us than 50% bright and 0% bright.
try turning on a flashlight during the day and during the night and you can see the difference the same absolute change in brightness makes in different lighting scenarios. Let’s say that some flashlight is 5% the brightness of the sun. Going from 100% to 105% feels like nothing, but going from 0% to 5% is massive.
In fact you can model this difference using existing perceptually accurate color spaces. Let’s take the CIE L*a*b* color space. To find the perceived brightness (L*) you take Y, which is the absolute brightness in the CIE XYZ color space and run it through the following function: 116 f(Y/Yn) - 16 where Yn is the brightness of some predefined white point, and f is effectively the cube root (though it’s linear when lower than (6/29)^3 (less than 1%)).
If you look at the perceived brightness at 20% absolute brightness, you see that it’s not too far from the 75% brightness OP was describing.
I imagine there are other factors at play, but this is probably the biggest one.
You’re probably right, I have edited my comment to reflect this. source
This article actually talks about how your eyes adjust during an eclipse, which is a bit different from what i was talking about, but also likely just as important.