It doesn't show that. It's literally numerical in that dark skin reflects less light than light skin , so the sensors report lower values for the entire face, reducing contrast for the entire face, which is what the recognition systems count on.
Brown eyebrows on brown skin = low contrast.
Brown eyebrows on pale skin = high contrast.
If our races were dark purple hair on bright green skin and bright green hair on dark purple skin, facial recognition systems would have no trouble with either. But that's not how humans render, so our contrast based systems struggle with low contrast.
It's like you're confusing a software/data problem with a photon/physics problem because you're thinking in your box.
It's a design problem. If they tested it with POC they would have noted down "well, our primative algorithm works well for light skinned individuals but not others"
And hopefully someone wouldn't have said "hmm good enough for me, let's ship it!"
My webcam has an advanced option panel that lets me edit both the brightness and the exposure time. I can turn it up so bright that you can't even make out any of my facial features, and I'm in a somewhat dark room lit by a single floor lamp.
> it shows that they didn't test with anyone with a darker skin tone
Are you disagreeing and saying they did test on people with darker skin tone, found the issue, and decided to ship anyway? You realize that either way, it doesn't make them look good?
Anyway, leaving all that aside, the article interviewing an actual face recognition software expert, shows that your guesses here are incorrect.