I heard that darker skin is associated with poverty, even in Europe in the past, but then I also heard that while men prefer pale but women don't have a preference. Tanned in Europe and North America now seems to be associated with sun, health, and vacation.
I don't like to criticize people about this superficial stuff because I have some weird preferences, although they don't really follow mainstream culture and I really don't care about skin tone or what shade a vagina is, I've seen how these things form and change. "Feminists" blame companies but it's a rare event when products generate demand, even with the best advertisements in the world. This kind of thing existed in the past, but it was much worse, it included foot binding and corsets. Truth is that due to gender roles women bear unfair burden but superficiality and fashion trends aren't going away, and they're not driven by marketing departments.
I think this product may have been created in part because of their preference to fairer skin, because of the contrast that creates with the sexual organs that are usually darker. I think that's another reason why people prefer tanned in Europe and North America, skin is more uniform, unless they get tan lines which look terrible.