Why Do Women Wear Bikinis?
Women wear bikinis to cover up the area that needs to be covered for both modesty and decency. The bikini line is supposed to give the illusion of wearing underwear. This helps people who are uncomfortable with their bodies or are afraid of sunburns feel more comfortable in one or allows people to show off their bodies without being too exposed. The common misconception is that women wear it because it's cumbersome, but this couldn't be further from the truth!