From my own experience and observations, I don't think it's a logical assumption at all, at least not in mainstream Western culture. I can see where in some other countries, or in areas with large immigrant or first-generation populations from countries that have a very gender-divided culture where this might be a problem, but otherwise no. In the US, and probably in other Western cultures as well, we are raised to see a doctor as a doctor...and stereotypically, and still true in most fields, most doctors of all types are male. Of the OB/GYNs I personally know five are male and one is female, and the female is the only one that is still accepting new clients. Of their patients that I know, this isn't gender-driven in the slightest...three of the male docs have been in practice 30+ years, and the other is constantly updating his techniques for the best possible treatments available. The female is a good doctor, and the only one I knew professionally before I knew her personally, but has only been in practice for 8 years and so many women opt for experience. Personally, I saw the aforementioned female doc with my first pregnancy and thought she did a good job, but was not at all happy with a couple of things she did. For this pregnancy I am seeing the least experienced of the male docs, but his technique is the best in the area and he is a lot gentler than my previous female doctor. Overall, the only people I know who have a hangup about male GYNs are some of the less-secure boyfriends and husbands of the women that go to them.