I dont think so. I would challenge these polls, the demographic groups it was targeted, the inherent bias involved etc.,
In my career, I have always seen women trusting and preferring to work with women, than men. Infact I have explicitly been told that multiple times. I quote a female colleague I once had - "Women prefer working with women because we are not afraid to be vulnerable with each other, which creates more trust".
I have also almost always seen organizations in which the leader is a female, the subsequent positions are all filled by women, not men.
What would you like me to do find a way to scrub the countless polls on the internet that support this finding because you feel there is inherent bias?
I can offer a anecdote of my own. I manage large teams and I always have to be cognizant of female managers to female staff dynamics. Guys, they don't give a shit about liking their boss, being friends with their boss, etc. They just get past that. I would say it's getting better because women are getting better at what I call team dynamics. No, we don't all have to be buddies to do the job. Still, I can take a group of male employees (including the female employees of course) who aren't friendly out for drinks. Women employees are more finicky about interpersonal relationships with female co-workers.
This is a generalization of course just like yours.
By the way, this seems like a bit of projection:
"Women prefer working with women because we are not afraid to be vulnerable with each other, which creates more trust". Somehow the polling doesn't pick this up. I don't think you can find one study that supports your POV.
This video is in regards to a study in Australia. This female Australian researcher found that 80% women preferred working for men. So I know it's not just an American thing. In fact, she said
".......if you go to the literature, there's huge studies all replicating these findings. Gallup in the U.S. has tracked this for sixty years and we're still not shifting the data too much."