I understand but that’s what a feminist is!
These words have been coopted. Especially in this country. Men call women feminists as a pejorative when they feel like a woman is mean to them, or when they don’t like their female boss. Historically, feminists have been painted as women with an agenda of, like, world domination.
A feminist by dictionary definition is a woman who does exactly what you said: fights for her own basic human rights, and advocates for other women.
We’re all political actors just by existing. You don’t need to participate in collective action to do political work. I’m doing feminism right now by getting an abdominal ultrasound rather than transvaginal because I’m protecting myself and still getting my needs met.
I want feminism to be inclusive, because I think women like you are right about so much and I think we all just need to talk to each other more and with more kindness.