noun
Definition: A belief or practice that centers around men and their experiences, often ignoring or minimizing the perspectives and needs of women and other genders.
Example: The company's policies reflected a clear androcentrism, prioritizing male employees' needs over those of their female counterparts.