Women in American Indian Society devles into an area that has long been misrepresented, if not entirely neglected, by mainstream scholars. Traditionally, native women played important roles in their society, often determining the course of history, but as soon as Europens set foot on Indian soil, these women began losing ground. Denied their rightful positions of responsibility, excluded from tribal councils, and stripped of their property, Indian women felt sorely the chauvinism that whites forced upon their culture. As they lost status among their own people, many found themselves to be prized items in the eyes of white men, who needed local assistance in order to survive and profit from their ventures in the New World. Others experienced greater denigration at the hands of non-Indians. In the 19th century, after being placed on reservations and forced to learn the ways of whites, native women wielded newfound and traditional knowledge to surmount U.S. government efforts to obliterate Indian cultures. Today native women continue to take matters in hand by revitalizing their heritage and rewriting American history.