Updated: Jun 14
Ever since there have been babies, there have been conflicting views on how to feed them. Today, breastfeeding is generally recognized in our country as the gold standard for infant nutrition. The American Academy of Pediatrics, American College of Obstetricians and Gynecologists, and World Health Organization all wholeheartedly encourage breastfeeding. Its benefits for both mom and baby are scientifically proven, and its adoption in our culture is now widespread.
But breastfeeding hasn’t always been the prevailing baby feeding practice in the U.S. At times, it was socially sneered at and even medically discouraged. Beyond a checkered past, breastfeeding has always been a loaded topic that stirs up complicated emotions for mothers who don’t have the ability or desire to breastfeed. Let’s take a look at history to examine how breastfeeding has fared in American culture over the centuries and why it warrants a dedicated week – World Breastfeeding Week – annually from August 1st to 7th.
As millions of immigrants came to America, they left behind their extended families who would have typically shared child-rearing responsibilities and inter-generational breastfeeding wisdom. In England, the homeland of most colonial immigrants, women belonging to the upper classes employed wet nurses to breastfeed their children. Wet nurses were far more scarce but highly sought-after in colonial America. Typically temporary arrangements, wet nurses were both formally employed (e.g. a well-off family would hire a woman who was nursing her own infant or whose infant had died) and casually negotiated (e.g. a volunteer nursing neighbor would take in a newborn until the mother was able to breastfeed).
Up until the 18th century, physicians subscribed to the humoral theory of medicine, which held that the human body consists of four humors: blood, bile, black bile, and phlegm. Breast milk was believed to be menstrual blood that changed color in the womb and flowed into the breasts after birth. At this time, doctors supported breastfeeding based on the scientific evidence that maternal milk produced the highest probability of infant survival. Colostrum, however, was believed to be toxic and was commonly rejected as a baby’s first food. Therefore, the majority of newborns were either nursed by mothers of older infants or given alternative sources of nourishment until their mother’s mature milk came in.
Due to the general scarcity of women available to serve as wet nurses in colonial America and legal prohibitions against sexual relations with nursing women, mothers frequently flouted their doctors’ advice and hand-fed their babies prepared foods as a substitute for breast milk. The most common foods were a mixture of flour or breadcrumbs cooked in water or milk from cows or goats. Really desperate times called for desperate measures, and it was not unheard of for human babies to suckle directly from animals’ teats. This feeding method was a safer alternative to other food substitutes, which ran a high risk of contamination and infant mortality. All in all, initial efforts to provide alternative animal milk sources for baby feeding resulted in a high risk of disease or death, often from infection, dehydration, and malnutrition.
A stark change from the colonial era, when mothers typically breastfed at least through their babies’ second summer, many mothers began to supplement their own breast milk with cow’s milk shortly after birth and to wean their babies from the breast before the age of three months. Throughout the 1800s, lower-class women flocked to the factories that cropped up during the Industrial Revolution, and middle-class women became more active in social organizations. Breastfeeding was considered a private affair, so these women sought out breast milk substitutes to feed their babies while they were outside the home. In large part due to these social forces drawing mothers away from their babies during the day, many mothers didn’t breastfeed longer than a few weeks or months.
As many infants were left with caregivers (e.g. servants of upper class mothers, older daughters of working class mothers) who could not nurse them, their diets consisted mostly of homemade alternatives. Some mothers tried concocting their own feeding mixtures at home, using a peculiar combination of liquids such as cow’s milk, melted butter, or meat-based broth, combined with flour or soaked bread. Babies were often hand-fed spoiled cow’s milk with unsanitary pap boats or makeshift bottles that were hard to clean. Babies who were hand-fed tended to die at alarming rates. In some cities, one-third of children died before the age of five.
In the late 1800s, the infant mortality rate became so high that physicians took action to seek a medically-sound method to feed babies in the absence of breastfeeding. Medical researchers determined that questionable nutrition (compounded with poverty and unsanitary conditions) was the main culprit for infant deaths and turned a critical eye on cow’s milk. Mothers began to rely more on the expert opinions of physicians in their choice of feeding methods, and the first artificial formula was born in 1856.
The end of the 19th century brought major scientific advances in the arena of infant feeding. The discoveries of French scientist Louis Pasteur proved the existence of potentially contaminating microorganisms in food and introduced the process of pasteurization, which heats milk to a specific temperature to destroy harmful bacteria. Sterilizing infant formula and bottles made it much safer to feed babies breast milk substitutes.
First Half of the 20th Century
In the early 20th century, the American medical community made strides to improve the country’s abysmal infant outcomes. The United States Children’s Bureau, the first federal agency devoted to the welfare of mothers and children, was founded in 1912. One year prior, S.W. Newmayer was asked to guide Philadelphia’s child health and welfare reform. He discovered that the United States ranked 18th among developed nations when comparing infant mortality rates. At that time, 135 out of 1,000 live births resulted in death. Medical researchers agreed that artificially fed infants died at a much higher rate from gastrointestinal diseases than breastfed infants and concluded that nine-tenths of these deaths were from preventable causes.
As part of a national campaign to lower infant mortality, the medical community orchestrated two sets of public health campaigns in the early 20th century: one encouraging mothers to breastfeed as long as possible and the other crusading for clean cow’s milk. Public health officials across the country hung posters in urban neighborhoods urging mothers, “To lessen baby deaths let us have more mother-fed babies. You can’t improve on God’s plan. For your baby’s sake—nurse it!” and “Give the Bottle-Fed Baby a Chance For Its Life!”
Breastfeeding among the middle and upper classes had become the ideal at the turn of the century; yet many mothers, whether out of need or preference, continued to turn to breast milk alternatives. Heeding medical advice that human milk should be chosen over cow’s milk and other artificial substitutes, mothers who could not or chose not to breastfeed hired wet nurses to do it for them. Hospitals in the U.S. had entire wings of wet nurses, who were usually desperate, destitute women. It was an unpleasant occupation that often required the wet nurse to abandon her own child to ensure the survival of a wealthier mother’s child. By the 1920s, instead of nursing babies in hospitals, households, orphanages, and other institutions, most wet nurses began to sell their breast milk to be bottled.
The decline of wet nursing coincided with the rise of bottle-feeding in the 1910s. With the widespread availability of clean cow’s milk, bottle-feeding quickly gained popularity because it was convenient, scientific, and modern. The prevailing opinion of the day was that breastfeeding was an antiquated, primitive practice for unenlightened women. Infant formula manufacturers began to market synthesized infant formula not only as a necessity to women working outside the home but also as a convenience to women who wanted a freer lifestyle. By the 1950s, even physicians had gone from viewing commercial formula feeding as a healthy supplement or replacement for breastfeeding to believing that it was actually a superior means of baby feeding. Trusting in science - and therefore in the safety and superiority of scientifically-manufactured artificial food - trumped trusting in maternal instinct.
Beginning in the 1930s and continuing for an entire generation, hospitals allowed mothers who had just delivered to have very limited access to their babies. For the first week or more after birth, babies spent their days and nights in the hospital nursery and were only allowed to be with their mothers for a few minutes every four hours of the day. These practices made the establishment of successful breastfeeding nearly impossible. At the time, this was of no concern to physicians, who had wholeheartedly embraced the ease and healthfulness of medically-directed bottle-feeding. By the middle of the 20th century, an estimated 80% of American mothers bottle-fed their babies.
Second Half of the 20th Century
As the medical community began to believe that pasteurization made cow’s milk just as good for a baby as human milk, a new generation of doctors turned their backs on breast milk and mothers increasingly eschewed breastfeeding altogether. During the 1930s, more than 70% of firstborn infants were breastfed at birth, and 45% of these babies were still feeding at the breast at three months of age. By 1965, breastfeeding rates at birth had dropped to 38%, and only 12% of these babies continued to breastfeed at the three-month mark. In 1970, breastfeeding initiation rates dipped to 28%, and only 8% of infants were still breastfeeding at three months of age. The bottle had fully displaced the breast.
Breastfeeding’s downward trend hit rock bottom in 1972, when only 22% of mothers breastfed their babies. At this point, a combination of socio-political factors and improved medical technologies caused the breastfeeding pendulum to swing back in the other direction. Family life became a nationwide priority, and political reform was passed to support mothers.
As the feminist-inspired women’s health reform movement rekindled an interest in breastfeeding, mothers abandoned the bottle and shared baby-nursing duties with other moms in their neighborhood. A companion to the rising women’s movement, the natural childbirth movement fostered greater family participation in the birth experience through education classes, which included an emphasis on breastfeeding. Feminists’ efforts to empower women and change the delivery of health care dramatically improved breastfeeding rates.
Physicians in the first half of the century had been ill-prepared by their medical education to help women make informed feeding choices and stocked their offices with literature containing infant formula samples, discount coupons, and gifts, all of which contributed to a shorter duration of breastfeeding. In the 1970s, breastfeeding research bore out the benefits of breast milk for infants. Analyzing the composition of human milk, studies identified its short-term and long-term nutritional, immunologic, psychological, physiological, and neuro-cognitive benefits, as well as its effect on reducing respiratory illnesses, gastrointestinal illnesses, and inflammatory diseases. Towards the end of the century, a consensus emerged among health professionals that exclusive breastfeeding for the first six months should be universally recommended for optimal infant nutrition and health.
During the 1980s, an entire profession developed around breastfeeding. La Leche League International, founded with the goal of providing accurate information to breastfeeding women through a mother-to-mother model, developed uniform standards for the growing field of skilled lactation consultants. International Board Certified Lactation Consultants took on the task of facilitating breastfeeding in hospitals, public health clinics, private practice, and pediatric and obstetrical offices. Efforts to increase breastfeeding rates in the U.S. were paying off. By 1984, 61% of American babies were being breastfed at birth.
Nonetheless, American society was not fully ready to embrace breastfeeding as a mainstream practice outside the home. In the 1990s, many states passed legislation so that mothers could breastfeed in public places. The landmark Innocenti Declaration on the Protection, Promotion, and Support of Breastfeeding, developed by the WHO and UNICEF and co-sponsored by the United States and Sweden in 1990, set ambitious standards for national support of breastfeeding, including ensuring that its Ten Steps to Successful Breastfeeding would be followed by every maternal care facility and enacting legislation to protect the breastfeeding rights of working women. The “Breast Is Best” campaign was born, putting breast milk on a pedestal above all other forms of infant nutrition.
The Last Two Decades to Today
The breastfeeding movement has continued to make progress through political reform. In 2008, the Department of Health and Human Services developed a mass media campaign to promote breastfeeding, and in 2011, the U.S. Surgeon General's Call to Action to Support Breastfeeding identified key elements to support breastfeeding, including health care, families, communities and employment. The Centers for Disease Control and Prevention has supported quality improvement initiatives around maternity care practices to help promote breastfeeding. Perhaps the biggest boon for breastfeeding in modern history has been the Affordable Care Act, which passed in 2010 and requires health insurance plans to provide coverage for breastfeeding support and supplies.
Overall, efforts to provide moms who want to breastfeed with adequate resources and support and to dismantle barriers to breastfeeding in public places have been largely successful. Breastfeeding rates have steadily climbed over the past two decades, as women have decided to nurse in higher numbers and for a longer duration. Almost 70% of American mothers initiated breastfeeding in 2001, and that figure rose to 82.5% in 2014.
Nonetheless, glaring disparities persist based on socioeconomic and demographic status. Research shows that hospital maternity wards serving predominantly Black communities are less likely to help Black mothers initiate breastfeeding after birth or offer lactation support, instead offering their babies infant formula. Another barrier to breastfeeding has been glaring gaps in the coverage mandated by the Affordable Care Act. Many families have been saddled with unreimbursed claims after paying out of pocket for lactation services. Ultimately, breastfeeding remains a sensitive topic that can conjure feelings of shame or guilt.
Controversies over baby feeding methods might seem like a modern problem as we see the continued shaming of public breastfeeding here at home and internationally or the prevalence of breastfeeding guilt that moms may have. But, clearly, history has been fraught with challenges for successful breastfeeding for a long time. The questions American women faced in the past few centuries aren’t so different from those facing women today: how they should feed their babies and what everyone else thinks about that.
World Breastfeeding Week is an opportunity to celebrate the achievements of the past and ensure a promising foundation for the future of moms and babies around the world.
With Warmth and Wellness,
Your EmmaWell Team