A Brief History of Baby Feeding



Ever since there have been babies, there have been conflicting views on how to feed them. Today, breastfeeding is generally recognized in our country as the gold standard for infant nutrition. The American Academy of Pediatrics, American College of Obstetricians and Gynecologists, and World Health Organization all wholeheartedly encourage breastfeeding. Its benefits for both mom and baby are scientifically proven, and its adoption in our culture is now widespread.

But breastfeeding hasn’t always been the prevailing baby feeding practice in the U.S. At times, it was socially sneered at and even medically discouraged. Beyond a checkered past, breastfeeding has always been a loaded topic that stirs up complicated emotions for mothers who don’t have the ability or desire to breastfeed. Let’s take a look at history to examine how breastfeeding has fared in American culture over the centuries and why it warrants a dedicated week – World Breastfeeding Week – every August 1-7.


18th Century

As millions of immigrants came to America, they left behind their extended families who would have typically shared child-rearing responsibilities and inter-generational breastfeeding wisdom. In England, the homeland of most colonial immigrants, women belonging to the upper classes employed wet nurses to breastfeed their children. Wet nurses were far more scarce but highly sought-after in colonial America. Typically temporary arrangements, wet nurses were both formally employed (e.g. a well-off family would hire a woman who was nursing her own infant or whose infant had died) and casually negotiated (e.g. a volunteer nursing neighbor would take in a newborn until the mother was able to breastfeed).


Up until the 18th century, physicians subscribed to the humoral theory of medicine, which held that the human body consists of four humors: blood, bile, black bile, and phlegm. Breast milk was believed to be menstrual blood that changed color in the womb and flowed into the breasts after birth. At this time, doctors supported breastfeeding based on the scientific evidence that maternal milk produced the highest probability of infant survival. Colostrum, however, was believed to be toxic and was commonly rejected as a baby’s first food. Therefore, the majority of newborns were either nursed by mothers of older infants or given alternative sources of nourishment until their mother’s mature milk came in.


Due to the general scarcity of women available to serve as wet nurses in colonial America and legal prohibitions against sexual relations with nursing women, mothers frequently flouted their doctors’ advice and hand-fed their babies prepared foods as a substitute for breast milk. The most common foods were a mixture of flour or breadcrumbs cooked in water or milk from cows or goats. Really desperate times called for desperate measures, and it was not unheard of for human babies to suckle directly from animals’ teats. This feeding method was a safer alternative to other food substitutes, which ran a high risk of contamination and infant mortality. All in all, initial efforts to provide alternative animal milk sources for baby feeding resulted in a high risk of disease or death, often from infection, dehydration, and malnutrition.



19th Century


A stark change from the colonial era, when mothers typically breastfed at least through their babies’ second summer, many mothers began to supplement their own breast milk with cow’s milk shortly after birth and to wean their babies from the breast before the age of three months. Throughout the 1800s, lower-class women flocked to the factories that cropped up during the Industrial Revolution, and middle-class women became more active in social organizations. Breastfeeding was considered a private affair, so these women sought out breast milk substitutes to feed their babies while they were outside the home. In large part due to these social forces drawing mothers away from their babies during the day, many mothers didn’t breastfeed longer than a few weeks or months.


As many infants were left with caregivers (e.g. servants of upper class mothers, older daughters of working class mothers) who could not nurse them, their diets consisted mostly of homemade alternatives. Some mothers tried concocting their own feeding mixtures at home, using a peculiar combination of liquids such as cow’s milk, melted butter, or meat-based broth, combined with flour or soaked bread. Babies were often hand-fed spoiled cow’s milk with unsanitary pap boats or makeshift bottles that were hard to clean. Babies who were hand-fed tended to die at alarming rates. In some cities, one-third of children died before the age of five.


In the late 1800s, the infant mortality rate became so high that physicians took action to seek a medically-sound method to feed babies in the absence of breastfeeding. Medical researchers determined that questionable nutrition (compounded with poverty and unsanitary conditions) was the main culprit for infant deaths and turned a critical eye on cow’s milk. Mothers began to rely more on the expert opinions of physicians in their choice of feeding methods, and the first artificial formula was born in 1856.


The end of the 19th century brought major scientific advances in the arena of infant feeding. The discoveries of French scientist Louis Pasteur proved the existence of potentially contaminating microorganisms in food and introduced the process of pasteurization, which heats milk to a specific temperature to destroy harmful bacteria. Sterilizing infant formula and bottles made it much safer to feed babies breast milk substitutes.



First Half of the 20th Century


In the early 20th century, the American medical community made strides to improve the country’s abysmal infant outcomes. The United States Children’s Bureau, the first federal agency devoted to the welfare of mothers and children, was founded in 1912. One year prior, S.W. Newmayer was asked to guide Philadelphia’s child health and welfare reform. He discovered that the United States ranked 18th among developed nations when comparing infant mortality rates. At that time, 135 out of 1,000 live births resulted in death. Medical researchers agreed that artificially fed infants died at a much higher rate from gastrointestinal diseases than breastfed infants and concluded that nine-tenths of these deaths were from preventable causes.


As part of a national campaign to lower infant mortality, the medical community orchestrated two sets of public health campaigns in the early 20th century: one encouraging mothers to breastfeed as long as possible and the other crusading for clean cow’s milk. Public health officials across the country hung posters in urban neighborhoods urging mothers, “To lessen baby deaths let us have more mother-fed babies. You can’t improve on God’s plan. For your baby’s sake—nurse it!” and “Give the Bottle-Fed Baby a Chance For Its Life!”


Breastfeeding among the middle and upper classes had become the ideal at the turn of the century; yet many mothers, whether out of need or preference, continued to turn to breast milk alternatives. Heeding medical advice that human milk should be chosen over cow’s milk and other artificial substitutes, mothers who could not or chose not to breastfeed hired wet nurses to do it for them. Hospitals in the U.S. had entire wings of wet nurses, who were usually desperate, destitute women. It was an unpleasant occupation that often required the wet nurse to abandon her own child to ensure the survival of a wealthier mother’s child. By the 1920s, instead of nursing babies in hospitals, households, orphanages, and other institutions, most wet nurses began to sell their breast milk to be bottled.