Give the Gift of Healthy Food with our Digital Gift Cards

No Antibiotics!

written by

PtF Farm Store

posted on

March 26, 2024

At Pasture to Fork, we like to point out that we don’t use antibiotics in the production of the foods we raise. Why is this important? Weren’t antibiotics deemed a miracle drug that proved to be a powerful life-saving tool when they first became available?

Yes, that is true, and I’ll even go so far as to say they are still a major lifesaver in human medical care eighty years later. However, today we’ll talk about how they have been abused and why it’s crucial that we limit their use to human medicine.

The first known antibiotic, dubbed penicillin, was discovered accidentally by a researcher named Alexander Fleming in 1928.  Though discovered in the twenties, it wasn’t widely propagated until the 1940’s when—after saving lives miraculously in instances like the Cocoanut Grove nightclub fire—the US government invested in its production to be used on the war front in WWII.

It’s hard for us to imagine a world before antibiotics. Everything from paper cuts to childbirth had the potential to kill via bacterial infection. Even minor wounds to soldiers in warfare—upon becoming bacterially infected—were cause for amputations, extreme illness, and death, which is why the government had interests in the mass production of penicillin.

Penicillin was quickly followed by other antibiotics such as aureomycin, tetracycline, and the like—and it quickly became a race between pharmaceutical companies in the late 1940’s to develop the next family of antibiotics that could then be patented. The formula for penicillin was proprietary from the beginning—being seen as a product for the greater good of society—but pharmaceutical companies, of course, each wanted their own piece of the pie.

In 1948, a researcher named Thomas Jukes, who specialized in animal nutrition working for Lederle Laboratories, discovered (again, by accident) a ground-breaking new twist to antibiotic use. In an effort to find new ways to cut costs for poultry farmers following the sharp decrease in demand post-WWII,  research entailed using laboratory waste used in making antibiotics as a supplement in chicken feed. Having had indications suggesting growth-promoter properties, Jukes performed one of the first controlled research projects on chickens and found the group fed antibiotic waste to be markedly bigger at the end of the 25-day feeding trial, discovering the growth-promoter properties of feeding antibiotics to animals.

This discovery opened a whole new frontier for the pharmaceutical industry and a tremendous new market for growth promoters in animal agriculture. It also made for an unprecedented hubristic attitude in the world of animal agriculture, which in turn, led to the confinement animal feeding operation (CAFO’s) of today. But, the chicken had yet to come home to roost (pun intended).

In the early years of antibiotics as a growth promoter the common mentality among both manufacturers and farmers was “if a little is good, more is better”, and growth promoter antibiotics were largely unregulated which resulted in heavy use, inexact dosages, and the like. Regulators, being enamored along with the industry, looked the other way and didn’t interfere. For context, let's remember this was the era when the chemical DDT was considered a marvelous and life-changing invention, only to be banned later.

Even early on there was concern among a few (only a few) scientists about the prospect of antibiotic resistance. In his 1945 Nobel prize acceptance speech, Alexander Fleming warned that the development of resistance had the potential to ruin the miracle of antibiotics. Resistance is the term used to describe the ability of bacteria to mutate and overcome the proficiency of antibiotics. The thesis is that an application of antibiotics never kills all the bacteria, allowing the survivors to gain genetic resistance not only to that particular antibiotic, but other antibiotics as well.

Throughout the sixties and into the early seventies, although there was little warning of antibiotic resistance, already cases of mass bacterial outbreaks occurred where antibiotics proved ineffective, including a 1967 stomach bug in Yorkshire, England where 15 babies and young children died for lack of effective and timely antibiotic treatment caused by resistance.

Thus far, little to no effort was put into measuring how quickly resistance is built. Until 1974, when an independent study took place that even today is under published. Participating in the study was the Downing family from Boston, who had ten children and a small farmstead. The researcher in charge of the study, Dr. Stuart Levy, designed it to include six batches of young chickens, half being fed antibiotic-free and half fed growth-promoter antibiotics. The oldest Downing child, Mary—a sophomore in college—cared for the birds, which were housed in the Downing’s barn in separate pens 50 feet apart. A precise chore routine was adopted where the antibiotic-free birds were fed and cared for first, then the flocks fed growth-promoters, after a change of boots and washing of hands. Birds from each flock were swabbed once a week, as well as fecal swabs of each of the Downing family—and a number of neighbors as well—with the objective of learning how quickly antibiotic resistance spread through the flocks, as well as the people participating in the study.

The results came quickly. Samples taken at the beginning of the experiment showed very few bacteria in the guts of the chickens, family, or neighbors containing defenses against tetracycline (the drug used in the chicken feed). That was to be expected, given the random roulette of mutation. But within 36 hours, those bacteria multiplied in the antibiotic flock, but the drug-free flocks remained clean for a few weeks longer.

Then things changed. First the bacteria in the antibiotic-fed flock became resistant to multiple drugs, including other families of antibiotics like sulfas, streptomycin, etc. Then the multidrug-resistant bacteria appeared in the flocks that never received antibiotics and had no contact with the birds that did. And soon after, the same multidrug-resistance showed up in the Downing’s fecal samples.

To the disappointment of his sponsors, Levy had demonstrated what they had hoped to disprove. Even though the feed contained just tiny doses of antibiotics, those doses selected resistant bacteria—which not only flourished in the animal’s systems, but left the animals, moved through the farm’s environment, and entered the systems of other animals and of humans in close proximity (but did not spread to any of the neighbors—which served as the control group). This served to reinforce some of the early scientists concerns that these altered bacteria were an untrackable, unpredictable form of pollution.

In her comprehensive book, Big Chicken, Maryn Mckenna (where I learned much of what is written in this article) eloquently relates not only the facts given above, but also tells the story of years and decades of industry and regulator pushback against the idea of restricting farm-use antibiotics—even into the 2000’s. She shares stories of horrible illnesses and epidemic-proportion bacterial outbreaks costing the lives of people who were unknowingly harboring antibiotic resistance being quietly transmitted from farm to food to consumer. Stories even of outbreaks traced backward by epidemiologists from the victims to the farms on which the meat was produced. The resulting reports and database entries, by the way, were then ignored and buried by regulators and industry leaders.

In the book, Mckenna does an outstanding job of presenting antibiotic resistance for what it is, a silent threatening contaminant that moves through a largely unaware society, looking for its next victim or victims. Even today, it’s not a subject well-covered by the media, largely due to the pharmaceutical interests in keeping it hush.

And this is where we find ourselves today, fifty years later. Although farm and food related antibiotic use has garnered a far more attention in recent times than anytime in history, they are still being widely used in the poultry, pork, and beef industries, both as growth-promoters and as preventative doses to prevent illness on factory farms. Several years ago, some of the major poultry providers—including Perdue—made a PR effort in the direction of “antibiotic-free.” The reason I say a PR (public relations) effort, is because it was driven, at least in part, by an increasing concern among the people regarding human-medical-use antibiotics used in agriculture and the subsequent risk of antibiotic resistance.

Borne out of that effort—which was also driven by recognition within the industry that growth-promoter antibiotics were losing their effect—came a family of drugs called “ionophores”, which were not quite the typical antibiotic, were not classified as an antibiotic (conveniently?), but were essentially an industry antidote to traditional antibiotics. However, it allowed the meat industry to advertise their product as “antibiotic-free” without taking the risk of losing production due to the loss of both growth-promoter and preventative antibiotics. Granted, ionophores were not used—at least not as heavily—on humans, but that doesn’t change the farce of “antibiotic-free” in the meat industry.

Vaccines have also been adopted in the meat industries as a sort of replacement for sub-therapeutic antibiotics. Modern vaccines—including mRNA technology —has been used increasingly in recent years as a solution to the rising pushback—and loss of effectiveness—against antibiotics used in meat production.  

To summarize, the discovery of antibiotics changed life as we know to a degree we cannot imagine, mitigating risk of bacterial infection astonishingly. However, the advent of antibiotics used in animal agriculture quickly threatened the efficacy of human-use antibiotics due to rapid rise of resistance to early antibiotics—and even faster, to other families of antibiotics. From the mid-1940’s into the 2000’s the meat industries, pharmaceutical companies, and even regulators ignored and repressed concerns involving the threat of antibiotic resistance. Despite attempts and posturing of certain players in the meat industries, even today it appears as if the mass-producing meat purveyors are unwilling and/or unable to completely absolve themselves from antibiotics in the production of human food, which only furthers the hazard of superbug infections that are resistant to nearly all common medical-use antibiotics.

Until the industry becomes willing to abandon its intense confinement production model, I don’t see the antibiotic story changing. However, the upside to this is that farmers who are willing to adopt a more natural template like the outdoor pasture-based model can completely eschew antibiotics, which is the grassroots future to clean eating for those who know and care about the antibiotic issue.

At Pasture to Fork, we are unwavering in our stance against using antibiotics to produce your food, even making "No Antibiotics" one of our stated protocols. While we believe the risk of regularly consuming antibiotics is great enough for adults, it’s even greater for children, and investing in future generations is paramount in our opinion. And that’s The View from the Country.

P.S. I highly recommend reading Maryn Mckenna's book, Big Chicken. This article does not do her work justice but is a mere sneak peek.

More from the blog

The Folly Of the Calorie

At Pasture to Fork, we have plenty to say about Corporate Food’s sleazy labeling tricks. Labeling tricks that magically turn the pseudo food produced in corporate laboratories and food factories out to be not only desirable, but quite healthy as well. And I must say, even for a real-food-passionate person like me, a walk down grocery store aisles—especially at mealtime—instigates a level of desire for even the most processed items on the shelf (I too, grew up consuming these food-like substances and developed a palate memory for their allure). The greatest advantage Big Food enjoys—which allows their hiding behind glitzy labels and wordy claims—is the disconnect between the farm and the eater. While this is convenient and desirable to many consumers as well as farmers, more and more people are waking up to the fact that their food may be vastly compromised, and that increasingly we’re a weakened species for consuming it. Convenience is addictive, however, and determining to source food locally and directly requires dedicated effort, although I would suggest it also brings considerable satisfaction and empowerment. Direct-to-consumer farms like us, on the other hand, have little use for fancy labels. Perhaps the number one reason being that the consumer—in most cases—either visits the farm in person or browses our website seeking a trustworthy source. These are people who want to connect with the producer’s vision and philosophy. Food produced and marketed in this manner doesn’t need much of a label, only true in-person representation and quality packaging in order to preserve freshness and quality. Given the attitude of acceptance among many Americans, I continue to be amazed at how few years have elapsed since the advent of government control in the food sector. Most mandatory food laws in this country are quite young and have not proven themselves capable of adding value or benefiting society. For example, the Nutrition Fact label we now take for granted was not required until 1994. With this being 2024, that makes ’94 exactly thirty years ago. Not a long time! How did people possibly know how or what to eat prior to ’94? I’m sure people did know, and maybe, just maybe, were more in tune with their food for the lack of labeling and government direction. We believe most so-called “nutrition labeling”—especially the Nutrition Facts graph—offer less value than most of us know. For example, the measure of calories has almost no relation to real nutrition and may cause more distraction than assistance. Yet calories are listed first on the Nutrition Facts label, in bold print. If tracking calories is of such utmost importance—or of significant value—why are 2/3rds of Americans now overweight or obese? Clearly, this exemplifies how the count of calories does not equate food quality, with Americans being more overfed and undernourished than ever. But the food police are doubling down, with a new law enacted in 2018 where the FDA requires any restaurant with more than 20 locations to provide customers with a calorie-count on their menu items. Is this anything more than a perpetuation of nutrition distraction? As Einstein said, “Insanity is doing the same thing over and over again and expecting different results.” As you may know, I’m not a fan of government attempting to influence societal behavior. But what really bothers me about the government-mandated caloric rule is the fact that it assumes “a calorie is a calorie” regardless of its origin. If you ate 500 calories of soda and 500 calories of broccoli, would your body respond similarly? Of course not! They may be calorically the same but are a world apart nutritionally. Don’t think your body doesn’t know the difference. So, how is a calorie determined? Number one, it’s a unit of energy—the amount of energy needed to raise the temperature of a gram of water one degree Centigrade, to be exact. Or the modern version is simply 4.1868 joules of energy. That’s all, merely energy. Obviously, a calorie of gasoline energy will not serve my body like a calorie of pork chop will. Perhaps the foremost reason is because a calorie of pork chop also provides a lot of other value besides X amount of energy. Which brings home the point of the discussion; calories are such a tiny portion of the measure of food item that it’s practically unimportant. Perhaps the most disturbing part of all this is that it’s the best we can do in modern America. Is this really the brightest and best in food science? Please tell me it’s not. As a matter of fact, we know it’s not. Private sector doctors and nutritionists—and perhaps everyday people who take an interest in food and how it affects us—now know far more about food and nutrition than anything coming from the ivory towers of government. Or at least are willing to have the discussion and/or publish their findings. Goodness, we’re still using a nearly 150-year-old method to determine caloric content. Besides, are we as humans not more than test tubes? Do we not break down food in a far more complex manner than a bomb calorimeter, which is how calories per gram of food are determined? We digest food efficiently or inefficiently depending on stress, nutrient deficiency, digestive enzymes, composition of gut flora, timing of previous meal, and on and on. One day you may be able to digest 300 calories from a meal but only harness 200 calories from the same meal the next day based on your environment and individual state of being. There are so many different diets on the market because no one really knows what you should eat. There’s probably as many opinions and disagreements as there are dieticians and nutritionists. However, one thing almost universally agreed upon in the diet world is whole foods raised without chemicals and antibiotics. Eating clean whole foods come with a lot of advantages, and literally no disadvantages. When you switch from a processed food diet to whole foods, you don’t have to worry about counting calories because your body self-regulates. It works the way it’s designed to work. You stop over-eating because you are no longer blocking the hormonal signal that tells your body you are full. When I say whole foods, I’m not necessarily saying raw food—although that can be included. I'm saying food that has nothing in the ingredient list except that food—or very few other ingredients. Do yourself a favor and simply stop counting calories. Stop listening to governmental guidance as to what foods you should or shouldn’t eat. Don’t choose your food based on an inaccurate label that perpetuates the myth that all calories are the created equal. It’s simply not true.  Let the stress of calorie counting go from your mind and body. Instead, invest in and enjoy clean whole foods—the food God intended for you to eat, and enjoy eating in a guilt-free state of mind without being all wound up about the number of calories you’re consuming.  Your body recognizes whole foods and knows how to digest and metabolize them for your health and benefit. And that’s The View from the Country.

Climate Change

While we try to stay out of politics, occasionally we come to the point where a hot-button issue just needs to be talked about. Today, climate change is one of them. Because of the esoteric ideas surrounding this subject, much of the language used in political circles is lost on many people, which creates a vast demand for it to be discussed from an everyday common-sense let’s-solve-the-problem-realistically approach. Here’s where we are, the way I see it; the political left seemingly blames climate change for everything. From inflation to an unreliable supply chain to the price of gasoline, climate change is thrown into the word salad at every turn. It’s the burgeoning apocalypse that justifies massive spending plans (often with additional legislation silently piggybacked into it). At the same time the political right seems to completely downplay—even scoff—at the potential for human-induced climate factors and/or attempts to address it. Who’s right? I suggest somewhere between these two positions is where the truth resides, and that nuances to the discussion exist that are completely ignored by both sides in the scuffle for political power and influence. I’m not a scientist or any sort of climate change expert. What I am, though, is a farmer who studies nature, keeps his ear to the ground for truth and alternative opinions, and endeavors to find common-sense middle ground. Moving forward we’ll talk about the flawed science surrounding climate change, discuss some of the angles not mentioned in the over-politicized conversation, and how human activity affects the climate. Here goes: Flawed Science – None of the computer models used have been able to function in reverse. In other words, if the models used to measure the timeline until the apocalypse are run backwards, we’re all extinct 200 years ago. This, of course, raises questions about the ability of the models to make accurate predictions for the future. Perhaps it’s a reminder to pause in our hubris and remember that technology can only lead us so far. We must recognize that systems dependent upon information plugged in by fallible humans can come to flawed conclusions. That said, we know that some of the arctic glaciers are receding. For example, in Alaska there are now interstates where glaciers were only 40 years ago. The question, however, is whether or not it’s new. Has it ever happened before? We don’t know. And then there’s the argument about cows inducing climate change. We have literature suggesting—with some certainty—that the planet carried far more animal weight 1000 years ago than it does today, even with factory chicken houses and multi-thousand-cow feedlots. It should give us all pause to realize that earth’s abundance is not tied to modern machinery, thousands of acres of annual crops, or 10-10-10 fertilizer. It must be tied to something else. Is there a way we can resurrect—domestically regeneratively resurrect—that abundance? The herbivores that were here 1000 years ago only ate plants. They didn’t eat corn or soybeans (monoculture), and they didn’t eat fermented plants like silage or rendered processing waste (such as man have devised for feedlot cattle). Herbivores—having more than one stomach—essentially have a fermentation vat in their gut and when fed fermented feeds it acidifies the gut and doubles the methane produced. As you may know, cow farts—or burps (they haven’t decided which yet :)—are blamed for causing climate change, which many of us think sounds far-fetched. If indeed it does, feeding concentrated grain diets to herbivores exacerbates the problem. Viewpoints Climate Extremists Never Mention (and perhaps don’t know about) – A diversified plantscape (prairie) stimulates the production of a methanotrophic bacteria (yes, it’s real; you can look it up). This bacteria—in a healthy diversified ecosystem—absorbs methane equivalent to that which is produced by over 1000 cows per acre. The problem is, today very little acreage devoted to herbivores (livestock) is a healthy perennial prairie ecosystem, where herbivores prune and move according to the template provided in nature (wild herds chased by predators). Methanotrophic bacteria doesn’t grow under corn or monoculture, it doesn’t grow under overgrazed land, it doesn’t grow under asphalt, it doesn’t grow under feedlots or factory farms. It requires a diversified perennial landscape. This, once again, speaks to how nature always provides checks and balances in the ecosystem, if only we lay down our hubris long enough to notice. The problem is that the scientists who study these things study extremely dysfunctional ecosystems, and then extrapolate data based on this completely inappropriate dysfunctional database. Science often is not objective, but is approached with intent to prove a viewpoint. The Australian scientist, Walter Jennings—along with scientists around the world—have determined that the temperature regulator of the planet has little to do with greenhouse gases (GHG’s) which is what climate change “experts” have been fixated on for many years. Rather, it’s about water condensation. The truth is, only 5% of planetary temperature is regulated by GHG’s. 95% is the energy it’s takes to condense water. In order to condense, water must have a particle to condense on—it can’t just condense on nothing. The main thing it condenses on is bacteria, specifically the bacteria that’s an exudate from foliage. Have you ever noticed that in areas of heavy foliage—such as mountainous or heavily wooded areas—how in the early morning this cloud, or mist, rises and hangs heavy during the time of temperature inversion as the sun begins to heat the atmosphere? This is water, after marrying to bacteria, that’s now condensing and vaporizing into the atmosphere, which in turn creates clouds that bring rainfall, which cools the earth. This explains why in climate change the dry areas are getting drier and the wet areas are getting wetter. Even climate scientists are bewildered by this. But in Jennings’ condensation theory the planet is essentially a big radiator. The physics of the planet is that it wants to be balanced. So, if agriculture destroys vegetation in one area (via plowing or overgrazing) the planet must cool itself somewhere and does so in places where vegetation exists. There the moisture can condense because of the presence of bacteria from foliage, which vaporizes to form clouds and precipitation. In other words, the moisture is concentrated there. Does Human Activity Affect Climate (if so, how)? – In my opinion, it’s no longer a question whether or not if humans are affecting the climate. Allen Williams from the regenerative farming consulting group Understanding Ag, relates their experience in working with the 30,000-acre Las Damas ranch in the Chihuahuan desert of Mexico. The area gets only about 8 inches of rain a year—and still has horrible erosion. For as long as any living generations remember, the desert has grown rather than receded. Starting in 2010, Understanding Ag worked closely with the ranch to develop cattle water and fence in some of the worst areas of the ranch, essentially to expand the areas where vegetation exists. At the conference where he and I met last winter, Allen showed pictures of a decade of progress since they began working with this ranch. Not only has the amount of plant material increased dramatically—what was large areas of desert devoid of grass is now a sea of green. But more importantly, after only ten years they’re seeing changes to the micro climate to where Las Damas now gets rainfall that seems to follow the property line. In other words, they get more rain than the neighboring ranches do. This is due to the amount of grass and other plant material (think bacteria exudates from green foliage) on the ranch compared to their neighbors who are not using regenerative practices. I suggest that if the micro-climate can be influenced in a 30,000-acre region in a decade, then little doubt remains whether or not human activity can affect the overall climate of the earth. As of 2019, the USDA had recorded 897,400,000 acres of farmland in the US, which is nearly thirty X the acreage at Las Damas. Most of these acres are either in monocrop or in miserably mismanaged grazing land. Monocrop, by design, requires either tillage or heavy applications of chemicals—both of which destroys soil. The same is true for unmanaged grazing land—meaning not managed to prevent overgrazing or under-grazing, both of which have negative effects on the soil and water cycle and cause desertification. In the span of about 200 years, the soils of the American Midwest went from what we think was about 8% organic matter (which is carbon), to an average of 1.5%. Where did the carbon go? By and large, it was released into the atmosphere because humans uncovered the soil via tillage in order to grow annual monoculture crops. Not only are our soils down to bare bones, but our air is polluted with carbon that needs to be returned to the soil in order to have a healthy ecosystem. Never before in history have humans had the means of raping the soil to this extent—made possible by mechanically tilling the soil as well as chemical technology. What if all climate change funds and efforts were channeled into growing a managed diversified perennial plantscape on 70 percent of these nearly 900 million acres? Imagine how much carbon could be sequestered from the atmosphere, not to mention methanotrophic bacteria produced to sequester methane from the air. This may sound like a pipe dream. But maybe it isn’t in light of the fact that 70% of all grain grown in the US is to feed herbivores who are not designed to metabolize them. Although human induced climate change is a very real possibility, I don’t see it as a burgeoning apocalypse. However, it’s a very real threat to our domestic ability to feed ourselves. This is not a problem government can fix via massive spending bills. Yes, they could stop throwing taxpayer money around in the form of crop subsidies, which would take away the incentive for the overproduction of monocrops such as corn and soybeans. But government will not stop or even slow climate change by limiting the use of fossil fuels or eliminating animal agriculture. The solution must come from the people. Each of us has a responsibility to decrease or eliminate our own portion of the demand for annual-crop-based foods and create demand for regeneratively produced perennial-crop-based foods. If humans have created this problem, then humans will have it to fix. We don’t have to look to the ivory towers and the “experts” to do it. And that’s The View from the Country.

Can the Food and Farming Crisis be Resolved?

Given the modern worldview that independence of the individual is everything, it’s probably a bit of a shocker when I say I view human independence as an illusion—a mirage in the distance that will always be that—in the distance. Yes, we have an aura of independence given that we have mechanized transportation that’s as easy as getting in and turning the key, we have devices in the palm of our hand that literally give us access to the knowledge of the world in milli-seconds. And that creates an impression of independence in the sense that we can go places, do things, and know things that were impossible for most of human history. But even with these modern technologies, we are dependent on other people. No man is an island. And I’m not even referring to the psychological aspect of human nature that wants to be connected to other people. It’s just an irrefutable fact that humans have always been dependent on each other—community and kin—as well as the surrounding eco-system for survival. But as life became easier due to industrial and technological advancements, many of us are at least a little obsessed with the idea of being our own person apart from others. That in itself may be ok, but I’ve come to agree with Stephen Covey in his The 7 Habits of Highly Effective People where he says: Independence is the paradigm of I—I can do it; I am responsible, I am self-reliant, I can choose. [on the other hand] Interdependence is the paradigm of we—we can do it; we can cooperate; we can combine talents and abilities and create something greater together…. A little later in the same chapter he writes: Life is, by nature, highly interdependent. To try to achieve maximum effectiveness through independence is like trying to play tennis with a golf club—the tool is not suited to the reality. Interdependence is a far more mature, more advanced concept.      Granted, the book Covey wrote is about effectiveness, and that may not be the goal of some people today—although I say it should be. What greater aspiration could we have than to be effective in our lives. Whether it be in our work, with our family, or even in our faith, we should aspire to be effective, which is defined as “adequate to accomplish a purpose, producing the intended or expected result.” Like always, I would like to turn this discussion to the food arena. How does it apply? Number one, due to the industrialization of human food within the last fifty to seventy-five years – not the least of which is the emergence of a food processing industry who brought a great many convenience foods into existence, distributing them to every local grocery and supermarket. Foods that traditionally were sourced directly from local farms and home gardens now come from nameless, faceless entities and conjure a fantasy of not only human independence from the natural elements man has always relied on for sustenance, but also the false resemblance of food security and independence. Modern society forgets that food and farming is inextricably linked, regardless of whether it comes from the supermarket in a plastic package, from the garden in the back yard, or from a local farm. That man would no longer be bound – yes, helplessly dependent – to the natural elements of soil, air, and water is one of the biggest myths of all time. In his writings, Joel Salatin often refers to our interaction with the earth and our dependence on its fruits as our ”ecological umbilical.” At first I thought it to be too strong a term, but I’ve changed my mind. Our dependence on the earth and its natural elements is not unlike the utter reliance of an unborn baby on the continuation of nutrients through the umbilical connection with its mother. In the foreword of Forest Pritchard’s excellent book, Gaining Ground, Joel penned these words; “We cannot escape our responsibilities to, nor our interactions with, soil, air, and water – the basic ingredients in the farmer’s alchemy….. Unlike other vocations that are arguably more or less necessary, farming is basic to human existence. Because it is at the root of civilization, it has the greatest capacity to either heal or hurt humankind’s planetary nest. As co-stewards of this great creation, we all owe future generations the benefit of knowing something about farming, food production, and land care. Few intellectual journeys could be this necessary and far-reaching.” Isn’t that an irrefutable truth? As the farmer population continues to decline – largely due to either age or bankruptcy – it will become more obvious than ever how dependent society is on farming and food production. Agricultural statistics are concerning in terms of farmer age, although it’s a little-known concern in society and is not touched by the mainstream news. One of the most abnormal aspects of modern America is the fact that many regions are literal food deserts, meaning there’s no food being raised in the vicinity. This is true not only in cities and urban areas, but in many rural areas as well. To be sure, rural areas may have farms – even active working farms, but they are usually in the commodity business and are not raising actual food for local sale to the local populace. Whether they have corn, soybeans, wheat, or hay in the fields, it’s a commodity that goes for animal feed. They may have hogs, dairy cows, beef steers, or a barn full of chickens, but there’s no food to be obtained from the farm. In this country by and large, food is acquired from grocery stores or supermarkets, not from farms.  Most farmers today contract with a grain, meat, or dairy processor, and are merely producers of commodities—feudal serfs who dance to a corporate whistle. Major multi-national corporations like Cargill, ADM (Archer Midland Daniels), Tyson, and Purdue purchase the majority of raw materials entering the food production stream. Rural farming communities throughout the United States have dwindled to near ghost towns, and most farm commodities are subsidized with tax revenue to support less-than-sustainable farm income streams, which in turn benefits the corporate buyers of raw farm commodities because they can purchase at cost of production or less.  New census data released by the USDA in February provides reason for concern, again. The number of farms operating in the US and the number of farm acres have both fallen significantly. There were 141,733 fewer farms in 2022 than in 2017. The number of farm acres was reduced by a whopping 20 million acres in the same five-year period. This is very disturbing! Yes, we can shrug our shoulders and say there’s still plenty of food in the supermarket, and that’s true. But that food is increasingly not produced domestically. As a nation we now import 20% of our food. That’s one out of every five bites. If that doesn’t pose a national security concern, I don’t know what does. What’s the solution to this? While it’s a complex problem—particularly on a national scale (I happen to think most large-scale problems are best solved on a local or regional level) I believe number one is to de-corporatize farming and food production. While there are a number of small farms that have effectively exited the corporate commodity system, they are few and far between, and we need many more to make this move. The problem with being in the commodity system is that the corporate aggregators who buy raw farm commodities hold farms and farmers hostage via price. Given that most farmers have little to no control over the price they’re paid for their goods, farming has become the hard-scrabble vocation it is, which then turns the next generation away. Thus we have an unprecedented aging farmer demographic, which means that in the next 15-20 years, over fifty percent of our privately held farmland will change hands not by choice, but by death. Who will take it over? Will they know how to manage it? If this land is not taken over by people who know how to produce food from it, we’ll undoubtedly import even more food from foreign interests. Throughout history, people—individuals—have always teamed up to instigate change. And they still do.  Such as small-scale food producers who take the path of lunatics and are driven to a different system by producing real food for real people within their region. That’s us. But more importantly, change is being instigated by people who are sick (literally) and tired of being victims of Big Food and their unpronounceable ingredients, empty claims, and tasteless pseudo-food, and opt out to find real-time food producing farms in their region. That’s you. This food partnership is the crux of interdependence. Small-scale farms like us cannot be independent, no more than today’s society is independent in food acquisition. To me, the folks who recognize the reality of this opportunity—and leverage it—portray quite well the irrefutable law of interdependent community and become the solution to one of the foremost threats facing us as an independent western nation. As always, the people hold the solution in the form of a food revolution. Let's hope it comes quickly. And that’s the View from the Country.