Save 5% with Subscribe & Save

Additional Corona Thoughts

written by

Sam Fisher

posted on

March 13, 2020

Just a few additional calming thoughts on the corona scare again. I’m not superstitious but am making an attempt to keep things in perspective and still be fully realistic on this Friday the 13th. Here goes…

To be honest, I have a general distrust of most info coming out of major national news channels, simply because of their reputation for partial truths, propaganda, and partisan partiality. I’m not necessarily a Trump supporter, but neither do I approve of our largely liberal news media and am quite skeptical that this “pandemic” isn’t overwrought purely for political purposes. It certainly wouldn’t be the first time politics manufactured or embellished an event to sway public opinion. Part of the reason I feel this way is because the majority of information being put out there comes from major news channels, as well as the CDC and WHO. In most cases, balanced perspective or truth is not found at the center of the issue, but is at the fringes or edges, away from the center of attention.

Everyone is talking about it being highly contagious, and that may be true, but contagion is 100% predicated on the germ theory, which prevails in modern medicine. However, what we haven’t been taught is that there is always a terrain in which germs thrive or die. This was first promoted in 19th century France. Louis Pasteur (inventor of pasteurization) advocated the notion of germs as the cause of disease, while another French scientist named Antoine Bechamp promoted a conflicting theory known as the “cellular theory” or “terrain theory” of disease. Bechamp’s argument was that these germs that Pasteur was so terrified of were opportunistic in nature. This caused quite a controversy, and the two men were bitter foes.

To prevent illness, Bechamp advocated not the killing of germs but the cultivation of health through diet, hygiene, and healthy lifestyle practices such as fresh air, exercise, adequate hydration, a good diet, etc. The idea is that if the person has a strong immune system and good tissue quality (or “terrain” as Bechamp called it), the germs will not manifest in the person, and they will have good health. It is only when their health starts to decline (due to personal neglect and poor lifestyle choices) that they become victim to infections.

To treat illness, Bechamp’s cellular theory also applied. Bechamp was less concerned with killing the infection and focused more on restoring the health of the patient’s body through healthy lifestyle choices and proper immune support. Bechamp saw the infection as a footnote to the state of illness and not the primary cause. As the person restored health through diet, hygiene, and detoxification the infection went away on its own–without needing measures to kill it.

Pasteur and Bechamp had a long and often bitter rivalry regarding who was right about the true cause of illness. Ultimately Pasteur’s ideas were accepted by society and Bechamp was pretty much forgotten. The practice of Western medicine is based on Pasteur’s germ phobia which gives rise to the use of vaccinations, antibiotics, and other anti-microbials.

The irony is that towards the end of his life, Pasteur renounced the germ theory and admitted that Bechamp was right all along. History has it that his last words on his death bed were “The microbe is nothing, the terrain is everything.” In the 1920’s medical historians also discovered that most of Pasteur’s theories were plagiarized from Bechamp’s early research work.

I unashamedly embrace the terrain theory, although most of western society has been taught to believe the germ theory. I have seen too many examples both as a caretaker of animals and an observer of human health to second guess it. In observing the animals in my care I have often noticed afflictions in one individual when the majority are free of it. This is due to terrain. The one had weaknesses that allowed opportunistic disease to proliferate. Germs are everywhere and even exist inside of us in a symbiotic relationship. Like Bechamp noticed in his research, it is only when the tissue of the host becomes damaged or compromised that these germs begin to manifest as a prevailing symptom (not cause) of disease.

Because of my beliefs in the terrain, I doubt the belief that no natural immunity exists. Sure, it’s supposedly a new virus, but have the thousands who recovered all been hospitalized and on antibiotics? Doubtfully. Corona is, yes, believed to be highly contagious, but the media and modern medicine–due to adherence to the germ theory–regard everyone as susceptible with no regard to the human immune system’s ability to ward off illness, yes, even new strains. The dire prediction of hospitals overflowing and medical professionals having to make the decision of who gets oxygen and who dies is all based on the theory that everyone who comes in contact with the virus will succumb to it. In the mid-90’s when the UK eradicated thousands of cows and sheep due to mad cow disease, it too, was said to have no natural immunity. However, not all farm animals contracted the disease. In fact, Newman Turner–a farmer who strongly adhered to natural farming and animal husbandry practices–invited the government to expose his cows to the disease. He wanted to prove that immunity existed, and that the disease was due to poor animal husbandry practices. Of course they refused, but he felt sure–as do I–that lack of immunity is a physical health (terrain) issue rather than a new germ strain.

I realize the terrain in many Americans is likely compromised due to poor diet and lifestyle choices, but this is why I am passionate about advising people to not panic. When in a panic people don’t think clearly, and there are measures–many measures–we can take to strengthen our immune systems and protect ourselves. For example, numerous reports are swirling around of things as simple as high doses of Vitamin C and Elberberry juice knock out coronavirus. But as always, folks are already pointing fingers at our president and his administration because of the way they handle things. It seems to be the American way now, regardless of what comes and how it’s handled, there will always be people who bash the current administration for not “saving” us. This is salvation-by-legislation thinking, and it causes us to increasingly lose our freedoms because we look to government to “save” us. If coronavirus comes anywhere close to being the pandemic it’s predicted to be, be assured, we’ll see more regulation become mandatory in the name of healthcare, national security, food safety, etc. Being of libertarian inclination, that bothers me.

With all this being said, we still don’t know if corona is the threat it’s forecast to be. And that’s where the fear and paranoia comes in, in not knowing. This is why society is practically going crazy, because the media panders it up hour by hour, minute by minute, and no one knows how bad it will be. Plus, there is so little actual information available on real-life cases or to what extent the patients suffer. The news has little to say of the 68,000 who recovered from the virus, which again, gives me reason to think it’s played up for a larger agenda. If Americans were given a balanced perspective on recovery rates, real-life patient experiences, how many hospitalizations occurred, and how it feels when one contracts the virus, the public reaction would likely be much more sane. But as it is, most of what we hear and see in the news is based on rising confirmed cases and number of deaths, mixed in with global numbers (for effect), and in general, gloom and doom, and we gullible Americans suck it up. This results in widespread fear and panic, which I suspect is the intent of the “larger agenda” behind this.

Perhaps I have too dour an attitude toward mainstream media and our government, but for what it’s worth, it’s The View from the Country.

Quote Worth Re-Quoting –
“I have yet to see a piece of writing, political or non-political, that does not have a slant. All writing slants the way a writer leans, and no man is born perpendicular.”  ~ E. B. White

More from the blog

Climate Change

While we try to stay out of politics, occasionally we come to the point where a hot-button issue just needs to be talked about. Today, climate change is one of them. Because of the esoteric ideas surrounding this subject, much of the language used in political circles is lost on many people, which creates a vast demand for it to be discussed from an everyday common-sense let’s-solve-the-problem-realistically approach. Here’s where we are, the way I see it; the political left seemingly blames climate change for everything. From inflation to an unreliable supply chain to the price of gasoline, climate change is thrown into the word salad at every turn. It’s the burgeoning apocalypse that justifies massive spending plans (often with additional legislation silently piggybacked into it). At the same time the political right seems to completely downplay—even scoff—at the potential for human-induced climate factors and/or attempts to address it. Who’s right? I suggest somewhere between these two positions is where the truth resides, and that nuances to the discussion exist that are completely ignored by both sides in the scuffle for political power and influence. I’m not a scientist or any sort of climate change expert. What I am, though, is a farmer who studies nature, keeps his ear to the ground for truth and alternative opinions, and endeavors to find common-sense middle ground. Moving forward we’ll talk about the flawed science surrounding climate change, discuss some of the angles not mentioned in the over-politicized conversation, and how human activity affects the climate. Here goes: Flawed Science – None of the computer models used have been able to function in reverse. In other words, if the models used to measure the timeline until the apocalypse are run backwards, we’re all extinct 200 years ago. This, of course, raises questions about the ability of the models to make accurate predictions for the future. Perhaps it’s a reminder to pause in our hubris and remember that technology can only lead us so far. We must recognize that systems dependent upon information plugged in by fallible humans can come to flawed conclusions. That said, we know that some of the arctic glaciers are receding. For example, in Alaska there are now interstates where glaciers were only 40 years ago. The question, however, is whether or not it’s new. Has it ever happened before? We don’t know. And then there’s the argument about cows inducing climate change. We have literature suggesting—with some certainty—that the planet carried far more animal weight 1000 years ago than it does today, even with factory chicken houses and multi-thousand-cow feedlots. It should give us all pause to realize that earth’s abundance is not tied to modern machinery, thousands of acres of annual crops, or 10-10-10 fertilizer. It must be tied to something else. Is there a way we can resurrect—domestically regeneratively resurrect—that abundance? The herbivores that were here 1000 years ago only ate plants. They didn’t eat corn or soybeans (monoculture), and they didn’t eat fermented plants like silage or rendered processing waste (such as man have devised for feedlot cattle). Herbivores—having more than one stomach—essentially have a fermentation vat in their gut and when fed fermented feeds it acidifies the gut and doubles the methane produced. As you may know, cow farts—or burps (they haven’t decided which yet :)—are blamed for causing climate change, which many of us think sounds far-fetched. If indeed it does, feeding concentrated grain diets to herbivores exacerbates the problem. Viewpoints Climate Extremists Never Mention (and perhaps don’t know about) – A diversified plantscape (prairie) stimulates the production of a methanotrophic bacteria (yes, it’s real; you can look it up). This bacteria—in a healthy diversified ecosystem—absorbs methane equivalent to that which is produced by over 1000 cows per acre. The problem is, today very little acreage devoted to herbivores (livestock) is a healthy perennial prairie ecosystem, where herbivores prune and move according to the template provided in nature (wild herds chased by predators). Methanotrophic bacteria doesn’t grow under corn or monoculture, it doesn’t grow under overgrazed land, it doesn’t grow under asphalt, it doesn’t grow under feedlots or factory farms. It requires a diversified perennial landscape. This, once again, speaks to how nature always provides checks and balances in the ecosystem, if only we lay down our hubris long enough to notice. The problem is that the scientists who study these things study extremely dysfunctional ecosystems, and then extrapolate data based on this completely inappropriate dysfunctional database. Science often is not objective, but is approached with intent to prove a viewpoint. The Australian scientist, Walter Jennings—along with scientists around the world—have determined that the temperature regulator of the planet has little to do with greenhouse gases (GHG’s) which is what climate change “experts” have been fixated on for many years. Rather, it’s about water condensation. The truth is, only 5% of planetary temperature is regulated by GHG’s. 95% is the energy it’s takes to condense water. In order to condense, water must have a particle to condense on—it can’t just condense on nothing. The main thing it condenses on is bacteria, specifically the bacteria that’s an exudate from foliage. Have you ever noticed that in areas of heavy foliage—such as mountainous or heavily wooded areas—how in the early morning this cloud, or mist, rises and hangs heavy during the time of temperature inversion as the sun begins to heat the atmosphere? This is water, after marrying to bacteria, that’s now condensing and vaporizing into the atmosphere, which in turn creates clouds that bring rainfall, which cools the earth. This explains why in climate change the dry areas are getting drier and the wet areas are getting wetter. Even climate scientists are bewildered by this. But in Jennings’ condensation theory the planet is essentially a big radiator. The physics of the planet is that it wants to be balanced. So, if agriculture destroys vegetation in one area (via plowing or overgrazing) the planet must cool itself somewhere and does so in places where vegetation exists. There the moisture can condense because of the presence of bacteria from foliage, which vaporizes to form clouds and precipitation. In other words, the moisture is concentrated there. Does Human Activity Affect Climate (if so, how)? – In my opinion, it’s no longer a question whether or not if humans are affecting the climate. Allen Williams from the regenerative farming consulting group Understanding Ag, relates their experience in working with the 30,000-acre Las Damas ranch in the Chihuahuan desert of Mexico. The area gets only about 8 inches of rain a year—and still has horrible erosion. For as long as any living generations remember, the desert has grown rather than receded. Starting in 2010, Understanding Ag worked closely with the ranch to develop cattle water and fence in some of the worst areas of the ranch, essentially to expand the areas where vegetation exists. At the conference where he and I met last winter, Allen showed pictures of a decade of progress since they began working with this ranch. Not only has the amount of plant material increased dramatically—what was large areas of desert devoid of grass is now a sea of green. But more importantly, after only ten years they’re seeing changes to the micro climate to where Las Damas now gets rainfall that seems to follow the property line. In other words, they get more rain than the neighboring ranches do. This is due to the amount of grass and other plant material (think bacteria exudates from green foliage) on the ranch compared to their neighbors who are not using regenerative practices. I suggest that if the micro-climate can be influenced in a 30,000-acre region in a decade, then little doubt remains whether or not human activity can affect the overall climate of the earth. As of 2019, the USDA had recorded 897,400,000 acres of farmland in the US, which is nearly thirty X the acreage at Las Damas. Most of these acres are either in monocrop or in miserably mismanaged grazing land. Monocrop, by design, requires either tillage or heavy applications of chemicals—both of which destroys soil. The same is true for unmanaged grazing land—meaning not managed to prevent overgrazing or under-grazing, both of which have negative effects on the soil and water cycle and cause desertification. In the span of about 200 years, the soils of the American Midwest went from what we think was about 8% organic matter (which is carbon), to an average of 1.5%. Where did the carbon go? By and large, it was released into the atmosphere because humans uncovered the soil via tillage in order to grow annual monoculture crops. Not only are our soils down to bare bones, but our air is polluted with carbon that needs to be returned to the soil in order to have a healthy ecosystem. Never before in history have humans had the means of raping the soil to this extent—made possible by mechanically tilling the soil as well as chemical technology. What if all climate change funds and efforts were channeled into growing a managed diversified perennial plantscape on 70 percent of these nearly 900 million acres? Imagine how much carbon could be sequestered from the atmosphere, not to mention methanotrophic bacteria produced to sequester methane from the air. This may sound like a pipe dream. But maybe it isn’t in light of the fact that 70% of all grain grown in the US is to feed herbivores who are not designed to metabolize them. Although human induced climate change is a very real possibility, I don’t see it as a burgeoning apocalypse. However, it’s a very real threat to our domestic ability to feed ourselves. This is not a problem government can fix via massive spending bills. Yes, they could stop throwing taxpayer money around in the form of crop subsidies, which would take away the incentive for the overproduction of monocrops such as corn and soybeans. But government will not stop or even slow climate change by limiting the use of fossil fuels or eliminating animal agriculture. The solution must come from the people. Each of us has a responsibility to decrease or eliminate our own portion of the demand for annual-crop-based foods and create demand for regeneratively produced perennial-crop-based foods. If humans have created this problem, then humans will have it to fix. We don’t have to look to the ivory towers and the “experts” to do it. And that’s The View from the Country.

No Antibiotics!

At Pasture to Fork, we like to point out that we don’t use antibiotics in the production of the foods we raise. Why is this important? Weren’t antibiotics deemed a miracle drug that proved to be a powerful life-saving tool when they first became available? Yes, that is true, and I’ll even go so far as to say they are still a major lifesaver in human medical care eighty years later. However, today we’ll talk about how they have been abused and why it’s crucial that we limit their use to human medicine. The first known antibiotic, dubbed penicillin, was discovered accidentally by a researcher named Alexander Fleming in 1928.  Though discovered in the twenties, it wasn’t widely propagated until the 1940’s when—after saving lives miraculously in instances like the Cocoanut Grove nightclub fire—the US government invested in its production to be used on the war front in WWII. It’s hard for us to imagine a world before antibiotics. Everything from paper cuts to childbirth had the potential to kill via bacterial infection. Even minor wounds to soldiers in warfare—upon becoming bacterially infected—were cause for amputations, extreme illness, and death, which is why the government had interests in the mass production of penicillin. Penicillin was quickly followed by other antibiotics such as aureomycin, tetracycline, and the like—and it quickly became a race between pharmaceutical companies in the late 1940’s to develop the next family of antibiotics that could then be patented. The formula for penicillin was proprietary from the beginning—being seen as a product for the greater good of society—but pharmaceutical companies, of course, each wanted their own piece of the pie. In 1948, a researcher named Thomas Jukes, who specialized in animal nutrition working for Lederle Laboratories, discovered (again, by accident) a ground-breaking new twist to antibiotic use. In an effort to find new ways to cut costs for poultry farmers following the sharp decrease in demand post-WWII,  research entailed using laboratory waste used in making antibiotics as a supplement in chicken feed. Having had indications suggesting growth-promoter properties, Jukes performed one of the first controlled research projects on chickens and found the group fed antibiotic waste to be markedly bigger at the end of the 25-day feeding trial, discovering the growth-promoter properties of feeding antibiotics to animals. This discovery opened a whole new frontier for the pharmaceutical industry and a tremendous new market for growth promoters in animal agriculture. It also made for an unprecedented hubristic attitude in the world of animal agriculture, which in turn, led to the confinement animal feeding operation (CAFO’s) of today. But, the chicken had yet to come home to roost (pun intended). In the early years of antibiotics as a growth promoter the common mentality among both manufacturers and farmers was “if a little is good, more is better”, and growth promoter antibiotics were largely unregulated which resulted in heavy use, inexact dosages, and the like. Regulators, being enamored along with the industry, looked the other way and didn’t interfere. For context, let's remember this was the era when the chemical DDT was considered a marvelous and life-changing invention, only to be banned later. Even early on there was concern among a few (only a few) scientists about the prospect of antibiotic resistance. In his 1945 Nobel prize acceptance speech, Alexander Fleming warned that the development of resistance had the potential to ruin the miracle of antibiotics. Resistance is the term used to describe the ability of bacteria to mutate and overcome the proficiency of antibiotics. The thesis is that an application of antibiotics never kills all the bacteria, allowing the survivors to gain genetic resistance not only to that particular antibiotic, but other antibiotics as well. Throughout the sixties and into the early seventies, although there was little warning of antibiotic resistance, already cases of mass bacterial outbreaks occurred where antibiotics proved ineffective, including a 1967 stomach bug in Yorkshire, England where 15 babies and young children died for lack of effective and timely antibiotic treatment caused by resistance. Thus far, little to no effort was put into measuring how quickly resistance is built. Until 1974, when an independent study took place that even today is under published. Participating in the study was the Downing family from Boston, who had ten children and a small farmstead. The researcher in charge of the study, Dr. Stuart Levy, designed it to include six batches of young chickens, half being fed antibiotic-free and half fed growth-promoter antibiotics. The oldest Downing child, Mary—a sophomore in college—cared for the birds, which were housed in the Downing’s barn in separate pens 50 feet apart. A precise chore routine was adopted where the antibiotic-free birds were fed and cared for first, then the flocks fed growth-promoters, after a change of boots and washing of hands. Birds from each flock were swabbed once a week, as well as fecal swabs of each of the Downing family—and a number of neighbors as well—with the objective of learning how quickly antibiotic resistance spread through the flocks, as well as the people participating in the study. The results came quickly. Samples taken at the beginning of the experiment showed very few bacteria in the guts of the chickens, family, or neighbors containing defenses against tetracycline (the drug used in the chicken feed). That was to be expected, given the random roulette of mutation. But within 36 hours, those bacteria multiplied in the antibiotic flock, but the drug-free flocks remained clean for a few weeks longer. Then things changed. First the bacteria in the antibiotic-fed flock became resistant to multiple drugs, including other families of antibiotics like sulfas, streptomycin, etc. Then the multidrug-resistant bacteria appeared in the flocks that never received antibiotics and had no contact with the birds that did. And soon after, the same multidrug-resistance showed up in the Downing’s fecal samples. To the disappointment of his sponsors, Levy had demonstrated what they had hoped to disprove. Even though the feed contained just tiny doses of antibiotics, those doses selected resistant bacteria—which not only flourished in the animal’s systems, but left the animals, moved through the farm’s environment, and entered the systems of other animals and of humans in close proximity (but did not spread to any of the neighbors—which served as the control group). This served to reinforce some of the early scientists concerns that these altered bacteria were an untrackable, unpredictable form of pollution. In her comprehensive book, Big Chicken, Maryn Mckenna (where I learned much of what is written in this article) eloquently relates not only the facts given above, but also tells the story of years and decades of industry and regulator pushback against the idea of restricting farm-use antibiotics—even into the 2000’s. She shares stories of horrible illnesses and epidemic-proportion bacterial outbreaks costing the lives of people who were unknowingly harboring antibiotic resistance being quietly transmitted from farm to food to consumer. Stories even of outbreaks traced backward by epidemiologists from the victims to the farms on which the meat was produced. The resulting reports and database entries, by the way, were then ignored and buried by regulators and industry leaders. In the book, Mckenna does an outstanding job of presenting antibiotic resistance for what it is, a silent threatening contaminant that moves through a largely unaware society, looking for its next victim or victims. Even today, it’s not a subject well-covered by the media, largely due to the pharmaceutical interests in keeping it hush. And this is where we find ourselves today, fifty years later. Although farm and food related antibiotic use has garnered a far more attention in recent times than anytime in history, they are still being widely used in the poultry, pork, and beef industries, both as growth-promoters and as preventative doses to prevent illness on factory farms. Several years ago, some of the major poultry providers—including Perdue—made a PR effort in the direction of “antibiotic-free.” The reason I say a PR (public relations) effort, is because it was driven, at least in part, by an increasing concern among the people regarding human-medical-use antibiotics used in agriculture and the subsequent risk of antibiotic resistance. Borne out of that effort—which was also driven by recognition within the industry that growth-promoter antibiotics were losing their effect—came a family of drugs called “ionophores”, which were not quite the typical antibiotic, were not classified as an antibiotic (conveniently?), but were essentially an industry antidote to traditional antibiotics. However, it allowed the meat industry to advertise their product as “antibiotic-free” without taking the risk of losing production due to the loss of both growth-promoter and preventative antibiotics. Granted, ionophores were not used—at least not as heavily—on humans, but that doesn’t change the farce of “antibiotic-free” in the meat industry. Vaccines have also been adopted in the meat industries as a sort of replacement for sub-therapeutic antibiotics. Modern vaccines—including mRNA technology —has been used increasingly in recent years as a solution to the rising pushback—and loss of effectiveness—against antibiotics used in meat production.   To summarize, the discovery of antibiotics changed life as we know to a degree we cannot imagine, mitigating risk of bacterial infection astonishingly. However, the advent of antibiotics used in animal agriculture quickly threatened the efficacy of human-use antibiotics due to rapid rise of resistance to early antibiotics—and even faster, to other families of antibiotics. From the mid-1940’s into the 2000’s the meat industries, pharmaceutical companies, and even regulators ignored and repressed concerns involving the threat of antibiotic resistance. Despite attempts and posturing of certain players in the meat industries, even today it appears as if the mass-producing meat purveyors are unwilling and/or unable to completely absolve themselves from antibiotics in the production of human food, which only furthers the hazard of superbug infections that are resistant to nearly all common medical-use antibiotics. Until the industry becomes willing to abandon its intense confinement production model, I don’t see the antibiotic story changing. However, the upside to this is that farmers who are willing to adopt a more natural template like the outdoor pasture-based model can completely eschew antibiotics, which is the grassroots future to clean eating for those who know and care about the antibiotic issue. At Pasture to Fork, we are unwavering in our stance against using antibiotics to produce your food, even making "No Antibiotics" one of our stated protocols. While we believe the risk of regularly consuming antibiotics is great enough for adults, it’s even greater for children, and investing in future generations is paramount in our opinion. And that’s The View from the Country. P.S. I highly recommend reading Maryn Mckenna's book, Big Chicken. This article does not do her work justice but is a mere sneak peek.

Can the Food and Farming Crisis be Resolved?

Given the modern worldview that independence of the individual is everything, it’s probably a bit of a shocker when I say I view human independence as an illusion—a mirage in the distance that will always be that—in the distance. Yes, we have an aura of independence given that we have mechanized transportation that’s as easy as getting in and turning the key, we have devices in the palm of our hand that literally give us access to the knowledge of the world in milli-seconds. And that creates an impression of independence in the sense that we can go places, do things, and know things that were impossible for most of human history. But even with these modern technologies, we are dependent on other people. No man is an island. And I’m not even referring to the psychological aspect of human nature that wants to be connected to other people. It’s just an irrefutable fact that humans have always been dependent on each other—community and kin—as well as the surrounding eco-system for survival. But as life became easier due to industrial and technological advancements, many of us are at least a little obsessed with the idea of being our own person apart from others. That in itself may be ok, but I’ve come to agree with Stephen Covey in his The 7 Habits of Highly Effective People where he says: Independence is the paradigm of I—I can do it; I am responsible, I am self-reliant, I can choose. [on the other hand] Interdependence is the paradigm of we—we can do it; we can cooperate; we can combine talents and abilities and create something greater together…. A little later in the same chapter he writes: Life is, by nature, highly interdependent. To try to achieve maximum effectiveness through independence is like trying to play tennis with a golf club—the tool is not suited to the reality. Interdependence is a far more mature, more advanced concept.      Granted, the book Covey wrote is about effectiveness, and that may not be the goal of some people today—although I say it should be. What greater aspiration could we have than to be effective in our lives. Whether it be in our work, with our family, or even in our faith, we should aspire to be effective, which is defined as “adequate to accomplish a purpose, producing the intended or expected result.” Like always, I would like to turn this discussion to the food arena. How does it apply? Number one, due to the industrialization of human food within the last fifty to seventy-five years – not the least of which is the emergence of a food processing industry who brought a great many convenience foods into existence, distributing them to every local grocery and supermarket. Foods that traditionally were sourced directly from local farms and home gardens now come from nameless, faceless entities and conjure a fantasy of not only human independence from the natural elements man has always relied on for sustenance, but also the false resemblance of food security and independence. Modern society forgets that food and farming is inextricably linked, regardless of whether it comes from the supermarket in a plastic package, from the garden in the back yard, or from a local farm. That man would no longer be bound – yes, helplessly dependent – to the natural elements of soil, air, and water is one of the biggest myths of all time. In his writings, Joel Salatin often refers to our interaction with the earth and our dependence on its fruits as our ”ecological umbilical.” At first I thought it to be too strong a term, but I’ve changed my mind. Our dependence on the earth and its natural elements is not unlike the utter reliance of an unborn baby on the continuation of nutrients through the umbilical connection with its mother. In the foreword of Forest Pritchard’s excellent book, Gaining Ground, Joel penned these words; “We cannot escape our responsibilities to, nor our interactions with, soil, air, and water – the basic ingredients in the farmer’s alchemy….. Unlike other vocations that are arguably more or less necessary, farming is basic to human existence. Because it is at the root of civilization, it has the greatest capacity to either heal or hurt humankind’s planetary nest. As co-stewards of this great creation, we all owe future generations the benefit of knowing something about farming, food production, and land care. Few intellectual journeys could be this necessary and far-reaching.” Isn’t that an irrefutable truth? As the farmer population continues to decline – largely due to either age or bankruptcy – it will become more obvious than ever how dependent society is on farming and food production. Agricultural statistics are concerning in terms of farmer age, although it’s a little-known concern in society and is not touched by the mainstream news. One of the most abnormal aspects of modern America is the fact that many regions are literal food deserts, meaning there’s no food being raised in the vicinity. This is true not only in cities and urban areas, but in many rural areas as well. To be sure, rural areas may have farms – even active working farms, but they are usually in the commodity business and are not raising actual food for local sale to the local populace. Whether they have corn, soybeans, wheat, or hay in the fields, it’s a commodity that goes for animal feed. They may have hogs, dairy cows, beef steers, or a barn full of chickens, but there’s no food to be obtained from the farm. In this country by and large, food is acquired from grocery stores or supermarkets, not from farms.  Most farmers today contract with a grain, meat, or dairy processor, and are merely producers of commodities—feudal serfs who dance to a corporate whistle. Major multi-national corporations like Cargill, ADM (Archer Midland Daniels), Tyson, and Purdue purchase the majority of raw materials entering the food production stream. Rural farming communities throughout the United States have dwindled to near ghost towns, and most farm commodities are subsidized with tax revenue to support less-than-sustainable farm income streams, which in turn benefits the corporate buyers of raw farm commodities because they can purchase at cost of production or less.  New census data released by the USDA in February provides reason for concern, again. The number of farms operating in the US and the number of farm acres have both fallen significantly. There were 141,733 fewer farms in 2022 than in 2017. The number of farm acres was reduced by a whopping 20 million acres in the same five-year period. This is very disturbing! Yes, we can shrug our shoulders and say there’s still plenty of food in the supermarket, and that’s true. But that food is increasingly not produced domestically. As a nation we now import 20% of our food. That’s one out of every five bites. If that doesn’t pose a national security concern, I don’t know what does. What’s the solution to this? While it’s a complex problem—particularly on a national scale (I happen to think most large-scale problems are best solved on a local or regional level) I believe number one is to de-corporatize farming and food production. While there are a number of small farms that have effectively exited the corporate commodity system, they are few and far between, and we need many more to make this move. The problem with being in the commodity system is that the corporate aggregators who buy raw farm commodities hold farms and farmers hostage via price. Given that most farmers have little to no control over the price they’re paid for their goods, farming has become the hard-scrabble vocation it is, which then turns the next generation away. Thus we have an unprecedented aging farmer demographic, which means that in the next 15-20 years, over fifty percent of our privately held farmland will change hands not by choice, but by death. Who will take it over? Will they know how to manage it? If this land is not taken over by people who know how to produce food from it, we’ll undoubtedly import even more food from foreign interests. Throughout history, people—individuals—have always teamed up to instigate change. And they still do.  Such as small-scale food producers who take the path of lunatics and are driven to a different system by producing real food for real people within their region. That’s us. But more importantly, change is being instigated by people who are sick (literally) and tired of being victims of Big Food and their unpronounceable ingredients, empty claims, and tasteless pseudo-food, and opt out to find real-time food producing farms in their region. That’s you. This food partnership is the crux of interdependence. Small-scale farms like us cannot be independent, no more than today’s society is independent in food acquisition. To me, the folks who recognize the reality of this opportunity—and leverage it—portray quite well the irrefutable law of interdependent community and become the solution to one of the foremost threats facing us as an independent western nation. As always, the people hold the solution in the form of a food revolution. Let's hope it comes quickly. And that’s the View from the Country.