We now offer FREE DELIVERY to our pickup locations and for home delivery orders $149 or more

Making Your Vote Count

written by

Sam Fisher

posted on

November 15, 2023

As you know, last Tuesday was election day. According to FairVote, presidential elections bring out about 60% of the voting population, while mid-term elections only attract about 40%, and odd year elections even less.

While I have a lot of thoughts on the act of voting and elections that I won’t go into in this post, I believe America has become overly obsessed with seats on the federal level—and the elections thereof, while paying too little attention to local seats, where votes arguably have more influence. Until about sixty years ago, most Americans did not pay much attention to who was president and what he did and were far more connected and involved in what was happening in local politics. That said, even in local elections we have to ask the question of whether or not our life habits and worldviews align with the principles we pretend to vote for. In other words, are we aiming to change the world via the ballot box, or are we actively seeking to change the world with our lifestyles, spending habits, and whatever cultural influence we have?

Due to an avid interest in the power of commerce in general—and the food industry in particular, I am increasingly distrustful of corporate business, especially multi-national chains such as Wal-Mart, McDonalds, Amazon, etc. and have arrived at the point where I don’t patronize them more than I absolutely have to, which is to say not at all. Why the dislike? While there’s more than one reason, to put it in a nutshell; corporations who reach the size of the ones I just mentioned attain monetary clout and consequently, policy-making influence that skews the playing field of commerce in their own favor, and a detriment to local and national economies. As Natalie Winch writes in Ditching the Drive-Thru; We may have separation of church and state in this country, but we do not have a separation of government and capitalism.

A few years ago, I walked past a man outside of Walmart (yes, I was walking into Walmart while he was walking out) who was wearing a “I Voted” sticker on his coat. Noticing the sticker, I was struck by the irony of it. While he may have voted at the polls earlier in the day, he also voted with his dollar at one of the largest supermarket chains in the world. In all reality, we have to question which vote carried the most weight and influence, especially given the fact that we vote so often with our dollars between the times we vote on a ballot.

The irony is this; each time we cast our “monetary vote” we not only create demand for more of the item we just voted for, but we also support the manufacturer, each and every middleman or broker who drew an income from handling or brokering the item, and most of all, the retailer. This—in the case of mass importers like Wal-Mart—has far-reaching tentacles that cross oceans and cultures into foreign lands with values very different from our own—often even opposing ours. While that is heavy in and of itself, even heavier is the fact that because we chose to patronize that multi-national giant, we kept those dollars from supporting a domestic company who may be far more aligned with our values.

The same could be said of corporate restaurant chains such as McDonalds, who is the biggest purchaser in the world of not only beef, but also the number one consumer of pork, potatoes, lettuce, and tomatoes, and the second-biggest buyer of chicken, after KFC. Do we know how that food is produced? Very commercially, for the lowest price possible, and to the detriment of our greatest national resource known as soil. Plus, giant buying clout such as this is bad for any semblance of free markets, and for any purveyors in the market. Corporations of this size jump suppliers over fractions of a penny. I may have a thorn in my flesh, but that kind of buying power is bad for the world.

So then, should we ask for the feds to intervene? Should McDonalds be brought before the courts for creating a monopoly? In my opinion, no. While they play by the same questionable business ethics many multi-national corporations do, they’re not solely at fault for their position and power. The consuming populace has given it to them. Should knowledgeable consumers like you and I boycott them? Absolutely! The food they serve isn’t good for us, their buying habits support terrible food production models, their presence in every town and village displaces smaller businesses founded on better values, and their corporate clout sways lawmakers, regulators, and agricultural policy alike.

At the risk of sounding crass about electing government leaders, let me say that I feel most Americans place far too much emphasis on it. True, we need good leaders who have the good of their constituents and the good of the nation at heart. Yes, I am highly concerned for the future of our country if we stay on the trajectory we’re currently on. But I’m far more concerned about the state of our nation’s culture today than I am about voting “the right person” into office. I believe politics is downstream from culture, and acting out good common-sense values in culture today is urgently needed. Which is to say the way we spend our money and how we live out a visceral example of what we stand for is a needed cultural influence, which is perhaps as needed today as voting in the ballot box, especially if we only vote every four years. As consumers of earthly goods, we have a responsibility to the culture to act and spend in accordance with responsible values. While I’m not opposed to prayerful voting in the polls and would certainly encourage incessant appeal to God for godly leaders, I’m more concerned about careful daily consideration to who and what we support with our monetary vote and how we act out our cultural influence. And that’s the View from the Country.

More from the blog

No Antibiotics!

At Pasture to Fork, we like to point out that we don’t use antibiotics in the production of the foods we raise. Why is this important? Weren’t antibiotics deemed a miracle drug that proved to be a powerful life-saving tool when they first became available? Yes, that is true, and I’ll even go so far as to say they are still a major lifesaver in human medical care eighty years later. However, today we’ll talk about how they have been abused and why it’s crucial that we limit their use to human medicine. The first known antibiotic, dubbed penicillin, was discovered accidentally by a researcher named Alexander Fleming in 1928.  Though discovered in the twenties, it wasn’t widely propagated until the 1940’s when—after saving lives miraculously in instances like the Cocoanut Grove nightclub fire—the US government invested in its production to be used on the war front in WWII. It’s hard for us to imagine a world before antibiotics. Everything from paper cuts to childbirth had the potential to kill via bacterial infection. Even minor wounds to soldiers in warfare—upon becoming bacterially infected—were cause for amputations, extreme illness, and death, which is why the government had interests in the mass production of penicillin. Penicillin was quickly followed by other antibiotics such as aureomycin, tetracycline, and the like—and it quickly became a race between pharmaceutical companies in the late 1940’s to develop the next family of antibiotics that could then be patented. The formula for penicillin was proprietary from the beginning—being seen as a product for the greater good of society—but pharmaceutical companies, of course, each wanted their own piece of the pie. In 1948, a researcher named Thomas Jukes, who specialized in animal nutrition working for Lederle Laboratories, discovered (again, by accident) a ground-breaking new twist to antibiotic use. In an effort to find new ways to cut costs for poultry farmers following the sharp decrease in demand post-WWII,  research entailed using laboratory waste used in making antibiotics as a supplement in chicken feed. Having had indications suggesting growth-promoter properties, Jukes performed one of the first controlled research projects on chickens and found the group fed antibiotic waste to be markedly bigger at the end of the 25-day feeding trial, discovering the growth-promoter properties of feeding antibiotics to animals. This discovery opened a whole new frontier for the pharmaceutical industry and a tremendous new market for growth promoters in animal agriculture. It also made for an unprecedented hubristic attitude in the world of animal agriculture, which in turn, led to the confinement animal feeding operation (CAFO’s) of today. But, the chicken had yet to come home to roost (pun intended). In the early years of antibiotics as a growth promoter the common mentality among both manufacturers and farmers was “if a little is good, more is better”, and growth promoter antibiotics were largely unregulated which resulted in heavy use, inexact dosages, and the like. Regulators, being enamored along with the industry, looked the other way and didn’t interfere. For context, let's remember this was the era when the chemical DDT was considered a marvelous and life-changing invention, only to be banned later. Even early on there was concern among a few (only a few) scientists about the prospect of antibiotic resistance. In his 1945 Nobel prize acceptance speech, Alexander Fleming warned that the development of resistance had the potential to ruin the miracle of antibiotics. Resistance is the term used to describe the ability of bacteria to mutate and overcome the proficiency of antibiotics. The thesis is that an application of antibiotics never kills all the bacteria, allowing the survivors to gain genetic resistance not only to that particular antibiotic, but other antibiotics as well. Throughout the sixties and into the early seventies, although there was little warning of antibiotic resistance, already cases of mass bacterial outbreaks occurred where antibiotics proved ineffective, including a 1967 stomach bug in Yorkshire, England where 15 babies and young children died for lack of effective and timely antibiotic treatment caused by resistance. Thus far, little to no effort was put into measuring how quickly resistance is built. Until 1974, when an independent study took place that even today is under published. Participating in the study was the Downing family from Boston, who had ten children and a small farmstead. The researcher in charge of the study, Dr. Stuart Levy, designed it to include six batches of young chickens, half being fed antibiotic-free and half fed growth-promoter antibiotics. The oldest Downing child, Mary—a sophomore in college—cared for the birds, which were housed in the Downing’s barn in separate pens 50 feet apart. A precise chore routine was adopted where the antibiotic-free birds were fed and cared for first, then the flocks fed growth-promoters, after a change of boots and washing of hands. Birds from each flock were swabbed once a week, as well as fecal swabs of each of the Downing family—and a number of neighbors as well—with the objective of learning how quickly antibiotic resistance spread through the flocks, as well as the people participating in the study. The results came quickly. Samples taken at the beginning of the experiment showed very few bacteria in the guts of the chickens, family, or neighbors containing defenses against tetracycline (the drug used in the chicken feed). That was to be expected, given the random roulette of mutation. But within 36 hours, those bacteria multiplied in the antibiotic flock, but the drug-free flocks remained clean for a few weeks longer. Then things changed. First the bacteria in the antibiotic-fed flock became resistant to multiple drugs, including other families of antibiotics like sulfas, streptomycin, etc. Then the multidrug-resistant bacteria appeared in the flocks that never received antibiotics and had no contact with the birds that did. And soon after, the same multidrug-resistance showed up in the Downing’s fecal samples. To the disappointment of his sponsors, Levy had demonstrated what they had hoped to disprove. Even though the feed contained just tiny doses of antibiotics, those doses selected resistant bacteria—which not only flourished in the animal’s systems, but left the animals, moved through the farm’s environment, and entered the systems of other animals and of humans in close proximity (but did not spread to any of the neighbors—which served as the control group). This served to reinforce some of the early scientists concerns that these altered bacteria were an untrackable, unpredictable form of pollution. In her comprehensive book, Big Chicken, Maryn Mckenna (where I learned much of what is written in this article) eloquently relates not only the facts given above, but also tells the story of years and decades of industry and regulator pushback against the idea of restricting farm-use antibiotics—even into the 2000’s. She shares stories of horrible illnesses and epidemic-proportion bacterial outbreaks costing the lives of people who were unknowingly harboring antibiotic resistance being quietly transmitted from farm to food to consumer. Stories even of outbreaks traced backward by epidemiologists from the victims to the farms on which the meat was produced. The resulting reports and database entries, by the way, were then ignored and buried by regulators and industry leaders. In the book, Mckenna does an outstanding job of presenting antibiotic resistance for what it is, a silent threatening contaminant that moves through a largely unaware society, looking for its next victim or victims. Even today, it’s not a subject well-covered by the media, largely due to the pharmaceutical interests in keeping it hush. And this is where we find ourselves today, fifty years later. Although farm and food related antibiotic use has garnered a far more attention in recent times than anytime in history, they are still being widely used in the poultry, pork, and beef industries, both as growth-promoters and as preventative doses to prevent illness on factory farms. Several years ago, some of the major poultry providers—including Perdue—made a PR effort in the direction of “antibiotic-free.” The reason I say a PR (public relations) effort, is because it was driven, at least in part, by an increasing concern among the people regarding human-medical-use antibiotics used in agriculture and the subsequent risk of antibiotic resistance. Borne out of that effort—which was also driven by recognition within the industry that growth-promoter antibiotics were losing their effect—came a family of drugs called “ionophores”, which were not quite the typical antibiotic, were not classified as an antibiotic (conveniently?), but were essentially an industry antidote to traditional antibiotics. However, it allowed the meat industry to advertise their product as “antibiotic-free” without taking the risk of losing production due to the loss of both growth-promoter and preventative antibiotics. Granted, ionophores were not used—at least not as heavily—on humans, but that doesn’t change the farce of “antibiotic-free” in the meat industry. Vaccines have also been adopted in the meat industries as a sort of replacement for sub-therapeutic antibiotics. Modern vaccines—including mRNA technology —has been used increasingly in recent years as a solution to the rising pushback—and loss of effectiveness—against antibiotics used in meat production.   To summarize, the discovery of antibiotics changed life as we know to a degree we cannot imagine, mitigating risk of bacterial infection astonishingly. However, the advent of antibiotics used in animal agriculture quickly threatened the efficacy of human-use antibiotics due to rapid rise of resistance to early antibiotics—and even faster, to other families of antibiotics. From the mid-1940’s into the 2000’s the meat industries, pharmaceutical companies, and even regulators ignored and repressed concerns involving the threat of antibiotic resistance. Despite attempts and posturing of certain players in the meat industries, even today it appears as if the mass-producing meat purveyors are unwilling and/or unable to completely absolve themselves from antibiotics in the production of human food, which only furthers the hazard of superbug infections that are resistant to nearly all common medical-use antibiotics. Until the industry becomes willing to abandon its intense confinement production model, I don’t see the antibiotic story changing. However, the upside to this is that farmers who are willing to adopt a more natural template like the outdoor pasture-based model can completely eschew antibiotics, which is the grassroots future to clean eating for those who know and care about the antibiotic issue. At Pasture to Fork, we are unwavering in our stance against using antibiotics to produce your food, even making "No Antibiotics" one of our stated protocols. While we believe the risk of regularly consuming antibiotics is great enough for adults, it’s even greater for children, and investing in future generations is paramount in our opinion. And that’s The View from the Country. P.S. I highly recommend reading Maryn Mckenna's book, Big Chicken. This article does not do her work justice but is a mere sneak peek.

Can the Food and Farming Crisis be Resolved?

Given the modern worldview that independence of the individual is everything, it’s probably a bit of a shocker when I say I view human independence as an illusion—a mirage in the distance that will always be that—in the distance. Yes, we have an aura of independence given that we have mechanized transportation that’s as easy as getting in and turning the key, we have devices in the palm of our hand that literally give us access to the knowledge of the world in milli-seconds. And that creates an impression of independence in the sense that we can go places, do things, and know things that were impossible for most of human history. But even with these modern technologies, we are dependent on other people. No man is an island. And I’m not even referring to the psychological aspect of human nature that wants to be connected to other people. It’s just an irrefutable fact that humans have always been dependent on each other—community and kin—as well as the surrounding eco-system for survival. But as life became easier due to industrial and technological advancements, many of us are at least a little obsessed with the idea of being our own person apart from others. That in itself may be ok, but I’ve come to agree with Stephen Covey in his The 7 Habits of Highly Effective People where he says: Independence is the paradigm of I—I can do it; I am responsible, I am self-reliant, I can choose. [on the other hand] Interdependence is the paradigm of we—we can do it; we can cooperate; we can combine talents and abilities and create something greater together…. A little later in the same chapter he writes: Life is, by nature, highly interdependent. To try to achieve maximum effectiveness through independence is like trying to play tennis with a golf club—the tool is not suited to the reality. Interdependence is a far more mature, more advanced concept.      Granted, the book Covey wrote is about effectiveness, and that may not be the goal of some people today—although I say it should be. What greater aspiration could we have than to be effective in our lives. Whether it be in our work, with our family, or even in our faith, we should aspire to be effective, which is defined as “adequate to accomplish a purpose, producing the intended or expected result.” Like always, I would like to turn this discussion to the food arena. How does it apply? Number one, due to the industrialization of human food within the last fifty to seventy-five years – not the least of which is the emergence of a food processing industry who brought a great many convenience foods into existence, distributing them to every local grocery and supermarket. Foods that traditionally were sourced directly from local farms and home gardens now come from nameless, faceless entities and conjure a fantasy of not only human independence from the natural elements man has always relied on for sustenance, but also the false resemblance of food security and independence. Modern society forgets that food and farming is inextricably linked, regardless of whether it comes from the supermarket in a plastic package, from the garden in the back yard, or from a local farm. That man would no longer be bound – yes, helplessly dependent – to the natural elements of soil, air, and water is one of the biggest myths of all time. In his writings, Joel Salatin often refers to our interaction with the earth and our dependence on its fruits as our ”ecological umbilical.” At first I thought it to be too strong a term, but I’ve changed my mind. Our dependence on the earth and its natural elements is not unlike the utter reliance of an unborn baby on the continuation of nutrients through the umbilical connection with its mother. In the foreword of Forest Pritchard’s excellent book, Gaining Ground, Joel penned these words; “We cannot escape our responsibilities to, nor our interactions with, soil, air, and water – the basic ingredients in the farmer’s alchemy….. Unlike other vocations that are arguably more or less necessary, farming is basic to human existence. Because it is at the root of civilization, it has the greatest capacity to either heal or hurt humankind’s planetary nest. As co-stewards of this great creation, we all owe future generations the benefit of knowing something about farming, food production, and land care. Few intellectual journeys could be this necessary and far-reaching.” Isn’t that an irrefutable truth? As the farmer population continues to decline – largely due to either age or bankruptcy – it will become more obvious than ever how dependent society is on farming and food production. Agricultural statistics are concerning in terms of farmer age, although it’s a little-known concern in society and is not touched by the mainstream news. One of the most abnormal aspects of modern America is the fact that many regions are literal food deserts, meaning there’s no food being raised in the vicinity. This is true not only in cities and urban areas, but in many rural areas as well. To be sure, rural areas may have farms – even active working farms, but they are usually in the commodity business and are not raising actual food for local sale to the local populace. Whether they have corn, soybeans, wheat, or hay in the fields, it’s a commodity that goes for animal feed. They may have hogs, dairy cows, beef steers, or a barn full of chickens, but there’s no food to be obtained from the farm. In this country by and large, food is acquired from grocery stores or supermarkets, not from farms.  Most farmers today contract with a grain, meat, or dairy processor, and are merely producers of commodities—feudal serfs who dance to a corporate whistle. Major multi-national corporations like Cargill, ADM (Archer Midland Daniels), Tyson, and Purdue purchase the majority of raw materials entering the food production stream. Rural farming communities throughout the United States have dwindled to near ghost towns, and most farm commodities are subsidized with tax revenue to support less-than-sustainable farm income streams, which in turn benefits the corporate buyers of raw farm commodities because they can purchase at cost of production or less.  New census data released by the USDA in February provides reason for concern, again. The number of farms operating in the US and the number of farm acres have both fallen significantly. There were 141,733 fewer farms in 2022 than in 2017. The number of farm acres was reduced by a whopping 20 million acres in the same five-year period. This is very disturbing! Yes, we can shrug our shoulders and say there’s still plenty of food in the supermarket, and that’s true. But that food is increasingly not produced domestically. As a nation we now import 20% of our food. That’s one out of every five bites. If that doesn’t pose a national security concern, I don’t know what does. What’s the solution to this? While it’s a complex problem—particularly on a national scale (I happen to think most large-scale problems are best solved on a local or regional level) I believe number one is to de-corporatize farming and food production. While there are a number of small farms that have effectively exited the corporate commodity system, they are few and far between, and we need many more to make this move. The problem with being in the commodity system is that the corporate aggregators who buy raw farm commodities hold farms and farmers hostage via price. Given that most farmers have little to no control over the price they’re paid for their goods, farming has become the hard-scrabble vocation it is, which then turns the next generation away. Thus we have an unprecedented aging farmer demographic, which means that in the next 15-20 years, over fifty percent of our privately held farmland will change hands not by choice, but by death. Who will take it over? Will they know how to manage it? If this land is not taken over by people who know how to produce food from it, we’ll undoubtedly import even more food from foreign interests. Throughout history, people—individuals—have always teamed up to instigate change. And they still do.  Such as small-scale food producers who take the path of lunatics and are driven to a different system by producing real food for real people within their region. That’s us. But more importantly, change is being instigated by people who are sick (literally) and tired of being victims of Big Food and their unpronounceable ingredients, empty claims, and tasteless pseudo-food, and opt out to find real-time food producing farms in their region. That’s you. This food partnership is the crux of interdependence. Small-scale farms like us cannot be independent, no more than today’s society is independent in food acquisition. To me, the folks who recognize the reality of this opportunity—and leverage it—portray quite well the irrefutable law of interdependent community and become the solution to one of the foremost threats facing us as an independent western nation. As always, the people hold the solution in the form of a food revolution. Let's hope it comes quickly. And that’s the View from the Country.

Do you trust "Certified Organic"?

The birth of organic food is one of humble beginnings and grassroot effort in an attempt to regenerate farmland and provide clean food for those who sought it over sixty years ago. In the mid-1960’s, the moment was ripe for turning back to nature; DDT was in the news, an oil spill off the coast of Santa Barbara had blackened California’s coastline, and Cleveland’s Cuyahoga River had caught fire due to chemical pollution. “Ecology” was on everyone’s lips and was closely followed by “organic”. Early on, “organic” carried connotations far exceeding mere chemical-free food production. It implied a disdain and rejection of the war machine (also a hot button issue of the Vietnam era), since the same corporations—Dow, Monsanto—that manufactured pesticides also made napalm and Agent Orange, the herbicides with which the U.S. military was waging war against nature in southeast Asia. This correlation was very real in the minds of the early adopters, which was largely made up of young people who decried the war.  The early efforts at growing food organically was trial and error by scattered amateurs who were poorly connected and had almost no support network. In fact, the USDA was actively hostile to these efforts, viewing it as a critique—which it was—of the industrialized agriculture it promoted. Largely due to these factors, the organic food and farming model stayed relatively small and obscure--compared to the industrialized food sector--in the twenty-five span from the mid-1960's until about 1990. It did, however, grow in a sort of behind-the-scenes manner driven by increasing consumer demand. Because of this burgeoning consumer demand, by 1990 organic agriculture caught the eye of some of the largest food corporations in America—and subsequently—the eye of food governance. A bill, called the Organic Food and Production Act (OFPA), was passed in Congress. Ironically, it instructed the Dept of Agriculture—the same agency who had treated organic agriculture with undisguised contempt—to establish a set of national standards. Even today, many of us in the beyond-organic food world look back on that moment in time and see that move as a grave mistake. Since the early ‘90’s—and even more in the 2000’s, the organic food movement has seen consistent growth, and is lauded by many as the complete answer to corruption and fraught industrialization in the food industry. From a marketing standpoint—as well as from a consumer view, organic was the unalloyed good in a food world gone awry. But what many people don’t know is that the growth was fueled by corporate buy-in (from the largest food corporations in the world who wanted a piece of the pie), who viewed organics through a dollars-and-cents lens rather than the pure-food-and-farming vision the early adopters had. In my view, corporate buy-in is what gave rise to the organic food movement. However, it also caused it to become corrupted. Consequently, the organic food movement—like its cousin Industrialized Food—has become riddled with fraud and deceit. In an effort to maintain purity and consumer trust, the National Organic Program (NOP), which was the result of the USDA's "national standards", set up a third-party review board that allows or disallows materials (products used in both production and processing) into the NOP. This review board is called the Organic Materials Review Institute (OMRI). Over the course of the past 20-30 years, as more large-scale growers seek entrance into the organic marketplace, OMRI has been pressured—and I suspect bribed and bought—into allowing more and more questionable materials into the NOP. As a result, a number of “organic version” pesticides, herbicides, and fungicides (cides=“death“ in Latin) are now allowed in the organic program that would certainly have been shunned by the early pioneers. But they are a necessary ingredient to the industrialization of organics. Similarly, certifiers are localized third-party groups, with usually several in each state, although they are allowed to certify in other states as well. For example, here in PA one of the certifiers is PCO (Pennsylvania Certified Organic), who is known to adhere to stricter rules and regulations than some certifiers. For this reason, some organic farmers choose to certify with the less rigid certifiers (which is always the result of a pass/fail system). You see where this is going, as organic became mainstream and therefore, mass-produced, it becomes increasingly similar to its conventional counterpart in that it's a race to the bottom in terms of quality. The only difference being that little green “USDA Organic” logo to buoy consumer confidence. Perhaps one of the most glaring cases of fraud in the organic sector involves imported grain. In most areas the world over where organic agriculture is practiced, there’s a deficit of organic grains. For example, here in the US we only grow 20% of the organic grains we use, and import 80%. To make up this deficit, the “Stan” countries (Afghanistan, Pakistan, Uzbekistan, Kazakhstan, etc.) of central Asia have stepped in to fill the void. The grain from this region is largely exported through the port in Istanbul, Turkey. In March 2018, a shipment of “organic” grain from these countries was found to be fraudulent and 25,000 metric tons of corn was refused entry into the U.S. Although the NOP issued a memo four months later (in July 2018) to organic certifiers to be wary of these high-risk countries for grain fraud, no more action was taken to limit imports from the "Stan" region. There’s evidence of domestic fraud as well, as one can well expect given the significant price hike from conventional to organic. In 2019, an Iowa commodities broker, Randy Constant, admitted to more than $142 million in “organic” grain sales, the vast majority of which were fraudulent. During the years of 2010 to 2017, he sold over 11 million bushels of grain with more than 90% of it falsely marketed as organic, some of which included grain grown from genetically modified (GMO) seed, which is banned by the NOP. While there has been some action taken to bring the perpetrators of these fraud schemes to justice—especially here in the states, much of it has been slow and complicated, with almost a sense a reluctance from the NOP. Which begs the question; “How much “organic” grain has been both imported and sold domestically since the above cases have been uncovered?” And that of; “How many crops are entering the organic food and feed sector daily that are not truly organic?” Given the fraud that’s has taken place, not to mention the host of questionable material “cides” allowed in the organic sector, the “certified organic” food movement is something of a house of cards in terms of consumer confidence. I purposely say the “certified organic” movement because I believe the original vision for food and farming that’s natural and chemical-free is still alive among many farmers and eaters alike. Truly organic food (even more if it's beyond organic) in the marketplace is still the most viable alternative to mainstream food, which is to say the alternative to GMO’s, glyphosate (a known carcinogen), synthetic fertilizers, antibiotics, growth hormones, and the like. But “certified organic” has become a mere shadow of what it was meant to be and may be the most misleading label out there today, given the fraudulent activity found in many corners of the program. Some of the early pioneers in the organic movement suggested that the organic food chain couldn’t expand into America’s supermarket and fast-food outlets without sacrificing its ideals, and it appears they were right. The industrialization of organics has sparked a dramatic shift away from the founder's vision for small locally oriented farms producing high-quality food to what is now a subset of Big Industrialized Food procuring pseudo-organic food on a large-scale globally oriented business model. When organic food appeared in big-box stores, it became just another label designed to bolster the confidence of eaters who were distanced from the producers of their food. As always, distance obscures transparency and accountability. With transparency and accountability missing, deception is easy—because regulators are easily bought and sold on the corporate level. And then, it’s only a matter of time before consumers see through the illusion and trust is lost. The good news is there’s a rising revolution of farms and food producers who are serving the “brightest and best” in the consuming populace with “beyond organic” vegetables, meat, and dairy products. It’s a revolution of sorts that includes small-scale direct-to-eater farms and clusters of concerned, educated consumers. Bypassing the need for organic certification with direct consumer relationships, this growing know-your-farmer know-your-food movement is the future to trust and transparency in the food supply. At Pasture to Fork, we place little to no emphasis on organic certification, mostly because all our production models far exceed of the requirements of the NOP. We do, however, emphasize producer/consumer relationship, localized foodscapes, full transparency, and optimal food quality and nutrition. What’s more, we believe truly organic food production is more about the producer’s beliefs, thoughts, and worldview than it is about a pass/fail certification process that can be fudged on or a set of rules that can be bent. Which is to say you can learn more about me by seeing my reading material (which indicates my interests and worldview) than having me fill out a bunch of certification forms. And that’s the View from the Country. This image depicts the books that have had the most impact on my views in farming and food production, health, and business, but not only that, on my life and worldview overall. Or see our short essay dubbed "Beyond Organic", for more outside-the-box views on the "organic" discussion.