In Sputnik’s Orbit

A few thoughts to tide you over…

 

Hey, Hi, Hello GUTGAAgians

This is my meet & greet intro for GUTGAA. If you don’t know what that means then either you aren’t in the writing biz or you need to follow this link.

Either way, Hi! I’m C. Stuart Hardwick. Welcome to my little sliver of the Internet. I grew up in South Dakota, and my writing springs from the intersection between the Wild West and the Space Age, the History of Earth and the dreams of mankind. After decades of technical writing, I turned seriously to fiction just a few years ago and write science fiction principally, but also historical tales and all sorts of other things. I won the Colonnade Writing Contest in December, and am pursuing a graduate certificate in writing from UC Berkeley.

Q: Where do you write?
A: Well, mostly Earth but…. No seriously, I write everywhere: the bus, coffee shop, back yard swing, standing up, sitting down, in a boat, with a goat. Well no…goats are annoying.

Q: Quick. Go to your writing space, sit down and look to your left. What is the first thing you see?
A: A cup of coffee. No, wait. The dog. Who gave the dog a cup of coffee?

Q: Favorite time to write?
A: When I’m awake. Seriously, I write whenever I have time, but different writing tasks require different circumstances. If I had to pick a favorite time, though, I would say “whenever the alternative is waiting on something.” Writing can turn an hour stuck on the bus and in traffic into a blessing.

Q: Drink of choice while writing?
A: Seriously? Actually, I drink mostly decaf, and to keep from tanning my insides, sometimes a nice flavored sparkling water. When I can get them, I like to nibble on cherries while I edit.

Q: When writing , do you listen to music or do you need complete silence?
A: Music generally, and specific music. When writing sci-fi, I generally listen to Tangerine dream and Daft Punk. Recently I altered the tone of an entire short story in response to its resonance with a particular piece of jazz. Never, ever, anything with lyrics.

Q: What was your inspiration for your latest manuscript and where did you find it?
A: If I told you that, I’d have to kill you and write a scene about it. Usually, I start with scenes or characters. I write a few scenes and the story and the relationships emerge from that. Then I stop and plan out the story and start looking at structure and pacing and balance.

Q: What’s your most valuable writing tip?
A: Writing takes humility and arrogance, and maniacal patience. Have faith in your voice and vision, but make sure you learn continuously. Be prepared to wait indefinitely, but work aggressively toward your goals.

Fun facts:

* I’m left handed and have been known to paint with both hands simultaneously.

* In high-school, I wrote a computer program to train myself to touch type.

* I once worked with John Carmack–a couple of years before he wrote the video game Doom.

* I know how to juggle, and prefer clubs.

* I play piano a little better than I juggle.

One Great Man, One Giant Legacy

The first ape to leave his planet of origin and go for a walk on another is remembered today as a “great man”. Perhaps, and the honor is certainly well deserved, but if Maj. General Armstrong was great, it was more for his conduct on the ground than for his exploits in space.

Humanity’s considerable success does not arise only from our intelligence or the dexterity of our opposable thumb. We have diversified, colonized, and advanced because of our unique balance of aggression and cooperation. Arguably, nowhere in our entire history is this better illustrated than in the Space Race of which Armstrong became such a key part.

We went to the moon for science and exploration and adventure, but we signed the checks to stick it to the Ruskies. We went because the two most powerful nations the world had ever known were locked in a stalemate of nuclear hair triggers that—once or twice that we know of—had brought us within hours of potential extinction. And yet, at this pinnacle of barbarism, we did what our ape family has been doing for over a million years: we hatched a bold plan, put together a team, and pulled off the win. At the height of the cold-war, we unleashed the combined creativity and dedication of 150,000 American engineers, scientists, managers and laborers to build a system of machines, the complexity of which makes the Great Pyramid just a pile of rocks by comparison.

Then we put together the procedures, policies, communications networks, and contingencies needed to test, perfect, and utilize this monster to do something that throughout history and until the last decade, had seemed to be impossible. We even broke the rules and put together a back-door alliance when it turned out that radio signals used by Soviet espionage vessels off the Florida coast had the potential to compromise the moon shots (in response to a long relay of unofficial personal pleas, the Soviet radios were silenced).

Armstong too, illustrated this human balance. He is remembered (rightly so) for his humility, but he didn’t get to the moon by being a wallflower. He was smart and sociable, but he was neither particularly well connected nor an academic superstar. He was, however, reliable. He made good grades and he did his job. When opportunities arose, he jumped on them with both feet. He fought in Korea, then he volunteered to be a military test pilot. Then he went to Edwards AFB, where he took the very unglamorous job of flying chase planes and the bombers that dropped the test aircraft. He went on to fly 600 different types of aircraft, most of them experimental. At Edwards, he regularly risked his life and just as regularly came back alive. Famously, when he ejected from a failed Lunar Landing Training Vehicle, he hitched a ride back to the office and started on the paperwork while some of the other Astronauts looked on in awe.

He made mistakes. He got a test plane stuck in the mud. He bumped into the ground with another and–through a serious of “bad day” challenges familiar to us all–ended up stranding three test pilots at another base. But when things went wrong, he handled them. He volunteered for Apollo, but was late getting his paperwork in. They took his packet anyway—they knew his reputation.

Neil Armstrong didn’t just go to the moon, he took us to the moon–all of us–and he saw his role in history with a clarity and humility that allowed him to step back and let us enjoy the ride. His passing, after 82 years, is a loss and sadness for his family, but his life will remain with us as a heroic example from a heroic time in our human journey. Neil Armstrong was indeed a great man, not because he was better than so many others, but because he was the sort of human being that any of us can be with a little bit of moxie, a little bit of smarts, and a whole lot of effort. He was a true hero, because more than anything else in this life, we all need to be reminded that we are all of us capable of greatness.Time and micrometeorites will erode the prints men left on the moon, but the down-to-Earth life of the first man who made them will forever be recorded, as truly a giant leap for mankind.

If Man Evolved From Apes, Why Are There Still Apes

Evolution is not a religious issue. It isn’t. If you make it a religious issue by pitting your religion against science, your religion looses. Period. That’s not an atheistic science conspiracy, it’s just a predictable byproduct of mistaking for divine revelation, what are actually stories passed down from people living in the iron age. Maybe the authors of genesis were inspired by God, but they clearly weren’t taking shorthand.

Maybe God made us in his image, but he took 4.6 billion years to do it and by “his image” is probably meant something other than “an old guy up in the clouds”. God or no God, evolution is how we got to be what we are, and if that seems to contradict some of the stories in scripture, that’s okay. God may have inspired the scripture, but he MADE the world, and this is it, right here holding up all these fossils.

Still, evolution is a vast and fascinating field, and there are a few things that are understandably confusing to the layman. This question, though, shouldn’t be one of them.

Australia was colonized by the British. Why are there still British? Because Australia was a penile colony, and most of the Brits stayed home and worked on bits for what would one day become Monty Python. Get it? Notice how Australians have a new dialect that is quite distinct from their ancestors? And yet, the Queen’s English is alive and well. Get it now?

No? I know, I know, you weren’t paying attention. The dog is chewing on the table leg, and somebody called you a monkey’s uncle, and you just have one question: If we evolved from apes, why are there still apes?

Because the apes that were living 6-8 million years ago didn’t all line up and march through the mouth of Vol (Star Trek reference) and come out human. There were thousands of these apes, you see. Tens of thousands, in fact, and lots of different groups and kinds—far more than today, partially because we weren’t around with our Land Rovers and our taste for bush-meat and penchant for taking away everything from everyone all the time except in church when we remember that the Big Man is watching. But I digress.

There were all these apes see, and some of them lived over there under those trees and they were just okay. And some of them lived yonder in the valley and they were cool with that. And a lot of them lived way over through the mountains and they don’t ever call or write. But this other group here, let’s call them the Skins, they kept rubbing elbows with those ugly bad-tempered dudes at the edge of the jungle who ate all the bananas, the Shirts.

The Shirts really stunk. No really, they smelled of bananas and Old Spice, and between you and me, they were bullies anyway. So the Skins, they started foraging out into the savanna a bit. Now, the African savanna was as dangerous then as it is today. They have lions and tigers and bears, oh my. Well, they have lions. And stuff. Lions loved eating them some Skins, but like every other mammal, lions have to chill through the heat of the day or they—oh what is the technical term? Oh yeah—die.

So the Skins did okay. They weren’t exactly sprinters, but if you keep following an animal through the heat of the day—keep making him run—eventually he’ll keel over (there are modern humans who still hunt this way today). This worked pretty well, though the hairier guys couldn’t take the heat. Those guys would pass out and get eaten, or they would go off and join the Shirts bowling league. After a while, no one with much fur was left among the Skins. Life on the savanna worked out pretty well, because it was getting hotter all the time and the savanna was getting larger and larger. Also, tracking prey and pacing yourself is not the easiest work. The groups with the best planning and tracking skills got more food and less, um, eaten. So, by the time the sea level dropped enough to create a pathway up into Europe and Asia, the Skins were much smarter and taller and faster and sweatier than the Shirts, who still got together Thursdays to shake down the bananas, and if anything were even bigger bullies than they had been.

“But,” I hear you asking, “if the Skins evolved from the Shirts, why were there still Shirts?”

Put down the bananas and pay attention will you? The Skins didn’t evolve from the Shirts, they both evolved alongside each other. After a while, none of the lady Shirts wanted to hang with those sweaty Skins, and the Skins hated the way the Shirts beat the crap out of them for showing off their times in the 200 meter sprint, so they just sort of left each other alone. They had become separate species—though not by much.

Migration and isolation are key parts of evolution. Forget about Gorillas and Chimps for a moment. Look at our more immediate ancestors. Homo erectus migrated out of Africa 1.8 million years ago, eventually migrating up into Europe and evolving (over more than a million years) into the Neanderthals. Meanwhile, the original population of H. erectus still existed in Africa, continuing to evolve into H. sapiens. When Sapiens migrated out of Africa 200,000 years ago, they out-competed Neanderthals and spread around the world. But Sapiens still existed in Africa–and still exist there today.

If that’s all too much to wrap your brain around, here are some simpler examples: Branches can grow from a tomato plant while the plant is still there sending off more branches in other directions. We humans bred domesticated corn from a bushy grass called teosinte, but teosinte still grows wild. Televangelists evolved from the Catholic scholars of medieval Europe, but there are still smart people in Europe. Okay, that’s not really an example; that’s cultural evolution.

There are still other apes (besides us) because we evolved alongside them, from the same ancestral stock as they come from. But we know for a fact that we evolved together.primate-hands-family-tree

And no, we did not descend from Monkeys. We descended from an ancestral population of early apes that lived 6-8 million years ago. Our last common ancestor with the monkeys was around 70 million years ago and wasn’t even a primate yet. I think they still wore tunics or something like that.

Have Proper Noun, Will Capitalize

Thou Shalt Capitalize Proper Nouns.

I don’t make the rules folks, but we all benefit from them, and my fellow writers, well criminy—look them up will you?

Earlier, I ran across a thread in a writer’s forum — a well-respected writer’s forum mind you–that went on through page after page and month after month of ignorant prattle about whether to capitalize “bible” or “the Bible” or “God” or “gods”. Every single post, it seems, missed the point utterly. You capitalize proper nouns: God, Zeus, Elvira Mistress of the Night, Scoobie Doo, what have you. It has nothing to do with whether you believe in God or whether you want to pay respect or reflect the importance of a figure.

We don’t capitalize God out of deference to God. We don’t do it because we believe and fear his wrath. This is not a question of style or belief or fashion. We do it out of deference to our reader, because we believe and fear his scorn. Even Christopher Hitchens would write, “God is not  great”. We don’t capitalize Charles Manson because he is so influential (well, I certainly hope not!) but because that’s his name. Piss off Charlie you git.

We also generally capitalize adjectives derived from proper nouns, such as Malthusian or Reaganesque. Oddly enough, notable exceptions to this rule include “biblical” (generally not capitalized anywhere in the English speaking world except the editorial department of the local Baptist church), or vedic or talmadic. For the record, neither theist or atheist are capitalized, but Baptist is. The latter is a religion, a proper noun and derived from a proper noun, the former are  states of being (adjectives), like “agnostic” or “fed up with people who can’t be bothered with an Internet search before stating an opinion on the Internet”. Oh, and “Internet” is capitalized because American dictionarists are under the collective misapprehension that it’s a proper noun instead of a noun meaning “a network connecting computers in two or more installations”, as opposed to “intranet”.

So atheists, you still have to capitalize God and the Bible. Theists, you still have to capitalize Wiccan and Galilean, and Darwinian. Anything else just wouldn’t be cricket, Cricket.

The Certain Fool

It is a peculiar form of arrogance that leads from “I don’t know” to “those who claim to know are liars, conspirators, and scoundrels.” I once knew a fellow who believed that the transistor was (and could only be) the product of alien intervention. It’s unclear why he found this explanation more reasonable than simple human inventiveness, but I suspect it’s because in some primordial way, he placed aliens in the metaphysical realm of myths and Gods with dominion over the unknown (and suspiciously complex). God couldn’t have done it because transistors brought rock & roll to America and millionaires to silicon valley, and there is nothing less godly (apparently) than a machine that gets people tapping their toes and buying things, so it must have been the aliens.
Miraculous as its impact has been, though, the origin of the transistor is quite down to earth. It was the product of a very human team of scientists (led by William Shockley at Belledisoneffect Labs) who set out to find a faster, more reliable alternative to the triode tube used in war-time radar sets. The triode (and other tubes) had evolved from attempts during the 1880’s to extend the operating life of Edison’s new light bulb. Edison, in turn, was building on earlier work by Humphrey Davy, Warren De la Rue, and James Bowman Lindsay, whose own efforts derived from simple experiments with electricity and magnetism, including the observation that lightning strikes cause a magnetic compass needle to jump (God did it after all, he just takes his sweet time). All of this is known and well documented. But none of it mattered to my acquaintance. He seemed to believe that what he didn’t understand couldn’t be understood, and that attempts to explain it could only be the work of tricksters, out to conceal the real and simple truth: the government is conspiring to hide little green men! One wonders what he would think of the idea that God (or at least the author of the bible) is necessarily in on the ruse? A healthy skepticism is vital, but the key to skepticism is diligent, objective study, not paranoia and infantile rationalization.
Everything we humans do develops in this way, step by step, one generation building on the shoulders of the last. It has taken millennia to build the modern world, and it is natural that we sometimes find it as overwhelming and inexplicable, as our ancestors must have found the elements of nature. But we have more than technology: we have the way of thinking that swept us, wave after wave and revolution after revolution, from beast to astronaut in less time than it took wolves to become pekingese.
Our problem, of course, is that we are all doomed to live and die within Plato’s allegorical cave. We know of the world only by the shadows on the walls—that is, through our imperfect senses. Empirical study may not reveal all that we would like, but it provides the only answers in which we can justify any confidence. Science cannot tell us why the earth exists, but it can tell us how it formed and how long it has been here. We are free to believe as we like, but only within the constraints set by what we can see and test. When we speculate (or accept the speculation of others) in the absence of evidence, we are literally “taking leave of our senses.” When we accept it in the presence of contradictory evidence, we are mad.
Of course, we can’t investigate everything for ourselves, so we are forced to rely on the testimony of experts. This presents a problem. How can we evaluate the expertise of someone who knows what we don’t? More to the point, what do we believe when our doctors, priests, administrators and scientists are at odds? Sadly, “the truth, the whole truth, and nothing but the truth”, is not something any of us will ever have access to. We have facts, but we can only measure them against the ruler of reality, and since we must measure the ruler as well, we must accept a degree of uncertainty.
Not all possibilities are equally likely, though. We can approximate truth like an archer zeroing in on the bullseye. By testing what we can test, we can sort our beliefs until we develop confidence as to which are closest to the truth. This is the key insight behind the scientific method, and it is the key to assessing the claims that clutter our modern world. If science is a game of archery, our individual quest is more like pyramid building, where facts form a broad and shifting base, people who make claims about fact hold up the middle, and the truth, distant and precarious, balances somewhere at the apex. This is a difficult and imperfect way to understand our world, but it works.
Millions of Americans believe that the Apollo program was a hoax. They are all wrong, and I am about as confident of that as I am that those millions of people (most of whom I have not personally met) exist. This is possible because I know enough about science and engineering and human fallibility to recognize the veracity of the evidence supporting the landings, and at the same time, the laughable ignorance behind “moon hoax” arguments. When a “hoaxer” shows a flag waving as it could only wave in a vacuum and then claims this as evidence for a breezy sound stage, I naturally grow suspicious. When his arguments reveal ignorance of how dust falls in a vacuum, of optical photographic artifacts, and of basic physics; when he reveals omissions, flaws, and shortcuts in his reasoning and research and then responds to correction with anger and appeals to authority instead of gratitude and reconsideration, I can hurl his bricks away with confidence.
Consider claims that the earth is only six thousand years old and that its surface was once covered in a single flood event. Many accept these claims based on their understanding of Christian scripture. But in fact, neither claim is explicitly made in the Bible, and neither can be true based on dozens of overlapping lines of evidence and thousands of physical observations. Some of these observations are simple for even the layman to understand. Dendrochronology (tens of thousands of overlapping living and fossil tree rings) and geo-stratigraphy (millions of layers of sedimentary rock) have complex names, but are easy to understand and to see. If we assert (as some do) that God simply popped everything into existence just as it is, with the ancient sediments, the tree rings, and photons sailing in from the spaces between the stars—just to test our faith—then we are in philosophically deep water indeed. If we can’t accept basic measurements of parallax collected through telescopes, then neither can we accept anything else gleaned by our senses, including the stories in the bible. This sort of solipsism leads nowhere, which is why even the Catholic Church, having burned itself before, has acknowledged the antiquity of the earth. Besides, if the entire universe is a fraud, what does that make its creator?
Anyone can sell magazines and books making bold claims. Here are a just a few that are bouncing around our world right now: 1) Conspirators tell us the World Trade Center towers were brought down in a controlled demolition because “no building could ever fall into its own footprint on its own” They do, of course, as happened in January of this year, when one did exactly that in Rio de Janeiro after a structural failure. 2) “Psychics” offer “readings” on late night radio, even though precognition violates the laws of physics as we know them, and anyway would presumably have given the cold war to the Soviet Union (which invested heavily and consistently in occult research). If for three dollars, a gypsy woman with a pack of cards and a creepy disposition can foresee the woman you are destined to marry, surely for a Château on the Baltic she will give you a schedule of spy-plane overflights so you can disguise your missile launchers as a Cuban bazaar! 3) The local pharmacy has an extensive selection of pricey Homeopathic remedies, even though these are just highly distilled waterxi—often with real medicine added as “inactive ingredients”. Homeopathic medicine would also violate the laws of physics (or at least everything we know about chemistry and life—which is quite a lot). 4) A casino paid $28,000 for a partially eaten cheese sandwich bearing an image claimed to resemble the Virgin Mary, (though in fact, no one knows what she looked like and for a short time afterward, the Internet was abuzz with images of foods and nature scenes depicting (with sufficient credulity) various rude acts and anatomical parts).
We are all entitled to our opinions, but none of these claims is worthy of serious contemplation by anyone with a command of our shared facts. Not everything can be observed directly, but we must never be too sure of anything that can’t. When forget this, we can fall for anything—literally. One consequence is religious fanaticism; it is just that sort of certainty that leads people to strap explosives to their bodies before visiting the local market. But such misplaced certainty does more than justify extremist violence; it subverts the ability of people and cultures to manage the resources upon which their survival depends. Children can learn much from the beautiful story of Genesis, but combat disease, they need genetics, and with it, the knowledge that our last common male and female ancestors lived at least 60,000 years apart.
History shows us to be an adaptable and clever race. In an age in which we alone among God’s creation have ventured beyond our world, we must add nuclear war, pandemic, overpopulation, climate change, genocide and eugenics to an already long list of known challenges. If ever a being had the tools to face these challenges, that being is man. But how will we face our future? One possibility is to throw up our hands in prayer and hope we are delivered from this world before it comes crashing down around us. A more intelligent, and frankly, a more spiritually responsible approach, is to learn to govern ourselves as our ancient advisers could not, and use our greatest gift—reasoning—to its fullest.
We don’t have to choose between faith and science; we can reconcile the one to the other. We don’t need to seek the fantastic; the real world is fantastic enough. We don’t need to pretend to certainty, a well-founded approximation of truth is more valuable. Thomas Painex warned us ‘The word of God is the creation we behold, and it is in this word, which no human invention can counterfeit or alter, that God speaks universally to man.” More than ever before, perhaps, we are assaulted today by claims (counterfeit and otherwise) from those who would manipulate us or lighten our purse. We don’t have to give in to these claims, but neither should we see conspiracy and alien intervention in every unknown. Our nation is the fruit of the age of reason. It will survive only so long as science and clever human investigation are permitted to outstrip the darkness that came before it. Empirical thinking, if we will but trust it, will sweep us yet to new heights, whatever those may be.

Dear Moon Hoax Conspiracy Nuts

Dear Moon Hoax Conspiracy Nuts:

Here is how you know when a moon landing is faked: In “Transformers: Dark of the Moon, they didn’t properly account for lunar gravity or for the vacuum and so they animated all the dust wrong. In every single image recorded by NASA on the moon, the dust behaves as it only could on the moon.

The spaceship impact at the beginning of the movie is WRONG. First, a ship traveling at that speed would have rebounded in the weak lunar gravity, and would almost certainly have cartwheeled as it plowed through the lunar soil.

Second, dust CAN NOT BILLOW in a vacuum. On Earth, dust billows (that is, roils out in overlapping spherical clouds) because it is running into and dragging against the air. Likewise, dust lingers in the air because there IS air to linger in. On the moon, every dropped object, from a spaceship to a mote of talcum power, travels along a ballistic trajectory with zero resistance. (This is actually one of the classic arguments through which conspiracy advocates shoot themselves in the foot. The Apollo lander didn’t create a dust cloud BECAUSE IT WAS ON THE MOON, WHERE DUST CLOUDS ARE IMPOSSIBLE!)

When a ship plows up dust in a vacuum, the dust grains travel out in flattened arcs and are gone. A dust cloud cannot rise, because there is no air to push against and suspend the particles. Dust clouds CAN NOT HAPPEN in a vacuum (except in orbit, where there is no gravity, but that’s a very different type of cloud). In the Apollo landing footage, ejecta from the engine can clearly be seen through the window flying out in rays, just as it should, and leaving no cloud.

When an astronaut kicks up dust on the moon, the dust DOES NOT linger around his foot as it does in the movie—it immediately falls to the ground as it does in all the NASA footage of the Apollo landings. There are only two ways this footage could have been produced in 1969: 1) on the moon, 2) on a sound stage built into a cargo plane that can simulate lunar gravity during a dive.

Finally, when the astronauts in the movie investigate the lunr crash site, they disturb dust which falls down through openings producing a slick reveal. Trouble is, it was shot on a sound stage and the dust accelerates under normal Earth gravity.

So there you go. NASA: Real deal. Transformers:Phony baloney. If you still can’t tell the difference, go back to third grade and spend more time in science class, In the meantime DON’T VOTE, because if you aren’t scientifically literate, your aren’t any kind of literate.

The Eye

She was the old one once. I buried her long ago, and now another runs and wags the bushy tuft of tail and stares demonic—the eye.

I drag my bones and stoop and lift. She dodges and shakes and will not relent. I wait and lift and throw (a ball or stick) and burn down her candle a bit until, the pastime secured from view, she finds a twig or pine cone or frog and pesters for a while, and comes and lies across the deck and probes the breeze for hinted wonders while I write or read or swing.

She does not wonder, will I will rise or stoop again? I wonder though–I know–the anguish she will feel one day when the demon eye is cast once more and with expectant glee, on some new toy that still has voice: not me, but in a stranger’s hand.

Dog Eat Dog

I killed a mouse today. He made the mistake of letting me see him run into the garage, so I left a trap baited with peanut butter and he couldn’t resist. I use a live trap because it’s much more reliable and can’t flip over into attic insulation.

So he was alive; more tired than frightened. I killed him. A drop in the grand ocean of life’s ill winds. I don’t regret it. He and his kind spread disease and damage my roof, my siding, and my insulation. Nothing can restrain them, not machine cloth, not flashing, not properly trimmed foliage, not cats, dogs, nor owls. Still they come from flowerbeds dripping with irrigation, along fencelines and through pipes. They chew through ventilators and squeeze between boards. They creep through the attic and sometimes the walls.

He had to go. I am glad to be rid of him, and I’ll kill his relatives if they show their mousy feet ’round here. But I don’t hate him. He was only a beast, doing what nature has shaped him to do. He didn’t mean to be a pest. He had no manifesto, no agenda. He would not have gotten drunk and slashed my tires or my throat with a broken bottle. His kind will never bomb my cities to take my provisions or worse, to silence my voice.

He was beautiful.

The Descent of Man’s Diet

I was reading Mark Oppenheimer’s “Daddy Eats Dead Cows” on Slate, and he’s right about one thing. Teaching children to be vegetarian on ethical grounds just gives them a distorted view of ethics that may come back to haunt you. I’m sorry, animals are not people and I am evolved to eat them and, while I’d like the process to be as humane as practical, I’m just not going to lose any sleep over whether livestock finds its time on this earth fulfilling.

Nor is health a very good reason to eschew the beef, for although we eat way, way, way more meat than is good for us in the West, we need some meat and have a hard time getting a balanced diet without it. Do some research and find out how many kidney beans you need to replace the digestible iron you get from a single ground beef taco.

But there are very good reasons to be vegetarian, or at least to curtail meat consumption to a healthy level. Factory farms are major polluters, disease incubators, and energy hogs. Meat takes a dozen times the energy to produce, harvest, and store than the equivalent veg, and that’s the only reason I need. Besides, I never had a steak that could compare with a nice red curry with tofu.

Wrong is Wrong

I am currently enrolled in a graduate certificate program which involves a formal study of grammar and mechanics. In this endeavor, I am reminded of a story my wife tells about her first week in school after moving from California to Louisiana. In her old school, students were expected to diagram sentences of ever increasing complexity. After a week in Louisiana, my wife had figured out that the material was not on her level, so she stayed after class one day to ask the teacher when they would start diagramming. He told her they would not be doing so, and when she protested, he cut her off. “These students can’t handle diagramming.” he said. The truth, of course, is that the students very well could have—should have—handled it and more, but they would have complained to their parents who would have complained to the principal who…well you get the idea.
I think of this now because I was one of those kids who should have diagrammed sentences but was never taught how. I have since found the exercise instructive and helpful, and as I study advanced grammar, I sometimes regret that I was not afforded the opportunity to master this skill as a child. More to the point, I regret that I was never correctly taught grammar, punctuation, mechanics and a host of related subjects. Oh, we had the lessons—every year for twelve years—and I made A‘s and B‘s. I also read so widely that I was able to get by quite well until I started preparing manuscripts and book proposals. But the fact is, the grammar I learned in school was superficial and, in many key respects, simply wrong. It was wrong because somewhere in the educational system in this country, it was decided that school children can only be taught lessons so watered down as to lose all meaning.
I am reminded of another story, one told by the Nobel Prize-winning physicist Richard Feynman about his experience serving on a text book adoption committee. He was asked to serve and was happy to do so, and so he received a few elementary grade science texts to review. In each, in the first chapter, he found the same words: “Energy is the ability to do work”. “No!” argued Feynman, “It isn’t”. They all had it wrong, and so, being a good citizen, he called them up to explain the three or four ways in which this apparently universal statement was mistaken. The reply was unanimous. “Calm down Mr. Feynman. You can’t expect elementary children to understand college physics. We give them explanations they can understand.” His answer was direct: “How can they understand anything if what you teach them is WRONG.”
Perhaps not everyone need understand physics, but everyone ought to be able to use his own language correctly. Children do, of course, have to be taught at their level, but Feynman had a point. English grammar isn’t quantum mechanics. Diana Hacker managed to cover it pretty well in only five chapters out of her 540 page “Writer’s Reference”. On reading it and similar texts, it quickly becomes clear that most of the sticking points that English speaking adults stumble over—indeed, have come to see as intractable—have less to do with the subject matter, and more to do with the quality of education.
You cannot punctuate a sentence correctly unless you know a subordinate clause when you see one, understand why a preposition is called a preposition, and know the difference between coordinate and cumulative adjectives. When you know that “who” is a subject and “whom” is an object, you don’t need silly rules about prepositions, or to remember lines from Hemingway. There are exceptions, but most of it makes pretty good sense once you understand all the detail and terminology that, in school, was replaced with rote memorization, rules that aren’t really rules, and alternative terms designed to keep us from having to learn the Latin and Greek roots upon which our language is built.
My recollection of college is of discovering that we could easily cover in one quarter the material presented over an entire year of high school. We had twelve years in which to master our language, but wasted much of it wading through twelve repetitions of the same coddling. In life, the only things I really regret are squandered learning opportunities. The only things that really anger me are those ruined through ineffective instruction. I can’t get too riled through, for my schooling taught me one thing very well—something that has served me faithfully and that every child should learn as early as possible: no one is responsible for your education, ultimately, but you.