Why I have returned to vegetarianism

When I was a teenager, I was for some years a vegetarian. It was ill thought out - I was always fond of animals (thanking my lucky stars that Charlotte was able to pluck Wilbur from his baconward trajectory). I was also probably was attracted to the lifestyle's aesthetic properties - stricter, more virtuous, less mainstream. I lapsed from vegetarianism for equally specious reasons. My boyfriend at the time, who was an animal activist vegetarian, left me the day before Thanksgiving, and I consoled myself with turkey.

In the meantime, I've become something of a foodie and avid home cook. Mmmmm, meat!

But I have always felt that if I do anything immoral, anything I should be regretting, it is eating animals. I also think if there's any widely held cultural value right now that is ripe for moral re-evaluation, it's eating animals. Recently, I was considering watching Food, Inc. I didn't want to see it because I was worried it would make me feel guilty, and I didn't want to feel guilted out of eating meat. Which, of course, meant that I really need to take a look at what I'm doing.

The cause of animal rights suffers from its adherents. PETA attracts only attention with its antics - not converts. "Meat is Murder!" and "Love animals, don't eat them" are question-begging. It is obvious to me that a duck hunter who kills a few dozen ducks is not the moral equivalent of Timothy McVeigh. People who claim otherwise, who claim that animals are fully persons in the moral sense of the term, fail to convince me or the vast majority of people without some further argument.

The main moral defense of eating non-human animals is that they are not persons since they lack certain crucial cognitive abilities - most notably the ability to respect rights. The main response to this from the animal rights folks is that we don't accord rights on a sliding scale based on how intelligent one is. A whipsmart doctor does not deserve more moral consideration than a somewhat slower, more plainspoken farmer.

And we accord rights to humans with severe cognitive disabilities.

So I thought about all this quite seriously in the days after the birth of newest little guy. For those who are not regular readers of the blog (a group who should live and be well to 120), Edmund (aka the Mound) has Cri du Chat Syndrome, and is likely severely cognitively disabled. It seems perfectly obvious to me that he deserves equal love and attention, if necessarily different treatment, than my typical kid. (As it happens, he is ridiculously easy to love, as he is always either smiling and delighted, apparently made entirely of baby fat and dimples and glee, or sleeping). One might attribute my obligations to the special responsibilities of being a parent. But it also seems to me that he deserves respect from everyone else, too (warning to future parents of kids who tease the Mound - if you do not sufficiently intervene to prevent your child's bullying, I will do something we all will regret). Of course, I can't treat him exactly as I would a typical human. I can't grant him, say, autonomy over medical decisions. But nor can I mistreat him, and I am obligated to maximize his talents and be respectful of him.

Now the fact that we have moral obligations to Edmund does not necessarily mean we have the same obligations as we do to a similarly-abled animal (of course, Edmund has abilities that even the most able ape lacks, and lacks abilities that my cats have -- but you get the point, I hope). Species membership in itself does, as I have suggested, count for something morally.

However, it does not count for everything. I do think it's warranted to treat a fertilized human egg, and a brain-damaged person in a permanent coma somewhat differently than I would a fully able adult (that doesn't mean I may treat them without respect). I also think we would be required to treat, say, a chimp who was able to go through some neuron-growing process and had all the cognitive abilities that a typical human adult has, as a full person in the moral sense. Same with ET. Cognitive abilities in themselves, even without species membership, also seem to count morally.

So. I think neither cognitive abilities nor species membership are necessary for moral consideration, but each is sufficient. Sufficient, that is, for some moral consideration - not for full personhood. I don't know how much cognition warrants moral consideration (so, say, a thermostat doesn't count). It may well be something like the ability to suffer or having interests puts you in the moral consideration camp but is not enough for full personhood.

In light of the fact that I don't know how much cognition is required for moral consideration, and animals have quite a bit of cognition, I think I ought not to eat animals. (I don't promise I will never slip up and eat a bite or two of prosciutto at my in-laws' Christmas antipasto.) At the very least, it is absolutely clear to me that factory farming is an absolute moral outrage, one I want no part of perpetuating. And I made the leap in honor of Edmund, who, once upon a time, would have been thought unworthy of any moral consideration. I do not believe he is the moral equivalent of a non-human animal. But he has taught me to value more fully everything that a being can be without intellectual abilities. He has taught me that my love and treatment of him does not depend on what he can do. And he makes me want to treat all those who can feel and suffer, but can't do algebra, better.


  1. You would grant a artificial intelligence, equivalent to a typical human in cognitive ability, full moral personhood? Would it be ethical to continue development on such an AI absent permission from the AI itself? And would society be obligated to fund any project that produced such an AI in order to keep the machinery from being turned off? Inquiring minds want to know!

  2. Great post, epic. I've replied to this at embarrassing length over at my place. Cheers!

  3. If I only ate strict Vegetarians, what does that make me?

    "So I think neither cognitive abilities nor species membership are necessary for moral consideration, but each is sufficient."

    I hate to get into the weeds of morality, I think Legality is a better approach. Legal reasoning is based on a host of factors, morality being just a portion of them. I accept the right of society, through the Democratic process, creating the laws which deal with such issues. Your own moral considerations are your business, just as long as you are willing to abide by the Democratic process in law making, I have no problems. If you don't want it to go to that extent, by say abolishing factory farming, then persuasion is your only other route.

    Not trying to sound crude but I don't think you mean this: But he has taught me to value more fully everything that a being can be without intellectual abilities.

    Certainly your son has intellectual abilities, they are just impaired. Someone with no intellectual abilities would be in a vegetative state. At that point we get into really difficult terrain.

    gj, we are a long, long way away from AI, so don't fret about turning off your mac at night just yet.


  4. one other thing, I just reread my posting and find it a bit rough. Years ago I worked at a camp for ARC, one of the campers was a S&P boy, born without eyes (one of the things I had to do was put salve into his sockets to keep it moist, on the plus side, I learned from that I can handle pretty much anything) and who had a shunt to drain fluid from his brain. He had to be treated with great care.
    I remember one day I took him into the pool. He stood there with the sun on his face and the water lapping at his midriff and he had a beautiful smile on his face, he literally radiated. Buddhist monks can train there whole lives to come to a moment that came naturally to him. He was completely of the moment and his place. As far as I know the smartest Chimp can not do this. Seeing him, I sensed he possessed something beyond my understanding, so how for a moment can I not consider him to be as "worthy" as I am. It has been 30 years this summer since then and that memory is as strong as yesterday. The smartest teacher I have had has not taught me as much and as vividly.


  5. Charo - I was loose with language. By "intellectual," I meant higher cognitive function - like language, abstract thought, reasoning, planning, etc. Edmund will have little to none of that. There was a point at which it appeared that Edmund would remain basically non-responsive to the environment, and that was indeed really difficult terrain.

    And, as I said, there are things that Edmund can do, even now, that the smartest ape can't. And that is a lovely story.

    Bone - thanks! Have some thoughts on yours that I'll post!

    GJ: Assuming enough relevant cognitive similarity (including emotions, the ability to suffer (not just detect pain), self-awareness, consciousness, having qualia...then yes. Full moral personhood.

    We develop people without permission from the person itself. That said, I think we would need to tread carefully before creating someone to whom we would hold such obligations. It is hoped we could get a lot out of AI using substitutes of some kind for emotions and suffering.

    "And would society be obligated to fund any project that produced such an AI in order to keep the machinery from being turned off?" Those seem like two different questions. What must society pony up to start such a project? No obligations, there, except to tread carefully. Would we need to spend any amount of money to keep it from being unplugged? We don't spend all the money we could to save every human life now. Many lives could be saved with certain road improvements, say, that are too expensive for society to bear. Aren't there government cost/benefit analyses of the value of human life?

  6. We develop people without permission from the person itself.

    Yes, an excellent point. I'm not sure there can be substitutes for emotions and suffering, however. These may be inevitable products of consciousness and self-awareness.

    I'm only suggesting society would be obligated once a moral-person AI was activated. We wouldn't be obligated to build it, but once it is here, we might not be able to turn it off without a ethically compelling reason.

    And if I may suggest it, AI entities would very quickly 'evolve' well past human abilities to understand or control. I believe such intelligences may be the next evolutionary phase of intelligence on this planet. We aren't close today -- a cockroach brain is around the best we can do at the moment -- but perhaps in our lifetime things will begin to snowball, and when it happens, it will go very quickly, IMVVHO.