Obligation to write

  • Post author:
  • Post category:Casual / Meta
  • Reading time:12 mins read
  • Post last modified:October 22, 2022

Forever as draft

I’ve come to realise that writing because I feel like I have some kind of obligation to is a terrible motivation for writing.

For the last few weeks, I’ve been stuck writing a draft for empathy in autism. With various misconceptions about empathy on both sides, I felt like there was some value in presenting a more balanced view that considered both perspectives as fairly as possible.

But… as I’ve mentioned in the original post where I talked about my autism diagnosis, I don’t want to talk about autism. I don’t find talking about autism intrinsically valuable. This isn’t to say that talking about autism isn’t valuable, but just that I don’t find myself being very motivated when there are so many other things I could be writing about instead, mostly about science as I understand it.

My views on scientific issues usually align with the scientific consensus, but taking that for granted is an unhealthy way to think. Imagine if we started assigning uncontested authority to certain individuals solely based on their credentials or their post history! Everything they were to ever say would have to be indisputably true on the basis of this given authority. This isn’t how science works, and the day science becomes like this is the day I assume many people will begin to protest, though for scientists that usually comes in the form of writing more books that people won’t read.

One common misunderstanding many in academia seem to have is that they believe people are willing to read—be it articles, books, or even scientific papers. For someone who spends most of their life reading, it may not always be easy to fully appreciate a life that has mostly consisted of not reading. However, according to a recent Gallup poll, only about 17% of Americans do not read books (based on a survey on whether an individual has read anything in the last year; the individuals themselves may change year after year, but the overall percentage remains relatively stable).

But, this poll has some quirks. For one, audiobooks are considered a form of reading, which I find quite peculiar as we already have a word for that: “listening”. In addition, the question only asks individuals if they have “read, either all or part of the way through” a book, so it is unclear how many people are just picking up books and giving up after the first chapter. It also doesn’t differentiate between fiction and non-fiction; for all we know most people are reading for pleasure more than for function: more to entertain than educate, such as the Diary of a Wimpy Kid, Demon Slayer, or Keeper of the Lost Cities.

Though there are some works of fiction such as George Orwell’s 1984 that leans so much towards philosophy it might as well be philosophical non-fiction, most popular works of fiction, especially manga and comics which are surprisingly also considered forms of reading in several surveys, hardly train one for critical reading and thinking skills, let alone the linguistic competence usually required to easily access denser academic texts.

Many people even accuse the relatively educated for using “big words”, as if their personal shortcomings is something to be proud of. To clarify, I’m not referring to obscure words and specific terminology such as perhaps prattling about “deoxyribonucleic acid” when referring to “DNA” instead of just saying “DNA”, but rather just common vocabulary you find in most non-fiction books. Examples will be difficult as competence varies among individuals, but I’ve seen people think that “hypothesis”, “fallacy”, “ostensibly” are somehow “big words” when they’re used virtually everywhere. The concept of even having “big words” as if implying that there’s a part of English vocabulary that’s somehow undesirable to be familiar with is in itself a very pernicious idea.

Woah that’s a derail

But why am I talking about this? This isn’t even remotely close to the topic of autism, rather reading like a precursor towards a rant that centres on my subjective definition of illiteracy, which cannot possibly be true because… well.. they can technically read. Illiteracy is no longer as ubiquitous of a problem as it was in the past.

The real reason, perhaps, is that I wanted to demonstrate—at least to myself— that I still had the motivation to write. At least, as long as I got to write about the things I was interested in, as opposed to the things I felt obligated to write about. The problem is twofold: while it is true that I have less motivation for writing about things I’m less interested in, I also feel like whatever I write as such has to be perfect, precisely because I’m writing it out of that sense of duty or obligation.

As a result, these posts end up spending months in draft states, even to the point of never getting published, precisely because the motivation required to write a post that meets these unrealistic requirements is dramatically higher than the amount of interest I actually have towards the topic, especially autism which I really don’t care enough to write extensively about.

A side note on autism

I don’t think autism exists in the common Vitalist sense where “autism” is analogous to a form of “essence” that exists or defines certain people. My definition of neurodiversity is just that: everybody is different. While it helps to draw (blurred) lines between the majority and minority according to their shared traits, or even biological reasons (there are neurological differences in those with neurodevelopmental disorders such as ADHD and ASD, though this is only one of the many examples and only goes into inhibition response), the fact still remains that there is no “essence” of these conditions, because the majority of people also vary regardless (this is an oversimplification, but just bear with me for a bit).

An analogy would be checking products for defects. Say we’re building some electronic circuits using blueprints. During quality checks, we would compare the finished products with the original blueprint in order to make sure everything is in place. But, this isn’t how it works with people, because there is no fixed blueprint. Though our DNA can be considered a kind of blueprint, differences in gene expression for example can greatly vary the “final product”. In addition, DNA changes over each generation via mutations (by design), making it difficult to pinpoint if there really is a “correct” blueprint.

A blueprint that keeps changing will hardly be useful for determining if something is defective. Going back to our manufacturing analogy, we would be comparing the finished products with a blueprint that was completely different each time—there’d be nothing worth checking, because it would always be a mismatch!

How can we even define a “correct” human being? The idea is in and of itself absurd, given both the nature of our DNA as well as our evolutionary (genetic) history. Evolution works using a mechanism that may be summarised as: “well this creature didn’t die before reproducing, whatever the reason for that may be”. Using this definition, a “correct” human may plausibly be defined as “if it works, it works”. And that is a very broad definition.

Given the (Laplace’s demon hasn’t introduced itself to us yet) incomprehensible complexity of the world, it would be very difficult to determine if someone will be excluded from the “if it works, it works” criteria of a “correct” human being, even if we only restrict our range to only one generation. We could broaden our scope to define reproductive success as having grandchildren, or even great-grandchildren, but this just throws us off a cliff (or at least a really steep slippery slope that might as well be a cliff) and thus isn’t very useful.

As some people like to quip: “it ain’t over till it’s over”. As long as one is still alive, there’s always hope. And if one does die, then, well, life doesn’t apply to them anymore, does it? The nature of the loss completely erases the consequences of losing; only people who are alive are afraid of death—those who are already dead do not fear death. Optimism isn’t always about denying reality—one can be realistic and optimistic at the same time.

But, going back to what I think about my own diagnosis of autism, I don’t think it really changes anything—this is just how I am, different from others. Given what I know about human variation, it’s difficult to pigeonhole it into something that’s either “good” or “bad” given that that implies knowing how all these complex variables interact with one another in this already-complex world.

This doesn’t apply for more severely-impairing forms of autism (hence the word “impair”, making this a bit of a tautology), and I still subscribe to the belief that being different is, in and of itself, a form of disability, but it would be too hurried to conclude that my autism in particular is necessarily a bad thing, if it even is “real” (again, not that the condition isn’t real, but that there is no Vitalistic “essence” that defines autism; it’s still part of the spectrum of natural human variation).

I define what autism and its implications are for me; autism doesn’t define me. Many advocates think their autism defines them. I wholeheartedly and perhaps sometimes even bitterly disagree. Autism should not define an individual as a whole; it can be part of one’s identity, but it should not be one’s identity.

As neurologically inaccurate as the expression “having autism” is, I find “autistic person” no more helpful when it comes to personal identity and just living as an individual—as just somebody on this planet that falls within our contemporary definition of our “species” (which has changed repeatedly throughout the course of evolution). Like the sorites paradox, it may be unclear when we will eventually decide to redefine what it means to be a Homo sapien, but that’s a problem that won’t realistically concern any of us anytime soon. Or at least, this website will have long been shut down by then, perhaps a hundred or maybe even a thousand millennia later.

That is, assuming we haven’t don’t experience a mass extinction event along the way such as nuclear war, climate change, or even both of them at the same time. This list is non-exhaustive. Perhaps our future generations will become so coddled that spontaneous suicide becomes socially acceptable even after hearing just one insult or “politically incorrect” phrase, and people go around effectively murdering others by calling them fat or something.

Closing

I really like rants. I also really like coming up with ridiculous stories and testing the boundaries of the logically plausible.

Forcing myself to write out of a sense of obligation was never going to work from the start. The fact this post was written in just a day instead of being spellchecked or factchecked to oblivion proves it. Any grammatical errors or typos are unintentional, but at the same time also fully intentional: it shows how little effort I’ve put into this compared to obsessing over it for months and re-reading each passage 20 times in order to ensure that it is absolutely flawless.

I don’t have ads; I’m not making any money off any readers of this website, so there’s nothing stopping me from writing whatever I want in whatever style I feel like on any given day, as long as it’s within reason. Indeed, reason is very important. Science misinformation (and disinformation) is rife. I do not wish to become yet another contributor to the widespread belief in pseudoscience and disregard of critical thinking.

Though it is important to be exposed to bad research and low-quality information in order to learn that we can’t and shouldn’t trust anything just from source or reputation alone, it should also be equally important to eventually reveal why good studies are good and why bad studies are bad, after letting readers independently try and guess. After all, if you take an exam and never receive the results, what’s the point? You wouldn’t know where your mistakes (if any) were, and you wouldn’t know why you got certain questions correct. In modern high school systems, the latter would be equivalent to praising a student for parroting a specific answer well, as opposed to their ability to think critically and come up with their own answers, even better if they criticise the question itself, if they can spot the error, simultaneously going against and questioning authority figures which may have future positive consequences as opposed to the blind trust today’s youth tend to place in irrelevant or questionable figures.