Friday, November 30, 2012

You can be stubborn as a poem
Refusing to yield and budge
To anyone but your clairvoyant admirer
Who would look past the black ink
And glean out meaning from the white of the page
Like a poem, you'd wait an eternity
Cradling the unsaid in your lap
And I
In my impatient seduction
Will take of you what I can
To transmute into a poem
One as stubborn as you

Wednesday, November 21, 2012

How effortlessly emotions manipulate reason to serve their ends, and how laboriously reason rebels and dies a martyr's death!

Saturday, November 17, 2012

In Queering the Quran, Michael Muhammad Knight discusses some attempts at reconciliation between Islam and homosexuality. With regards to Quranic interpretation he writes:

"As with anti-queer readers of the Bible, anti-queer readers of the Qur’an mention the fate of Lot’s people as proof that God hates same-sex desire. However, there are also readers of the Qur’an who attempt to produce new meanings from the episode. Among progressive Muslims, an argument exists that the story of Lot does not discuss men who want consensual sex with other men, but rather men who intend to commit rape. I have to confess that this argument strikes me as a bit of a reach, but I do appreciate the effort, if only so that I can say that alternative readings do exist.

Unfortunately, when I read the Qur’an, I find it mocking men who want to have sex with men. This is not what I want to see, and I hope to someday find an interpretation that will change this for me. I appreciate the need for queer Muslims to find new meanings in the words, and I'm on their side, but the project remains a matter of making the Qur’an say something other than what it appears to be obviously saying."

I wouldn't delve into whether Quran condemns homosexuality or not, but I want to draw attention to a broader aspect of what such progressive attempts at interpreting Quran entail.

There is a fine but important difference between:
1) Quran does not condemn homosexuality.
2) Quran supports homosexuality as a morally valid religious practice.

The truth of the first statement does not by itself imply the truth of the second statement. The former may be the case, but there is simply no scriptural evidence and support for the latter. (Here I might add that Quran and Hadith unambiguously disapprove of sexual relations outside marriage. So even if homosexuality is allowed within Islam, it would have to be within the fold of marriage. In this regard the historical support is abysmal, as I am not aware of a single homosexual marriage that took place in the days of Prophet and early Islam.)

If the first statement alone is true, then it shows that Quran and homosexuality can be reconciled. However, it would also imply that the source of moral validity of homosexuality is extra-Quranic. This, of course, is linked to the larger philosophical debate regarding the source of morality and the foundation for human ethics. Secular philosophy proves at least this much that moral reasoning is not dependent on locating the moral injunctions in the scriptures. In fact, often moral reasoning and scriptural injunctions can be in conflict.

If moral reasoning is not dependent on scripture, what then is the validity of scripture as a source of morality? This, I believe, is a larger theological question that most progressive Muslims simply do not seem to be aware of. Are there two source of morality, one scriptural and the other extra-scriptural? Does secular moral reasoning lack something that scriptural morality offers? If so, what does it lack? And if it doesn't lack anything, what need is there for scriptural morality?

"Derrida’s argument was that Western thought from Plato to Rousseau to Lévi-Strauss had been hopelessly entangled in the illusion that language might provide us with access to a reality beyond language, beyond metaphor: an unmediated experience of truth and being which he called ‘presence’. Even Heidegger, a radical critic of metaphysics, had failed to escape its snares. This illusion, according to Derrida, was the corollary of a long history of ‘logocentrism’: a privileging of the spoken word as the repository of ‘presence’, at the expense of writing, which had been denigrated as a ‘dangerous supplement’, alienated from the voice, secondary, parasitic, even deceitful.

Derrida wanted not only to liberate writing from the ‘repression’ of speech, but to demonstrate that speech itself was a form of writing, a way of referring to things that aren’t there. If logocentrism was a ‘metaphysics of presence’, what he proposed was a poetics of absence – a philosophical echo of Mallarmé’s remark that what defines ‘rose’ as a word is ‘l’absence de toute rose’. Derrida, a passionate reader of Mallarmé, made a similar argument about language by drawing on – and radicalising – Saussure’s Course in General Linguistics. Saussure had argued that words acquire their meaning through their difference from other words – specifically from the differences between phonemes – rather than from their referents. Derrida went a step further, arguing that meaning itself is subject to what he deliberately misspelled as différance, a pun on the verb différer, which means both ‘to differ’ and ‘to defer’. (He spelled différance with an ‘a’ rather than an ‘e’ because it could only be read, not heard: a mark of the primacy of writing over speech.) The meaning of what we say, or write (a distinction without a difference, for Derrida), is always ‘undecidable’; it hardly takes shape before it dissolves again in an endless process of differing and deferring."

Excerpt from Not in the Mood by Adam Shatz at London Review of Books.

Clothing can be a concealment offering the promise of nudity. It utilizes the aesthetic appeal for the sexual appeal, such that the sartorial decoration becomes an extension of the physical beauty. It calls the imagination for aid by denying the direct vision of bare actuality. Conceal enough, however, and the promise is suffocated. 

This play of concealment and possibility is a function of the context. To see a possibility where it is not offered is to risk being guilty of sexual objectification.

Wednesday, November 14, 2012

K. K. Shahid's Don't Blame The Taliban series of articles/blogs make for a pretty interesting reading. I criticized one particular aspect of the Part I recently. Part II and III are relatively more sophisticated in the arguments, and I think some of them are quite valid, especially the ones directed at the apologists of Islam who continue to maintain that Islam is 'inherently peaceful' and dismiss any criticism on the basis of context. I believe K. K. Shahid is quite justified in being irked by such apologists and their tactics, and his articles are a desperate attempt of sorts to deflate this balloon of apologism. It is a balloon that I wish to deflate as well. I will present my own criticism of Islamic apologists separately, because it is a topic that needs addressing. 

What bothers me about K. K. Shahid's articles is his 'defence' of Taliban as the true followers of Islam. This is how he describes them in Part III:

"The Taliban understandably aren’t particularly fond of Muslim apologists, as asserted by Ehsanullah Ehsan, the Pakistani Taliban spokesman, following the Malala attack. They probably don’t understand how they – the Taliban: the students of Islam, who do nothing but study Islam and its scriptures and endeavor to personify them – are misapprehending the ideology, while most Muslim apologists – who quite often don’t even border on following the basic Islamic tenets – seem to understand Islam better than them."

"The Taliban are merely striving to propagate the message of the Quran and of the prophet how it’s said to have expanded in the 7th century AD. Islam orders the true Muslims to wage war against those who spread Fitna, which is described in (2:217) as disbelieving Allah and not following his path. This basically means that the Muslims are ordered to ensure that every part of the world follows the Islamic way of life, and use violence – if need be – to ascertain Islamic supremacy. The Taliban understand the meaning and act accordingly, the Islamic scholars throughout the past 1400 years comprehend it and elaborate it accordingly in their tafsirs and literature, but the apologists are hell bent on claiming, and perhaps believing, what they want the teachings to articulate – not what they actually proclaim – at the cost of multitudinous lives."

[My italics]

Whatever my own feuds with Muslims, I genuinely don't think that this is the case. The Taliban are not well-intentioned students of Islam, who are trying to follow the true spirit of their religion with pure hearts, but are led to commit horrific violence just because the religion they happen to believe in is intrinsically violent. If only we could show them the errors of their religion, they would return to being the peace-leaving people that they are! The Taliban are students only in name. They are a Frankenstein army of artificially bred religious warriors. Their version of Islam is not your traditional Islam. It is a malignant form of Islam in which all the checks-and-balances and the emphasis on due legal process is markedly absent. It is a vigilante form of Islam in which they have expanded the notion of fitna and enemies to include even other civilian Muslims who differ from them in how to practice their religion. By way of analogy, they are a cancer. Just as a clump of cells in the body loses all its inhibitions and begins to proliferate in an excessively abnormal fashion, invading the other normal tissues of the body, similarly the Taliban are a cancer, which have pumped-up the inherent growth potential and have discarded all the inhibitions of their religion, and are now waging a war against other Muslims. Yes, blame the religion, if you wish, for possessing that inherent growth potential, but do blame the Taliban, for they are surely not without blame.

Conor Gearty talks about how real scholarship is not gained from twitter and blogs, but from old fashioned hard work. He speaks with reference to public law, but I think the point is applicable generally:

"Every working day brings some excitement in public law. The Twitter/Blog mind embraces the daily frenzy, scans the raw material to hand and produces an instant judgment – as though its owner were some kind of perpetually available law specialist on Radio 5 Live. The old fashioned hard work – quiet; library-based; thoughtful – that made the writer/speaker an expert in the first place gradually drifts off the daily agenda. At first because of time constraints and then – well – because it’s boring, like returning to decaff coffee after an espresso. Twitter/Blog erodes our confidence in the deeper stuff without which we would never have become experts in the first place."

Saturday, November 10, 2012

Herman Cappelen: I had Jonathan Barnes as tutor for most of my courses. Barnes was an important influence – though I remember asking him whether it was worth going on with philosophy professionally and he said, ‘Only if there’s absolutely nothing else you can see yourself doing and you think you can do it better than anyone else’. I worked hard to ignore that advice or put severe restrictions on the domain of ‘anyone.’ That said, I think he was right. However much one loves doing philosophy, entering into the profession is for most lovers of philosophy probably a bad choice (which is just an instance of a more general point: making a profession out of something one loves because one loves it, is probably a bad choice).

My op-ed published in The News on 7 November 2012:

Outrage and Reality

Awais Aftab

Dr Peter Sandman is a risk communication specialist and a prominent international consultant with regards to outrage and crisis management. He is well known for his conceptual formula ‘Risk = Hazard + Outrage’ and he is always trying to educate the public on the relationship between hazard and outrage.

Hazard refers to how much harm a risk actually does and outrage denotes how much upset people get about it. The most striking feature about the relationship is their abysmally low correlation, which in a numerical figure amounts to about 0.2. In simpler words, the risks that actually harm people and the risks that upset people are unrelated to each other. If you know what harms people, you cannot say if people are upset about it. If people are upset about something, you know nothing about how dangerous it really is. This low correlation applies to all sorts of harms, be they medical, ecological or economic.

In contrast to actual hazard, outrage is associated with a perceived hazard, that is, what people think to be detrimental. Sandman discusses the interesting question of what causes what. Do people get upset about something because they think it is dangerous, or does something appear hazardous to them because they are upset? The reality is that it is a cyclical process; the arrow of causality goes both ways, but it is also true that one of these arrows is strong and the other is weak. Surprisingly, or perhaps unsurprisingly, the stronger direction of causality is from outrage to perceived hazard. People tend to believe something is hazardous because they’re upset about it.

Understanding this dynamic is important if we wish to alter the state of affairs. If you try to correct the hazard perception, the outrage will be minimally reduced. People will respond with denial or they will alter the hazard perception in such a way as to counter your correction, and they will remain upset. The best way to reduce perceived hazard in cases where actual hazard is low is to try to calm people down. This is outrage management. On the other hand, if the perceived hazard is low and the actual hazard is high, people need to be made more upset to take the matter seriously.

I am frequently reminded of Dr Sandman’s work, as our country is very fond of sudden outrage, and this becomes all the more pertinent as we approach the upcoming elections. Media and journalists play the eager role of amplifying this vicious cycle of indignation and hazard misperception, while we lack any public specialists who may help bring any semblance of sanity and balance to it.

To take a prominent example, the actual hazard of blasphemy is practically non-existent, yet the outrage it evokes is volcanic. The actual hazard of factory safety conditions is alarmingly high, but the outrage is proportionately very low. Recently, a Pakistani blogger who goes by the name of ‘Sky is Neela’ analysed the available terrorism data and showed that the Karachi terrorists have killed approximately the same number of people as have been killed by suicide bombings and drone attacks combined, yet the associated public and political outrage is highly unequal.

With regards to the coming elections, both the masses and the intellectuals appear to be very concerned with the ideological leanings of the political parties in deciding who to vote for, however, to my mind, the actual top priorities should be policies related to health, education, energy and economics, but hardly anyone ever talks about them.

The political parties and media have vested interests in this disproportion between hazard and outrage because they utilise the outrage to garner more political support and sell their newspapers respectively. Risk communication can be made effective only if the media, intellectuals and politicians act selflessly to determine the actual dangers that threaten our country and communicate them to the public, while at the same time managing their outrage in matters where it is not warranted. However, given our national lifelong love for hysterical theatrics in matters of politics, the prospects are but despairing.

The writer is a doctor based in Lahore. Email:; Twitter: @awaisaftab

Tuesday, November 6, 2012

"What all this boils down to is that science has never given up on the Whiggish view of history that historians have long since abandoned: a triumphant voyage out of the dark ages of ignorance and superstition into the light of reason. In this view, all we really care about in historical scientists is which of their ideas survived, not how they thought and why. All the stuff that was of its time — Kepler’s cosmic harmonies, Newton’s alchemy and eschatology, Faraday’s religiosity — must then become a curious aberration: ‘Isn’t it strange that such great minds held such weird ideas?’ It isn’t strange at all if you truly care about history."

excerpt from the article Science fictions by Philip Ball.

Monday, November 5, 2012

In the November/December 2012 issue of Orion magazine, there is a wonderfully informative and delightfully engaging article by Charles C. MannState of the Species, in which he discusses the history and fate of Homo sapiens as a biological species. Please do read the entire article. Below I am going to present an outline of the salient features, frequently relying on the words of the author. Consider them as abridged excerpts.

Homo sapiens emerged on Earth around 200,000 years ago. Those humans were anatomically modern, but not behaviorally modern: they possessed no language, no clothing, no art, no religion and had the simplest of tools. Those early humans had so little capacity for innovation that for the first 100,000 years of their existence, we find no evidence of any significant cultural and social change. Furthermore, humans were confined geographically to a small area in East Africa (and possibly another area in South Africa).

Then, mysteriously, 50,000 years later humans were drawing cave art, using specialized tool, had clothing, religious worship and rituals, and were communicating via language. How did this sudden transformation come about in what is geologically a finger snap?

We are not really sure what happened. It is possible that favorable genetic mutations swept throughout the species that led to an increase of mental capacities and abstract thinking. It may have come about by interbreeding with Neanderthals. It may simply be a result of the invention of language. 

It may also have been related to a significant geological event that happened 75,000 years ago: a huge volcano exploded on the island of Sumatra, creating Lake Toba, the world’s biggest crater lake, and ejected the equivalent of as much as 3,000 cubic kilometers of rock. Dust hid the sun for as much as a decade, plunging the earth into a years-long winter accompanied by widespread drought. This predictably threatened the survival of countless species, including Homo sapiens: the number of humans shrank dramatically, perhaps to a few thousand people. This bottleneck is evident from the remarkable genetic uniformity of humans. This bottleneck may also have helped alter the genetic composition of Homo sapiens, because in small populations favorable mutations can spread very rapidly, and uncommon variants can become dominant.

How it happened is unclear, but around the time of Toba behaviorally modern humans emerged and spread out into the world; human footprints appeared in Australia within as few as 10,000 years.

What does the growth curve of a typical species look like? Take a species of bacteria. It's population would be kept in check and steady by the size of its habitat, by the limited food supply and by competing organisms. However, put a small population in a petri dish and the behavior radically changes. There is no competition, and the food supply and habitat appear to be endless. At first it grows slowly, then it hits an inflection point, and there is a frenzy of exponential growth. The petri dish is swamped. The rapid growth continues until it hits the second inflection point - the edge of the petri dish. As the food supply is exhausted and waste materials accumulate, the bacteria begin to die and ultimately the population falls to zero.

From a biological perspective, Homo sapiens look like one of these briefly fortunate species: for the modern humans the post-Toba world presented itself like a petri-dish. Around 10,000 years ago with the invention of agriculture, we hit our first inflection point. There has no been stopping us since then. In just the past two hundred years, human population has exploded from 1 to 7 billion. In 2000, the chemist Paul Crutzen gave a name to our time: the “Anthropocene,” the era in which Homo sapiens became a force operating on a planetary scale.

Like a typical biological specie, it can be expected that our 
"growth will continue at a delirious speed until we hit the second inflection point. At that time we will have exhausted the resources of the global petri dish, or effectively made the atmosphere toxic with our carbon-dioxide waste, or both. After that, human life will be, briefly, a Hobbesian nightmare, the living overwhelmed by the dead. When the king falls, so do his minions; it is possible that our fall might also take down most mammals and many plants. Possibly sooner, quite likely later, in this scenario, the earth will again be a choir of bacteria, fungi, and insects, as it has been through most of its history."
Is a different outcome possible? It may be. Unlike other species, we are characterized by remarkable behavioral plasticity; we have the ability to change our behavior, not just on an individual level, but also on a social one. This behavioral plasticity has conferred to us great powers of adaptation, instrumental to our biological expansion. Can we as a species constrain our own growth before we hit the second inflection point? To do so would be biologically unprecedented. Is our behavior plastic enough to take on such a challenge? The possibility exists, but that itself is no predictor of the outcome. Time will tell whether Homo sapiens will end up any different from bacteria in a petri-dish.

Religion sets out to reform brutes and ends up becoming a tool in their hands.

Sunday, November 4, 2012

Reading a post on the blog Zunn left me thinking about the relationship between language, thoughts and emotions. With beautiful description she states:
"It is a Faustian bargain that humanity has to make - without words, there would be a chaos of thoughts and feelings where everything might appear to be in a flux. But with words, we reduce the vastness of experience into manageable cubby holes for ourselves and others to fit oh-so-snugly into."
I have tried to gather my own thoughts on the cognitive relationship between them in this post. Since I am not well-versed in linguistics and cognitive science, what is presented below are my own speculations and impressions, building on my limited knowledge, and it may or may not be valid from a scientific point of view.

Can emotions be experienced without them being translated into thoughts? Yes, I believe so. While it seems obvious enough to me from introspection, it is also supported by the fact that different regions of the brain are involved in the production & processing of emotions and conscious thoughts.

Can we have thoughts without a language? Yes, I believe so. Language cannot exist without thought, but it appears to me that thought can exist without language. After all, higher animals and babies do have thoughts, even though they don't have a language. Similarly, the faculty of language can be destroyed in the brain by stroke or other injuries, without completely destroying the faculty of thought. 

We can think without language, and we can think in a language. However, it seems to me that while the faculty of language is functioning, it completely dominates the subjective conscious experience, such that we cannot will ourselves to think without a language.

Both thoughts-in-a-language and thoughts without language can be non-verbal. Non-verbal modes include visual (said to be predominant in autistics), tactile, musical, mathematical. Perhaps there are non-verbal modes of thought without language for which there is no corresponding language, such as, I imagine, olfactory (in animals).

Language is verbal, that is clear enough, but can thoughts without language be verbal as well? It is a matter of some controversy, but perhaps it is possible that thoughts without language exist in a sort of proto-language. (Some call it 'mentalese').

The scope of thoughts-without-language is considerably limited compared to thoughts-in-language. The scope is limited not just in the complexity and abstraction, but also in the extent of communication. This would seem tautological once we consider the definition of a language: a system of complex communication. The point at which tactile thoughts-without-language become tactile thoughts-in-a-language is set by a certain level of complexity.

The following diagram shows the interaction of emotions and thought. Emotions can get translated into thoughts, which can be thoughts without language and thoughts-in-a-language. I do not think thoughts get converted into emotions, but thoughts can definitely induce and modify emotions (hence the arrow towards left).

In majority of cases we do not experience a difficulty in converting emotions into thoughts, but occasionally emotions arise that cannot be converted into thoughts, and if forced a conversion takes place only with a distortion of emotion. (Like a key that doesn't fit in a hole, but you chip away the edges of the key until it slides in.) One may call it a failure or distortion of communication between the limbic system and the neocortex: we experience an emotion and struggle to express it even to ourselves.

It is also possible that an emotion may get converted into a thought, but it fails to be converted into language. Such a thought may be unconscious or subconcious, and may find other expressions, such as in the symbolism of dreams.

Coming back to the post on Zunn, the author thinks that in cases where emotions cannot be converted into language, we adopt two strategies:

1) We forcefully distort the emotions into fitting the language and what cannot be distorted is suppressed. The advantage is that we can now communicate. The disadvantage is that the real emotions are no longer there.

2) We hold on to the emotions, and try to communicate partially what we can. The advantage is that we are still in touch with our emotional reality. The disadvantage is that the communication has been compromised.

Please experience it in the poetic words of the author:
"[W]ords are an ideal outlet, because they immediately offer a common ground. [...] [But people subscribing to this sort of strategy] also have to give up on the range of disparate and unrelated emotions they might feel, because they need to whittle these all down to one coherent idea. So yes, feelings need to get suppressed, or better yet, processed, because even if they ring true, to continuously unleash them would apparently lead to what is perceived through such a lens as chaos.
The other extreme in this approach of dealing with communication is to hold on to the entire range of emotions, feelings, ideas, thoughts *insert synonymies aplenty* because they offer a wider array of information, and can be understood at a raw, unprocessed, unfiltered form.
Unfortunately, communication of this sort ends up relying only partially on words. Which means that two people communicating using a whole array of levels of communication are rarely going to end up having a coherent, shared platform, or context. [...]
[B]ecause they hold onto the primacy of the feelings over the importance of words, they keep searching for words that never quite fit. And so even though they might have extremely intense connections in the brief moments that the ranges of their emotions find multiple overlaps, they struggle to form extremely loyal bonds that become more important than anything else, because they just can’t have that continued shared context."

She goes on to link these strategies to the differences in gender communication styles. I do not agree with her on the application of these ideas to gender, but her views on emotions, thoughts and language were expressed with such vividness and style that I was forced to ponder on their cognitive background.

Saturday, November 3, 2012

“Our lives are not our own. From womb to tomb, we are bound to others, past and present, and by each crime and every kindness, we birth our future.” Cloud Atlas

The fictional setting of Cloud Atlas allows you to suspend skepticism just long enough for your mind to drink in 'a tapestry sewn from universality of human feeling', and your emotions to be swept by mystical tidings: 'we’re not just bodies, but also souls; the choices we make in one life affect who we become in another; we’re all connected to each other and to something bigger than ourselves.' The end result is an inspired state of mind that leaves you yearning for “the great Perhaps”. 'Belief in the great Perhaps suffuses Cloud Atlas the novel; the misstep of Cloud Atlas the film is to try to turn Perhaps into Certainty.'

It's an artistic vision of a world where we do not know the possible far-reaching consequences of our actions, which may transcend our individual lives and ripple across centuries to form a grander pattern. The limitedness of a singular life covets a moral justification, and other than the bland idea of heaven and hell, a Cloud Atlas sort of karmic recurrence is of some possible consolation.

Skepticism kicks in soon after, bringing with it the despair of a world that has no plan in it, but if you are lucky the film would have done it's job of stretching your imagination enough that your whole existence quivers with the unheard symphonies of infinite possibilities.


Aati: In your own quiet, understated way, you're brilliant not just in philosophy and science, which is the niche where people like to conveniently stash you, maybe because there are so few others they can stick with those labels, but also as an observer, as a humorist, as a poet. People like to reduce you to a mind, if you understand what I'm saying, because your mind baffles them the most and because your mind they can easily distance themselves from fully understanding. But you're not just a thinker, and your feelings are brilliant and lucid, and so people like to pretend they haven't seen them perhaps, and underplay them because they have feelings too and because feelings make them your equal and you theirs and feelings bring you too close for them to pass off their lack of understanding as merely natural. And sometimes I get the feeling you go right along with them too, like a boat carried by the tide.

'The Thinker' is your stereotype, it is your niche and all yours; people fawn over you for it, and when they have to describe you, they use it as a convenient shorthand to get out actually describing a human being, because that is difficult. And I think you've experienced this so long and so pervasively/from every person in your life, that you just go along with it, because hey it does speak to a very vital part of your core, right, even if not all of it? You accept it, resigned in a way, and you give them more of what you want and in return they see more of what they want to see, and so you have an image. An image, like a hologram, not entirely fake because of the photons and atoms that make it up, but not entirely true either because it as image but its object is not there, except in the eyes and minds of its beholders. You're a brain. And a heart, a secret heart.

Thursday, November 1, 2012

'One of the most famous stories of H. G. Wells, “The Country of the Blind” (1904), depicts a society, enclosed in an isolated valley amid forbidding mountains, in which a strange and persistent epidemic has rendered its members blind from birth. Their whole culture is reshaped around this difference: their notion of beauty depends on the feel rather than the look of a face; no windows adorn their houses; they work at night, when it is cool, and sleep during the day, when it is hot. A mountain climber named Nunez stumbles upon this community and hopes that he will rule over it: “In the Country of the Blind the One-Eyed Man is King,” he repeats to himself. Yet he comes to find that his ability to see is not an asset but a burden. The houses are pitch-black inside, and he loses fights to local warriors who possess extraordinary senses of touch and hearing. The blind live with no knowledge of the sense of sight, and no need for it. They consider Nunez’s eyes to be diseased, and mock his love for a beautiful woman whose face feels unattractive to them. When he finally fails to defeat them, exhausted and beaten, he gives himself up. They ask him if he still thinks he can see: “No,” he replies, “That was folly. The word means nothing — less than nothing!” They enslave him because of his apparently subhuman disability. But when they propose to remove his eyes to make him “normal,” he realizes the beauty of the mountains, the snow, the trees, the lines in the rocks, and the crispness of the sky — and he climbs a mountain, attempting to escape.'

Aaron Rothstein summarizes this story in the beginning for his article on his neurodiversity.

[hat-tip: 3 Quarks Daily]


Copyright 2013 A Myth in Creation.

Theme by
Blogger Template by Beta Templates.