7 ways to make your assignment different from others


Nowadays, for obtaining high grades in your academics, you have to make assignment which is unique and non-identical from others. Solitary the largest change scholars have to make from high school to college is grasping how to draft college assignment that stands out.

The assignment is same for all scholars, so it becomes hard to gain above average. Here are certain points which will tell you the significance of writing the same content in a similar way that makes your assignment more scoring and helps you to make your assignment different from others

Here are some useful tips that can aid you high-quality assignment writing services and also assist you to stand out from others.

  • An impressive cover page: -A good cover page will always attract the attention of the readers and through this, they will read further, so try to make your cover page more attractive and impressive. You can employ your own concepts and creativity but not in excess, keep it simple and readable format .overuse of drawing skills is not a better option and if you want that reader will read further then keep your cover page simple and clear.
  • Organised index page: – A systematic index page makes it simple for the reader to flip through the topics. As a student, it also converses of the discipline in your academics.
  • Be clear and concise: while writing college assignment, make assure that you are using the correct word and also take care of spelling and grammatical mistakes. Always avoid obsolete and invented words. To write briefly, escape unnecessary repetition and redundancy.
  • keep paragraph short: -while writing any assignment always keep the paragraph short and precise because if you write long one then reader will easily get bored and never read further .so to attracts the readers always keep the paragraph short and clear.
  • Highlighting: -generally, highlighting word and phrases are the finest way to gain the attention of the reader. Whenever we read any kind of articlePsychology Articles, we get attracted to the highlighted words to write more perfect. So keep highlighting the important term and clauses.
  • Suitable conclusion: – Last but not the least is the closure of your assignment. A convincing conclusion makes an eternal apprehension on the reader. This includes:.
  1. Skim the selective point briefly
  2. Explain the final message to the reader by elaborating the overall discussion

The Founding Fathers v. The Climate Change Skeptics

When claims from Europe accused British America of being inferior on account of its colder weather, Thomas Jefferson and his fellow Founding Fathers responded with patriotic zeal that their settlement was actually causing the climate to warm. Raphael Calel explores how, in contrast to today’s common association of the U.S. with climate change skepticism, it was a very different story in the 18th century.

Thomas Jefferson by Rembrandt Peale (1805)
Portrait of Thomas Jefferson by Rembrandt Peale (1805). Note the furs – Source.

The United States has in recent years become a stronghold for climate change skepticism, especially since the country’s declaration in 2001 that it would not participate in the Kyoto Protocol. Nevertheless, though it is a well-documented fact, it might surprise you to learn that, a far cry from the United States’ recent ambivalence with respect to the modern scientific theory of man-made climate change, the country’s founders were keen observers of climatic trends and might even be counted among the first climate change advocates.

From the start, the project to colonize North America had proceeded on the understanding that climate followed latitude; so dependent was climate on the angle of the sun to the earth’s surface, it was believed, that the word ‘climate’ was defined in terms of parallels of latitude. New England was expected to be as mild as England, and Virginia as hot as Italy and Spain. Surprised by harsh conditions in the New World, however, a great number of the early settlers did not outlast their first winter in the colonies. Many of the survivors returned to Europe, and in fact, the majority of 17th-century colonies in North America were abandoned.

Jamestown in snow
Jamestown colonists endured a severe winter in 1607-1608, black and white copy of a painting by Sidney King for the Colonial National Historical Park – Source.

A view formed in Europe that the New World was inferior to the Old. In particular, medical lore still held that climate lay behind the characteristic balance of the Hippocratic humors – it explained why Spaniards were temperamental and Englishmen reserved – and it was believed that the climate of the colonies caused physical and mental degeneration. Swedish explorer Pehr Kalm, who had travelled to North America on a mission from Carl Linnaeus, observed in his travel diary that the climate of the New World caused life – plants and animals, including humans – to possess less stamina, stature, and longevity than in Europe. The respected French naturalist Georges-Louis Leclerc, Comte de Buffon, explained in his encyclopaedia of natural history that “all animals of the New World were much smaller than those of the Old. This great diminution in size, whatever maybe the cause, is a primary kind of degeneration.” He speculated that the difference in climate might be the cause:

It may not be impossible, then, without inverting the order of nature, that all the animals of the new world originated from the same stock as those of the old; that having been afterwards separated by immense seas or impassable lands, they, in course of time, underwent all the effects of a climate which was new to them, and which must also have had its qualities changed by the very causes which produced its separation; and that they, in consequence, became not only inferior in size, but different in nature.

Dutch philosopher Cornelius de Pauw believed that “The Europeans who pass into America degenerate, as do the animals; a proof that the climate is unfavorable to the improvement of either man or animal.” Scientific and artistic genius, according to a prominent theory put forth by the French intellectual Jean-Baptiste Dubos, only flourished in suitable climates – climate accounted for the marvels of Ancient Greece, the Roman Empire, the Italian Renaissance, and, thanks to rising temperatures on the European continent that Dubos thought he observed, the Enlightenment. French writer Guillaume Raynal agreed, and made a point of saying that “America has not yet produced one good poet, one able mathematician, one man of genius in a single art or a single science.”

Cornelius de Pauw’s Researches Philosophiques sur les Américains
In this edition of Cornelius de Pauw’s Researches Philosophiques sur les Américains, the usual ornamentation preceding the chapter on the American climate is sardonically replaced with this apparently frozen landscape – Source.

In the New World, refuting such theories became a matter of patriotism. In the rousing conclusion to one of the Federalist Papers, Alexander Hamilton wrote:

Men admired as profound philosophers have, in direct terms, attributed to her inhabitants a physical superiority, and have gravely asserted that all animals, and with them the human species, degenerate in America–that even dogs cease to bark after having breathed awhile in our atmosphere. Facts have too long supported these arrogant pretensions of the Europeans. It belongs to us to vindicate the honor of the human race, and to teach that assuming brother, moderation. Union will enable us to do it. Disunion will will add another victim to his triumphs. Let Americans disdain to be the instruments of European greatness!

Building on the theories of John Evelyn, John Woodward, Jean-Baptiste Dubos, and David Hume – who all believed that the clearing and cultivation of land in Europe accounted for the temperate climate that had enabled the Enlightenment – the colonists set about arguing that their settlement was causing a gradual increase in temperatures and improvement of the flora and fauna of North America.

Hugh Williamson, American politician and a signatory of the Constitutional Convention, believed that “within the last forty or fifty years there has been a very great observable change of climate, that our winters are not so intensely cold, nor our summers so disagreeably warm as they have been,” a fact he attributed to the clearing of forests. “The change of climate which has taken place in North America, has been a matter of constant observation and experience,” wrote Harvard professor Samuel Williams. Benjamin Franklin wrote of the “common Opinion, that the Winters in America are grown milder.” Measurements were as yet inadequate to the task of proving this, he said, but he found the proposed mechanism (i.e. clearing and cultivation) sufficiently persuasive that, even if the winters were not milder already, he could not “but think that in time they may be so.” Benjamin Rush, physician and signatory of the Declaration of Independence, speculated that, if cultivation kept pace with clearing of new lands, climate change might even reduce the incidence of fevers and disease.

Thomas Jefferson was especially eager to rebut Buffon and the proponents of the theory of climatic degeneracy. He expended substantial efforts to this effect in his Notes on the State of Virginia (1785), with page after page of animal measurements showing that the American animals were not inferior to their European counterparts. He also had help from James Madison, who shared his own measurements with Jefferson, urging him to use them in his arguments against Buffon.

jefferson notes on virginia
Thomas Jefferson compared the weight of European and American animals, in order to disprove Buffon’s claims that the animals of the New World were smaller degenerate forms of their Old World counterparts. Notice that he includes the Mammoth at the top of his list. – Source.

Their impassioned advocacy would occasionally lead them astray, though. Samuel Williams claimed that winter temperatures in Boston and eastern Massachusetts had risen by 10-12˚F in the previous century and a half, a climatic transformation too rapid to be believed perhaps. Jefferson, convinced that the American climate could sustain large animals too, insisted to a friend that “The Indians of America say [the Mammoth] still exists very far North in our continent.” Anxious to disprove claims of degeneracy, he wrote a letter to the American Philosophical Society in which he openly speculated that elephants, lions, giant ground sloths, and mammoths still lived in the interior of the continent. Later, believing he was on the verge of proving the skeptical Europeans wrong, he wrote a letter to the French naturalist Bernard Germain de Lacépède boasting that “we are now actually sending off a small party to explore the Missouri to it’s source,” referring to Lewis’ and Clark’s expedition. “It is not improbable that this voyage of discovery will procure us further information of the Mammoth, & of the Megatherium,” Jefferson continued, concluding “that there are symptoms of [the Megalonyx’s] late and present existence.”

The Founders did not settle for mere advocacy, though. Keen to present as strong a case for climate change as possible, and moderated by their scientific temperament perhaps, they wanted more and better evidence. To decide the issue of lions and mammoths, Jefferson instructed Lewis and Clark to pay special attention to “the animals of the country generally, & especially those not known in the U.S. the remains and accounts of any which may [be] deemed rare or extinct.” Although they didn’t find mammoths, they discovered many animals and plants previously unknown to science.

On the question of whether the winters were getting milder, Franklin wrote to Ezra Stiles, president of Yale University, encouraging him to make “a regular and steady Course of Observations on a Number of Winters in the different Parts of the Country you mention, to obtain full Satisfaction on the Point.” Madison made regular observations at his estate, which he assiduously entered into his meteorological journals. Jefferson, too, kept meticulous records, and encouraged his friends and colleagues to submit their measurements to the American Philosophical Society, “and the work should be repeated once or twice in a century, to show the effect of clearing and culture towards the changes of climate.” Jefferson himself made significant contributions to the development of modern meteorology. In 1778, for instance, Jefferson and the Reverend James Madison, president of The College of William & Mary and cousin of the fourth President of the United States, made the first simultaneous meteorological measurements. Jefferson promoted methodological standardization and expansion of geographical coverage, and was an early proponent of establishing a national meteorological service.

jefferson weather record
Detail from a page of Thomas Jefferson’s “Weather Record (1776-1818)”, in which he meticulously and somewhat obsessively notes down the temperature on everyday of the year. In this detail, from the year 1777, we see evidence of a particularly cold spring in Virginia, with frost on the ground as late as early April – Source.

One need hardly belabor the point that the early climate change advocates were wrong. Modern climate reconstructions show there was a brief warming period in New England during the late 1700s, but Jefferson’s and Williams’ measurements predate any actual man-made climate change. Their theories were pre-scientific in the specific sense that they predate a scientific understanding of the greenhouse effect. It is true that the French scientist Edme Mariotte had, as early as 1681, noticed the greenhouse effect, but it was not until the 1760s and 1770s that the first systematic measurements were made, and it would still be another century before anyone imagined that human activities might influence atmospheric composition to such an extent that the climate might be modified by this mechanism. Their pre-scientific theories also led them to believe that a changing climate would necessarily be beneficial, whereas today we are much more aware of the dangers of climate change.

Yet one should not belittle the efforts of these early climate change advocates. Fighting back against the European ‘degeneracy theory’ was necessitated by pride as much as a concern that these ideas might negatively affect immigration and trade from Europe. Their search for evidence, moreover, resulted in substantial contributions to zoology, and was instrumental to the foundation of modern meteorology and climatology. One might speculate, even, that a belief in degeneracy contributed to England’s refusal to afford its North American colonial subjects representation in parliament, and so helped spark the American revolution. In this case, one might construe the Founders’ climate change advocacy partly as an attempt to facilitate a peaceful resolution of their grievances with the Crown. Indeed, so politically important was their advocacy efforts thought at the time that Senator Sam Mitchell of New York, in his eulogy at Thomas Jefferson’s funeral, raised them to the same level as the American revolution itself.

Portrait of Thomas Jefferson by his close friend, soldier and part-time painter, Tadeusz Kościuszko (1746-1817). Jefferson’s turn to the hot-coloured break in the clouds perhaps not entirely devoid of symbolism – Source.

It is an interesting historical footnote that, during a visit to London, Benjamin Franklin met and became friends with Horace Benedict de Saussure, the Swiss scientist credited with the first systematic measurements of the greenhouse effect. Franklin exchanged letters with Saussure, and encouraged his experiments on electricity. So impressed by Saussure’s work was Jefferson that he would later write to George Washington to suggest recruiting Saussure to a professorship at the University of Virginia, which was then under construction.

Far from a stronghold of climate change skepticism, as the United States is sometimes seen today, the country’s founders were vocal proponents of early theories of man-made climate change. They wrote extensively in favor of the theory that settlement was improving the continent’s climate, and their efforts helped to lay the foundation of modern meteorology. Much of the climate change skepticism of the day, on the other hand, was based on the second- and third-hand accounts of travelers, and the skeptics rarely made efforts to further develop the science. In addition, one cannot ignore its political convenience for many in Europe; for instance, Cornelius de Pauw was even hired by the King of Prussia to discourage Prussian citizens from emigrating or investing their capital in the New World.

Even if the parallels between the past and present are too obvious to spell out, they can be of some use to us today. While modern climate change advocates and skeptics have become experts at pointing to each other’s errors, we are usually the last to notice our own faults. An episode in our history that bears such strong resemblance to our present provides a rare opportunity to examine ourselves as if through the eyes of another. Today’s climate change advocates may recognize in themselves some of the overzealousness of the Founding Fathers, and therefore better guard against potential fallacies. Skeptics may recognize in themselves the often anti-scientific spirit of the degeneracy-theorists, and hopefully make greater efforts to engage constructively in the scientific enterprise today. One can hope, at least.

Dr. Raphael Calel is a Ciriacy-Wantrup Postdoctoral Fellow at UC Berkeley, and a Visiting Fellow at the London School of Economics. His research has looked at the history of climate change politics, the effects of current policies, and how climate forecasts can be used to inform future action. More information and links to his other writings are available from his personal website.

Links To Works

  • Buffon’s Natural history, containing a theory of the earth, a general history of man, of the brute creation, and of vegetables, minerals, &c. &c(1797), by Georges Louis Leclerc, Comte de Buffon.
  • A general history of the Americans, of their customs, manners, and colours: an history of the Patagonians, of the Blafards, and white negroes : history of Peru : an history of the manners, customs, &c. of the Chinese and Egyptions (1806), by Cornelius de Pauw.
  • Notes on the state of Virginia (1801), by Thomas Jefferson.
  • See also hyperlinks embedded in the article itself.

Recommended Reading


The Dissembler



This following Arabic fable is found in al-Qazwini’s classic Wonderous Creatures

It is said that once a pious man heard about a community that worshipped a tree instead of Allah. He picked up an axe and went off, intent on chopping the tree down. Satan met him on the road in the form of an old man and said to him: “Where are you going, and what do you want.”

The man replied: “I want to cut down the tree that people are worshipping instead of Allah.”

Satan said: “What does this have to do with you? You left your own worship to involve yourself with this. If you cut down their tree, they will find something else to worship.”

The man said: “No, I must cut it down.”

Satan said: “I prohibit you to cut it down!”

The man wrestled Satan to the ground and sat on his chest. Satan said: “Let me go so I can speak to you.” When the man did so, Satan said to him: “Allah has not placed this burden upon you. Had He wished, he has many servants on the Earth he could order to cut it down.”

The man said: “No, I must cut it down.”

Satan then asked: “Would you let there be something between me and you that is better than what you want?”

The man said: “I’m listening.”

Satan said: “You are a poor man. Maybe you would like to provide something in charity for your brothers and your neighbours and become independent of the people.”

The man said: “Certainly.”

Satan said: “Desist in what you are doing, and I will place two gold coins under your pillow every night. You can use them to support your family and spend something in charity. This is better for you and for the people than cutting down that tree.”

The man thought about it and said: “What you say is true. Swear to me on what you say.” So Satan swore an oath to him, and the man returned to his personal worship. When he woke up the next morning, he found two gold coins under his pillow. He took them. The same thing happened on the second day.

When he woke up on the third morning, he did not find the coins under his pillow. He became angry, picked up his axe, and went out intent on chopping down the tree. Satan met him in the form of the old man he assumed before. He asked: “Where are you headed?”

The man replied: “I am going to cut down that tree.”

Satan said: “You do not have the ability to do so.”

The man then reached out to strike Satan, but Satan struck him down instead and said: “If you do not desist, I will slaughter you.”

The man cried: “Let me go and tell me how you overpowered me.”

Satan said: “When your anger was for Allah’s sake, Allah submitted me to you and brought me down before you. But now, you are angry for your own sake on account of worldly desires, so I am able to subdue you.”

This is, of course, just a story, but like any good fable, it has an important moral.

When we do things for the sake of the people and abstain from them for their sake, we cease doing so for Allah’s sake. This is why the Prophet said: “The thing I fear most for you is the lesser polytheism… which is showing off.” [Musnad Ahmad]

This does not mean it is wrong for a believer to enjoy the praise of others. Showing off is only where the intention behind the person’s action is for other than Allah, so that if the person was not being seen by others, he or she would not act. There are many ways to show off:

1. Showing off in one’s belief. This is hypocrisy, where a person makes a public show of faith while concealing their real disbelief.

2. Showing off in one’s appearance. This is to make oneself look like someone who exerts a lot of effort in worship. This is like someone who cultivates a prostration mark on the forehead to make it seem like they pray a lot, or someone who cultivates dry lips to make it look like they are fasting. It also includes bowing the head in false humility while walking or keeping dishevelled hair to appear ascetic.

3. Showing off through what one says. Prophet Muhammad (peace be upon him) said: “Whoever calls attention to himself, Allah will call attention to how he really is.” [Sahīh al-Bukhari and Sahīh Muslim]

This includes quoting wise sayings, exhorting people to righteousness, and quoting hadith to bolster one’s “pious” reputation. It also includes moving one’s lips to give the appearance of being engaged in God’s remembrance.

4. Showing off through one’s deeds. This includes spending an extra-long time standing, bowing or prostrating in prayer when others are watching.

5. Showing of through association. This is like making sure to be seen with prominent scholars and pious people in order to be associated with them in the people’s eyes.

In all cases, the crux of the matter is not the action itself, but what motivates the action in the first place. The motivation for showing off stems from either a desire for praise, an aversion for people’s low opinion, or a covetousness for what other people possess. If something of this nature comes into a person’s heart while the person is already engaged in an act of piety, then it does not nullify the blessings of that act.

Some people abstain from doing good deeds fearing that they will fall into the sin of showing off. This is a mistake that leads people to lose out on a lot of virtue and many blessings. As long as your original intention is for Allah, then you should go forward with the good deed you intend. Do not let fear of showing off keep you from doing something good.

Al-Fudayl b. `Iyād said: “Engaging in acts of worship for the people’s sake is polytheism. Abandoning acts of worship for the people’s sake is showing off. Sincerity is for Allah to spare you from both concerns.”

A student at an early stage in his studies rushes ahead and starts issuing Islamic legal verdicts, heads a study circle, and walks about with a regal demeanor. He has a hard time admitting when he does not know something. He speaks as if he is a leading authority, making statements like:

“In my considered opinion…”

“What has become evident to me…”

“I have come to the overwhelming conclusion that…”

“What a person’s heart feels secure with is…”

At the same time, he is harsh and long-winded when he comes across someone else’s mistake, though he cannot tolerate their pointing our any error of his.

Another person gives preference to carrying out public duties at the expense of his own individual obligations. He might even spend excessive time engaged in things that the community might only rarely need, simply because they make him look important.

A third person is overjoyed with any chance he has to argue and debate with people. In the heat of an argument, he is quick to challenge people to invoke Allah’s curse upon whichever one of them are in the wrong. Often, his point of contention is no more than hair-splitting and his only purpose is to publicly expose his opponent as wrong or misguided. It infuriates him when he knows his opponent has made a good point. On the other hand, if his opponent concedes a point to him, he says: “Now you have come to my point of view and my way of thinking” as if he has had a monopoly on the truth all along and decides who can partake of it with him.

In almost all cases, what the Prophet said about debates holds true: “Base motives are obeyed, passions are followed, and each holds fast to his own opinion.” [Sunan Abī Dāwūd and Sunan al-Tirmidhī]

A wise man was once asked: “Why is it that the words of the Pious Predecessors are more beneficial than what we say?”

The wise man replied: “The words of the Pious Predecessors are better than our words, because they spoke to promote Islam, please Allah, and guide people to salvation, whereas we speak to promote ourselves, please the people, and achieve worldly success.”

Some people like to dig up strange opinions and resurrect old arguments to make it seem like they are resuscitating a Sunnah that has been forgotten or neglected. Our scholars had a different attitude about such things. They warned against strange and unusual opinions that show themselves in their very strangeness to be suspect.

It is possible, nevertheless, to go to the opposite extreme. When a well-known custom or tradition is called into question, some people come forward as self-proclaimed defenders of tradition, hoping to earn a prominent position in society by doing so, even if they know that tradition is misguided, does not serve the public welfare, or is contrary to the teachings of Islam.

Another mode of conduct in this vein is to seek after a lot of followers, pit people against one another, erect obstacles to reconciliation, and stake loyalty on adherence to a bunch of secondary controversial opinions.

Al-Dhahabī said: “You can be an oppressor and believe that you are the one who is oppressed. You can be consuming unlawful wealth and fancy yourself to be abstentious. You can be a sinner and think you are righteous and just. You can be seeking religious knowledge for worldly benefit but see yourself as doing so for Allah’s sake.” [Siyar A`lām al-Nubalā’]

Abū Dāwūd, speaking about his Sunan, told Imam Ahmad: “This is something I have done for Allah’s sake.”

Ahmad said to him: “As for saying it is for Allah’s sake, that is a serious claim. Rather say: ‘This is something my heart has been made to incline towards, so I did it’.”

It is also related as something Ahmad admonished himself with. Indeed, identifying your inner motives is one of the subtest but most crucial ways of being honest.

By Shaykh Salman Al Odah

What are tachyons?


Tachyons are hypothetical particles resulting from what physicists call a thought experiment. Back in the 1960s, some physicists wondered what would happen if matter could travel faster than the speed of light, something that is supposed to be impossible according to the Theory of Relativity. So these particles may or may not exist because they have not been proven or disproven by real experiment as of yet. What people have done is apply existing formulas to the unique properties of tachyons (like imaginary mass!). What comes out is a particles that go faster when they lose energy with a MINIMUM velocity of the speed of light and a maximum velocity of infinity! Hope that helps Ben, theoretical physics is a weird place and is not too far off from philosophy.

In Einstein’s theory of relativity, the “mass” of an object increases as it goes faster, becoming infinite at the speed of light, so it takes an infinite amount of energy (remember E=mc^2 means energy and mass are the same) to reach the speed of light. This is why special relativity says we cannot go faster than the speed of light. So what we talk about in physics is the mass of the object when it is sitting still, the “rest mass.” If an object has a positive rest mass, it goes slower than the speed of light; if it is like light with a zero rest mass, it moves at light speed. What we call a tachyon is a particle (a fundamental particle, like an electron) that has an _imaginary_ rest mass.

here was a young lady named Bright,
                Whose speed was far faster than light.
                She went out one day,
                In a relative way,
                And returned the previous night!

                        — Reginald Buller

Draw a graph, with momentum (p) on the x-axis, and energy (E) on the y-axis.  Then draw the “light cone”, two lines with the equations E = ±p.  This divides our 1+1 dimensional space-time into two regions.  Above and below are the “timelike” quadrants, and to the left and right are the “spacelike” quadrants.

Now the fundamental fact of relativity is that

E² − p² = m²

where E is an object’s energy, p is its momentum, and m is its rest mass, which we’ll just call ‘mass’.  In case you’re wondering, we are working in units where c=1.  For any non-zero value of m, this is a hyperbola with branches in the timelike regions.  It passes through the point (p,E) = (0,m), where the particle is at rest.  Any particle with mass m is constrained to move on the upper branch of this hyperbola.  (Otherwise, it is “off shell”, a term you hear in association with virtual particles — but that’s another topic.)  For massless particles, E² = p², and the particle moves on the light-cone.

These two cases are given the names tardyon (or bradyon in more modern usage) and luxon, for “slow particle” and “light particle”.  Tachyon is the name given to the supposed “fast particle” which would move with v > c. Tachyons were first introduced into physics by Gerald Feinberg, in his seminal paper “On the possibility of faster-than-light particles” [Phys. Rev. 159, 1089—1105 (1967)].

Now another familiar relativistic equation is

E = m[1−(v/c)²]−½.

Tachyons have v > c.  This means that E is imaginary!  Well, what if we take the rest mass m, and take it to be imaginary?  Then E is negative real, and E² − p² = m² < 0.  Or, p² − E² = M², whereM is real.  This is a hyperbola with branches in the spacelike region of spacetime.  The energy and momentum of a tachyon must satisfy this relation.

You can now deduce many interesting properties of tachyons.  For example, they accelerate (p goes up) if they lose energy (E goes down).  Furthermore, a zero-energy tachyon is “transcendent”, or moves infinitely fast.  This has profound consequences.  For example, let’s say that there were electrically charged tachyons.  Since they would move faster than the speed of light in the vacuum, they should produce Cherenkov radiation.  This would lower their energy, causing them to accelerate more!  In other words, charged tachyons would probably lead to a runaway reaction releasing an arbitrarily large amount of energy.  This suggests that coming up with a sensible theory of anything except free (noninteracting) tachyons is likely to be difficult.  Heuristically, the problem is that we can get spontaneous creation of tachyon-antitachyon pairs, then do a runaway reaction, making the vacuum unstable.  To treat this precisely requires quantum field theory, which gets complicated.  It is not easy to summarize results here.  However, one reasonably modern reference is Tachyons, Monopoles, and Related Topics, E. Recami, ed. (North-Holland, Amsterdam, 1978).

However, tachyons are not entirely invisible.  You can imagine that you might produce them in some exotic nuclear reaction.  If they are charged, you could “see” them by detecting the Cherenkov light they produce as they speed away faster and faster.  Such experiments have been done but, so far, no tachyons have been found.  Even neutral tachyons can scatter off normal matter with experimentally observable consequences.  Again, no such tachyons have been found.

How about using tachyons to transmit information faster than the speed of light, in violation of Special Relativity?  It’s worth noting that when one considers the relativistic quantum mechanics of tachyons, the question of whether they “really” go faster than the speed of light becomes much more touchy!  In this framework, tachyons are waves that satisfy a wave equation.  Let’s treat free tachyons of spin zero, for simplicity.  We’ll set c = 1 to keep things less messy.  The wavefunction of a single such tachyon can be expected to satisfy the usual equation for spin-zero particles, the Klein-Gordon equation:

(□ + m²)φ = 0

where □ is the D’Alembertian, which in 3+1 dimensions is just

□ = ∂²/∂t² − ∂²/∂x² − ∂²/∂y² − ∂²/∂z².

The difference with tachyons is that m² is negative, and so m is imaginary.

To simplify the math a bit, let’s work in 1+1 dimensions with co-ordinates x and t, so that

□ = ∂²/∂t² − ∂²/∂x².

Everything we’ll say generalizes to the real-world 3+1-dimensional case.  Now, regardless of m, any solution is a linear combination, or superposition, of solutions of the form

φ(t,x) = exp(−iEt + ipx)

where E² − p² = m².  When m² is negative there are two essentially different cases.  Either | p | ≥ | E |, in which case E is real and we get solutions that look like waves whose crests move along at the rate | p/E | ≥ 1, i.e., no slower than the speed of light.  Or | p | < | E |, in which case E is imaginary and we get solutions that look like waves that amplify exponentially as time passes!

We can decide as we please whether or not we want to consider the second type of solution.  They seem weird, but then the whole business is weird, after all.

(1)  If we do permit the second type of solution, we can solve the Klein-Gordon equation with any reasonable initial data — that is, any reasonable values of φ and its first time derivative at t = 0.  (For the precise definition of “reasonable”, consult your local mathematician.)  This is typical of wave equations.  And, also typical of wave equations, we can prove the following thing: if the solution φ and its time derivative are zero outside the interval [−L, L] when t = 0, they will be zero outside the interval [−L− | t |, L + | t |] at any time t.  In other words, localized disturbances do not spread with speed faster than the speed of light!  This seems to go against our notion that tachyons move faster than the speed of light, but it’s a mathematical fact, known as “unit propagation velocity”.

(2)  If we don’t permit the second sort of solution, we can’t solve the Klein-Gordon equation for all reasonable initial data, but only for initial data whose Fourier transforms vanish in the interval [−| m |, | m |].  By the Paley-Wiener theorem this has an odd consequence: it becomes impossible to solve the equation for initial data that vanish outside some interval [−L, L]!  In other words, we can no longer “localize” our tachyon in any bounded region in the first place, so it becomes impossible to decide whether or not there is “unit propagation velocity” in the precise sense of part (1).  Of course, the crests of the waves exp(−iEt + ipx) move faster than the speed of light, but these waves were never localized in the first place!

The bottom line is that you can’t use tachyons to send information faster than the speed of light from one place to another.  Doing so would require creating a message encoded some way in a localized tachyon field, and sending it off at superluminal speed toward the intended receiver.  But as we have seen you can’t have it both ways: localized tachyon disturbances are subluminal and superluminal disturbances are nonlocal.

The futility of Islamophobia

What Islam is going through right now is not at all different from the process of reformation that Christianity underwent during the 16th century. Those who are so impatient with reform would do well to read that history.


The First Amendment to the Constitution of the US is unique in the world. Other than religious freedom as well as a bar on Congress from respecting an establishment of religion, this amendment provides for an unfettered right to freedom of speech and press. This is how it should be everywhere ideally but, unfortunately, we live in a less than ideal world. The advantages of having an unfettered right to freedom of speech and press are too numerous to list. Primarily though it creates a society where true scholarship and bona fide research into even the most taboo of topics is possible. This leads to a marketplace of ideas that creates a national intellectual economy so essential to a progressive society. Yet it can also mean that the same freedom is abused. The right to speak is conflated often with the right to offend. Offensive speech, it follows, is protected speech. However, where hate speech leads to hate violence it becomes fighting words.

Now consider the ongoing Islamophobia debate in the US. It is one of the most divisive and polarising debates in that country. Leading this debate are people like Sam Harris and Ayyan Hirsi Ali, who vehemently insist that extremists and terrorists are intellectually honest when committing crimes against humanity in the name of Islam. Their target, without exception, is not extremists or terrorists but moderate Muslims who they contend are intellectually dishonest, naïve or both. Furthermore, they contend that the only way to be a good Muslim is to be a Muslim in name. A corollary is that they believe moderate Muslims shield extremists because they make it impossible to criticise Islam’s true doctrine (ironically laying claim to be the true experts of Islamic doctrine themselves). This, they say, is not Islamophobia but instead legitimate criticism of an ideology that is inherently violent. Needless to say, this abrasive rhetoric is counterproductive to any of the stated objectives of this camp. They can claim as many times as they want that their target is a set of ideas and not Muslim people but the truth is that the Chapel Hill shootings showed that this rhetoric also translates into hate violence.

Perhaps the biggest problem with their penchant to paint the diversity that is Islam with one broad brush is that they forget one fundamental truth: there are close to 1.5 billion people on this Earth who identify themselves as Muslims and modernisation of the Muslim narrative can only happen if you state that their lifeblood, which is their faith, is completely compatible with such modernity. What Islam is going through right now is not at all different from the process of reformation that Christianity underwent during the 16th century. Those who are so impatient with reform would do well to read that history. Indeed, the world of Islam is reforming at a faster rate because of the times we live in. More and more women are part of the work force. At the very basic level there is a realisation that religious freedom is a good thing and civil society in many Muslim countries, especially Pakistan, is very active in speaking out for civil liberties, women’s rights and other issues germane to the modern age. Yes, there are fundamentalists and fanatics creating problems but then what do you think Martin Luther, the father of Christian reformation, was? Lutherans and Catholics both burnt each other at the stake during the 16th century.

The discourse, rightly labelled as Islamophobia, does not aid or speed up the process of Muslim reformation. It hinders it especially since Sam Harris and company goes after not the extremists in the Muslim world but the moderates. When painted into a corner, even a moderate Muslim has to make a choice: to give up his identity and his way of life or to resist. If human history is any indication, nine out of 10 moderates will resist. And there are many achievements in Islamic history that moderates are rightly proud of. The civilisation that Islam ushered in produced Avicenna, Averroes, Rhazes, Al Khawarzimi and countless other men of science and philosophy, who have enriched the human consciousness. Averroes, for example, was precisely the kind of person who would be called an “Islamic apologist” by the Islamophobes of today. He had attempted to reconcile Aristotlean ideas with the Islamic faith. Yet it is Averroes who features prominently in the artwork of the Renaissance period. His influence over western thought cannot be underestimated.

The main objection raised against Islam by its critics today pertains to the Islamic legal system. The criticism holds water because Islamic jurisprudence has remained static since the 12th century. There is no denying of course that the major pre-occupation of Islam has been the law. However, it must be said that compared to the legal systems that existed at the time, i.e. from 650 AD to 1250 AD, Islamic jurisprudence was far more progressive. Then it all came to a halt around the time the great Muslim seat of Islamic learning, Baghdad, the capital of knowledge in the world, was burnt down by Helagu Khan. Islamic law was ossified and limited to dogma. What the critics of Islamic jurisprudence today attack is a corpse rather than a living system. A legal system has to constantly evolve. After all, how does one explain west’s evolution from a society that burnt women at the stake to the one that is subject to the highest principles of human conduct and civil rights?

What is certain, however, is that one cannot hope to reform the Islamic world until and unless one enlists Islam and its doctrine in one’s aid. That is just the way it is. Therefore, one really questions whether piling humiliation or insulting moderate Muslims, instead of welcoming them with open arms, is the forward march of humanity.

By Yasser Latif Hamdani who is a lawyer based in Lahore and the author of the book Mr Jinnah: Myth and Reality. He can be contacted via twitter @therealylh and through his email address yasser.hamdani@gmail.com

Travel Lets You See Yourself Differently


Travel is a source of inspiration and deep insights. You meet new people and encounter different social circles. You get to see other parts of Allah’s creation. You also get to see new dimensions of human ingenuity and experience the rich history of other parts of the world.

When you stand at the foot of a great mountain, you get a sense of how small you are in the grand scale of things. When you stand next to an immense old Roman column in Lebanon or Morocco, or by the pyramids in Egypt, you get a sense of the immensity of history and the passage of time.

Travellers get the chance to shed formality and pretense in dealing with people. They can eat what they want and dress how they like, without worrying about how it impacts on people’s opinion. The can walk down the roads and alleyways, contemplating Allah’s creation as they like, glorifying Him for the wonders that they see.

They are among people who do not know them. Since travellers are just anonymous people, they can interact with the locals without any pretension. This can be a liberating but also humbling experience, especially for those who enjoyed a degree of fame or prestige back home. No one comes up to shake their hands, stop to speak with them, or even give them a second glance.

One such person entered a library and saw someone coming up to him as if he wanted to greet him. The man was used to receiving such attention back home, so he prepared himself for the encounter. He stood up straighter and turned squarely into the path of the approaching man. He was shocked when the person said to him: “Could you please move over. You’re in my way.”

Travellers are sometimes scolded and criticised because they do not know the local customs and cause offense.

No matter how much prestige they enjoy back home, travellers might find themselves asked by a lorry driver to help him push his stalled vehicle.

Travel writing is a special art, especially when it conveys inspired feelings, noble sentiments, and a renewed awareness of things. The location could be a public park or the view from a balcony or a crowded train, but the writer feels compelled to record the experience and human encounters of that particular place at that precise time.

Dr Salman Al Auda

The Fascinating Origins of Religion — and Why It’s Deeply Intertwined With Violence

The relationship between warfare and religion is infinitely more complex than most of us realize.

Photo Credit: imagedb.com/Shutterstock.com

Excerpted from Fields of Blood by Karen Armstrong. Copyright © 2014 by Karen Armstrong. Excerpted by permission of Random House Audio, a division of Random House LLC. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.

Every year in ancient Israel the high priest brought two goats into the Jerusalem temple on the Day of Atonement. He sacrificed one to expiate the sins of the community and then laid his hands on the other, transferring all the people’s misdeeds onto its head, and sent the sin-laden animal out of the city, literally placing the blame elsewhere. In this way, Moses explained, “the goat will bear all their faults away with it into a desert place.” In his classic study of religion and violence, René Girard argued that the scapegoat ritual defused rivalries among groups within the community. In a similar way, I believe, modern society has made a scapegoat of faith.

In the West the idea that religion is inherently violent is now taken for granted and seems self-evident. As one who speaks on religion, I constantly hear how cruel and aggressive it has been, a view that, eerily, is expressed in the same way almost every time: “Religion has been the cause of all the major wars in history.” I have heard this sentence recited like a mantra by American commentators and psychiatrists, London taxi drivers and Oxford academics. It is an odd remark. Obviously the two world wars were not fought on account of religion. When they discuss the reasons people go to war, military historians acknowledge that many interrelated social, material, and ideological factors are involved, one of the chief being competition for scarce resources. Experts on political violence or terrorism also insist that people commit atrocities for a complex range of reasons. Yet so indelible is the aggressive image of religious faith in our secular consciousness that we routinely load the violent sins of the twentieth century onto the back of “religion” and drive it out into the political wilderness.

Even those who admit that religion has not been responsible for all the violence and warfare of the human race still take its essential belligerence for granted. They claim that “monotheism” is especially intolerant and that once people believe that “God” is on their side, compromise becomes impossible. They cite the Crusades, the Inquisition, and the Wars of Religion of the sixteenth and seventeenth centuries. They also point to the recent spate of terrorism committed in the name of religion to prove that Islam is particularly aggressive. If I mention Buddhist non- violence, they retort that Buddhism is a secular philosophy, not a religion. Here we come to the heart of the problem. Buddhism is certainly not a religion as this word has been understood in the West since the seventeenth and eighteenth centuries. But our modern Western conception of “religion” is idiosyncratic and eccentric. No other cultural tradition has anything like it, and even premodern European Christians would have found it reductive and alien. In fact, it complicates any attempt to pronounce on religion’s propensity to violence.

To complicate things still further, for about fifty years now it has been clear in the academy that there is no universal way to define religion. In the West we see “religion” as a coherent system of obligatory beliefs, institutions, and rituals, centering on a supernatural God, whose practice is essentially private and hermetically sealed off from all “secular” activities. But words in other languages that we translate as “religion” almost invariably refer to something larger, vaguer, and more encompassing. The Arabic din signifies an entire way of life. The Sanskrit dharma is also “a ‘total’ concept, untranslatable, which covers law, justice, morals, and social life.” The Oxford Classical Dictionary firmly states: “No word in either Greek or Latin corresponds to the English ‘religion’ or ‘religious.’” The idea of religion as an essentially personal and systematic pursuit was entirely absent from classical Greece, Japan, Egypt, Mesopotamia, Iran, China, and India. Nor does the Hebrew Bible have any abstract concept of religion; and the Talmudic rabbis would have found it impossible to express what they meant by faith in a single word or even in a formula, since the Talmud was expressly designed to bring the whole of human life into the ambit of the sacred.

The origins of the Latin religio are obscure. It was not “a great objective something” but had imprecise connotations of obligation and taboo; to say that a cultic observance, a family propriety, or keeping an oath was religio for you meant that it was incumbent on you to do it. The word acquired an important new meaning among early Christian theologians: an attitude of reverence toward God and the universe as a whole. For Saint Augustine (c. 354–430 CE), religio was neither a system of rituals and doctrines nor a historical institutionalized tradition but a personal encounter with the transcendence that we call God as well as the bond that unites us to the divine and to one another. In medieval Europe, religio came to refer to the monastic life and distinguished the monk from the “secular” priest, someone who lived and worked in the world (saeculum).

The only faith tradition that does fit the modern Western notion of religion as something codified and private is Protestant Christianity, which, like religion in this sense of the word, is also a product of the early modern period. At this time Europeans and Americans had begun to separate religion and politics, because they assumed, not altogether accurately, that the theological squabbles of the Reformation had been entirely responsible for the Thirty Years’ War. The conviction that religion must be rigorously excluded from political life has been called the charter myth of the sovereign nation-state. The philosophers and statesmen who pioneered this dogma believed that they were returning to a more satisfactory state of affairs that had existed before ambitious Catholic clerics had confused two utterly distinct realms. But in fact their secular ideology was as radical an innovation as the modern market economy that the West was concurrently devising. To non-Westerners, who had not been through this particular modernizing process, both these innovations would seem unnatural and even incomprehensible. The habit of separating religion and politics is now so routine in the West that it is difficult for us to appreciate how thoroughly the two co-inhered in the past. It was never simply a question of the state “using” religion; the two were indivisible. Dissociating them would have seemed like trying to ex- tract the gin from a cocktail.

In the premodern world, religion permeated all aspects of life. We shall see that a host of activities now considered mundane were experienced as deeply sacred: forest clearing, hunting, football matches, dice games, astronomy, farming, state building, tugs-of-war, town plan- ning, commerce, imbibing strong drink, and, most particularly, warfare. Ancient peoples would have found it impossible to see where “religion” ended and “politics” began. This was not because they were too stupid to understand the distinction but because they wanted to invest every- thing they did with ultimate value. We are meaning-seeking creatures and, unlike other animals, fall very easily into despair if we fail to make sense of our lives. We find the prospect of our inevitable extinction hard to bear. We are troubled by natural disasters and human cruelty and are acutely aware of our physical and psychological frailty. We find it astonishing that we are here at all and want to know why. We also have a great capacity for wonder. Ancient philosophies were entranced by the order of the cosmos; they marveled at the mysterious power that kept the heavenly bodies in their orbits and the seas within bounds and that ensured that the earth regularly came to life again after the dearth of winter, and they longed to participate in this richer and more permanent existence.

They expressed this yearning in terms of what is known as the perennial philosophy, so called because it was present, in some form, in most premodern cultures.Every single person, object, or experience was seen as a replica, a pale shadow, of a reality that was stronger and more enduring than anything in their ordinary experience but that they only glimpsed in visionary moments or in dreams. By ritually imitating what they understood to be the gestures and actions of their celestial alter egos—whether gods, ancestors, or culture heroes—premodern folk felt themselves to be caught up in their larger dimension of being. We humans are profoundly artificial and tend naturally toward archetypes and paradigms. We constantly strive to improve on nature or approximate to an ideal that transcends the day-to-day. Even our contemporary cult of celebrity can be understood as an expression of our reverence for and yearning to emulate models of “superhumanity.” Feeling ourselves connected to such extraordinary realities satisfies an essential craving. It touches us within, lifts us momentarily beyond ourselves, so that we seem to inhabit our humanity more fully than usual and feel in touch with the deeper currents of life. If we no longer find this experience in a church or temple, we seek it in art, a musical concert, sex, drugs— or warfare. What this last may have to do with these other moments of transport may not be so obvious, but it is one of the oldest triggers of ecstatic experience. To understand why, it will be helpful to consider the development of our neuroanatomy.

Each of us has not one but three brains that coexist uneasily. In the deepest recess of our gray matter we have an “old brain” that we inherited from the reptiles that struggled out of the primal slime 500 million years ago. Intent on their own survival, with absolutely no altruistic impulses, these creatures were solely motivated by mechanisms urging them to feed, fight, flee (when necessary), and reproduce. Those best equipped to compete mercilessly for food, ward off any threat, dominate territory, and seek safety naturally passed along their genes, so these self- centered impulses could only intensify. But sometime after mammals appeared, they evolved what neuroscientists call the limbic system, per- haps about 120 million years ago. Formed over the core brain derived from the reptiles, the limbic system motivated all sorts of new behaviors, including the protection and nurture of young as well as the formation of alliances with other individuals that were invaluable in the struggle to survive. And so, for the first time, sentient beings possessed the capacity to cherish and care for creatures other than themselves.

Although these limbic emotions would never be as strong as the “me first” drives still issuing from our reptilian core, we humans have evolved a substantial hard-wiring for empathy for other creatures, and especially for our fellow humans. Eventually, the Chinese philosopher Mencius (c. 371–288 BCE) would insist that nobody was wholly without such sympathy. If a man sees a child teetering on the brink of a well, about to fall in, he would feel her predicament in his own body and would reflexively, without thought for himself, lunge forward to save her. There would be something radically wrong with anyone who could walk past such a scene without a flicker of disquiet. For most, these sentiments were essential, though, Mencius thought, somewhat subject to individual will. You could stamp on these shoots of benevolence just as you could cripple or deform yourself physically. On the other hand, if you cultivated them, they would acquire a strength and dynamism of their own.

We cannot entirely understand Mencius’s argument without considering the third part of our brain. About twenty thousand years ago, during the Paleolithic Age, human beings evolved a “new brain,” the neocortex, home of the reasoning powers and self-awareness that enable us to stand back from the instinctive, primitive passions. Humans thus became roughly as they are today, subject to the conflicting impulses of their three distinct brains. Paleolithic men were proficient killers. Before the invention of agriculture, they were dependent on the slaughter of animals and used their big brains to develop a technology that enabled them to kill creatures much larger and more powerful than themselves. But their empathy may have made them uneasy. Or so we might conclude from modern hunting societies. Anthropologists observe that tribesmen feel acute anxiety about having to slay the beasts they consider their friends and patrons and try to assuage this distress by ritual purification. In the Kalahari Desert, where wood is scarce, bushmen are forced to rely on light weapons that can only graze the skin. So they anoint their arrows with a poison that kills the animal—only very slowly. Out of ineffable solidarity, the hunter stays with his dying victim, crying when it cries, and participating symbolically in its death throes. Other tribes don animal costumes or smear the kill’s blood and excrement on cavern walls, ceremonially returning the creature to the underworld from which it came.

Paleolithic hunters may have had a similar understanding. The cave paintings in northern Spain and southwestern France are among the earliest extant documents of our species. These decorated caves almost certainly had a liturgical function, so from the very beginning art and ritual were inseparable. Our neocortex makes us intensely aware of the tragedy and perplexity of our existence, and in art, as in some forms of religious expression, we find a means of letting go and encouraging the softer, limbic emotions to predominate. The frescoes and engravings in the labyrinth of Lascaux in the Dordogne, the earliest of which are seventeen thousand years old, still evoke awe in visitors. In their numinous depiction of the animals, the artists have captured the hunters’ essential ambivalence. Intent as they were to acquire food, their ferocity was tempered by respectful sympathy for the beasts they were obliged to kill, whose blood and fat they mixed with their paints. Ritual and art helped hunters express their empathy with and reverence (religio) for their fellow creatures—just as Mencius would describe some seventeen millennia later—and helped them live with their need to kill them.

In Lascaux there are no pictures of the reindeer that featured so largely in the diet of these hunters. But not far away, in Montastruc, a small sculpture has been found, carved from a mammoth tusk in about 11,000 BCE, at about the same time as the later Lascaux paintings. Now lodged in the British Museum, it depicts two swimming reindeer. The artist must have watched his prey intently as they swam across lakes and rivers in search of new pastures, making themselves particularly vulnerable to the hunters. He also felt a tenderness toward his victims, conveying the unmistakable poignancy of their facial expressions without a hint of sentimentality. As Neil MacGregor, director of the British Museum, has noted, the anatomical accuracy of this sculpture shows that it “was clearly made not just with the knowledge of a hunter but also with the insight of a butcher, someone who had not only looked at his animals but had cut them up.” Rowan Williams, the former archbishop of Canterbury, has also reflected insightfully on the “huge and imaginative generosity” of these Paleolithic artists: “In the art of this period, you see human beings trying to enter fully into the flow of life, so that they become part of the whole process of animal life that’s going on all around them . . . and this is actually a very religious impulse.” From the first, then, one of the major preoccupations of both religion and art (the two being inseparable) was to cultivate a sense of community—with nature, the animal world, and our fellow humans.

We would never wholly forget our hunter-gatherer past, which was the longest period in human history. Everything that we think of as most human—our brains, bodies, faces, speech, emotions, and thoughts— bears the stamp of this heritage. Some of the rituals and myths devised by our prehistoric ancestors appear to have survived in the practices of later, literate cultures. In this way, animal sacrifice, the central rite of nearly every ancient society, preserved prehistoric hunting ceremonies and the honor accorded the beast that gave its life for the community. Much of what we now call “religion” was originally rooted in an acknowledgment of the tragic fact that life depended on the destruction of other creatures; rituals were addressed to helping human beings face up to this insoluble dilemma. Despite their real respect, reverence, and even affection for their prey, however, ancient huntsmen remained dedicated killers. Millennia of fighting large aggressive animals meant that these hunting parties became tightly bonded teams that were the seeds of our modern armies, ready to risk everything for the common good and to protect their fellows in moments of danger. And there was one more conflicting emotion to be reconciled: they probably loved the excitement and intensity of the hunt.

Here again the limbic system comes into play. The prospect of killing may stir our empathy, but in the very acts of hunting, raiding, and battling, this same seat of emotions is awash in serotonin, the neurotransmitter responsible for the sensation of ecstasy that we associate with some forms of spiritual experience. So it happened that these violent pursuits came to be perceived as sacred activities, however bizarre that may seem to our understanding of religion. People, especially men, experienced a strong bond with their fellow warriors, a heady feeling of altruism at putting their lives at risk for others and of being more fully alive. This response to violence persists in our nature. The New York Times war correspondent Chris Hedges has aptly described war as “a force that gives us meaning”:

War makes the world understandable, a black and white tableau of them and us. It suspends thought, especially self-critical thought. All bow before the supreme effort. We are one. Most of us willingly accept war as long as we can fold it into a belief system that paints the ensuing suffering as necessary for a higher good, for human beings seek not only happiness but meaning. And tragically war is some- times the most powerful way in human society to achieve meaning.

It may be too that as they give free rein to the aggressive impulses from the deepest region of their brains, warriors feel in tune with the most elemental and inexorable dynamics of existence, those of life and death. Put another way, war is a means of surrender to reptilian ruthlessness, one of the strongest of human drives, without being troubled by the self- critical nudges of the neocortex.

The warrior, therefore, experiences in battle the transcendence that others find in ritual, sometimes to pathological effect. Psychiatrists who treat war veterans for post-traumatic stress disorder (PTSD) have noted that in the destruction of other people, soldiers can experience a self- affirmation that is almost erotic. Yet afterward, as they struggle to disentangle their emotions of pity and ruthlessness, PTSD sufferers may find themselves unable to function as coherent human beings. One Vietnam veteran described a photograph of himself holding two severed heads by the hair; the war, he said, was “hell,” a place where “crazy was natural” and everything “out of control,” but, he concluded:

The worst thing I can say about myself is that while I was there I was so alive. I loved it the way you can like an adrenaline high, the way you can love your friends, your tight buddies. So unreal and the realest thing that ever happened. . . . And maybe the worst thing for me now is living in peacetime without a possibility of that high again. I
hate what that high was about but I loved that high.

“Only when we are in the midst of conflict does the shallowness and vapidness of much of our lives become apparent,” Hedges explains. “Trivia dominates our conversation and increasingly our airwaves. And war is an enticing elixir. It gives us a resolve, a cause. It allows us to be noble.” One of the many, intertwined motives driving men to the battlefield has been the tedium and pointlessness of ordinary domestic existence. The same hunger for intensity would compel others to become monks and ascetics.

The warrior in battle may feel connected with the cosmos, but after- ward he cannot always resolve these inner contradictions. It is fairly well established that there is a strong taboo against killing our own kind—an evolutionary stratagem that helped our species to survive. Still, we fight. But to bring ourselves to do so, we envelop the effort in a mythology—often a “religious” mythology—that puts distance between us and the enemy. We exaggerate his differences, be they racial, religious, or ideological. We develop narratives to convince ourselves that he is not really human but monstrous, the antithesis of order and goodness. Today we may tell ourselves that we are fighting for God and country or that a particular war is “just” or “legal.” But this encouragement doesn’t always take hold. During the Second World War, for instance, Brigadier General S. L. A. Marshall of the U.S. Army and a team of historians interviewed thousands of soldiers from more than four hundred infantry companies that had seen close combat in Europe and the Pacific. Their findings were startling: only 15 to 20 percent of infantrymen had been able to fire at the enemy directly; the rest tried to avoid it and had developed complex methods of misfiring or reloading their weapons so as to escape detection.

It is hard to overcome one’s nature. To become efficient soldiers, recruits must go through a grueling initiation, not unlike what monks or yogins undergo, to subdue their emotions. As the cultural historian Joanna Bourke explains the process:

Individuals had to be broken down to be rebuilt into efficient fighting men. The basic tenets included depersonalization, uniforms, lack of privacy, forced social relationships, tight schedules, lack of sleep, disorientation followed by rites of reorganization according to military codes, arbitrary rules, and strict punishment. The methods of brutalization were similar to those carried out by regimes where men were taught to torture prisoners.

So, we might say, the soldier has to become as inhuman as the “enemy” he has created in his mind. Indeed, we shall find that in some cultures, even (or perhaps especially) those that glorify warfare, the warrior is somehow tainted, polluted, and an object of fear—both an heroic figure and a necessary evil, to be dreaded, set apart.

Our relationship to warfare is therefore complex, possibly because it is a relatively recent human development. Hunter-gatherers could not afford the organized violence that we call war, because warfare requires large armies, sustained leadership, and economic resources that were far beyond their reach. Archaeologists have found mass graves from this period that suggest some kind of massacre, yet there is little evidence that early humans regularly fought one another. But human life changed forever in about 9000 BCE, when pioneering farmers in the Levant learned to grow and store wild grain. They produced harvests that were able to support larger populations than ever before and eventually they grew more food than they needed. As a result, the human population increased so dramatically that in some regions a return to hunter-gatherer life became impossible. Between about 8500 BCE and the first century of the Common Era—a remarkably short period given the four million years of our history—all around the world, quite independently, the great majority of humans made the transition to agrarian life. And with agriculture came civilization; and with civilization, warfare.

In our industrialized societies, we often look back to the agrarian age with nostalgia, imagining that people lived more wholesomely then, close to the land and in harmony with nature. Initially, however, agriculture was experienced as traumatic. These early settlements were vulnerable to wild swings in productivity that could wipe out the entire population, and their mythology describes the first farmers fighting a desperate battle against sterility, drought, and famine. For the first time, backbreaking drudgery became a fact of human life. Skeletal remains show that plant- fed humans were a head shorter than meat-eating hunters, prone to anemia, infectious diseases, rotten teeth, and bone disorders. The earth was revered as the Mother Goddess and her fecundity experienced as an epiphany; she was called Ishtar in Mesopotamia, Demeter in Greece, Isis in Egypt, and Anat in Syria. Yet she was not a comforting presence but extremely violent. The Earth Mother regularly dismembered con- sorts and enemies alike—just as corn was ground to powder and grapes crushed to unrecognizable pulp. Farming implements were depicted as weapons that wounded the earth, so farming plots became fields of blood. When Anat slew Mot, god of sterility, she cut him in two with a ritual sickle, winnowed him in a sieve, ground him in a mill, and scattered his scraps of bleeding flesh over the fields. After she slaughtered the enemies of Baal, god of life-giving rain, she adorned herself with rouge and henna, made a necklace of the hands and heads of her victims, and waded knee-deep in blood to attend the triumphal banquet.

These violent myths reflected the political realities of agrarian life. By the beginning of the ninth millennium BCE, the settlement in the oasis of Jericho in the Jordan valley had a population of three thousand people, which would have been impossible before the advent of agriculture. Jericho was a fortified stronghold protected by a massive wall that must have consumed tens of thousands of hours of manpower to construct. In this arid region, Jericho’s ample food stores would have been a magnet for hungry nomads. Intensified agriculture, therefore, created conditions that that could endanger everyone in this wealthy colony and transform its arable land into fields of blood. Jericho was unusual, however—a portent of the future. Warfare would not become endemic in the region for another five thousand years, but it was already a possibility, and from the first, it seems, large-scale organized violence was linked not with religion but with organized theft.

Agriculture had also introduced another type of aggression: an institutional or structural violence in which a society compels people to live in such wretchedness and subjection that they are unable to better their lot. This systemic oppression has been described as possibly “the most subtle form of violence,” and, according to the World Council of Churches, it is present whenever “resources and powers are unequally distributed, concentrated in the hands of the few, who do not use them to achieve the possible self-realization of all members, but use parts of them for self-satisfaction or for purposes of dominance, oppression, and control of other societies or of the underprivileged in the same society.” Agrarian civilization made this systemic violence a reality for the first time in human history.

Paleolithic communities had probably been egalitarian because hunter-gatherers could not support a privileged class that did not share the hardship and danger of the hunt. Because these small communities lived at near-subsistence level and produced no economic surplus, inequity of wealth was impossible. The tribe could survive only if everybody shared what food they had. Government by coercion was not feasible because all able-bodied males had exactly the same weapons and fighting skills. Anthropologists have noted that modern hunter-gatherer societies are classless, that their economy is “a sort of communism,” and that people are honored for skills and qualities, such as generosity, kindness, and even-temperedness, that benefit the community as a whole. But in societies that produce more than they need, it is possible for a small group to exploit this surplus for its own enrichment, gain a monopoly of violence, and dominate the rest of the population.

As we shall see in Part One, this systemic violence would prevail in all agrarian civilizations. In the empires of the Middle East, China, India, and Europe, which were economically dependent on agriculture, a small elite, comprising not more than 2 percent of the population, with the help of a small band of retainers, systematically robbed the masses of the produce they had grown in order to support their aristocratic life- style. Yet, social historians argue, without this iniquitous arrangement, human beings would probably never have advanced beyond subsistence level, because it created a nobility with the leisure to develop the civilized arts and sciences that made progress possible. All premodern civilizations adopted this oppressive system; there seemed to be no alternative. This inevitably had implications for religion, which permeated all human activities, including state building and government. Indeed, we shall see that premodern politics was inseparable from religion. And if a ruling elite adopted an ethical tradition, such as Buddhism, Christianity, or Islam, the aristocratic clergy usually adapted their ideology so that it could support the structural violence of the state.

In Parts One and Two we shall explore this dilemma. Established by force and maintained by military aggression, warfare was essential to the agrarian state. When land and the peasants who farmed it were the chief sources of wealth, territorial conquest was the only way such a kingdom could increase its revenues. Warfare was, therefore, indispensable to any premodern economy. The ruling class had to maintain its control of the peasant villages, defend its arable land against aggressors, conquer more land, and ruthlessly suppress any hint of insubordination. A key figure in this story will be the Indian emperor Ashoka (c. 268–232 BCE). Appalled by the suffering his army had inflicted on a rebellious city, he tirelessly promoted an ethic of compassion and tolerance but could not in the end disband his army. No state can survive without its soldiers. And once states grew and warfare had become a fact of human life, an even greater force—the military might of empire—often seemed the only way to keep the peace.

So necessary to the rise of states and ultimately empires is military force that historians regard militarism as a mark of civilization. With- out disciplined, obedient, and law-abiding armies, human society, it is claimed, would probably have remained at a primitive level or have degenerated into ceaselessly warring hordes. But like our inner conflict between violent and compassionate impulses, the incoherence between peaceful ends and violent means would remain unresolved. Ashoka’s dilemma is the dilemma of civilization itself. And into this tug-of-war religion would enter too. Since all premodern state ideology was inseparable from religion, warfare inevitably acquired a sacral element. Indeed, every major faith tradition has tracked that political entity in which it arose; none has become a “world religion” without the patronage of a militarily powerful empire, and, therefore, each would have to develop an imperial ideology. But to what degree did religion contribute to the violence of the states with which it was inextricably linked? How much blame for the history of human violence can we ascribe to religion itself? The answer is not as simple as much of our popular discourse would suggest.


Our world is dangerously polarized at a time when humanity is more closely interconnected—politically, economically, and electronically— than ever before. If we are to meet the challenge of our time and create a global society where all peoples can live together in peace and mutual respect, we need to assess our situation accurately. We cannot afford oversimplified assumptions about the nature of religion or its role in the world. What the American scholar William T. Cavanaugh calls “the myth of religious violence” served Western people well at an early stage of their modernization, but in our global village we need a more nuanced view in order to understand our predicament fully.