Category Archives: Essays

Paracelsus: The Man Beneath the Myth

Abstract: Paracelsus––whose unorthodox beliefs and volatile temperament caused him to be ostracized from his contemporaries in the tightly knit academic and medical communities, where gossip and scandal circulated with relative ease in spite of the spacial limitations under which mail couriers then operated––was thought of as an agent of the Devil. Though history has been kinder to him, his association with the black arts remains irrevocable––for in a 1942 speech before the Royal Society of Medicine, H. P. Bayon described Paracelsus as “not a harbinger of light.” This paper seeks to uncover the man beneath the myth, and then, hopefully, to set at rest the idea that Paracelsus was anything but an ordinary (and godly) man with ideas and ideals ahead of his time, ideas and ideals that, unfortunately for his reputation, were unsettling to his contemporaries. On a separate note, this paper also attempts to demonstrate the folly of basing one’s opinion of someone or something off of a reputation that, more often than not, is fabricated from half-baked rumors and ill-conceived exaggerations. Read the rest of this entry

Akhenaten, Egypt’s Heretic Pharaoh, and the Amarna Revolution

Akhenaten (Amenhotep IV)

Akhenaten (Amenhotep IV)

Lost to humanity for three millennia until “[t]he Prussian exploration expedition of 1842-45 gave special attention” (Niebuhr & Hutchison, 1901, p. 1) to the ruins of a great city along the eastern bank of the Nile at what is now known as el-Amarna in Middle Egypt, Amenhotep IV (or Akhenaten, as he was later called) was a figure unlike any other in Egyptian history. Ruling Egypt as Pharaoh for less than two decades from ca. 1353-1336 B.C., Akhenaten nonetheless distinguished himself as an apostate who discarded a spiritual tradition that stretched unbroken for nearly two thousand years. Rejecting the Theban god Amun who, joined with Re at the start of the New Kingdom era (ca. 1539 B.C.), became Amun-Re, Akhenaten is said to have devoted himself to the worship of “a manifestation of the sun god” (Murnane & Meltzer, 1995, p. 4), the Aten. Since Akhenaten’s discovery at el-Amarna, where he built a city called Akhet-Aten (which translates as “The Horizon of the Aten” [Brewer, 2012, p. 163]), his devotion to the Aten has branded him a monotheist and invited speculation about his motives for shunning Egyptian polytheism. Such is the public’s interest in Akhenaten that some, myself included, wonder about a connection between Atenism and the genesis of the Judaic-Christian-Islamic monotheistic tradition; it should be noted, however, that though the biblical Exodus from Egypt is said to have occurred only a century after Akhenaten’s reign (ca. 1250-1200 B.C.), and Moses himself is said to have lived during Akhenaten’s reign, there exists nothing but circumstantial evidence (if even that) to support any such connection. In any case, Egyptologists have since cast doubt on Akhenaten’s status as a monotheist, instead postulating that he was a henotheist (someone who worships one god, but accepts the existence of other gods) or an atheist. But whatever Akhenaten’s religious beliefs, he remains an enigmatic figure worthy of further study. Thus are Akhenaten and his short-lived religious movement, now known as the Amarna revolution, the subject of this paper. Read the rest of this entry

A Surrender by Atomic Means?

At dawn on August 5, 1945, after “General [Curtis] LeMay finalized the take-off time [of three B-29 bombers], final assembly of the [atomic] bomb proceeded[,] and take-off [from Tinian] . . . occurred on schedule,”[1] setting in motion a portentous chain of events that have since received an inordinate amount of scrutiny. At approximately a quarter past eight in the morning on August 6, the Enola Gay released its load––a single, gun-type atomic bomb––over Hiroshima, Japan, an industrial and military centre. The ten-foot long weapon, christened “Little Boy,” detonated some two thousand feet in the air, leveling a sizable percentage of the city and killing between 70,000 and 80,000 soldiers and civilians––a further 100,000 are said to have perished from acute radiation sickness in the subsequent weeks, months, and years. Three days after the attack on Hiroshima, a second atomic bomb, larger and more powerful than the first, was loosed over the Japanese port city Nagasaki. Fortunately, due to Nagasaki’s uneven terrain, “Fat Man” inflicted fewer casualties (40,000-75,000). In any case, Japan’s Supreme War Council, having convened hours before the Nagasaki bombing at Emperor Hirohito’s bidding, announced on August 14 the country’s surrender as per the terms outlined in the Potsdam Declaration (though only upon receiving a guarantee of immunity for the kokutai, Japan’s monarchy). Given the timing of the surrender––less than a week after Nagasaki––and Emperor Hirohito’s suggestion that the enemy’s possession of “a new and terrible weapon with the power to destroy many innocent lives” rendered useless any further resistance, that the world readily accepted America’s use of atomic weapons as the sole reason for Japan’s swift and orderly capitulation is understandable. And yet, one would be remiss to dismiss from this equation the Soviet Union, whose presence grew increasingly ominous as spring turned to summer in 1945. A close reading of the circumstances surrounding Japan’s surrender suggests that the ever-looming threat of communism, rather than U.S. President Harry Truman’s decision to use atomic weapons, dictated Japan’s actions in this final, terrible chapter of mankind’s bloodiest war. Read the rest of this entry

Freedom in Post-World War II America

In his 1941 Annual Message to Congress (aka the State of the Union address), President Franklin Roosevelt proclaimed, “In the future days, which we seek to make secure, we look forward to a world founded upon four essential human freedoms.”[1] Christened the Four Freedoms, they included the freedoms of speech and worship, and the freedoms from want and fear. The first two freedoms, their meanings self-evident, need not be defined; the latter two, however, are less explicit. Roosevelt, in the same address, described those latter two––the freedoms from want and fear, respectively––as follows: “the economic understandings which will secure to every nation a healthy peacetime life for its inhabitants . . . [and] the world-wide reduction of armaments to such a point and in such a thorough fashion that no nation will be in a position to commit an act of physical aggression . . .” The Four Freedoms, American leaders made abundantly clear, embodied that which set the U.S. apart from and above its enemies in Europe (Nazi Germany, Fascist Italy) and the Far East (Imperial Japan), and its grudging communist ally the Soviet Union. They also “provided a crucial language of national unity . . . [for t]he message seemed to be that Americans were fighting to preserve freedoms enjoyed individually or within the family rather than in the larger public world.”[2] However, once World War II ended, the Four Freedoms were eventually subsumed into the larger framework of America’s blueprint for winning the Cold War against its ideological counterpart the Soviet Union. Freedom, at least in the initial stages of the Cold War, became less an empowerment of the individual and his/her civil liberties, and more a habit of unquestioned devotion to the American cause, and of cultural and political conformity in the face of encroaching communism. Unchecked material consumption, perhaps buoyed by the U.S.’s postwar economic prosperity, became a staple of the American way of life, so much so that Vice-President Richard Nixon, in 1959, described to Soviet premier Nikita Khrushchev “a conception of freedom centered on economic abundance and consumer choice . . .”[3]; indeed, one might argue that this consumer culture sweetened America’s otherwise unpalatable crackdown on homegrown communism. The U.S., in any case, soon underwent a backlash against this constraining statism, and Americans rediscovered their political autonomy, a development that culminated in the successive civil rights and anti-Vietnam War movements of the 1950s and 60s. Read the rest of this entry

The Death of Henry Ford’s America

Born in 1863 to an Irish immigrant farmer, Henry Ford quit his parents’ estate as a young man, and became an engineer at the Edison Illuminating Company where, prone to experimentation, he became intrigued by the automobile. In 1905, he founded the Ford Motor Company which, by an innovative manufacturing process christened “Fordism,” made cars accessible to ordinary Americans. Ford, amongst others, was to blame for the indulgent consumerism that overtook the U.S. in the teens and twenties; such was America’s fascination with and dependence on the car that “[m]any families . . . [didn’t] spend anything on recreation except for the car.”[1] Only after the New York Stock Exchange crashed in late October 1929 did Ford’s star wane, as Americans disposed of their misguided regard for such technological wonders as the car, the vacuum cleaner, and the electric sewing machine. By the 1940s, that which Ford represented and stood for––the so-called “business consensus” of the Roaring Twenties, American prosperity, economic inequality, anti-labor, anti-welfare, anti-immigration, white supremacy, and isolationism––was less prevalent in the U.S. President Franklin D. Roosevelt’s Depression-era populist reforms and the U.S.’s precipitous entry into World War II convinced America of the need to settle its domestic affairs and assume a more prominent international role as the West’s guiding light; Henry Ford’s America, as morally dubious as it was, was unprepared to discharge such momentous responsibilities. Read the rest of this entry

%d bloggers like this: