Sunday, August 6, 2023

 On This Date In History

On August 6, 1945, the United States becomes the first and only nation to use atomic weaponry during wartime when it drops an atomic bomb on the Japanese city of Hiroshima. Approximately 80,000 people are killed as a direct result of the blast, and another 35,000 are injured. At least another 60,000 would be dead by the end of the year from the effects of the fallout.
Though the dropping of the atomic bomb on Japan marked the end of World War II, many historians argue that it also ignited the Cold War.
Since 1940, the United States had been working on developing an atomic weapon, after having been warned by Albert Einstein that Nazi Germany was already conducting research into nuclear weapons. By the time the United States conducted the first successful test (an atomic bomb was exploded in the desert in New Mexico in July 1945), Germany had already been defeated. The war against Japan in the Pacific, however, continued to rage. President Harry S. Truman, warned by some of his advisers that any attempt to invade Japan would result in horrific American casualties, ordered that the new weapon be used to bring the war to a speedy end.
On August 6, 1945, the American bomber Enola Gay dropped a five-ton bomb over the Japanese city of Hiroshima. A blast equivalent to the power of 15,000 tons of TNT reduced four square miles of the city to ruins and immediately killed 80,000 people. Tens of thousands more died in the following weeks from wounds and radiation poisoning. Three days later, another bomb was dropped on the city of Nagasaki, killing nearly 40,000 more people. A few days later, Japan announced its surrender.

 


In the years since the two atomic bombs were dropped on Japan, a number of historians have suggested that the weapons had a two-pronged objective. First, of course, was to bring the war with Japan to a speedy end and spare American lives. It has been suggested that the second objective was to demonstrate the new weapon of mass destruction to the Soviet Union.
By August 1945, relations between the Soviet Union and the United States had deteriorated badly. The Potsdam Conference between U.S. President Harry S. Truman, Russian leader Joseph Stalin, and Winston Churchill (before being replaced by Clement Attlee) ended just four days before the bombing of Hiroshima. The meeting was marked by recriminations and suspicion between the Americans and Soviets. Russian armies were occupying most of Eastern Europe. Truman and many of his advisers hoped that the U.S. atomic monopoly might offer diplomatic leverage with the Soviets. In this fashion, the dropping of the atomic bomb on Japan can be seen as the first shot of the Cold War.
If U.S. officials truly believed that they could use their atomic monopoly for diplomatic advantage, they had little time to put their plan into action. By 1949, the Soviets had developed their own atomic bomb and the nuclear arms race began.
Soon after arriving at the Potsdam Conference in July 1945, U.S. President Harry S. Truman received word that the scientists of the Manhattan Project had successfully detonated the world’s first nuclear device in a remote corner of the New Mexico desert.
On July 24, eight days after the Trinity test, Truman approached Soviet Premier Joseph Stalin, who along with Truman and British Prime Minister Winston Churchill (soon to be succeeded by Clement Attlee) made up the “Big Three” Allied leaders gathered at Potsdam to determine the post-World War II future of Germany.
According to Truman, he “casually mentioned” to Stalin that the United States had “a new weapon of unusual destructive force,” but Stalin didn’t seem especially interested. “All he said was that he was glad to hear it and hoped we would make ‘good use of it against the Japanese,’” Truman later wrote in his memoir, Year of Decisions.
For Truman, news of the successful Trinity test set up a momentous choice: whether or not to deploy the world’s first weapon of mass destruction. But it also came as a relief, as it meant the United States wouldn’t have to rely on the increasingly adversarial Soviet Union to enter World War II against Japan.
Truman never mentioned the words “atomic” or “nuclear” to Stalin, and the assumption on the U.S. side was that the Soviet premier didn’t know the exact nature of the new weapon. In fact, while Truman himself had first learned of the top-secret U.S. program to develop atomic weapons just three months earlier, after Franklin D. Roosevelt’s death, Soviet intelligence had begun receiving reports about the project as early as September 1941.
While Stalin didn’t take the atomic threat as seriously during wartime as some of his spies did—he had other problems on his hands, thanks to the German onslaught and occupation—Truman’s words at Potsdam made more of an impact than the president realized.
“We now know that Stalin immediately went to his subordinates and said, we need to get Kurchatov working faster on this,” says Gregg Herken, emeritus professor of U.S. diplomatic history at the University of California and the author of The Winning Weapon: The Atomic Bomb in the Cold War and Brotherhood of the Bomb. Igor Kurchatov was the nuclear physicist who headed up the Soviet atomic bomb project, the Soviet equivalent, in other words, of Manhattan Project mastermind J. Robert Oppenheimer.
On August 6, 1945, just days after the Potsdam Conference ended, the U.S. bomber Enola Gay dropped the uranium bomb known as “Little Boy” on the Japanese city of Hiroshima. Despite its devastating effects, Japan didn’t offer unconditional surrender right away, as the United States had hoped. Then on August 8, Soviet forces invaded Japanese-occupied Manchuria, violating an earlier non-aggression pact signed with Japan.
Herken argues that the Soviet invasion may have had at least as great an effect on Japanese morale as the first atomic bomb. “The last hope for the Japanese government, the peace faction, was that the Soviet Union might actually agree to negotiate a peace with the United States as a neutral party,” he explains. “But once the Soviets invaded Manchuria, it was clear that was not going to happen.”
On August 9, U.S. forces dropped “Fat Man,” a plutonium bomb, on Nagasaki. Together, the two bombs dropped in Japan would kill more than 300,000 people, including those who died instantly and those who perished from radiation and other lingering effects of the explosions.
Emperor Hirohito announced Japan’s unconditional surrender via radio address on August 15, bringing World War II to a close. In the peace negotiations at Yalta, as at Potsdam, the ideological gulf between the Soviet Union and its Western allies solidified, particularly when it came to the fate of Eastern Europe.
Even today, historians continue to disagree over whether or not the Truman administration made the decision to drop the atomic bomb for political reasons, namely, to intimidate the Soviet Union, rather than strictly military ones.
“The bomb was so top secret that there were no formal meetings about it, there was no official discussion about what to do, there wasn't the kind of decision-making process that we have with most kinds of policy,” says Campbell Craig, professor of international relations in the School of Law and Politics at Cardiff University and co-author of The Atomic Bomb and the Origins of the Cold War (with Sergey Radchenko). “So a lot of our opinions about what really drove the United States to drop the bomb is guesswork.”
Whatever the U.S. intention had been at Hiroshima and Nagasaki, Stalin certainly saw U.S. possession of the atomic bomb as a direct threat to the Soviet Union and its place in the post-war world, and he was determined to level the playing field. Meanwhile, thanks to atomic espionage, Soviet scientists were well on their way to developing their own bomb.
Some members of Truman’s administration would argue in favor of cooperation with the Soviets, seeing it as the only way to avoid a nuclear arms race. But an opposing view, articulated by State Department official George Kennan in his famous “Long Telegram” in early 1946, would prove far more influential, inspiring the Truman Doctrine and the “containment” policy toward Soviet and communist expansionism around the globe.
Later in 1946, during the first meeting of the United Nations Atomic Energy Commission (UNAEC), the United States presented the Baruch Plan, which called for the Soviets to share every detail of their atomic energy program, including opening their facilities to international inspectors, before the United States would share anything with them. Surprising no one, the Soviets rejected these terms.
“The Baruch Plan would have required the Soviets to basically surrender their sovereignty for them to have any share in atomic energy,” Herken says. “Stalin was the last person to want to do that.”
By 1949, all thoughts of cooperation were off the table: On August 29, the Soviets successfully tested their own nuclear device, producing a 20-kiloton blast roughly equal to the Trinity test. The nuclear arms race that would define the rest of the Cold War was on, as the two superpowers battled to see who could amass the most weapons of mass destruction, and figure out how to deploy them most effectively.




 

On August 6, 1787, in Philadelphia, delegates to the Constitutional Convention begin debating the first complete draft of the proposed Constitution of the United States.
The Articles of Confederation, ratified several months before the British surrender at Yorktown in 1781, provided for a loose confederation of U.S. states, which were sovereign in most of their affairs. On paper, Congress, the central authority, had the power to govern foreign affairs, conduct war, and regulate currency, but in practice these powers were sharply limited because Congress was given no authority to enforce its requests to the states for money or troops. By 1786, it was apparent that the Union would soon break up if the Articles of Confederation were not amended or replaced. Five states met in Annapolis, Maryland, to discuss the issue, and all the states were invited to send delegates to a new constitutional convention to be held in Philadelphia.
On May 25, 1787, delegates representing every state except Rhode Island convened at Philadelphia’s Pennsylvania State House for the Constitutional Convention. The building, which is now known as Independence Hall, had earlier seen the drafting of the Declaration of Independence and the signing of the Articles of Confederation. The assembly immediately discarded the idea of amending the Articles of Confederation and set about drawing up a new scheme of government. Revolutionary War hero George Washington, a delegate from Virginia, was elected convention president.
During an intensive debate, the delegates devised a brilliant federal system characterized by an intricate system of checks and balances. The convention was divided over the issue of state representation in Congress, as more-populated states sought proportional legislation, and smaller states wanted equal representation. The problem was resolved by the Connecticut Compromise, which proposed a bicameral legislature with proportional representation in the lower house (House of Representatives) and equal representation of the states in the upper house (Senate).
On September 17, 1787, the Constitution of the United States of America was signed by 38 of the 41 delegates present at the conclusion of the convention. As dictated by Article VII, the document would not become binding until it was ratified by nine of the 13 states.
Beginning on December 7, five states, Delaware, Pennsylvania, New Jersey, Georgia, and Connecticut, ratified it in quick succession. However, other states, especially Massachusetts, opposed the document, as it failed to reserve undelegated powers to the states and lacked constitutional protection of basic political rights, such as freedom of speech, religion, and the press. In February 1788, a compromise was reached under which Massachusetts and other states would agree to ratify the document with the assurance that amendments would be immediately proposed. The Constitution was thus narrowly ratified in Massachusetts, followed by Maryland and South Carolina. On June 21, 1788, New Hampshire became the ninth state to ratify the document, and it was subsequently agreed that government under the U.S. Constitution would begin on March 4, 1789. In June, Virginia ratified the Constitution, followed by New York in July.
On September 25, 1789, the first Congress of the United States adopted 12 amendments to the U.S. Constitution, the Bill of Rights, and sent them to the states for ratification. Ten of these amendments were ratified in 1791. In November 1789, North Carolina became the 12th state to ratify the U.S. Constitution. Rhode Island, which opposed federal control of currency and was critical of compromise on the issue of slavery, resisted ratifying the Constitution until the U.S. government threatened to sever commercial relations with the state. On May 29, 1790, Rhode Island voted by two votes to ratify the document, and the last of the original 13 colonies joined the United States. Today, the U.S. Constitution is the oldest written constitution in operation in the world.

 


 


On August 6, 1890, at Auburn Prison in New York, the first execution by electrocution in history is carried out against William Kemmler, who had been convicted of murdering his lover, Matilda Ziegler, with an axe.
Electrocution as a humane means of execution was first suggested in 1881 by Dr. Albert Southwick, a dentist. Southwick had witnessed an elderly drunkard “painlessly” killed after touching the terminals of an electrical generator in Buffalo, New York. In the prevalent form of execution at the time, death by hanging, the condemned were known to hang by their broken necks for up to 30 minutes before succumbing to asphyxiation.
In 1889, New York’s Electrical Execution Law, the first of its kind in the world, went into effect, and Edwin R. Davis, the Auburn Prison electrician, was commissioned to design an electric chair. Closely resembling the modern device, Davis’ chair was fitted with two electrodes, which were composed of metal disks held together with rubber and covered with a damp sponge. The electrodes were to be applied to the criminal’s head and back.
On August 6, 1890, William Kemmler became the first person to be sent to the chair. After he was strapped in, a charge of approximately 700 volts was delivered for only 17 seconds before the current failed. Although witnesses reported smelling burnt clothing and charred flesh, Kemmler was far from dead, and a second shock was prepared. The second charge was 1,030 volts and applied for about two minutes, whereupon smoke was observed coming from the head of Kemmler, who was clearly deceased. An autopsy showed that the electrode attached to his back had burned through to the spine.
Dr. Southwick applauded Kemmler’s execution with the declaration, “We live in a higher civilization from this day on,” while American inventor George Westinghouse, an innovator of the use of electricity, remarked, “They would have done better with an axe.”

 

On August 6, 1965, President Lyndon Baines Johnson signs the Voting Rights Act, guaranteeing African Americans the right to vote. The bill made it illegal to impose restrictions on federal, state and local elections that were designed to deny the vote to blacks.
Johnson assumed the presidency in November 1963 upon the assassination of President John F. Kennedy. In the presidential race of 1964, Johnson was officially elected in a landslide victory and used this mandate to push for legislation he believed would improve the American way of life, which included stronger voting-rights laws. A recent march in Alabama in support of voting rights, during which blacks were beaten by state troops, shamed Congress and the president into passing the law, meant to enforce the 15th Amendment of the Constitution ratified by Congress in 1870.
In a speech to Congress on March 15, 1965, Johnson had outlined the devious ways in which election officials denied African-American citizens the vote. Blacks attempting to vote were often told by election officials that they gotten the date, time or polling place wrong, that the officials were late or absent, that they possessed insufficient literacy skills or had filled out an application incorrectly. Often African Americans, whose population suffered a high rate of illiteracy due to centuries of oppression and poverty, would be forced to take literacy tests, which they inevitably failed. Johnson also told Congress that voting officials, primarily in southern states, had been known to force black voters to “recite the entire constitution or explain the most complex provisions of state laws”–a task most white voters would have been hard-pressed to accomplish. In some cases, even blacks with college degrees were turned away from the polls.
Although the Voting Rights Act passed, state and local enforcement of the law was weak and it was often outright ignored, mainly in the South and in areas where the proportion of blacks in the population was high and their vote threatened the political status quo. Still, the Voting Rights Act gave African-American voters the legal means to challenge voting restrictions and vastly improved voter turnout. In Mississippi alone, voter turnout among blacks increased from 6 percent in 1964 to 59 percent in 1969. In 1970, President Richard Nixon extended the provisions of the Voting Rights Act and lowered the eligible voting age for all voters to 18.

 


 

 

 

No comments:

Post a Comment

Random Political Memes/Cartoons Dump - 11.9.2025

  Ronald Reagan Quotes On Government: If it moves, tax it. If it keeps moving, regulate it. And if it stops moving, subsidize it. Gover...