Did Germany Declare War on the United States in WW2?

Hitler publicly blamed the war on the failure of FDR's New Deal.
... Photos.com/Photos.com/Getty Images

The United States entered World War II on the Pacific front following the Japanese attack on the Pearl Harbor naval base in Hawaii, on December 7, 1941. Four days later, Nazi Germany and Fascist Italy, Japan's allies, declared war on the United States, drawing America officially into the war's European theater.

1 A Fateful Move

In September 1940, a year after the Nazis had incited World War II by invading Poland, Japan, Germany and Italy signed the Tripartite Act, promising military assistance to each other if any party was attacked by a country not involved in the war at the time. The agreement was meant to discourage the U.S., which had declared neutrality, of entering the war. Though Germany was not obligated to declare war on the U.S. following the assault on Pearl Harbor because Japan was the aggressor, Nazi leader Adolf Hitler decided to make the first move toward what he believed was an inevitable conflict. Angry at American attacks on German U-boats and banking on Japan's ability to prevail in the Pacific, on December 11, Germany declared war on the United States.

2 Hitler's Speech to the Reichstag

The same day Germany declared war on America, Hitler addressed his parliament, or Reichstag. In a now famous speech, he blamed the Japanese attack and World War II on President Franklin Delano Roosevelt. Influenced by Jewish interests, Hitler claimed, FDR had instigated the war to detract attention from the shortcomings of the New Deal in lifting the American economy out of the Great Depression.

Since beginning her career as a professional journalist in 2007, Nathalie Alonso has covered a myriad of topics, including arts, culture and travel, for newspapers and magazines in New York City. She holds a B.A. in American Studies from Columbia University and lives in Queens with her two cats.