The United States officially entered World War II on December 8, 1941.  The war in the Pacific formally ended on September 2, 1945. A recent documentary on one of the history channels chronicled the path the United States took from a nation with an underdeveloped…