There is some debate as to what pulled the United States out of the Great Depression. Some people say it was World War II that marked the end of the Great
Depression, while others say it was Roosevelt's New Deal policies that finally helped set the U.S. straight. In a one-to-two paragraph response, what do you
think helped pull the U.S. out of the depression? What is your reasoning?