The history of the American city is, in many ways, the history of the United States. Although rural traditions have also left their impact on the country, cities and urban living have been vital components of America for centuries, and an understanding of the urban experience is essential to compreh..