Liberalism in the United States |
Liberalism in the United States is a broad
Since the 1930s, the term liberalism (without a qualifier) usually refers in the United States to
According to
The origins of American liberalism lie in the political ideals of the
In the late 18th and 19th centuries, the United States extended liberty to ever broader classes of people. The states abolished many restrictions on voting for white males in the early 19th century. The Constitution was amended in 1865 to abolish slavery and in 1870 to extend the vote to black men.[14]