#florida_western

Florida Western

The term Florida Western is used to describe a small number of films and literary works set in the 19th century, particularly around the time of the Second Seminole War. Not a significant number of these films have been made, as most Hollywood and other genre Westerns are usually located in other regions of the United States, particularly the former frontier territories of "the West".

Sun 30th

Provided by Wikipedia

Learn More
0 searches
This keyword has never been searched before
This keyword has never been searched for with any other keyword.