I grew up in Maryland and recently moved to lower Delaware to escape the D.C. area's over-development. I'm thinking of heading somewhere warm for the winter and am researching Florida, particularly the West coast, where I hear there is a lot of nature and wildlife. I've visited Miami and , West Palm Beach, and Orlando, but never the gulf coast. As someone who used to live there, can you offer any advice?
sort by best latest
You can help the HubPages community highlight top quality content by ranking this answer up or down.