While I don't think individuals or families should be responsible for growing everything they eat, I think there's some benefit to knowing where your food comes from. We've become disconnected from the fact that all that stuff in the meat department used to be living animals, and our produce comes from all over the world, rather than the farm up the road. This is especially true of children, who often have no clue that tomatoes grow on vines, and potatoes grow under the soil.
Also, there are health benefits to being outside, with your hands in the dirt. As someone who grows some of her own vegetables and herbs, I feel better and happier after working with my hands in the soil, and there are some studies that say this has to do with beneficial bacteria.
And really, there's nothing better than a ripe tomato you've grown yourself, still warm from the sun.
(No comment on chickens here. I'm a vegan.)