These recent Netflix documentaries about the Agricultural world will have you thinking twice about how the farming industry is really portrayed to fellow Americans.
Some things to consider while watching these documentaries:
- Are some of the facts they throw out unrealistic and too extreme? Or maybe just a little too exposed?
- Do you think documentaries like these will really change the way people eat in the United States?
What do you really think of documentaries like this? Tell us in the comment section below!