Having talked to a lot of people about this show, I’ve received mixed opinions about it. You either love it or you hate it, but I noticed those that don’t watch it dislike it for the wrong reasons.
The most common reason is that “it’s just another show about zombies”. While zombies (or “walkers” as the show likes to call them) are part of the story, they are an element of the environment, not the main focus. The Walking Dead focuses on the moral and social implications of what humanity would become were civilized society and law be replaced by the fundamental need for survival. It goes so far as to teach how deceptive and untrustworthy humanity can become towards each other. If it was just another zombie survival show, it wouldn’t have grabbed my attention. The Walking Dead, however, has maintained my interest because of the inner struggles the group has faced when it comes to leadership, dealing with other survivors, and ultimately trying to build a life for themselves.
While the show hasn’t always done this flawlessly (Season 2…), it has for the most part never failed to entertain. The characters experience a vicious cycle where things go from good, to bad, to worse, and to more worse. Sometimes you wish that they would just catch a break! I can’t wait to see what happens next.