I dunno, as much as I love The Walking Dead I have to admit it becomes less appealing with each season. I loved season 1, even though it was way too short, season 2 was just as good as he first in my opinion, then season 3 not so much. I was still good, but not even close to as good as previous seasons, then season 4 took another step in the wrong direction for me. I'm hoping season 5 will pick up again, and if not, that the spin-off is better.
I've been bored with all the seasons so far - and now even the comic is getting all soap opera-y....the best thing in the comic was the idea, that if there were no social rules, then what happens? The tv show has been sanitised so all those ideas are gone, and the same social rules are in play.