Here are some of the issues that may result in not indexing of your pages -
Robots.txt - If your robots.txt file has this line in it; User-agent: * Disallow: /, it means that you are telling the crawler to take a hike and not index ANY of your site's content.
.htaccess - This is an invisible file that lets you toggle visibility in most modern text editors and FTP clients. If this is not configured properly, there may be infinite loops that won’t let your site load at all.
Meta tags - Make sure that the page(s) that's not getting indexed don’t have these Meta tags in the source code: <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
Sitemaps - Your sitemap isn't updating for some reason, and you keep feeding the old/broken one in Webmaster Tools.
URL parameters – Here you can set URL parameters which tell Google what dynamic links you do not want to get indexed. However, incorrectly configuring parameters can result in pages from your site to be dropped while indexing and therefore it is recommended not to use this parameter unless necessary.