Google recommends that this is useful to identify a web page's poor efficiency in search engine result, considering that you will manage to diagnose crawling mistakes. If Google is unable to render the page as you plan for Googlebot to view it, then that could possibly have a negative result on your ranking in search engine results page.

The new Fetch and Render option takes longer than the common Fetch option, which examines the crawlability of your website and afterwards permits you send pages to Google's index.

Fetch As Google will certainly not render anything being blocked by robots.txt. If you are prohibiting the crawling of a few of your files, then Google will not be able to show them to you in the rendered view.