This side is for protection the domain from so-called bad webcrawler (spambots). Often it is the case that personal data e.g. the bank account is on a web page. These data are a high risk to have in the search machines. For this the big search engines have defined a agreement to exclude sites from search engines. *
You are following a link which was generated for these spambots and so your ip is banned.
In order to access the site again you must enter the numbers shown in the picture into the field and submit - then you have immediately again access to this Website.
* To exclude sites you create a file called robots.txt in your root directory. In these file you can define rules which sites can indexed or not. Search engines like MSN,Google and Yahoo are handling the rules correct, but not bad robots. To deny the spambot the access to the content the spiders were banned by ip. More information about the robots.txt standard are here - https://www.robotstxt.org/robotstxt.html