[Skip navigation links]

Site Report

This tab shows site quality issues, including broken links and server configuration problems.

  • FailureBroken links - Some pages contain links that don't work.
  • SuccessServer configuration - No issues found.
  • SuccessASP, ASP.NET and PHP script errors - No issues found.
  • SuccessInternet RFCs - No issues found.
PriorityDescription and URLGuideline and Line#Count

Priority 1

1 issues on 10 pages

CriticalThis link is broken. The url scheme is unrecognized. Broken Link 10 pages
The URL scheme should be http: https: file: mailto: data: or javascript: Links using the skype: scheme only work when the user has Skype installed and don't work with Skype Lite. mailto: should not be followed by // or contain more than one colon.
Link: tg://resolve?domain=+74996495662
http://diploms-service.com/
Line 90 93 96 1508 1510 ...
Link: tg://resolve?domain=+74996495662
http://diploms-service.com/contacts
Line 101 104 107 405 411 ...
Link: tg://resolve?domain=+74996495662
http://diploms-service.com/delivery
Line 94 97 100
Link: tg://resolve?domain=+74996495662
http://diploms-service.com/faq
Line 94 97 100
Link: tg://resolve?domain=+74996495662
http://diploms-service.com/info
Line 94 97 100
Link: tg://resolve?domain=+74996495662
http://diploms-service.com/price
Line 98 101 104
Link: tg://resolve?domain=+74996495662
http://diploms-service.com/professii
Line 100 103 106
Link: tg://resolve?domain=+74996495662
http://diploms-service.com/rewiews
Line 94 97 100
Link: tg://resolve?domain=+74996495662
http://diploms-service.com/warranty
Line 94 97 100
Link: tg://resolve?domain=+74996495662
http://diploms-service.com/zakazat
Line 94 97 100

Informative

These messages are for information only and do not indicate errors or conformance problems

InformationSome links were not visited because they were blocked by a robots.txt file or robots meta tag. Blocked Links 4 pages
A robots.txt file is a digital "keep out" sign placed on a web server by the server administrator. Adding the following entries to the robots.txt file will bypass any blocks intended for other web crawlers:
User-agent: PowerMapper
Allow: /
            
http://diploms-service.com/media/plg_jchoptimize/assets2/jscss.php?f=42202e41771b42ff61b711548c4cbf83&type=js&gz=gz&i=0Line 1
http://diploms-service.com/media/plg_jchoptimize/assets2/jscss.php?f=79961f29ea34ef83b714747080364a95&type=css&gz=gz&i=0Line 1
http://diploms-service.com/media/plg_jchoptimize/assets2/jscss.php?f=bfa3b6a6e4540b978535f718db257603&type=css&gz=gz&i=0Line 1
http://diploms-service.com/media/plg_jchoptimize/assets2/jscss.php?f=ebb13412f6dcffa602c70fa5341bf256&type=css&gz=gz&i=0Line 1
InformationSpell checking was not enabled for this scan. 1 pages
If you want to check spelling, set the language using the Edit Scan command in OnDemand.
http://diploms-service.com/Line 1
Expand all 3 issues