Website analysis examines issues about the website domain, the ranking of the URL, the level and quality of website visitor traffic, the construction of the website, the links to other web pages, and diagnoses any operational or software problems.
|SEE ALSO: How to Increase Website Traffic|
Review of Domain Issues
Domains rank globally based on the strength of the domain’s authority and relevance. Domain authority increases when other domains that have a high authority link to a domain. Websites also have a page ranking that shows the popularity of the website when compared to other websites.
A crawler tool scan of the website shows any broken links. The links to and from a website are evaluated for quality. Quality links, for example from noted authority websites, are helpful in improving the ranking listing on the search engine results page (SERP). Links going out like spam or coming back from poor quality websites actually harm SERP rankings.
Another domain issue is the DMOZ category. DMOZ is a web directory where the editors decide the category for a website. Besides DMOZ, a listing of the website should be in the Yahoo Directory. It should show on the web archive at www.archive.org, as well in the indices of the major search engines of Google, Yahoo, and Bing.
The WHOIS record for the domain should be up-to-date with accurate information. The load time and speed for the web pages needs to be as quick as possible, because excess latency will disturb visitors’ experiences when viewing the website.
The website needs to have a good appearance and readability on all major browsers, including Chrome, Firefox, Edge, Internet Explorer, and Safari. Additionally, the website display on mobile devices needs checking as part of the website analysis.
Website analysis of the traffic to the website considers a number of factors, including:
- Alexa ranking of traffic
- Number of visitors
- Number of unique visitors
- Page views
- Bounce rate
- Time spent on website
Trend analysis shows the changes over time and in response to certain media campaigns.
Analysis of each web page determines the number of links on the page and the page ranking. Each web page undergoes keyword analysis that includes checking keyword frequency rates. Evaluation of the social media popularity of each web page comes from using a measure of the “WuzzRank.”
This report includes all the errors found during the website analysis. It also shows potential problems that negatively impact search engine results and makes recommendations about things to do for improvement.
Website Analysis Tools
Here are some very useful website analysis tools available for either free or for a free trial that are easy enough to use for beginners and provide lots of good information for advanced webmasters:
2. Moz Pro Tools (30-day free trial) give information about website rankings, links, and SEO problems using the Crawl-Test Tool. Other Moz tools help with keyword searches, page optimization, and give an action list of things to do to improve website functionality, performance, and ranking.
5. The Screaming Frog SEO Spider (lite version is free) is a nice tool to locate and fix SEO problems.
7. The Google Location Changer makes it easy to change locations temporarily, while doing keyword research.
8. If one wants to see a website as the Google search engine sees it, they can use the Fetch as Google tool.
9. Google has very helpful Google Webmaster Tools, which are free.
10. SEO SiteCheckup has dozens of free software tools that address specific issues. A webmaster can use the free tools individually from the website or register for a free account in order to download all the free tools.
Here are the specialized tools:
Backlinks – Checks the quality of backlinks to avoid problems with search engines.
Broken Links – Tests for any broken links.
Canonical Tag Checker, URL and IP Canonicalization – These tools check to see if similar URLs or IP addresses show the same web page, such as www.website.com and website.com, which may confuse search engines. The canonical tag tells the search engines which page to index when there is more than one web page with duplicate content.
Common Keywords – Shows the keywords used most frequently on the web page.
CSS and JS Minimalization – Checks to see if CSS and JS files area minimal size.
Directory Browsing – Disabling directory browsing is a good idea for better security.
Disallow Directive Checker – This shows what parts of the website that search engines are not indexing.
Doctype– Checks for the doctype declaration, which tells each browser the version of HTML used.
Favicon – Checks to see if the website is using favicons properly to make it easier to find the web page as a bookmark.
Flash Test – Checks to see if the web page is using Flash. New functions in HTML 5 replaced Flash.
Frameset – Checks to see if frames are used, which may harm search engine rankings.
Google Analytics – Checks to see if the web pages connect properly to Google Analytics.
Google Search Results – This shows the web page title (should be less than 70 characters), web page URL, and Meta description (should be less than 160 characters).
H1 and H2 Tags – Shows the header tags used by the website.
HTML Compression/GZIP – Checks to see if the web page is using HTML compression to make the web page smaller and load faster.
HTML Page Size – Checks the overall page size of the HTML code.
HTTPS – This test checks for the use of secure socket Layer (SSL) encryption.
Image Alt – Checks to see if any images are missing their “alt” attributes. Search engines use the alt information that describes the photos or graphics.
Image Expires Tag – Checks to see if the website uses an image expired tag that sets a date when an image will expire. Browsers use a cached version of the image until the expiration date, so the web pages load more quickly on a subsequent visit.
Inline CSS – Checks for inline CSS properties that are a style attribute with a specific tag. These inline CSS attributes should be one external file so that individual web pages load more quickly.
JS Error – Checks to see if there are any java script errors.
Keywords Cloud – Gives a quick visual diagram of keyword usage. Keywords displayed in larger fonts are more frequently used.
Libwww-perl Access – A useful test to eliminate a security vulnerability.
Media Query Responsiveness – This checks for the ability to display content optimized for different devices, so that the website looks good and is readable on all platforms and all devices.
Meta Tags Analysis – Checks for Meta description, Meta titles, Meta keywords, and Meta robot tags.
Microdata Schema – The use of microdata in a web page helps search engines create more detailed search results.
Mobile Snapshot – Shows how the website displays on a mobile device.
Nested Tables – Checks for use of nested tables that slow down web page loading speed.
Nofollow Tag Checker – Tells a search engine not to follow outgoing links from web pages.
Noindex Tag Checker – A noindex tag tells search engines not to show the web page in search engine results.
Outdated HTML Code – This test finds old HTML code for replacement with better code to work with newer browsers.
Page Cache Test – Checks to see the elements that are frequently used are stored in the server cache for more rapid presentation.
Page Objects – Checks to see if the objects requested by a web page exist and are retrievable.
Plaintext Emails – Identifies the public publishing of email addresses in plain text. It is important to fix this problem, to disable the capturing of email addresses by automatic web-crawler bots looking for email addresses to spam.
Robots.txt – Tests for the presence of the robots.txt file that tells search engines not to index certain files.
Safe Browsing – Checks for malware on the website or security risks.
SEO Friendly URL – Checks the web page URLs to see if the web page links are SEO friendly with no spaces, underscores, or unusual characters in the URL title. It is best to use dashes in the web page URLs, which the Google search engine recognizes as spaces between the words.
Server Signature – Checks to make sure the server signature is set to OFF to prevent public disclosure of software versions that are in use on the server.
Site Loading Test – Checks to see how fast the site loads.
Sitemap – Tests for a sitemap file showing the index of all the web pages on the web site.
Social Media – Checks to see if a web page links to social media.
SPF records – Checks for an SPF record on the DNS server. This tells if email is coming from a legitimate source.
URL Redirects – Finds any redirects that cause search engine ranking problems.
There are many software tools, which are useful to fine-tune a website. Doing a comprehensive website analysis is the first step to identify problems, before making the necessary changes.