Domain Trust (Trust Rank) and Domain Authority
- Who links to you? Use Google Webmaster Tools
- Who do you link to?
- Registration Info
- User Data Signals
- Link Juice/Page Rank-DmR (Domain Moz Rank)
- Diversity of Link Sources
- Temporal Analysis
- Distribution Analysis
Best prediction about how a website will perform in search engine rankings and is likely to rank in Google.com’s search results. Use Domain Authority when comparing one site to another or tracking the “strength” of your website over time. We calculate this metric by combining all of our other link metrics (linking root domains, number of total links, mozRank, mozTrust, etc.) into one single score. Over 150 signals are included in this calculation. Your website’s Domain Authority score will often fluctuate. For this reason, it’s best to use Domain Authority as a competitive metric against other sites as opposed to a historic measure of your internal SEO efforts. Whereas Domain Authority measures the predictive ranking strength of entire domains or subdomains, Page Authority measures the strength of the individual page. The same is true for metrics such as MozRank and MozTrust. You can measure Domain Authority using Open Site Explorer or the MozBar, SEOmoz’s free SEO toolbar.
It is based on the Linkscape web index and includes link counts, mozRank, mozTrust, and dozens more. It uses a machine-learning model to predictively find an algorithm that best correlates with rankings across thousands of search results that we predict against. While more specific metrics like mozRank can answer questions of raw link popularity and link counts can show the raw quantities of pages/sites linking, the authority numbers are high-level metrics that attempt to answer the question “How strong are this page’s/site’s links in terms of helping them rank for queries in Google.com?” This means the best way to influence this metric is to improve your overall SEO. Particularly you should focus on your link profile (which influences mozRank and mozTrust) by getting more links from other well linked to pages.
What do we want our visitors to do once they get to our site
- Sign up for your email newsletter, fill out a bid request, or ‘Like’ us on Facebook?
- Decide on one main goal and make it obvious.
- Who is my primary audience?
- What type of leads do I want to capture?
- Of those leads, what is their primary function on my site?
- What would best motivate them to leave their contact information behind? Need to incentivize customers to complete your lead capture system. Spend some time considering what they will value. A free estimate? A free digital download with tips? What will strike a chord with them and help them cross the finish line?
Top performing keywords (in terms of rank, traffic and lead generation)/ Current SEO rankings for important keywords
Measuring the individual keyword rank, group keywords together into indexes and then track the average rank of those indexes over time – all with Google Analytics.
- Someone clicks on organic search results
- The user lands on your site
- A custom piece of code that you install on your site collects the keyword and the rank of the result using a Google Analytics event
4. Google Analytics will automatically calculate the average position for the result
5. You create indexes of keywords and analyze the data using the Event reports OR Excel and the GA API
Number of inbound linking domains-TBD
Total number of total pages indexed
In general, the term indexing means analyzing large amounts of data and building some sort of index to access the data in a more efficient way based on some search criteria. Compare it with database indexes. Google servers are constantly visiting pages on the Internet (crawling) and reading their contents. Based on the contents Google builds an internal index, which is basically a data structure mapping from keywords to pages containing them (very simplified). Also when the crawler discovers hyperlinks, it will follow them and repeat the process on linked pages. This process happens all the time on thousands of servers.
Total number of pages that receive traffic-TBD
What are most shared or viewed content, what are most trafficked pages, what are your most ranked pages? TBD
Ranking on first page
- Provide high-quality content on your site. Use the Search Analytics report to see which queries lead to your pages, and what the click-through rate is for links to your site.
- Make your site mobile friendly.
- Use informative titles and snippets. Good, clear titles and accurate meta tag descriptions help us understand the purpose of a page and generate useful snippets in our search results. Learn more.
- Add structured data to enable additional search result features such as stars, event information, or site search boxes, which add to the user experience, making your site more valuable to readers. Read more about structured data or use our tools.
What is a sitemap?
A sitemap is a file where you can list the web pages of your site to tell Google and other search engines about the organization of your site content. Search engine web crawlers like Googlebot read this file to more intelligently crawl your site. Also, your sitemap can provide valuable metadata associated with the pages you list in that sitemap: Metadata is information about a webpage, such as when the page was last updated, how often the page is changed, and the importance of the page relative to other URLs in the site.
You can use a sitemap to provide Google with metadata about specific types of content on your pages, including video and image content. For example, you can give Google the information about video and image content:
- A sitemap video entry can specify the video running time, category, and age appropriateness rating.
- A sitemap image entry can include the image subject matter, type, and license.
Do I need a sitemap?
If your site’s pages are properly linked, our web crawlers can usually discover most of your site. Even so, a sitemap can improve the crawling of your site, particularly if your site meets one of the following criteria:
- Your site is really large. As a result, it’s more likely Google web crawlers might overlook crawling some of your new or recently updated pages.
- Your site has a large archive of content pages that are isolated or well not linked to each other. If you site pages do not naturally reference each other, you can list them in a sitemap to ensure that Google does not overlook some of your pages.
- Your site is new and has few external links to it. Googlebot and other web crawlers crawl the web by following links from one page to another. As a result, Google might not discover your pages if no other sites link to them.
- Your site uses rich media content, is shown in Google News, or uses other sitemaps-compatible annotations. Google can take additional information from sitemaps into account for search, where appropriate.
Search Console dashboard; the dashboard is the simplest way to get a quick health check on your site:
- Make sure that you aren’t experiencing an increase in errors for your site.
- Check that you don’t have any unusual dips in your click counts. Note that a weekly rhythm of weekend dips, or dips or spikes over holidays, is normal.
When your content changes
- Test that Google can access your pages using the Fetch as Google tool.
- Tell Google which pages to crawl by updating your sitemap.
- Tell Google which pages not to crawl using robots.txt or noindex tags.
- A few weeks after you post content, confirm that the number of indexed pages in your site is rising and that you don’t have any blocked resources that might impair Google’s crawling of your pages.
Adding new properties:
- New international content: Be sure to target the correct country with your site and add hreflang link tags to your pages.
If you change your site’s domain name:
- Use the change of address tool to point Google search to your new location.
Removing a page from search results:
- Use the URL removal tool and take other appropriate steps to block crawling and/or indexing.
Build and submit a sitemap:
- Decide which pages on your site should be crawled by Google, and determine the canonical version of each page.
- Decide which sitemap format you want to use. You can create your sitemap manually or choose from a number of third-party tools to generate your sitemap for you.
- Test your sitemap using the Search Console Sitemaps testing tool.
- Make your sitemap available to Google by adding it to your robots.txt file and submitting it to Search Console.
Google supports several sitemap formats, described here.
All formats limit a single sitemap to 50MB (uncompressed) and 50,000 URLs. If you have a larger file or more URLs, you will have to break your list into multiple sitemaps. You can optionally create a sitemap index file (a file that points to a list of sitemaps) and submit that single index file to Google. You can submit multiple sitemaps and/or sitemap index files to Google The three key processes in delivering search results to you are: