100,000+ Expiring & Pre-Release Domains published DAILY to provide High Domain Authority, Link Rich Domain Names. Come visit every day for the latest FREE premium domain listings

Check List for modern day website owner Part 1

The domain has been purchased, the name has been registered and the content has been built so here are a few easy steps to follow to ensure Search Engine Optimization rules your world.

1) Install the right tools!Google analytics

Any disconcerting work man will tell you the right tools is the difference between a good job and a bad one, ensuring your domain wins the SEO battle you will need the right weapons up your sleeve and I would would recommend you install Google Analytics and Google Webmaster Tools. Two programs that we at Expired Domains constantly utilize to track errors, visitors, traffic sources, key word impressions and many more. The Google Analytics phone app is neat too, which means you can monitor your site anytime and anywhere.

 

2) Identify your target Keywords.

KeywoKeywords_2rds are king and beyond important in ensuring your domain becomes an SEO success story. Focus on high volume but low competition keywords but keep adding a smaller tail of keywords to your content to ensure you have a saturated your target market.

 

 

 

3) Cloaking is only a good thing when it comes to an interstellar battle in Star Trek!

Cloaking in USS_Enterprise_(NCC-1701-A)domain terms is showing one version of a page to users, but a different version to search engines. Google, Bing etc want to see the identical results users are seeing and will penalize any site that uses this type of outdated black hat tactics. If in doubt just ensure that technically any hidden text, hidden links or cloaking is avoid.

 

 

4) Speed Demon 

Go fasterSpeeddemon, faster, faster when it comes to your website’s server speed and page loading time (commonly called “site performance”)the speed will impact the user experience and SEO, as well. A site’s accessibility issue for users and spiders is crucial, the longer the web server response time, the longer it takes for your web pages to load. Slow page-loading times can loose you customers and increase bounce rates as your site visitors get bored and leave, it will also slow down search engine spiders so less of your site gets indexed.

You need a fast, high-performance server that allows search engine spiders to crawl more pages per sequence and that satisfies your human visitors, as well. Keep in mind bad website design can play a key role in site performance, so ensure this is a key focus for you once you have bought that domain. Google has an awesome free tool tool Page Speed Insights to analyze a site’s performance. Last but not least in this competitive world of SEO site performance is really seen now by Google as a tie breaker between otherwise equal web or mobile results.

 

5) Redirect, redirect, redirect…

Y301redirectes this can get confusing, especially when you are managing a website with a lot of content but be careful one wrong move here and you may feel the penalties for a long time to come. Firstly when you move a web page to a different URL, make sure you’re using the right type of redirect, and that you’re also redirecting users to the most appropriate page. A 301 tells the search engine to drop the old page from its index and replace it with the new URL. Search engines transfer most of the link equity from the old page to the new one, so you won’t suffer a loss in rankings.

Mistakes are common with redirects. A webmaster, for example, might delete a web page but neglect to set up a redirect; this causes users to get a “Page Not Found” 404 error. Furthermore, sneaky redirects in any form, whether they are user agent/IP-based or redirects through JavaScript or meta refreshes, frequently cause ranking penalties. In addition, we recommend avoiding 302 redirects altogether. A 302 (temporary) redirect signals that the move will be short-lived, and therefore search engines do not transfer link equity to the new page. Both the lack of link equity and the potential filtering of the duplicated page can hurt your rankings.

6) Keep your content fresh and original on each page.

Original-Content-Here.jpgThis can actually be tricky, especially if you have tag lines which you want to be repeated on multiple pages. Here at Expired Domains we have made that mistake, and once fixed almost imediately reaped the Google rewards. So if you haven’t already fix and prevent duplicate content issues within your site. Search engines get confused about which version of a page to index and rank if the same content appears on multiple pages. Ideally, you should only have one URL for one piece of content. When you have duplicated pages, search engines pick the version they think is best and filter out all the rest. You lose out on having more of your content ranked, and also risk having “thin or duplicated” content, something Google’s Panda algorithm penalizes.

If your duplicate content is internal, such as multiple URLs leading to the same content, then you can decide for the search engines by deleting and 301-redirecting the duplicate page to the original page. Alternatively, you can use a canonical link element also known as a cononical tag to communicate which is the primary URL.

 

7) Error pages are important too!is-this-the-best-internet-explorer-error-message-ever--12d66f9327

I learned this a long time ago in my CRM days but error pages really are important, they convey a message that resonates with the end user. Generic “Page Not Found” HTML Error 404 message (such as the one shown below), will send your site visitors away never to come back.

standard 404 error

 

Most website visitors simply click the Back button when they see that standard 404 error, leaving your site forever. Since it’s inevitable that mistakes happen, and people will get stuck sometimes, you need a way to help them at their point of need. To keep people from jumping ship, create a custom 404 error page for your website.

First, make the page. A custom 404 page should do more than just say the URL doesn’t exist. While some kind of polite error feedback is necessary, your customized page can also help steer people toward pages they may want with links and other options. Additionally, you want your 404 page to reassure wayward visitors that they’re still on your site, so make the page look just like your other pages (using the same colors, fonts and layout) and offer the same side and top navigation menus. In the body of the 404 page, here are some helpful items you might include:

  • Apology for the error
  • Home page link
  • Links to your most popular or main pages
  • Link to view your sitemap
  • Site-search box
  • Image or other engaging element

Since your 404 page may be accessed from anywhere on your website, be sure to make all links fully qualified (starting with http).

Next, tell your server. Once you’ve created a helpful, customized error page, the next step is to set up this pretty new page to work as your 404 error message. The setup instructions differ depending on what type of website server you use. For Apache servers, you modify the .htaccess file to specify the page’s location. If your site runs on a Microsoft IIS server, you set up your custom 404 page using the Internet Information Services (IIS) Manager. WordPress sites have yet another procedure.

We should note that some smaller website hosts do not permit custom error 404 pages. But if yours does, it’s worth the effort to create a page you’ve carefully worded and designed to serve your site visitors’ needs. You’ll minimize the number of misdirected travelers who go overboard, and help them remain happily on your site.

7. Robots are in charge

If you did not already know this the first thing a a search engine looks for upon arriving at your site is a text file kept in the root directory of a website that instructs spiders which directories can and cannot be crawled called a robots txt file. With simple “disallow” commands, a robots.txt is where you can block indexing of:

  • Private directories you don’t want the public to find
  • Temporary or auto-generated pages (such as search results pages)
  • Advertisements you may host (such as AdSense ads)
  • Under-construction sections of your site

Every site should put a robots.txt file in their root directory, even if it’s blank, since that’s the first thing on the spiders’ checklist. But handle your robots.txt with great care, like a small rudder capable of steering a huge ship. A single disallow command applied to the root directory can stop all crawling — which is very useful, for instance, for a staging site or a brand new version of your site that isn’t ready to be launched but remember websites have been known to sink without a trace in the SERP’s because the webmaster forgot to remove that disallow command when the site went live.

Google offers a robots.txt checker in Google Webmaster Tools that checks your robots.txt file to make sure it’s working as you need. At expired domains we also frequently use the  Fetch as Google Tool tool if there’s any question about how a particular URL may be indexed. This tool simulates how Google crawls URLs on your website, even rendering your pages to show you whether the spiders can correctly process the various types of code and elements you have on your page.

So that 7 tips to get you started, or you could go to our own expired domains search engine and pick up a few more domains to optimize! Expired Domains.

Comments on this entry are closed.