Technical Search Engine Optimization

9-technical-staffA lot of people overlook the technical issues when they start their SEO plans, but these issues are important and can make or break your website. You don’t have to be a PhD to help boost this area of your website. Below are some easy ways to handle your technical issues with a simple notepad program and some text files that even a non-techy can manage on their own.

.htaccess files can help you do some pretty amazing things. If you have problems saving your .htaccess files, save them on your computer as something like x.htaccess, then change the file name when you uploaded them via your ftp program.

Canonical Issues
Your canonical url is the url you want your website visitors to see. You want all links on your site to point to the main url for search engine popularity ranking. The best way to solve this problem is to be sure your non www links redirect to your www links. This will correct any mistakes in your coding and avoid your link popularity from being divided (the search engine will read a www url and a non-www url as two separate pages. To have all of your links redirect to the www version, copy this coding below. Simply change the domain to your own.

~~~~copy below this line~~~?
RewriteEngine On
?RewriteCond %{HTTP_HOST} ^yourwebsite\.com [nc]?
RewriteRule (.*)$1 [R=301,L]?
~~~~stop copying~~~

301 & 302 Redirects
These redirects will place your visitors where you want them. They are especially useful if you deleted a page but have a bunch of back links that point to the now absent page. You do not want to lose the popularity those back links provide to you. A 301 redirect will force your hosting computer to automatically redirect your visitors to the new location. 302 redirects are only temporary fixes, and will damage your popularity rating, so always be sure to use the permanent 301 redirect below.

~~~~copy below this line~~~?
redirect 301/oldlocation
?redirect 301/oldlocation2
redirect 301/oldlocation3
?~~~~stop copying~~~

Robots.txt Files

A robots.txt feature is not necessary for the search engines to grab your content. It is advisable that you at least upload a blank robot.txt file so that you can avoid having 404 file not found errors. If you do anything with your robot.txt file, and even when you first upload a new one its best to use Google’s free robots.txt validator, found in their webmaster central area. An invalid robots.txt will prevent search engines from being capable of crawling your site. Failing to do so can be a very costly mistake!

Log Files
Depending on your hosting provider, you may not see a log file. If this text file is missing, call your provider and have them turn it on for you. A log file is a great way to keep track of the hints that will help you figure out a difficult SEO issue should one arise. Even if the problem is out of your realm of understanding, you can save a lot of money by sending that log file to an expert versus not having one at all and making the expert search for the problem without those hints.

Most of what you see in the log file you will not fully understand. The important thing to look for is 301 redirects. Make sure any 301 redirects do not point to another 301 redirect. Search for any 302 redirects and change those to 301 redirects, for the reasons listed above. If you find any 404 file not found errors, either upload a new file to replace the missing one or change those to 301 redirects so you don’t lose any popularity on those links.

These simple tasks can help fine tune your SEO without calling in a professional.