Free SEO Tools for Busy Bees: Maximize Impact Without Maxing Out Your Budget
In the current highly competitive and ever-evolving online landscape, it has become increasingly crucial for small businesses and entrepreneurs to have convenient and free access to a wide array of SEO tools. These tools are paramount for optimizing the overall performance of their websites and effectively driving copious amounts of organic traffic to their online platforms. By utilizing these invaluable resources, businesses can gain a strategic advantage, increase their online visibility, and ultimately achieve their desired success in the digital realm.
1. Introduction
Cost efficiency is another huge factor for we are only human and we’re bound to make errors. The last thing we would want is to spend a chunk of money on an SEO expert and damage our site further because their methods are not guaranteed. With free tools, we can conduct the SEO/SERP campaign without the involvement of a third party and utilize the cost savings elsewhere. Even though it’s free, the functionality of free tools today is comparable to software tools that cost a hefty price.
There are many experiments and trial and errors that will be conducted before they can climb up the SERP ladder…and for many webmasters, money doesn’t exactly grow on trees. This is where free SEO tools come into the picture. Tuning up your website and making changes here and there sometimes is a blind effort…we don’t really know if it’s effective or futile. This can be extremely risky and damaging to our site’s success. At least with free tools, we don’t stand to lose anything in the process…if the campaign is futile, at least we would know our errors and it would not cost a dime.
Every single search engine optimizer around the web is constantly searching for a good free SEO tool for their website optimization. Casual surfers read a lot about how to build up content strategies, get extremely good links, and raise link density. Although they can do it for free utilizing common sense and elbow grease, it takes a lot of time.
1.1. Importance of SEO tools
Keywords should be just that – words that are keys to reaching the top position in the search engines. Therefore, one of the most essential tasks for a search engine optimization is researching the keyword list. For each keyword, try to make the density between 3-7%. This will give the best potential for a top position for a specific keyword in the search engines. If the density of a keyword is too low, it will not be very effective and if the density of a keyword is too high, the search engine may see it as spam. This is known as keyword prominence. Step four of this search engine optimization is making sure your website is at least 100 words. Most search engines require a minimum of 100 words for a website to be indexed. This would be the simple part of the search engine optimization and would need to be performed only once.
1.2. Benefits of using free SEO tools
Feature-laden paid versions are no doubt more powerful, but there are still plenty of things you can do with free tools to be successful.
Learning. Free tools can be a great way to learn the basics of a new tool and decide if it’s worth paying to go for a more powerful paid version.
Small Jobs. For tiny one-time jobs, there’s no need to use every ounce of the monthly allotment for a more powerful paid version – just use the free tools.
Spread out resources. If you’re targeting a different search engine other than Google, using free Google-specific tools means you can keep targeting both the “free” and “paid” engines without using resources from the non-Google DMA.
Use in conjunction with other tools. Sometimes a free tool compliments a more powerful paid tool very well. SERPs tool, for example, tells you if you’re in the top 100 results for a given keyword in Google and Yahoo. You can use it in conjunction with the paid Keyword Ranking tool to further monitor the results of the keywords you’re tracking.
Money talks. Though it doesn’t always apply in this industry, there’s no argument that price can be a reason to use the free version of a tool. Low budget clients with low competition in their markets can often get by on a free tool or two, and sometimes it’s better than what they had been using.
2. Keyword Research
Once you’ve brainstormed, it’s important to identify which keywords are most relevant to your business. Any list of keywords is going to be long, so it’s important to utilize a keyword research tool. When browsing keyword research tools, there are a few factors that you should take into consideration. First and foremost is the relevance of the keyword to your business. Most keyword tools work by showing how often a certain keyword is searched, and then generating a list of related keywords. This is a very useful feature, since it’s important to target keywords that are directly related to your business. Attempting to rank highly for a broad term such as “insurance” or “mortgage” will be an expensive and fruitless endeavor. These generic terms are used by searchers who are merely window shopping, and not looking to conduct any serious business. The traffic from these searches will be next to impossible to monetize, and the people searching these terms will be unlikely to convert. It’s far more profitable to target terms that suggest a higher level of intent on the part of the customer. An example of this type of keyword would be something like “car insurance quote” or “best rated reverse mortgage”. Keep in mind that the more specific the keyword is, the easier it will be to rank for, as there will be less competition. Step two is to determine whether or not there is adequate search volume for a given keyword. Often these niche terms have a lower search volume, so it’s important to weigh the amount of traffic you would receive with the competitiveness of the keyword. Finally, it is important to take into account the commercial intent of the keyword. Attempting to rank for a keyword with no clear commercial intent is a waste of time as many users will be looking for information with no intention to purchase.
2.1. Finding relevant keywords
“Keyword” refers to the words search engine users type in the search box to find information. Keyword research will help you choose the right keywords that your target audience uses to find the kinds of products or services you offer. You then can employ those keywords on your webpage copy so search engines will find your page for those keywords. Step one to locating the right keywords is to brainstorm. Sit down and make a list of words that are relevant to your webpage. This could be words you think are keywords people would use to get to your type of content. This could also be words that describe the content on your site. You never want to limit yourself so be sure to have a friend look over your list to offer suggestions or feedback.
Once you have a decent list compiled it is time to expand on it. The Google keyword tool can help you identify which keywords are most frequently used and how competitive they are in terms of ad revenue. You won’t be using this tool for its ad recall usage but it will work just the same. Take your list of keywords and plug them into the Google keyword tool. From there the tool will generate a list of keywords that are similar to the ones you were using along with how many of the keywords are there are and the amount of competition each keyword has. Make sure to keep the keywords that are relevant to your page and have a good ratio of search value to competition. Although using keywords with high search value and low competition may seem like a good idea, these keywords are usually irrelevant and provide no functionality to your website. A good tip to keep in mind is to think of keywords with high competition and do a government search on these words. Usually these keywords are overused and finding similar ones with lower competition would be a better choice.
2.2. Analyzing keyword competition
The tools will utilize the data from the search engines and provide an analysis of the keyword competition. It will then provide a simple report giving the competition a rating by taking into consideration the usual factors that determine the level of competition (i.e. number of indexed pages, number of links, and quality of the links, etc.). Depending upon the tool, some will take it a step further and make recommendations of whether or not it is worthwhile trying to compete for the keyword. This is usually based on a probability rating. For example, the keyword is “blue widgets”. The tool will suggest that based on the analysis, “blue widgets” is a low probability target and suggest a more targeted term such as “cheap blue widgets”. Make sure to take note of any recommendations.
The objective of understanding keyword competition is to find open playing fields. It involves determining who your competitors are when it comes to the chosen keywords. If the competition is too fierce, you may need to re-evaluate the keywords. When analyzing the keyword competition of a particular search term, you are assessing the sites that are already listed in the top results of the search engine. To determine the level of competition within the results, you will need to look at the search results and ask yourself the following questions: How many links are there? How many of the links are quality links? Is the content on the page relevant to the search term? How old is the content and the site?
2.3. Identifying long-tail keywords
Various free tools are available for finding long-tail keywords, but you might start with the Google AdWords Keyword Tool. Enter a potential keyword and the tool provides a list of related keywords, along with an assessment of the competition and the probable search volume. The information about competition tends to be unreliable, but the search volume is useful. To get the best information, you will need to run the tool with several different queries – for maximum search volume, for lower search volume, and for high and low competition. Make a note of search volumes for potential keywords.
Take keywords from the list of alternatives given by AdWords and enter them as queries in Ubersuggest. This tool aggregates Google suggestions for queries containing the keyword, and it usually works very well for long-tail keywords. Save all the suggestions that seem useful and feed them back into the AdWords tool to get an assessment of search volume. This should yield a sizable list of keywords with associated search volumes. Keep the keywords with a search volume that is reasonable in the context of your site. High volume on a keyword can make it hard to rank for that keyword, but the same is true for a keyword with very low search volume.
3. On-Page Optimization
A meta tag is a hidden content which describes your website. Meta tags are really important for search engine optimization. Your meta tags should be clear and concise. It should be related to the content of the page. Don’t keep same meta tags for a large number of pages. Avoid using irrelevant keywords. Make sure that every single keyword of your meta tag is available within the body text. Search engines score relevancy with meta tags to some degree.
Website speed is a crucial factor for search engine ranking and it will also affect your conversion rate. Ideally, your website should load within 4 seconds. You can check your loading time at websites like Pingdom and Webpagetest. The problem of the slow website can be due to cheap hosting service or a higher resolution theme. As an image is a major factor for better looking website, it can also slow down your website. Try to use smaller size images without reducing the image quality. You can resize images via an image editor and also you can compress image file size without losing image quality at Tinypng. You can also use image lazy loading for your website. It will reduce initial page load time, and it will also reduce the number of HTTP requests.
For permalink optimization, you should use a SEO-friendly URL. Clean and descriptive URLs are really important for search engine. You can change your URL structure at permalinks. Select the post name and save this setting. Now every time when you create a new post, just make sure that the title of your post is good. If you are using a long title, it will also create a long URL. You can modify Supercharge Your Rankings with Free SEO Tools URL while you are editing the post.
3.1. Optimizing meta tags
Meta tags provide information about a given web page and are placed between the tags in your HTML document. These hidden tags provide a description of your page to the search engine. With meta tags potentially having a large impact on your site’s placement in search engine results, getting the strategy right is vital. Meta elements can be used to specify page description, keywords, author of the document, last modified, and other metadata. At the moment, the most important meta tags are the description and keywords tags. It is generally accepted that the title tag is the most important tag on the page for search engine positioning. A good title effectively sums up the page content. Title text is used in the clickable link on the search engine results page, so ensuring the title is highly relevant to key search phrases is vital. Failure to do this may reduce the number of clicks from the search engine results page, thus reducing the number of visitors the page receives. A title that is too long may be truncated by the search engine and should be avoided. Although not strictly a meta tag, the title is a key candidate for on-page optimization and is certainly worth mentioning in this article.
3.2. Improving website speed
For any website, one of the most important aspects which affects search engine rankings as well as the effectiveness of the website is the website’s speed. On average, a website which takes more than 10 seconds to load will lose the user’s attention and they will either click “back” and look for another search result, or they will simply close the browser entirely. Reducing or optimizing website images is the primary method for increasing website speed. Using the “Save for Web” feature in Adobe Photoshop will remove much of the extraneous information in an image file and reduce its file size. Also, you could remove some unnecessary images within your content.
Optimization of CSS is also a crucial part of website speed. Some people use external stylesheets, others use embedded CSS within the page. Generally, the latter method is faster, although finding all the CSS references can take some time. By using vBulletin’s automatic CSS optimization tool, you can consolidate all CSS references into one CSS file, which is a much faster method. After completing CSS optimization, always remember to validate the CSS to ensure there are no errors. Finally, one common method of improving website speed is table optimization. This is beyond the scope of SEO, but for those who have access to it, optimizing the MySQL queries within a website can substantially improve its speed.
3.3. Creating SEO-friendly URLs
With highly ranked URLs, you can attain better click through rates and it’s been proven that the URL is actually displayed in the search results. SEF URLs are also easier to use and guess for the end user. Static URLs are far more indexable by the search engines. Although Google and the other search engines are getting better at indexing dynamic URLs, they can often run into session ID issues character limits. This can mean they do not index the content of the page.
Google has publicly stated that it considers the first 3-5 words in a URL to be the most important and anything after that has diminishing returns. Why wouldn’t you want to take full advantage of this and create URLs that have the key search terms that you are looking to rank for?
Dynamic URL: [Link]
SEF URL: [Link]
Firstly, it is important to understand what a search engine friendly URL is. A search engine friendly (SEF) URL is usually 3-5 words that can describe a page. It uses clean URLs that are free of any parameters or ID numbers. They are also static and an internet user can easily guess the content of the page from the URL.
4. Content Analysis
The cornerstone of SEO is the content that you create. A high-quality website with good content is what search engines look for, and good content is what drives success at both a monetary and a branding level. Content analysis is a crucial part of the success of a site because it targets the very thing that you are trying to promote. If your content is copied, you will never achieve a high search ranking as the content will be filtered as duplicate. If your content is not engaging, the bounce rate will be high and your site will be considered untrustworthy or not useful. If the content is not optimized around keywords, the search engine will find it difficult to determine what the content is actually about, which will result in poor ranking. If the structure and formatting are poor, this will affect its readability and will also impact on the engagement. This is because web users tend to scan through web pages rather than reading them in depth. Good formatting and structure will make it easier for a user to determine whether the information they seek is on that page, quickly and efficiently. The easier they can do this, the more likely they are to stay on your site.
4.1. Checking for duplicate content
Siteliner is a free service that allows you to check an entire website for duplicate content, broken links, and other issues. While the free version is missing some features, it still has more than enough for a smaller website. Copyscape is another great tool to find duplicate content, but is limited to only checking a single page at a time. This is great if you’re concerned with people stealing your content, but not as useful for checking your own website. Both of these tools will compare the content of a page to the content on the rest of your site. If you’re a WordPress user, there is a free plugin called “Duplicate Content Cure”. This simple plugin will check for duplicate content and unapproved comments. All of these options make it easy to find duplicate content and fix it before it becomes a big problem.
4.2. Assessing keyword density
Keyword density is an old-school concept in SEO. Some sources claim that you should aim for a certain percentage (3-5% is a common figure), while others say just write naturally and your keyword density will take care of itself. I’ve gone on record as saying that keyword density is not something you should be focusing on, and I still stand by that. The only time I’ll check keyword density is when I am optimizing a page for a very specific term. In those cases, I want to make sure that I don’t overdo my optimization. An example of this would be if my page title was ‘cheap widgets’ and I wanted to rank highly for that term. Chances are, I’d have the term ‘cheap widgets’ in the anchor text of backlinks to that page. In that case, it would be easy to over-optimize, and I’d want to make sure that my keyword density for ‘cheap widgets’ wasn’t way higher than the natural, normal keyword density for that phrase. An example of a page that is suffering from over-optimization is the Mozilla cache page (if you see PageRank sculpting in your future using nofollow, don’t use it on the cache page).
4.3. Analyzing readability and engagement
Readability and engagement are important factors which not only affect the way users comprehend the website content but also the way search engines evaluate and rank it. It is not only how well the contents hit visitors but how compelling the contents are for visitors to stay and read more. Search engines have “crawlers” that are automated robots which follow the link structure of website and make copies of individual webpages. These copies are then stored in the search engine’s index. This data will be used in the search engine’s algorithm to analyze and rank the contents of the website when user is searching for particular keywords. There are various methods of analyzing website contents for readability and engagement, the simplest of which is employing a third party to reassess content. A more complex and costly method is to utilize readability analysis software to review the content, however the software described below is a free tool. An effective method to check the readability of site content is to use Google Analytics in conjunction with its Google Site Search function.
4.4. Evaluating content structure and formatting
If not, reducing content bloat is a very effective way to increase keyword density on a page. It’s difficult to provide a lot of information with a small amount of text, but if you can find a way to condense content and maintain or even increase its value, then readers and search engines will see this in the quality of the page.
Compression of content on the internet is another issue that affects readability and user satisfaction. Users want to be able to find the information they want as quickly as possible, so they can get on with whatever action they wanted info for in the first place. By providing precise useful information with minimal waffle, visitors will leave having a positive impression of your site. This is compared to a site that spreads the same information over a large amount of text. Users will feel this is inefficiency and it is likely they will unknowingly stereotype your site as low quality.
An important aspect of good content structure is including an effective introduction and conclusion. These help to inform the user of what the content will contain and summarize the information respectively. If a user ends up on your page and finds the information in between the introduction and conclusion doesn’t really relate to what they were looking for, at least they can quickly identify if the page is right for them without having to read the whole thing.
To analyze the structure of your page, you should first think of how the content should be broken down. Usually, content can be separated by core topic or subtopic. The way you separate the content is down to personal preference. With clear groupings of information, there is likely a higher chance the page will be visually appealing and make sense to the user.
Structure and formatting are essential aspects of great content as they make the information easy to understand and read. The more digestible content is, the more likely it will be read, understood, and shared. This, in turn, will lead to a better understanding of your subject and more traffic.
5. Backlink Analysis
The monitoring feature should be used on a periodic basis to compare link profile changes over time. This is important as search engine rankings are ever-changing and you may need to take action if your rankings begin to decrease. This tool integrates with Google Analytics and Webmaster Tools so that data is pulled in from those sources for a comprehensive backlink analysis experience.
The backlink analysis function in the toolkit allows you to view which sites are linking to your website. Once you know which links are aiding you and which ones are causing damage, you can undertake a link removal request to the webmaster of the domain containing the toxic link. Upon successful removal of bad links, you can then use the link building opportunities function to replace those bad links with new ones.
Backlinks, also known as inbound or incoming links, are links that connect your website to other websites. Search engines see backlinks as favorable “votes” from other sites. The more “votes” a site has, the better its organic search rankings. However, too many votes from the wrong parties can have a less than desirable effect on your website’s search ranking.
5.1. Monitoring backlink profile
Part of monitoring your backlink profile requires to see all of your links and how they are impacting your site. When you know each link that is pointing to your site, it is easier to manage the quality of those links. You can easily use the Google Search Console to gather the inbound links for your site. This can take a long time and the data is limited compared to that of some of the paid tools. Other free methods like using Yahoo Site Explorer are also available for identifying your backlinks but the limitations are similar to the Google Search Console. Open Site Explorer provides you with the ability to check the backlinks to a specific page. While the data is limited to one page at a time, this can be useful for looking into specific pages. Google webmaster provides you with information about backlinks to your website and anchor details. While the data is a bit limited, it is useful for a quick glance into your backlink profile. These methods are useful to a degree, however the page limitations and quantity of data limitations make it difficult to manage and track your links effectively. Tracking your backlinks in a spreadsheet can get very messy and hard to manage if you have a lot of links.
A free tool known as SEO Review Tools offers a free backlink checker that provides you with a full list of backlinks to your site. It gathers its data from Ahrefs, which offers a free trial that can be used to gather a large amount of data for free. Running the backlink checker on SEO Review Tools and then running a trial on Ahrefs can help you to gather a large amount of data without spending a penny. SEO review tools offers a clean list of links that can be downloaded as a .csv file. The data is then easily transferred to a spreadsheet. This method provides an easy way to gather a large amount of data and keep it well managed for further analysis. The Ahrefs trial can be repeated with a new email to gather a full list of links to your site if it has a large backlink profile. This method is quite effective but it does take time to run the checker and compile the data.
5.2. Identifying toxic backlinks
Identifying toxic bad links in your website is step one to disposing of them. A few require no attempt to pick them out. They may be obvious ones from spammy websites, regularly with overuse of unrelated key phrases. It is uncommon for a small enterprise to advantage a lot from inclusive of hyperlinks to and from websites with these varieties of links but, if you have them, it’s far a very smooth choice to get rid of them. Others can be inflicting more harm than good without you even knowing it. If, for example, you had been to installation a free directory or a job board where any and all agencies can put up content, you can find that they have begun employing spammy SEO processes to benefit low-quality links.
These may be tough to locate because the content generated isn’t always always terrible and the method may have been used by an employee acting on behalf of their organization as opposed to outsourced SEO. In such a case, it is best to remove the links and contact the business trying to replace the anchor text directing to your page with natural text links.
5.3. Finding opportunities for link building
What if I told you that there is no definitive answer to that question? All the more frustrating but is the truth, as the process of link building can take many forms and it is always best to go the most natural looking route. One company’s perfect link profile could be a dangerous thing for another company depending on the circumstances. Nonetheless, here are a few methods to hopefully point you in the right direction. If you’re the type of person who likes to plan for the marathon, then the Skyscraper Technique from the folks at Ahrefs is a wise starting point. The idea is to find content in your niche and create something better using it as a blueprint. A post with better written content, better infographics, video embeds, or anything that will add more value, then reach out to the people linking the original piece and sell them on why your content is a better resource and that they should link to you instead. For those less patient and in need of quick wins, resource page link building is a method where you find resource pages in your niche that have a list of links related to your content and you ask to have your link added there. Finally, if your content tends to be more on the viral or controversial side, the Moving Man Method is an approach where you monitor a specific piece of content for a certain amount of time to see if the page URL might change or become a dead link. You then create a similar piece of content and reach out to people who linked the original post to say the content they linked is no longer a valid resource and that they should instead link your content. This method is not necessarily recommended for everyone but depending on the circumstances it can be highly effective.
6. Competitor Analysis
What you’re really aiming to do here is to build up a repository of strong information to guide decisions on SEO strategy. All information or data that is cited needs to be based on specific evidence found on competitors’ sites. This evidence needs to advocate as to why a certain action should be taken. We should aim to develop conclusions based on probability. For example, I think this action is the best way forward because the evidence suggests that it has been effective for my competitors.
Fancy another great way to maintain competitiveness? Take a leaf out of a competitor’s book. That’s right, conducting competitor-based research is an excellent way to gain insight and minimize risk on optimization efforts. Why trailblaze a new path when you can beat a well-worn track to success? By undertaking a competitor-focused strategy, you can direct the common saying “knowledge is power” to “information is money.”
6.1. Analyzing competitor’s keywords
To figure out what long-tail keywords are most likely to be used by our potential customers, we can do a brainstorming session. We can open up the Google Keyword Tool and enter these long-tail keywords, making educated guesses. Then, we can sort the results by global monthly search volume and inspect similar keywords that have at least 100-200 searches. We can also take note of the advertiser competition and the CPC. This manual method is a little better than scratching around in the dark, but it would still take a lot of time and there are still many errors.
One thing we need to realize is that competitors ranking on the 1st page of Google for our main targeted keyword are most likely better than us. It is much more efficient to check which keywords they are targeting and try to rank for those keywords too. We know that organic traffic coming from long-tail keywords with 3+ words in the search are easier to convert, and this kind of traffic is probably how our competitors are already getting traffic.
Most of the time, doing an in-depth keyword research takes a lot of time and resources, and sometimes it is still not accurate enough. Fortunately, we can skip the educated guesswork of figuring out which keywords are most profitable in our niche by spying on our competitors.
6.2. Assessing competitor’s backlink profile
Yahoo Site Explorer is the first tool to use for competitor backlink analysis. It’s great because you can export data on the backlinks in a .csv file. This will then allow you to import the data into a spreadsheet and compare it with your own site’s backlinks. The ability to compare data is key for understanding which links the competition has which you do not. It’s quite clear that if your competitors are in the top ranks and you have fewer backlinks than them, it is very likely that the backlinks are the cause of their rankings. Data from YSE can also be used to understand anchor text that is being used to link the competitor’s site. This will help you to understand which keywords the competition is targeting, which can be valuable for 6.1. Yahoo Site Explorer is free but does require you to have a Yahoo ID. A word of warning about Yahoo Site Explorer; there have been reports of it being taken down recently with many stating that it could be closed by the end of 2011. For these reasons, it may be a good idea to use the tool as much as possible while it’s still active so you can get as much data as possible before it is no longer available.
Backlinks have always been the most important factor in understanding which websites are trusted or popular in the eyes of major search engines. While there are many ways to promote a site and increase its trust, the best way is and has always been through building quality backlinks. A pro SEO needs to know exactly what the competition is doing in terms of its backlinks. Luckily, backlink analysis is made easy with the following tools.
6.3. Identifying content gaps and opportunities
After understanding the type of content required, the next stage is to pinpoint the areas that require content creation or the improvement of existing content. This is a simple process of comparing keyword lists to URLs and identifying any content needs for which there is no related content or a URL is deemed suboptimal. This data should be organized into a logical list of content needs and can be referred back to the keyword data in order to distinguish priority areas. This can help to influence the URL structure of a site and organize content creation around specific themes.
The first stage is to conduct a brainstorming session and interpret the keyword data in order to discover any information needs related to the keyword and the intent behind it. The content gauge tool is useful for this as it will show what types of content already exist for a particular keyword compared to its competitors. This tool categorizes keywords by type and assesses the data in terms of any content needs. This process will identify the depth of information required for the keyword and the type of information that should be created.
An effective on-site optimization strategy will focus on the gaps and opportunities possible to improve a site’s overall keywords. The aim of identifying content gaps and opportunities is to understand what already exists on a site and how it can be improved. This strategy enables to prioritize the keywords to focus on and develop information related to these.
7. Rank Tracking
If you want to keep track of keyword rankings for your site, you can use Firefox plugin – SEO SERP. This is simple and does what it says on the tin. I don’t generally use rank tracking plugins so I can’t provide a great deal of insight to this one – but it comes highly recommended. An alternative to this is the RankChecker which also allows you to check the rankings of specific keywords on Google, Yahoo! and Bing in a matter of seconds. For those of you on a tight budget (and who isn’t) the good news is that this tool is free. Unfortunately, it will not allow you to monitor how your rankings change over time but you can always revisit it time and time again, and the results can be exported to a CSV file if you wish.
One of the most important things about online marketing software is a nice simple user interface – and EasyMacros ranks highly in this area. Using this software, you can check the rankings of your site on Google, Yahoo! and Bing and store this information over time so that you can view the changes in a line graph. The fact that it automatically updates data at the push of a button is a real bonus. The only problem is that it’s currently in the testing phase – but it is free for anyone who wishes to give it a go. And don’t let the fact that it’s been developed in Japan put you off! Finally, if you’re familiar with Microsoft Excel then you may be interested in trying out SEO tools for Excel. This powerful plugin allows you to pull in a whole load of SEO data from various sources, rank tracking being just one of them, to display it in a clear and concise manner all within Excel. This plugin is not free – but there is a free basic version available.
7.1. Monitoring keyword rankings
There are good free tools available to check your website’s historical rankings. Firstly, the Google Webmaster Tools Search Top Queries which can show data up to 90 days and also the Google Webmaster Tools Search Query Data by Clicks which was released in December, offering a much more in-depth analysis. This data can be useful, but the best free tool by far is the SEO Book Rank Checker which allows you to save a .csv of ranking data and compare it to different time periods. The SEO Book tool retrieves ranking data from a variety of sources and is a great indicator of change in search visibility. A good feature to look out for in the future is the Rank Checker Super Monitor, this tool is currently in BETA testing and is subscription only, but it promises to track keyword rankings on a daily basis and show an onscreen visual representation of rank change.
As to the second point, it is important to check not only your own website’s position, but also the visibility over time. This is important because it is possible for a website to rank highly for a short period of time before being penalized. It is also beneficial to track ranking over time to identify which activities have had the most positive/negative impact on your search visibility. For example, a new link building campaign may have improved search visibility for the targeted keywords, suggesting that it is important to continue and remain the same. It is also important to identify the activities which do not have a significant impact. An algorithm change may hinder search visibility, a high CPC PPC campaign may result in no change, and it may be that SEO is simply not effective for those keywords.
7.2. Tracking website visibility over time
A great deal of rank tracking software only displays the current visibility of your website. It can be more useful to follow your website’s progress over time and correlate any increases or decreases with events on your site or key changes to the search engine algorithms. Some rank tracking software will allow you to track your website’s visibility over time, but the easiest way to do this without spending more money is by using Google Analytics. You can use the data from the search engine traffic report to see how many visitors each search engine has referred to your site. This data is up to date and directly correlated with your website visibility from search engines. First, you will need to define the search engines that you want to monitor. This might be all available search engines in your target market, or it might just be one of the big three: Google, Yahoo, MSN.
To do this, go to the “add/edit search engines” link, just below the graph, and Analytics will show you a percentage increase or decrease in non-paid search traffic for that engine, as well as the overall increase or decrease in traffic from the selected search engine. This provides a clear way to track if your SEO efforts are improving the visibility of your website, and you can even mark important changes on the graph to see if they had any effect.
7.3. Analyzing competitor’s rankings
The first thing to do is identify who your competitors actually are. This may be obvious if your website is already in a competitive niche, however if your website is new to the online environment or if it caters for a unique product or service, then it may not be apparent who are your top competitors in the search engine. To find out, try searching for the keywords and phrases that you have already optimized for and make a note of the websites with the highest search rankings. These are the companies that you want to focus your efforts on competing with.
When it comes to analyzing the strength of your website in an organic search, it’s often useful to compare it to the performance of your competitors. This can help in identifying exactly where you are losing out, what can be improved, and in what areas you are performing well. In many cases, tracking your own progress alongside that of a competitor can be the motivation needed to make improvements.
8. Technical SEO
Lastly, redirect issues are very common yet not easy to find. In 404.php or page not found errors, the page should return a 404 header. Using a header check tool, you can visit the error page and ensure that it is returning a 404 header. Then verify these errors in Google Webmaster Tools under Diagnostics > Crawl errors. A list of the offending URLs indexed will be indicated, and you must take proper steps to 301 redirect (if possible) to a relevant page. Always re-submit your sitemap to Google to get these fixed URLs out of their index. This process should begin to heal the damage that internal on-page linking has caused.
Good suggestions are to improve appending the word “sitemap” in the robots.txt file and avoid incorporating non-essential files in your sitemaps. You can fix this by looking at your sitemap in Google Webmaster Tools under Explore > Sitemaps. In phase three, begin the URL with the proper protocol for $server_uri, then declare the url variable and execute a $server_response on the scream page, which will alert us from the beginning if the URL is non-working or not.
8.1. Checking website crawlability
Almost every SEO problem begins with the search engine’s ability to crawl a website and often ends with a bad user interaction. If a search engine can’t crawl your site or has limited access to certain pages, it’s difficult to have that page included in the index. And the price of entry into the search index is the search result. If that page is not in the index, it won’t appear in the search results for any terms.
Crawlability problems can occur because your site is too complex for crawlers to follow all links, or it may be too isolated, with only a few paths for crawlers to follow. In some cases, the search engine is unable to access your site at all, typically because of a “Do not access this resource” request in a file called robots.txt. This, in turn, makes everything said and done on your website pointless since it doesn’t appear in the search results.
8.2. Optimizing XML sitemaps
Once you have devised your XML sitemaps, you need to ensure that they are fully optimized. This can be checked using Google Webmaster tools with the content keyword feature to check the main words used on the site match the words used in the sitemaps. If they do not match, then you need to edit your sitemaps accordingly. Another aspect to consider is that often webmasters use sitemap pages as a way of getting PageRank deep into their site. If you have sitemap pages with no or very little content and you are using them purely to get PageRank deeper into your site, it is more beneficial to stop using the sitemap pages and get rid of the PageRank barrier by linking your deep pages directly from the normal pages to ensure they get indexed. This will allocate the deep pages with more PageRank, and the pages should get indexed. Although your sitemaps may no longer be needed, it is still best to resubmit the new sitemaps with updated URLs to ensure that all your pages do get visited.
8.3. Fixing broken links and redirects
Broken links are those hyperlinks that send a user to a “file not found” page. Click on an internal or external link and instead of going to a functioning page, you get a message, “404 Error – Page Not Found”. The best way to deal with 404 error pages is to 301 redirect the page that the link is pointing to. This will ensure the correct exchange of inbound and internal link equity, also known as “link juice”. This is a powerful method to get the most out of the traffic that is sent to your site from other sites or from their previous visits.
The link juice will still be passed, but to the functioning page. A 301 redirect is the most efficient and search engine friendly method for webpage redirection. It will ensure that you keep the search engine rankings of the original page. A 301 redirect is an instruction to the browser that the page has moved. The 301 redirect is the best method for implementing redirects on a website.