Search engines periodically crawl websites to find and index updated content. These “bots” or “spiders” are programs specifically designed to go over the internet looking for new content like pages, blog posts and new websites. Your job as a website designer and developer is to make it easy for these crawling programs to find, index and add your created web pages to the global index.
Before you even begin working on the development of your website, you need to make a complete and exhaustive plan that takes into account all the SEO factors that could affect the way these “Spiders” crawl the website.
Indexable Content
For bots to access your content, it needs to be in HTML text format because they cannot read pages that use Java applets, Flash interactions or even images. For example, here is what a page looks like live versus how it looks to search engine bots:
1 A responsive webpage
You can check out how well your website is ranking on Google by using keyword tracking tools – which run on a Serp Api. The benefits of this is that you can easily know which keywords need improvement or if you got a penalty on one of your keywords. It is an essential tool to every SEO expert’s arsenal.
2 HTML Text version as it appears to search engines
You can see how search engines cannot “read” any images or other visual elements on the page. So, when you design a website, it is important to remember this and add SEO optimized text content for any visual elements so search engines know what your page is about. For example, when your page is primarily image based, bots crawling on the page see nothing. To them it is essentially a blank page with no information. To avoid this, savvy search engines experts add Alt descriptions to images so crawlers can read what the image is about.
Another common mistake that websites make is duplicating content. If your website contains the same content on more than one page, your chances of ranking in search engines will fall drastically. In addition, always check that your website’s “Meta Robots” tag or “Robot.txt” files are not blocking crawl bots from accessing any pages.
Site Architecture and Navigation
A websites architecture or site hierarchy should be clear and defined so crawlers know where they are going. When a website has multiple directories, it is difficult for crawlers, or even users, to reach that last directory to get exactly what they are searching for. For example, mywebsite.com/store/products/shoes/women/sandals/red is a complicated URL necessitated by the number of directories you have created. While you can have as many directories as you like, for SEO purposes it is better to have as few as possible.
For large website, an organized sitemap will also do wonders in helping both visitors and crawlers find, identify and index your webpages. Add an HTML sitemap for visitors and an XML sitemap for search engines so all your webpages are indexed properly.
A website should have clear indicators for navigating its contents that are readable by search engine bots. For example, if you are designing a greeting card website and decide to use images of “Happy Birthday” greeting cards to link to the page, bots see nothing. Make sure the hyperlinked text being used mentions the word “birthday greeting cards” so when a bot sees that link it knows what the linked page is about.
URL Structure
The structure of page URLs is also important when designing a website. When a crawler visits a link, it wants to be able to see the whole structure of how your website is connected. For example,
The above URL would be something like: mywebsite.com/page 1 and mywebsite.com/page 2. The crawler looks at the URL and stops there because it has no way of knowing that pages 3 and 4 also exist. These might be highly SEO optimized pages with excellent content but search engines will not index them because they are not aware of their existence.
Another issue with URL structures that we have noticed usually comes from ecommerce stores. Links will be something like mystore.com/product123.asp?id=12345. While search engines can read the URL, they do not know what product the page is displaying through the URL alone. Having the product name displayed in the URL would be much more effective.
Keeping these SEO factors in mind when designing and developing a website will go a long way in ensuring you produce an effective and efficient website. Do you have any other factors that you think are important? Let us know!