Before we start to learn the tips to design an SEO friendly website, it is very important to know one thing.
Websites we create do not look the same to us, as they appear to search engines.
The content we create that may be great and useful for us, but that might be useless or not even comprehensive for a search engine.
In our previous article Best SEO Techniques in 2022. SEO Tips for Website. We discussed about spiders and indexing.
When content is indexed by the spiders some content looks better understandable for spiders but some content isn’t.
1. Content that is Indexable
Although web crawling becomes a genuine task, it has undergone enough to see the web through man’s eyes and needs to be helped.
So before we design an SEO-friendly website, we need to keep in mind what the internet (World Wide Web) looks like to search engines.
For creating a search engine-friendly website and long-term optimization website, we have to think like a robot or software, never rely on tricks or keywords, that will only help you for SEO for a few months not more than that.
It is very important to keep in mind that, how advanced spiders in search engines only pick the content that is mostly in HTML and XHTML, for that you have to keep your website elements in the HTML code form.
Java applets, Flash files, or images are very rarely read by spiders or even not for that matter.
So, the point is you might have designed an amazing website with great content but not in HTML, search engine will not pick it up because search engine spiders are looking for the HTML text.
There are such things that you can do to make your images, videos, or other Java applets on your website become indexing or become indexable content.
If you have images on your website whether it’s in JPG, PNG, or GIF format and you want spiders to notice and index that thing then you need to insert them into HTML by giving them alt attributes and image descriptions.
Then, you can also make the content found in Flash or Java plug-ins count by using text on the page that tells what the content of the page is about.
Then, you have to provide a transcript of the phrases to the video or audio content of the website that must be indexed by the search engine, it could be helpful for the users to find that content.
One thing you must take note, if spiders are unable to read it, then search engines will certainly not feed it to visitors.
If the content is visible to a search engine then it will be indexable and after that, it appears on a search engine.
You can use the tools like seo-browser.com or googles cache inspect the website how exactly search engine and its spiders look on it.
When it is noticed by search engine, it will help you to understand what you are doing wrong and what others might be doing wrong.
Best way to see the content as search engines do, you can access the the cached version of website.
You have to type following in the address bar of the browser.
Note : Replace the word “example” with the URL of the website.
http://webcache.googleusercontent.com/search?q=cache:http:// example.com
/
For example, if your website is created by using Java, you will see something like this:
Search engine see it in different manner.
You may see the website is seen differently by the search engine when you use the text copy of the cached version of the website.
Search engine cannot see anything apart from the title.
2. Title tags

A title tag is the most important things you give to a search engine spider and its algorithm.
It must be a concise and exact description of the website and the content that user wish to find.
Title tags are so important for search engines as they are Beneficial for users.
Three aspects which are relevancy, browsing and the results that search engines provide where title tag generates value.
First thing you should be aware about is the length of your title because normally, search engines are only takes interest in the first 50 – 70 words of your title tag.
Always remember that, title tags are excerpt or summary to your website which attracts users and tells search engines how relevant your website in relation to the keywords which found in the queries that users make on search engines.
It would be good to go longer if you are using multiple keywords, that might need to get better rankings.
Try to get you main keywords to the start or opening of the title tag, this will help you with rankings.
There will be higher chances of users to click through to your website when it appears on the search engines results.
In a survey by SEO specialists, it has found that the title tags are the best place to insert keywords if you want better rankings on the search engines.
That main reason behind putting creative, descriptive, relevant and keyword rich title tag is so important.
Title tags are the key elements of your website wich expressed in a sentence and the only representation that users will have of your website.
One another thing you can do to increase the possibility of better rankings through your title tag is use leverage branding.
What you have to do is use a brand name at the end of your title tag. The users click on it who already know that brand or related to it. It will result in better click through rates.
You can use title tags in the following tentative structure:
Primary Keyword Secondary Keyword Brand name / end
It is better to see how your title tags will appear in other online platforms, like when someone post’s the URL to your website on social media platforms like Facebook or Twitter or when your name appears on different search engines.
Search engines will automatically make the keywords which match those in the query appear in bold.
So that it is a good idea to use your keywords carefully. You should not depend only on one keyword , but you should have two or three keywords that will get higher rankings for you.
3. Meta tags

Meta tags provide descriptions, instructions and information to search engines and a entire host of other clients regarding a webpage.
They are part of the HTML or XHTML code of a website and the head section.
There are many types of meta tags and their uses depending on the meta tag, from maintaining the activity of search engine spiders to provide information to a search engine that the design of your website is mobile-friendly.
It is not necessary that all the search engines will understand the metatags, for instance, Google ignores the metatags if it is understood and only recognize the ones it does.
There is no meaning to focus on putting keywords in the Metadata and expect to produce better rankings. Below is a list of some helpful meta tags and what you can do with them.
4. Spider tags

The robots meta tag can be used to create effect on the activity of search engine spiders on a page-to- page basis.
Index/noindex will inform the search engine whether or not to index a certain page.
You do not need to tell a search engine to index every page, so there is also noindex option you will probably be using.
Follow/nofollow will let an engine to follow or not to follow a certain page when it crawls and thus ignore it and the links found on that page for ranking.
Eg :
No archive lets engine know that you do not want to save a cache copy of a that page.
No snippet will tell the engine not to display a description with a page’s URL and title when it appears on the search results page.
Noodp/noydir will tells the search engine to stop from picking up a description snippet for the page from either the Open Directory Project or Yahoo! Directory when the page appears in search results.
5. The meta tag description (or meta description)

This is a short description of your web page that lets a search engine know what the page is about and it will, some times, appears as a snippet below your site title on a search engine’s results page.
Keywords found in the metadata will usually not influence rankings.
As per the W3C guidelines, 160 characters are enough in your meta description because that is the limit if it is exceeding it will be cut off in the display.
It is nice to find out which metatags is understood by different search engines and which they ignored, so you have optimized in the Met area without wasting your time while actually being ignored by the search engine.
You can find a list of meta tags that Google recognizes online.
Yahoo! and Bing have a similar document for webmasters.
Meta tags that Google understands:
https://support.google.com/webmasters/answer/79812?hl=en
Meta tags that Yahoo! understands :
https://help.yahoo.com/kb/yahoo-web-hosting/create-meta-tags-sln21496.html
Bing meta tags in the SEO segment of the Webmaster Guidelines:
http://www.bing.com/webmaster/help/webmaster-guidelines- 30fba23a
It is important for search engine spiders to be able to crawl your website flawlessly and go to all the links that your site have.
Your website must have a link structure and that too search engine friendly and designed out keeping in mind about spiders.
No matter how amazing content your pages contain, they might as well not exist, if there are no direct or crawlable links that directs towards the certain pages in your website, so thay are not reachable by search engine spiders.
It is surprising how many websites with amazing content make the mistake of having a link structure with navigation that makes it difficult , if not impossible, to get parts of their website to appear on search engines.
There are many reasons why certain parts of a website or pages might not be reachable by search engines.
Let us learn some common reasons.
First reason , if your page have content that is accessible only after filling certain forms, the content on those pages will not be accessible to spiders most probably, and thus not reachable for search engines.
In many cases your form requires a user to login with a password, fill in a few details, or need to answer some questions, spiders will not reach those content found behind forms so they usually do not submit forms. That content becomes invisible.
Second reason , spiders are very rarely crawling pages with links in Java and do not give attention to the links embedded within the certain pages.
You should try to use HTML instead of Java wherever it would be possible. The same thing happens with the links that are in Flash or other plug-ins.
Though they are embedded on the main page, a spider might not go through the site’s link structure and they may remain invisible because they are not embedded in HTML links.
Third reason, the robots meta tag and the robots.txt file are both used to impact the activity of spiders and restrict it.
The only pages that you want to ignored by the spiders have directives for spiders to do so.
Unintentional tagging has caused the death of many good pages in the experience of webmasters as a community.
Fourth reason, another common reason for a broken link structure for spiders are search forms.
You may have lots of pages of content hidden behind a search form on your website, and it would all be invisible for a search engine since spiders do not perform searches during crawling.
You have to link such content to an indexed page, so that the content can be found during webcrawling.
Fifth reason, you need to avoid making links in frames or iFrames if you don’t have a sound technical understanding of how search engines index and follow links in frames.
However, technically they are crawlable, they pose structural problems and those relating to organization.
Seventh reason, if links are found on pages with many of them, links can be ignored or have less accessibility, as spiders will only crawl and index a given amount to safeguard against spam and protect rankings.
It is important that your important links are found on clean and organized pages and linked into a clear manner or structure that is easy and simple for spiders to follow and index.
6. No follow links

No follow links are not easy to understand or little bit confusing what they actually are.
Let us look at what normal follow links are. When someone clicks a link to your website or a page, so that page gets a link juice.
You can also assumes it as a points given to the link on the page. The more inbound links you get, the more SEO points you get.
Search engines takes this as a good thing for a website, if lots of people are linking to your site or page, then it must be of value or have a great content and thus, that page will be given importance or preference by the search engine in its results.
The link juice term is that like a real juice, that flows
It transmits from site to site. So for example if shoutmeloud.com has a link to your page on their website, that means a lot more linkjuice or points your site have than if some small blog mentioned your URL with a hyperlink.
shoutmeloud.com, being a wildly popular website in terms of blogging or SEO, , has obviously more link juice and PageRank.
Now there is nofollow links come in and why they have become much valuable in the SEO segment and getting better rankings.
A nofollow link does not add linkjuice or points to your page or count for that matter; they are taken like the losers of the link division in terms of SEO points.
A normal nofollow link is a tag that looks like this:

While search engines mostly neglect any attributes you apply to links, they generally are responsive to the exception of the rel=”nofollow” tag.
Nofollow links direct search engines not to follow certain links, although some still do.
It also tells the message that this page should not be read as normal.
Nofollow links used as a method for avoiding automatically generated spam, blog entries and other forms of link injection, but over a period of time it has become a tool for letting search engines know not to transfer any link value and ignore certain pages.
When you put the nofollow tag on a link it will be read differently by search engines and treated differently than the other normal pages, which are followed links.
If you have a small or even certain amount of nofollow links to your website is not necessarily a bad thing, many popular websites or well ranked websites will generally have a many inbound nofollow links than their less popular and lower ranked sites .
It is all part of having a large amount of links, that make up your website like a authority website.
Google already confirms that it does not follow nofollow links in most of the cases and they do not allow them to travel PageRank or anchor text.
Nofollow links are ignored in the map of the web that the search engine creates for itself.
They are also meant to carry no weight and are thought of as just HTML text and nothing more.
Although , most of the webmasters says that search engines takes nofollow links from high authority websites as signs of trust and credibility.
While nofollow links might considered like bad things for link building if they are linked to your site.
Nofollow links can still build awareness for your website, as users will still get to see the name of your site and even click on the link it.
The only thing is apart that you will not get better rankings,as a result of the nofollow link.
As a webmaster, nofollow links are of. Mostly use to you because there will be certain pages that you may want a search engine to ignore and not to follow.
They are also of most use on paid links, comments, forums and anywhere you can expect spamming or pages that are little visited because they might bring down the performance of your site overall.
It is good to have a diverse link profile of backlinks that is created by follow links and nofollow links both.
Follow links will certainly do more good work for your SEO, nofollow links have their uses and can be used as a tool in avoiding spam and having unnecessary pages indexed by spiders of search engines.
7. Keyword usage

Keywords are probably the most important factor when we perform search activity.
Keywords are the base element of any query, and indexing and retrieval of information from the search engines would be impossible without them.
Imagine for a while if there were 30 different types of words used fora a bike.
It would be impossible to get any traffic to your website from search engines. Jut for you did not have the right keyword to match the keywords user used for bikes, you would miss the maximum number of users interested in bikes.
Search engine form indices by using keywords when they index the web during crawling.
Each keyword we type may have thousands or even lakhs of pages to only few of them which are are relevant to that keyword depending on the word.
When we imagine Google or Bing or. Yahoo as country and websites as public, then the keywords are the different cities with tags on their entrance and it is the duty of spiders to put relevant person in the most relevant city.
Websites which have more related keywords will get more than one city of the country, we can say the more popular keyword, the wider its area and thus, the more audience in it.
Instead large databases that consists of everything, search engines uses small databases on keywords.
Just as products are kept in a grocery stores, where is easy to search what you require and have a look around.
Just imagine if there were not any products were placed without sorting and order, it would be a very difficult just buying a toothpaste and apples, because of the size of the grocery store and numerous categories of products available.
Your content must be indexable and that it has been indexed by the search engine, so whatever content you want to give as a keyword for your website, what you want to be identified for on the internet that must be confirmed by you.
Try to use more specific keywords for the better chances of coming up higher in a results page when visitor types it for their search query
For example, ‘cow’ is a very common and popular keyword and for that keyword you will have sound competition.
Even high ranked and authority website with a large user base also face the problem when it comes to stay on top with common keywords.
So the moral of the story is if you are using common keywords with a lot of competition your chances of getting attention are very rare.
If you made your keyword more specific for example, ‘fish food Mumbai’, you have come into a keyword search which is more relevant to your content, and there will be less competition.
Here, you actually have chances of coming into the top of the page or at least high up in the search results.
Now you will be more accessible to the visitors who are looking for exactly what the content that you are offering.
Always Remember, it is better to goal for being higher up in a more specific or narrow keyword than to satisfied for lower rankings for a more popular keyword.
Title tag, anchor text, meta tag description and the first 200 words of your web page content are more important because search engines pays more attention to those.
So in case of SEO you must use your keywords in these areas as much as possible but be careful do not overuse.
There is nothing more weird than a website that is stuffed with full of keywords unnecessarily and meaninglessly.
Overuse of keywords will kill the actual work and loose the purpose, which is to win the faith of the visitors and search engines.
It looks very unusual, that your website seems unprofessional and without creating value, and seems spammy because it gives the similar experience to search engines and visitors, that you are trying to trick or decieve the system only for to pay attention to your page.
Keep yourself on the place of visitor and think about do you like or trust those websites which overuses the keywords or repetitive keywords? Or you ignore such kind of websites as they have poor content value.
You can maintain keyword density through the tools which are like yoast and many others that Can tell you the keywords volume across the page.
Always Remember, if you select your keywords properly and honestly without stuffing, you don’t have to stuff keywords with your content, as they will seen naturally in the text to a necessary volume because they are actually relevant to your website content and not just something that you only picked up for ranking purpose or because it was trending on the internet.
Find out the nearest and most popular keyword that is more relevant to your website content .
You can use it more wisely when it comes to use of popular keywords, but you can only use if you have a keyword that is more relevant to your website content and has been used properly, keeping in mind about the benefit of visitor.
8. URL structures

URL can be used effectively for websites rankings in many ways and the way users experience a website and use it. But it has seen that it is not important anymore.
By looking at the URL structure, user can guess accurately what content they are looking for on your website.
1) If a URL structure has too much text or numbers then it seems not user-friendly.
So That it is best to avoid peculiar worded URLs, with few identifiable words or words, that are relevant.
Try to make your URL seem user-friendly.
2) Try to make your URL as short as possible, and not just for your main page but for the every pages that is out from your main page.
It would be easier for visitors to remember and type or copy /paste URL on the social media page, groups, blogs, etc.
It also indicates that URL shows up enitre on the search results page
3) Try to get relevant keywords from your website content into your URL, this may help help in rankings and also helps you to achieve more traffic, trust and popularity amongst visitors.
Avoid stuffing many keywords as it could be dangerous and will literally loose the purpose and work against you. So it should not be overused.
4) Always prefer static URLs over dynamic ones so you can increase their readability for people and somewhat self-explanatory.
That is most important.
Links that are containing a lot of numbers, hyphens and technical terms don’t go well with users, it can affect your ranking and interfere with indexing by spiders.
5) It is best to use hyphens (-) to separate words in your URL as opposed to underscores ( _
), the plus sign (+) or space (%20) for separation, since all web apps do not interpret these symbols actually as compare to the classic hyphen or dash.
A neat and user-friendly URL is not just good for indexing to search engines and helping them crawl better, it is also good for the way users experience when they use website.
Avoid using generic words in your URL like ‘page_3.html’, putting in unnecessary parameters or session IDs, and using capitalization where it is not needed.
Also avoid having multiple URLs for the same page, as this will distribute the reputation for the content found on that page amongst the different URLs.
You can fix this problem by using the old 301 direct or rel=”canonical” link element where you cannot redirect.
Always Remember, URL structures cleaned up the look and feel of your website and influence the way both users and search engines experience your website.
Some URLs are more crawlable than others and some are more user friendly looking and suitable for users than others.
