Most often, we will create programs that are needed for automating a company procedure. In these circumstances, almost all our attempts are gone into the structure and style of how effectively we can manage the company sector in our program. If your program is focused to Online viewers, then there are some extra specialized things which you need to consider apart from your company sector. One of the main resources of viewers for these internet programs are the Search Search engines like Search engines, Search engines, Search engines etc. Hence, the end program should not only manage your company problems effectively but also adhere to some easy guidelines so that it outcomes in great outcomes in internet field. This content, will record some of the easy recommendations which you need to consider if your Asp.Net program is an website.
1. Add descriptive and unique Page Title for every page
Every page in your website should have a unique and descriptive page title that can describe what the page offers. You can set the Page Title either declaratively or in the code behind file. Refer below,
<%@ Page Language=”C#” AutoEventWireup=”true” Title=”My Home Page” CodeFile=”Default.aspx.cs” Inherits=”_Default” %>
In code behind,
Page.Title = “My Home Page”;
4. Add Meta Keyword and Description tag for every page
Add Meta keyword and Meta description tag with relevant contents. Search engines will use these tags to understand what the page offers. You can dynamically set the meta tags from codebehind file using the below code,
HtmlHead head = (HtmlHead)Page.Header;
HtmlMeta metasearch1 = new HtmlMeta();
HtmlMeta metasearch2 = new HtmlMeta();
metasearch1.Name = “descriptions”;
metasearch1.Content = “my personal site”;
metasearch2.Name = “keywords”;
metasearch2.Content = “ASP.Net,C#,SQL”;
The above code will add the below Meta tags to output html.
<meta name=”descriptions” content=”my personal site” />
<meta name=”keywords” content=”ASP.Net,C#,SQL” />
In ASP.Net 4.0, Microsoft added 2 new properties on the Page directive (Page object) that lets you to define the Meta keywords and Description declaratively and dynamically from codebehind.
<%@ Page Language=”C#” AutoEventWireup=”true” MetaKeywords=”asp.net,C#” MetaDescription=”This is an asp.net site that hosts asp.net tutorials” CodeFile=”Default.aspx.cs” Inherits=”_Default” %>
protected void Page_Load(object sender, EventArgs e)
Page.MetaKeywords = “asp.net,C#”;
Page.MetaDescription = “This is an asp.net site that hosts asp.net tutorials.”;
The similar can thing can be achieved in previous versions of .NetFramework by using a custom BasePage class. Read the below article to know more.
5. Make descriptive urls
Make your website URL descriptive. URL’s that has lots of query string values, numeric ids are not descriptive. It will provide enough information what the page offers. For example, http://www.example.com/products.aspx?catid=C91E9918-BEC3-4DAA-A54B-0EC7E874245E is not descriptive as http://www.example.com/Electronics
Apart from other parameters, search engines will also consider the website url to match your page for a searched keywords.
Read the below article in codedigest.com to make search engine friendly url’s in asp.net.
You can also use URL rewriting modules for this.
6. Add Alt for images, Title for Anchor
Add ALT text for images and Title for hyperlinks. The ALT text will be displayed when the browser cannot display the image for some reasons. Search engines will not be able to read the image and ALT text will give some hint about the image which the search engine can use.
<asp:Image ID=”imLogo” runat=”server” AlternateText=”My company Logo” ImageUrl=”logo.gif” />
<asp:HyperLink ID=”hpHome” runat=”server” ToolTip=”My Website Home” Text=”Home” NavigateUrl=”Home.aspx”></asp:HyperLink>
The above ASP.Net markup will produce the below output,
<img id=”imLogo” src=”logo.gif” alt=”My company Logo” style=”border-width:0px;” />
<a id=”hpHome” title=”My Website Home” href=”Home.aspx”>Home</a>
7. Handle ViewState properly, don’t overload the ViewState
ViewState is an encoded string that is populated by ASP.Net to maintain the state of the controls on postback. This string is saved to a hidden field at the top of every page and gets transported with the HTML output. Most of the times, the ViewState string will be long and heavier. Since ViewState has no search value, it will be a real hindrance for search engines when trying to find the real content in your page. Some search engine may have some restriction on the page size.
Hence, try to handle the ViewState in your page efficiently. Turn off ViewState for the control that doesn’t require it.
You can set EnableViewState=”false” to turn off viewstate at control level, page level(@Page directive) and config level(<pages> section).
8. Design your page lighter with less images, less flash and less Silverlight content
Try to design your page with very less media contents like images, flash objects, Silverlight objects, ActiveX objects, etc. Search engines can only read HTML contents. A page that is entirely build on flash or Silverlight are not search engine friendly since the search engine robots cannot find any textual contents in those pages.
9. Do Permanent Redirect with proper return codes to retain the Page Rank
If you have moved a page to a different URL or changed your domain to new domain then you should do a redirect to the new location by returning an http status code of 301- Permanent Redirect. This is called permanent redirection. This will make sure the existing page rank is copied to the new page.
Read the below codedigest article that discusses some of the scenario where we can use permanent redirect for search engine optimization.
10. Add rel=”nofollow” to external links
Add rel=”nofollow” to the user contributed links that are external to the site. Sometimes, the external links posted by a user may have security threats (it may download a malware which will infect the users) or possibly a spam generating site. Doing like this will secure your site from getting penalized from search engines.
Sometime back when you add rel=”nofollow” to an anchor tag, search engines will not share the page rank with that link. This makes the remaining links on the page to take more share of the page rank. Currently, the implementation is changed where the page rank is shared but it no longer allows other links on the page to take more of the share. Read here to know more about how nofollow affect the page rank previously and now.
11. Use Header tags
Use Header tags (H1,H2, H3, H4, H5 and H6) wherever appropriate instead of styling the text in SPAN tags. These Header tags are search engine friendly. You can use this tag efficiently to organize your page headings and sub headings.
For example, you can put your page top most heading in H1, sub heading in H2, sub-sub heading in H3, etc that represents a proper hierarchy of your page contents.
13. Unique URL for a Page
Search engines like Google will treat a page with url http://www.example.com/Default.aspx as different from http://example.com/Default.aspx even though they are serving the same page on a website. This may lead to penalize your website for duplicate content issue by the search engine. Hence, always allow single unique URL to identify a page. You can handle this scenario by doing a permanent redirect to one url. Read the below article to handle these scenario in ASP.Net.
You can also Google Webmaster Tools for doing this restriction.
14. Make SEO friendly pagers
Always construct search engine friendly pager links when displaying list of items in a summary page. For example, product list, article list page, etc. A link is called search engine friendly if it is anchor tag (<A>) that has a reachable url in its href property through GET request.
Read the below article to build search engine friendly pager for GridView control in Asp.Net.
15. Limit the number of links per page
Previously there was a limit in number of links (100 links per page) the Google search engine will index on a page. This restriction is now no more. But it is still advisable to have limited number of links in your pages to avoid any adverse effect on your site rank. This is to prevent link spamming and to preserve the page rank.
16. Build SiteMap
Always have a sitemap file that can guide users and search engines to navigate your site pages easily. It is really necessary to have 2 site maps for a site, an xml sitemap file used by the search engines and an html sitemap file for the website users. Refer here to know more creating xml sitemap for search engines. You can submit your xml sitemap or RSS feed to Google Webmaster tools.
Things to be aware
1. Design an efficient navigation system where every content page can be reached in fewer numbers of clicks.
4. http://www.google.com/safebrowsing/diagnostic?site=aspalliance.com to detect malicious activity.
5. Avoid duplicate content and thin content in your website. A page having very less content with low quality which does not serves its purpose are called as thin content.
6. Avoid too many ads in a page or a page created just for ads.