Below Mentioned are the top 10 tips for SEO Beginners:
- Keyword Research: Keyword research is quite possibly the most important part of SEO. You cannot begin to plan for a campaign unless you know which phrases you are targeting, and you cannot estimate costs and returns from SEO unless you first know who you’re competing against.Keyword research is the practice of identifying which phrases are used on search engines when people are looking for information, and usually includes finding both the search volume and relative competitiveness of the terms.
- Keyword Density: Keyword density in SEO refers to the number of times your keyword phrase appears on the page. For example, if you have a Web page with exactly 100 words on it including all headlines, captions, alt text, and advertising, and you have a keyword phrase that is on the page 3 times, your keyword density is 3%. If your keyword density is more than 5% then Google will penalize you. Remember you are writing articles for your readers not for search engines. So Avoid Over Stuffing of keywords.
- Optimize Page Titles & Meta Description : Every page on your website should have a unique title tag and it should be less than 70 characters. Similarly Meta description of your post should be between 150-160 characters. This is important because search engines reads meta description for understanding the content in a post. If you are using WordPress CMS then you can use Yoast SEO Plugin which will help you in On Page SEO Optimization.
- No Duplicate Content: Duplicate content affects brand creditably, creates negative user experience and affects search engine rankings and domain authority. So if you want to get good traffic from Search engines then only publish unique content on your website and say no to duplicate content.
- Create XML Sitemap: A great way to make your Web site more user friendly to search engine spiders is to add an XML site map. Do create a sitemap. By creating XML Sitemap of your blog you are helping search engines to better index your blog. If you are using WordPress then you can use Google XML Sitemap for creating your Sitemap.
- No Broken Links: A broken link is a link that directs visitors to a page or file that no longer exists. It is sometimes collectively referred to as link rot, or simply as dead links. Google Webmaster Tools is a great way of finding 404 errors and broken links. Search engines will not be able to index your site if they cannot get through your links and your readers will not be able to find content on your website.
- Increase Backlinks: Backlinks are one of the most important factor which affects the ranking of any site in search engines. Higher the number of quality backlinks higher the credibility of your blog. So if you want to achieve higher search engine rankings and traffic to your website then you will have to invest your time in increasing your website backlinks. You can easily increase backlinks by doing article submission, blog commenting, guest posting, directory submission and social media promotion.
- Image Optimization: Image optimization is one of the most important factor of On Page SEO. A Picture is worth than 1000 words. Therefore if you want to get traffic from Google Images then you must optimize your images by using your keywords in ALT Tag. Make the size of the file as small as possible without losing quality. Also give description and caption to your images.
- Use Google Webmaster And Google Analytics: Every webmaster must use Google Webmaster ToolsAnd Google Analytics as it helps in improving site and its performance in the search engines. You can use Google Webmaster Tools for removing bad links, fixing 404s errors on your website, for submitting sitemaps and for creating sitelinks.You can also check which others sites are linking to yours and what keywords your site is ranking for.
Analytics: Google Analytics can be used for finding how many visitors your site gets, which keywords were used by visitors in the search engines to get to your website and what pages they visit. You can also see the traffic sources and average time of readers on your website.
- Create Robots.TXT: Robots.txt is a file which is placed at the root directory of a web site.It helps in indexing and gives instructions to search engines which pages it can see and which pages it cannot see. For example if you specify in your Robots.txt file that you do not want the search egnines to access your thank you page ,than that page wont be show up in search results and web users wont be able to find it. You can go to your website and check whether you have a robots.txt file or not by adding /robots.txt immediately after your domain name. Example: http://www.zybersys.com/robots.txt