Back Home

SEO Tips and Recommendations

There are 178 Blog Items in 23 pages and your are on page number 15

Tweets and How Google Ranks Them

Tweets and How Google Ranks Them

In December of 2009 Google entered the real-time search market by displaying real-time search results in their SERPS (search engine results pages). The idea naturally is to offer Google search users with live results.

Twitter users create tweets, and these tweets are real-time messages that Google displays. How Google decides what tweet is to be displayed is based on a pagerank algorithm specific to tweets and Twitter. A big portion of the Google Twitter Pagerank is based on followers. The more followers a user has then the more their reputation goes up. Amit Singhal from Google said "One user following another in social media is analogous to one page linking to another on the Web. Both are a form of recommendation. As high-quality pages link to another page on the Web, the quality of the linked-to page goes up. Likewise, in social media, as established users follow another user, the quality of the followed user goes up as well." Both are a form of recommendation. The pagerank algorithm is one method that Google uses to rank tweets. Another part of the algorithm includes the use of hashtags. When a user posts a tweet they can categorize it by using hastags, so if I wrote a tweet that was seo based I could categorize it by doing the following #seo. Hashtags are commonly used on Twitter. If I want I can do live searches based solely on hashtags.

Since Twitter and tweets are ever evolving real-time results don’t last like the natural search results in the regular Google index. So if you are looking at getting tweets in the top 10 of Google’s search results your energy will be better spent using seo skills for the natural or organic search results.

Posted by Web Design on Tuesday, January 19, 2010 at 16:12

Is Your Website Trusted?
Is Your Website Trusted?

Do the search engines consider your website to be a trusted website? Does your website have inbound links coming from seed sites? Seed sites are websites that are marked as trustworthy by the search engines. Apple.com, MSN.com, DMOZ, Yahoo Directory, Google Directory, Wikipedia, are trusted seed websites. So if any of these sites were to link to you then that would have a positive effect on the TrustRank of your website. I any of these websites linked to another site that then in turn linked to you then your website will still get a good TrustRank. You can continue to link through deeper and have those sites link to you to also receive good TrustRank.

Here are other things that can help or hurt with your TrustRank:

  • Links from spammy websites will hurt you. If you find spammy links contact the sites and ask for your link to be removed.
  • Good document structure and valuable content has always been a huge plus in making you a trustworthy high quality website.
  • You want inbound links from related websites, that’s a vote for you.
  • The older your domain name is the better. Domain name age with good domain history is a plus for you.
  • Links from social networks are also good. Links from social networks helps you appear to be a trustworthy high quality website.
Posted by Web Site Design and SEO on Friday, January 15, 2010 at 12:41

Targeting Long Tail Keywords

Targeting Long Tail Keywords

Long tail keywords are keyword phrases that are three to five word phrases that are very specific to what you are discussing.  When a user creates a search for exactly what they are looking for they use multiple words to find them.  Long Tail Keywords are very specific phrases that tend to be very descriptive. These descriptive key phrases tend to be easier to get higher rankings for. If your site is e-commerce or m-commerce based then long tail keyword searches tend to convert better since they are specific.

In order to get results to your site using long tail keyword searches you need to code your page for these long tail keywords. If you’re an e-commerce or m-commerce based website then each page should be product specific using long tail keywords. Each page needs to have the title tag, meta tag, H1 , H2, H3 tags, bold tags, emphasis tag and content tags coded using these long tail keywords.

Lastly make sure to research your long tail keywords to see what types of searches are currently being performed.

Posted by Web Design and SEO on Thursday, January 14, 2010 at 09:09

The Age of a Domain Name and SEO

The Age of a Domain Name and SEO


The age of a domain name plays into the algorithms of most popular search engines. Older domain names had fewer competitors when they came out, so getting higher rankings was easier at that time.  If your domain name is older the theory is that it is more established and trusted. A new domain name is not established and hasn’t had the time to build the trust.

So if you have a newer domain name can you compete with the older sites? There are some steps you can take to help get higher rankings. You can optimize your web pages for as many variations of keywords, key-phrases as possible. You want to make sure each web page is optimized for a variance of keywords and key-phrases. Stay away from highly competitive single words since getting rankings for these is not going to be possible. Last but not least build link worthy pages, and start building your inbound links to different pages of your website. Link worthy pages might be how to articles, blog posts, contest and give-away pages and topic based articles.

Posted by Web Design and SEO on Tuesday, January 12, 2010 at 10:00

Optimized for M-Commerce – Current SEO Trends
Optimized for M-Commerce – Current SEO Trends

M-CommerceCurrently 58 million American consumers access the internet via their smartphones or mobile devices. M-Commerce while still in its infancy is quickly gaining popularity as a way to shop. North American m-commerce sales will top 750 million dollars this year.

SEO’s will now have an additional revenue stream to optimize for. Optimizing for e-commerce or m-commerce is a good revenue stream since it is easy to track sales and measure and attain a solid R.O.I. As an e-commerce or m-commerce retailer it behooves you to have an e-commerce or m-commerce seo strategy in place. This will increase your sales and improve your bottom line.

One of the biggest design issues with m-commerce is to not overwhelm the user with too many choices or products. Since the screens are small you will have a higher conversion rate by keeping your product mix small. Another popular means of placing orders with m-commerce is through text messaging. Some of the top m-commerce e-tailers are offering this as an ordering option.

One thing is for sure keeping up with current trends in this New Year may help your bottom line.
Posted by Web Site Design and SEO on Monday, January 11, 2010 at 15:38

A Free Keyword Suggestion Tool
A Free Keyword Suggestion Tool

To be successful in SEO you need to select and use the right keywords for both natural search results and PPC. Not selecting the right keywords and key-phrases will bring unnecessary users and traffic to your website. KeywordIndex is a new free keyword research tool that doesn’t share it’s database with anyone else. Spending the time to select the right keywords or key-phrases for your site will pay great dividends down the road. Using the right keywords or key-phrases will put money in your pocket.

KeywordIndex is a powerful new research tool that will help you to find great keywords and key-phrases for your website. You can access the new keyword suggestion tool at http://www.keywordindex.com/. It's free and you can be used it for as long as you want.
Posted by Los Angeles Website Design and SEO on Saturday, January 9, 2010 at 11:16

Get High Rankings with Natural Links

Get High Rankings with Natural Links

It used to be that a website would rank well if they had in-bound links from any website. Then the search engine algorithms became more intelligent and they were able to track relevancy and linking patterns.  Google is able to track unnatural links by knowing who the link farms, linking systems, automated linking systems, and paid link sites are. Google considers these types of links spammy and they will hurt your website rankings.

The objective when getting in-bound links is to have relative links link into your website. You also want internet directories, social bookmarking services, and relative blogs to link to you. To get these links contact the relevant websites and point them to your website and the relevant information and ask them to link to you. For the directories you want links from Google Directory, Yahoo Directory, and DMOZ. You also want directories that are relative to your industry to link to you so let them know you exist.  Make it easy for social bookmark services to be saved and provide a link for people to click and bookmark your site. You also want deep links into your website, not only to your homepage but to all of your interior pages also.

Of course the best way to get in-bound links is to have a website that is worthy of getting in-bound links because the content on the website is highly sought after information. As a webmaster you want to provide information that others want and seek.

Posted by Web Design and SEO on Friday, January 8, 2010 at 09:21

Can Flash and Silverlight be Search Friendly?

Can Flash and Silverlight be Search Engine Friendly?

Adobe Flash and Microsoft Silverlight can create some awe inspiring animated web sites. Awe inspiring won't put money in the bank though and so if you have a Flash or Sivlerlight based site then most likely the search engines have no idea what your website is about. Currently about 95% of browsers are Flash ready and 25% are Silverlight ready. You can view some plugin stat information here: http://www.statowl.com/plugin_overview.php

There are some new technologies coming out that help search engines read these types of files. Venda Inc. can take a flash file and convert it to an XML file so it can be indexed for the search engines. Once a file is converted then changes can be made directly to the XML file. This in turn makes the file searchable on the major indexes.

Back in mid 2008 Google also announced that they had begun crawling and indexing Flash based websites. Through a flash reader that was created by Adobe Google's algorithm can read Flash files. While the technology isn't perfect it is a start. Google has stated that Googlebot can now extract textual content and links so Google can better crawl, index, and rank a website.

The technology of reading Flash files will get better over time and hopefully at some point reading a Flash or Silverlight file will be the same as reading text based file.

Posted by Website Design and SEO on Thursday, January 7, 2010 at 08:20

Page:  << Prev  1  2  3  4  5  6  7  8  9  10  11  12  13  14  15 16  17  18  19  20  21  22  23  Next >>


Powered by Los Angeles Web Design
Copyright ©2004 - 2010 Hieroweb Interactive