Search Engine and Social Media Optimization are the present Mantra! Or should I say Online Marketing? The world of business is witnessing change every other day. The transition is from competition to that of cut-throat competition. To position your site in the right segment and attract the potential customer and that too among the competitors in the search index is a tough task. This blog is my personal one and provides my personal and honest opinion.
Today in Google Webmaster Central blog, Google confirmed that for the last several months, a large team of Googlers has been working on a secret project: next-generation architecture for Google's web search.
This process will help Google search with indexing speed, accuracy, comprehensiveness and other dimensions. The new infrastructure sits "under the hood" of Google's search engine, which means that most users won't notice a difference in search results. But web developers and power searchers might notice a few differences, so to get the feedback from web developers they are collecting the feedback.
The web developer preview of Google’s new infrastructure is available at - http://www2.sandbox.google.com/ . To provide feedback on the current search results and new one can be provided through the link at the bottom of the search results page that says “Dissatisfied? Help us to improve.”
I am yet to identify significant changes in the rankings but, few other experts did notice the changes as
This change has effected to rankings related to keywords used in social media
This will affect the long-tail keywords
Keyword suggestion is popping bit faster
Not indexing as many web pages as the normal one
Providing the search results faster – may be the load is too less on Caffeine
May impact SEO rankings and/or traffic
May impact PPC search strategy
Link structures will not be the only one to decide the ranking
Indexing of social media, news sites and micro-blogging sites as immediate as possible
Improving of accuracy of results
Do comment on this if you have noticed something important about this.
I was searching for a post using Twitter search and it just came to my mind, what algorithm the search can use to give quality results. As of now, Twitter finds keywords in the text of the most recent Tweet. Often, the search results are repetitive or duplicate and not much relevant to the keywords. Jayaram, VP, Operations for Twitter (an ex-Googler) hopes to change all that by taking Twitter search beyond this, by looking at the pages people are linking to within their Tweets. A Twitter crawler will visit the links, index the page and then correlate that 3rd-party content with the Tweet itself. This should help make for more relevant Twitter search results.
This also made me think of more ways the algorithm can be set and I started comparing the organic search engine algorithms with Twitter search, and I have come up with these few assumptions.
Follower counts: I guess follower counts do not really matter as an important factor to be considered in the search algorithm. It really does not prove, if you have thousands of followers, you contribute quality tweets. And, if I have very few followers, it is not that my tweets are quality tweets.
Synergy of Tweets: I think the synergy of your tweets and that of your followers and that of your followers’ followers, should be considered as a factor. (In organic SEO - To have links on websites which have similar content and the link building strategy).
Groupies: Say, most of the employees of a corporate are on Tweeter and they follow one another, then this really forms a closed group with known people together exchange good, informative and quality tweets.
Tweet and URL Synergy: The synergy between the tweet and the web page that is linked in the tweet should matter. Just the case in PPC ads – the text ad and the landing page should talk the same thing.
Frequency of Tweets: You already know the more you tweet, the lower the tweet goes on your follower’s account and the lesser the visibility it gets. And the tweet gets diluted in the number of tweets you have.
Age of Tweet: The latest the tweet, the important is the update. But, how the tweet-bot will identify the spammy tweet to the original tweet is a big question.
Tweet Favorite: It will really add value to the tweet, if the tweet has been marked as favorite by other folks. Bookmarking?
Subscribing RSS: Yes, your tweets will be really important if you people subscribe them through RSS. People really will like you to know your updates instantaneously.
Keyword Density: The density of the keyword among all the tweets over a period of time can be considered as a factor. You may know a lot of information about the keyword and are tweeting about the same and the updates can be informative. Or should it consider the density of the keyword in that particular tweet?
Age of Tweet account: It does not matter when you created an account in twitter and when you have started tweeting. There can be many accounts which have been created and yet no tweets but, the handle can have followers. Tweets from such handles can’t be quality ones.
Re-tweets: Just to spam and get more clicks, there will be people who just re-tweet the same thing again and again. So, this has to be take care of and should not be considered for ranking. Rather it should be considered as spam. Also, linking the same url to different tweets should be spamming.
IP Address: It should note the no. of handles or IDs being logged in from the same IP address. People may have multiple IDs and would be using to promote their website using them. Such IDs need to be penalized.
Geo-targeted: Track the IPs of the users they are tweeting from or logging from. And show the results to the members who are searching from that geo areas. This way the results are geo-targeted.
Demographic profile: Twitter can have an advantage here over Google. They can target the ads or websites as per the demographic profile – age, gender, place, income etc.
These are just my assumptions and thinking. And I would like you to come up with your points and opinion as well to make this post very informative.
But one thing that is most important is, Twitter can be an enormous source of traffic to a website if it is handled well in the social media more so in Twitter.
Rand in his blog has clearly represented the algorithm changes of Google over the past few years in a graph. This clearly shows the trends that happened over the years. It is a great way to communicate the changes with this small graph which says all. As it is said – pictures tell a thousand words.
According to Rand, Domain Trust/Authority is playing lot of role in SERPs. My question here is how does Google measure the website’s domain authority? I guess an authority domain is generally one with large number of pages, is popular with more visitors, has lots of inbound links, its content is frequently updated and re-indexed. This means to achieve domain authority, you need to do the other SEO techniques mentioned in the graph as well. All other techniques lead to domain authority.
And I guess no one exactly knows how or at what stage a domain or website acquires authority. If an article is published in such authority website, it will generally start ranking for its keywords relatively very fast compared to something similar published in a non-authority domain. So it boils down to what matters is where the content or link is published rather than what it is.
There are all sorts of speculation about the signals Google using to define a site as being a brand will deserve a boost in SERPs. It seems overnight many big brands came up to the first page on Google without any SEO tweaks done to their website. Last month Aaron Wall from SEOBook posted his opinion about the change in the algorithm. The few examples which he has referred are about American Airlines, Delta and North West Airlines have come to the first page of Google for the search keyword ‘airlines tickets’ which was not the case earlier. Even Hallmark.com recently started ranking on first page for the keyword ‘gifts’.
If Aaron is correct then - Is Google giving more credit to the brands? Is it considering that only big brands have original and quality content? Do people look only for brands when they want to search? If I wanted to search for a website of certain brand, I would rather type the brand name as the keyword in Google and directly go to the website. Instead of ‘airlines tickets’, I would rather type ‘Delta airlines’ or American airlines’ and go to their website. I will not rather search for just 'airlines tickets' and check for Delta.
In India SearchMasters 2009 which happened recently in Bangalore, India, Adam Lasnik did mentioned in his Q&A session that, in the title tag, it is always good to use the Company’s name and location where it offers the products and services to be mentioned. I am wondering if he was pointing it to this algorithm change. If we add the company or brand name to the title tag are we not wasting the main real estate here? Adam Lasnik says no. Google considers stuffing of keywords in title tags as spamming. I believe if the brand is well known then it makes a lot of sense to put your brand in the page title or meta description tag. It might help to get clicks and even when it does not at least it helps aid awareness. On top of that, if it builds good CTR then it will help in personalized search.
Another sad news and set-back I see for small players is that they have to build their brand image to beat these brand company’s websites in Google’s result pages. The fact is that it’s just plain hard to define the strength of a local brand using an algorithm or a robot visit.
Few tips to minimize the effect of this change in the algorithm (these are just my guesses now and I have never experimented them). Follow them at your own risks.
Frame your title tag with your company’s name and in short provide the product or service you are offering. If possible mention the local area as well. Example: Company Name: Your product/service description and location. Something like - Northwest Airlines - Airline Tickets, Plane Tickets & Airfare.
Have a short and meaning full description which clearly summarizes the page content/product/service which you have on that page. If description tag is missing and/or is not relevant to the content that is available on that page, Google will take the description for your website which is on DMOZ.org (if at all your website is listed there).
Never use the same title and description tag for all the pages of your website - as usual.
Focus on long-tail keywords as this is more advantageous now. eMarketers.com research showed that more and more people are using long-tail and this will increase in coming months.
These are just the two cents. And it is too early to suggest any solutions about this change. I am just trying to guess the relationship between what Adam Lasnik talked during the session and this algorithm change.
Latest Important Update: Matt Cutts of Google made a video talking specifically about this change. Matt said, they made a "change" but he wouldn't call it an "update", but rather a "minor change." In fact, in Google they call it the Vince's change. In short, he said this impacts a relatively small number of queries, not the long tail ones and it is more about "trust," the "quality" of the page, the page's "PageRank" and "value" then about brands.
One example he gives at 2m12s) is that if you type in eclipse into Google, the first result is not from Mitsubishi. So he says it is not "brand" focused but more about trust.
Co-founder of Multiples Consulting, specializing in branding and digital marketing. Skilled in managing global investments, mentoring startups, and fostering relationships across diverse cultures. Adept at driving business growth and delivering strategic solutions for clients worldwide.