What is the Google RankBrain algorithm update all about & how does it work? How does this machine learning Artificial Intelligence (AI) affect SEO? In my previous article on facts and myths of Artificial Intelligence, I wrote about Strong AI and Super AI. I said that they may take time before they arrive; that it’s just a matter of time before someone cracks how to make a machine think like humans. I also said corporates would be glad to fund such projects if they promise better profits. For one, Google now has a “brain” that works well and it is called Google RankBrain. It may not be able to think yet but who knows the future! What surprised me was a comment from a Google executive saying they can’t understand what Google RankBrain AI is doing.

What is Google RankBrain AI

AI stands for Artificial Intelligence, and I will be using the acronym here to keep it easy. Before proceeding to the part where we will talk about Google not being able to understand what its own creation is doing, this section introducesBrainRank AI Search to readers who don’t know about search engine algorithms.

Search Engines like Google depend on hundreds of factors to bring the best possible results to anything you enter in the search box. Earlier they were dumb and focused just on keywords. But the keywords could also be dumb. For example, people can search for “explain top of the food chain”. This can easily confuse a search engine into assuming that maybe the person searching is asking something about food chains like restaurants so give him a list of top restaurants in the area

But the person is actually searching for the name of which, the top carnivore. The food chain starts with single cell animals, goes on to herbs, then herbivorous animals, carnivorous animals, humans and ends with a predator on the top.

Google and other search engines store plenty of information on their servers so that they can provide you with the results you want. For that, they check out many factors. So far, no artificial intelligence was involved. Among the hundreds of factors, it was ‘items in bold’, ‘headings’, ‘subheadings’, ‘repetition of a word or phrase’ and many such things.

If the person who is searching on Google, types in irrelevant things into the search box, the results were always garbage. The first principle of machines is if you feed garbage to machines, they’ll give out the garbage. You may search GIGO(garbage in, garbage out) for examples of this principle.

To tackle such situations, Google kept on making changes to its search algorithms and then secretly included BrainRank into it somewhere in 2015. It kept it a secret until recently. An event was held in March, and that is when they acknowledged that their engineers do not know how the thing works. It does send out wrong signals. 

RankBrain is part of Google’s Hummingbird search algorithm, and is said to be the third-most important signal – the first probably being the quality of back-links. It will soon change the way SEO works.

Here is what Google RankBrain AI search algorithm does according to what I could grasp from my research. Instead of focusing on each search initiated, it focusses on the entire search session. Normally, to get proper results and to narrow down, many researchers use synonyms and words related to what they are searching. Like in the above example, one may use “topmost consumer in the food chain” and “what’s the highest level of food chain called”. He or she may use more keywords depending upon what the person wants to know.

So as the searches progress in the session, from the first search to nth search, Google RankBrain AI will start presenting more and more relevant pages to the researcher. This may include pages that do not even include the keyword but provides more related information about the same.

What does Google RankBrain work

Google RankBrain AI

Here comes the problem. The creators of the RankBrain AI themselves do not understand how it works. Since it is limited to search, it is not a scary situation. But imagine creating a similar thing in a domain that is related to weapons? What are the odds against a machine growing mature enough to take its own stand against the creators? What if we create AI-based robots for the army, mass produce them and some things go wrong to make them turn against their own generals? It doesn’t look right. The chances are 50:50 – a good amount of risk.

In an event called SMX, Google’s Paul Haahr, who goes by the handle @haahr on Twitter told many interesting things about the algorithm and acknowledged that Google engineers who work on RankBrain don’t know how it works. Either Haahr was not willing to share information or the creators really don’t know much about their creation.

If later is the case, it should ring some alarm bells. Already many scholars have raised their fears on AI and the fast growing research in the domain. They petitioned governments to stop funding projects leading to strong and super AI.

Google RankBrain AI is just the beginning!

Source : http://www.thewindowsclub.com/google-rankbrain 

Categorized in Search Engine

Even before the first Apple iPhone was released in 2007, marketers were asking the question: “What should my mobile strategy be?” And it’s a question that many are still asking today. One thing is for sure, though: the mobile web is here to stay.

recent report from BI Intelligence projected by the year 2020, there will be 3.5 billion smartphones shipped worldwide. And users are increasingly shifting to mobile as their primary device for accessing the internet. In fact, Google announced last year that now, “more Google searches take place on mobile devices than on computers in 10 countries, including the US and Japan.”

This makes sense, because (as Benedict Evans recently wrote), “It’s actually the PC that has the limited, basic, cut-down version of the Internet…it only has the web.”

Our mobile devices have much more information to draw on than a desktop device:

  • photos
  • geolocation
  • friends
  • physical movement

As well as greater interactivity:

  • with the external world (through technology like beacons)
  • with you when you’re not using it (through notifications)
  • with your personal identity (because a phone is always signed-in and it is almost always an individual device rather than a shared one).

So how do search marketers ensure that this boom in mobile web usage won’t leave them behind? By staying on top of the basics of mobile search.

How Google Deals with Mobile Search

Google has many different crawlers for different use cases and different indexes, such as:

  • Googlebot
  • Googlebot News
  • Googlebot Images
  • Googlebot Video

For mobile search (specifically for smartphones), they use a version of Googlebot which uses a smartphone’s user-agent, so that the crawler can have the same user experience as actual mobile users (such as redirects to mobile versions, etc.).

This is the current Googlebot user-agent for smartphones:

Mozilla/5.0 (iPhone; CPU iPhone OS 8_3 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12F70 Safari/600.1.4 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

For comparison, this is the regular Googlebot user-agent:

Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

It is worth noting that Google does not consider tablets to be the same thing as mobile. Although they do consider tablets a separate class of device from both mobile and desktop, nonetheless, their view is that “Unless you offer tablet-optimized content, you can assume that users expect to see your site as it would look on a desktop browser.”

The mobile-friendly pages discovered by the smartphone Googlebot crawler, as well as mobile page versions discovered by the desktop crawler, are indexed as usual. However the SERP may include a different set of results for a mobile user than a desktop one, and mobile-friendly pages will be given a slight ranking boost, all things being equal.

Note that Google judges mobile-friendliness on a page-by-page basis rather than across an entire website, so if you have limited resources it is best to start by making your most valuable pages mobile-friendly first and branching out from there.

Google’s Mobile-Friendly Update (“Mobilegeddon”)

In 2015, Google announced a new rankings update for mobile search, which was nicknamed “Mobilegeddon”. On April 21, the rollout began, and the internet started to see the effects.

Some websites were directly impacted, losing up to 35% of mobile rankingswithin the 1st month — and some more indirectly, by the incentive to move towards mobile-friendliness before the update hit.

Google announced a 4.7% increase in the number of mobile-friendly sites in the time between announcing that the update was coming and when it actually rolled out.

Around the same time, Bing also announced they would be rolling out a similar update, although they haven’t provided as much information on the timeline of this. However, generally speaking, a site which is well-optimized for Google mobile search should perform well in Bing.

A new version of Google’s mobile-friendly update has recently been released(in May of 2016), so we can expect to see a further impact from this as a ranking factor over time.

So how do you make sure your site is correctly designed and optimized for mobile search?

Making Your Website Mobile-Friendly

There are three main approaches to making a website mobile-friendly. These are:

  • Responsive Design: the page – URL, HTML, images, everything – remains the same, but the CSS rearranges the page layout depending on-screen width.

TIP: Google has expressed this is their preferred approach, although they support the other two as well. This is primarily because responsively designed sites don’t require any additional technical implementation to optimize them for search.

  • RESS/Adaptive/Dynamic Serving: the URL remains the same, but the server sends a different version of the HTML (and CSS) based on what type of device is requesting the page.
  • Separate Mobile Site: as the name implies, this is when you simply create a second, “mobile-friendly” website for mobile users. Separate mobile sites usually sit on a subdomain (e.g. m.domain.com) or sometimes a subfolder (e.g. www.domain.com/mobile).

When creating a separate mobile site, the best approach is to keep all the same pages and content in the same structure (e.g. www.domain.com/first-page and m.domain.com/first-page). This makes it easy to redirect based on user agent/device, and also to indicate to Google what pages are equivalent on the mobile vs desktop version.

But since it’s a separate set of pages, you could choose to have a completely different site structure, in which case the mobile URLs might be different.


If you’re wondering how best to implement a mobile-friendly design for your site, check out this guide from my company, Distilled, on “Building Your Mobile-Friendly Website”.

How to Figure Out if Your Site is Mobile-Friendly

Here are a few tools you can use to check whether your site is mobile-friendly:

SEO for Mobile Search

If you have a responsive site which is optimized for search, you won’t need to do anything different for the mobile crawler. Note that when you are auditing the site, it is worth crawling it using a mobile user agent in addition to your regular crawl. This will allow you to identify any crawl issues which only occur on mobile.

Site performance (around things like speed and page load time) may also impact results for mobile search. An effective responsive site serves appropriately sized assets for the user’s screen size, even if the underlying HTML/CSS is the same. It’s worth checking site speed separately for mobile and desktop (and easy to do with Google’s PageSpeed Insights tool!).

If you have a dynamically served site or a separate mobile site, you’ll also need to add a couple of things to your pages to make sure that Google understands that the two versions are connected.

Optimizing an Adaptive Website for Mobile Search

An adaptive (or dynamically served) site uses a single URL, but serves a different version of the HTML/CSS depending on the type of device requesting the page.

While basic SEO principles remain the same as for a responsive site, you also need to make sure you avoid the appearance of cloaking.

Cloaking is when you show one thing to a search engine, and something different to a human user, and Google will devalue sites that are doing this for SEO gains. (Note that visually hiding or minimizing content for UI purposes, such as including a menu which is collapsed on load, do not count as cloaking as long as the content is accessible to users and crawlers alike.)

In the case of a dynamically served site, you want to signal to Google that you are showing different content based on user agent to provide the appropriate version of the page for the device accessing the page, rather than to trick the Googlebot user agent for nefarious SEO purposes.

To make it clear that this is what you’re doing, you should use the Vary-HTTP Header.

Using this header has two additional benefits:

  • It will let the mobile crawler know there is separate mobile content on this URL, and therefore encourage it to crawl the site.
  • It will signal to caching servers that they should consider the user agent (e.g. the type of device) when deciding whether to serve a page from the cache.

Optimizing a Separate Mobile Website for Mobile Search

A mobile site on a separate URL is effectively a different site, so you’ll need to optimize these pages in addition to optimizing the desktop version. Again, the basic SEO principles remain the same, with a few extra guidelines:

  • Create a parallel URL structure

Unless you’ve built your mobile site with very different content than your desktop site, the URL structure for your mobile site should mirror the relevant pages on your desktop site as closely as possible. So www.example.com/funny-story should become m.example.com/funny-story, not m.example.com/different-page.

  • Add mobile switchboard tags

A separate mobile site with the same or similar content as the desktop version could potentially be seen as a case of duplicate content, which may be suppressed by search engines.

This is where the mobile switchboard tag comes in. This tag indicates to Google crawlers that this is an alternate version of the site intended for mobile devices. A version of this tag is placed on both the desktop and mobile versions of the page.

To set up switchboard tags correctly:

  1. On the desktop version, place a mobile-specific rel=”alternate” tag indicating the relevant mobile page.
  2. On the mobile version, place a rel=”canonical” tag indicating the relevant desktop page.

These annotations can be included in the HTML of the pages themselves and sitemaps (but you don’t have to do both).

As an example, where the desktop URL is http://example.com/page-1 and the equivalent mobile URL is http://m.example.com/page-1, the tagging for this example would be as follows.


On the desktop page (http://www.example.com/page-1), place the following:

<link rel=”alternate” media=”only screen and (max-width: 640px)”

and on the mobile page (http://m.example.com/page-1), include:

<link rel=”canonical” href=”http://www.example.com/page-1″>

The rel=”canonical” tag on the mobile URL indicating the desktop URL is always required.

In Sitemaps:

You can place the rel=”alternate” tag for desktop pages in your sitemaps, as follows:

<?xml version=”1.0″ encoding=”UTF-8″?>

<urlset xmlns=”http://www.sitemaps.org/schemas/sitemap/0.9






media=”only screen and (max-width: 640px)”

href=”http://m.example.com/page-1″ />



The required rel=”canonical” tag on the mobile URL must still be included in the mobile page’s HTML.

  • Consider user-agent redirects

Some visitors will arrive at the wrong version of your site for their device, and you may want to redirect them. Use server-side redirects rather than Javascript redirects. You may use either 301 or 302 redirects.

  • Additional guidance for handling redirects to a mobile site:
  1. Don’t redirect all desktop pages to the mobile homepage; instead, point them to a mobile page which is relevant to the original.
  2. Include a link to ‘view desktop version’ on your mobile site (and vice versa). Use cookies to ensure that if a user clicks on this option the user agent detection will be overridden, and they will not be redirected again (unless they choose to switch back via the ‘view mobile version’ option).
  3. Send tablet users to the desktop site, rather than the mobile site (unless you have a tablet-specific version). Tablet browsing patterns typically resemble desktop browsing patterns more than mobile.
  • Keep both versions crawlable

Make sure you’re not blocking the Googlebot smartphone user-agent from your desktop version in robots.txt and don’t block regular Googlebot from the mobile version.

Schema.org, Rich Snippets, and Rich Cards

As Google shifts towards more of a card-based format in the SERPs, and especially on mobile devices where the screen height means limited screen real estate, any steps we can take to obtain enhanced results like rich snippets and rich cards (through the use of structured data markup) becomes increasingly valuable.

visual of rich snippets and rich cards

If you aren’t sure what type of structured data might be right for your website, you can use my guide for performing a structured data audit.

Other Key Mobile Search Trends from Google

In addition to trends around mobile-friendliness as a ranking factor, Google has been working towards a few other key trends which relate directly to mobile technology and user behavior:

  • The Accelerated Mobile Pages Project to improve site speed and page load times for mobile content, and allow this content to be cached and served directly within a SERP (rather than sending the user to the original website)
  • Removing sidebar ads from the desktop SERP layout for a more streamlined, “mobile” look
  • Integrating app content with web search through support for app indexation and app streaming

We don’t have space to go in-depth on these topics here, but here are a few recommended resources you can check out if you’d like to learn more:


SERP Layout

App Indexation and Streaming


Mobile technology is changing rapidly, and this has created major shifts in user behavior and mobile web usage worldwide. More than ever, mobile search is becoming the future of SEO, and with that comes a host of new challenges. But the key principles remain the same: ensure that crawlers can access, read, and understand your content, ensure user experience is working well for all devices, and keep testing and iterating for better results.


Categorized in Search Engine

It’s time for the first meeting with the customer. You may be a seasoned search marketer, but you’re still a little nervous. How do you achieve that perfect balance of getting the information you need while still exuding an aura of consummate professionalism, knowledge, and generally make yourself seem like the search Dalai Lama?

First, realize that the real Dalai Lama feels no need to prove himself, he just *is*. Project an aura of confidence, and realize the most elusive concept in our industry. It is not about you, it is about the customer.

Sidebar: Even if you’re an in-house marketer, you can still use these techniques. Pretend your VP of Product Development or someone similarly entrenched in the product/service is your “customer”.

Similarly, that first meeting should be all about the customer. This is your best chance to get an outsider’s perspective of how your customer views their products and what language they use to describe them.

After this first meeting, you’ll be an insider, and asking some of these questions will make it seem like you don’t know what you’re doing. So let your customer do most of the talking.

As you listen to the answers, jot down key phrases, jargon, and abbreviations they use to inform your keyword research later. Don’t forget to ask them to clarify anything you don’t understand.

Note that this is by no means an exhaustive list of questions you should be asking; merely a sample of questions for keyword research purposes.

Question 1: I’ve reviewed your website, and have learned about your business. However it always helps to hear you explain it in your own words. So, Mr. Customer, how would you describe what you do?

The answer to this is likely to be the same words you read on their website or see in a brochure. Point out any jargon that you don’t understand, as this will set the stage for later, when you tell them they need to change the way they describe their product.

Question 2: In your opinion, what is it that makes your product/service special? What differentiates you from your competitors?

These are their value propositions; the key elements that need to come across on their pages to compel a conversion. If one of them is that they offer the lowest cost, then you know to research keyword modifiers like [cheap], [low cost], [price]. Alternatively, if they’re not low cost, you know to avoid these keyword modifiers. More on this in my next article.

Question 3: What do you think are similar services/products that you do not consider competitors?

The keywords that come out in this answer will help you refine the research. Often, keywords that are very similar may have a completely different meaning in a particular clients’ industry.

For example, “phone lines” and “phone trunks” are very different and each appeal to a distinct target market. You’ll only want to explore the right one in your research.

Question 4: Which products/services are most profitable for you? Are there other reasons (inventory, seasonality, location) that you would want to push one product/service over another?

Again, the answer to this question will help focus your research. Spend the most time expanding and refining the products that the client indicates are most important. This can sometimes save you from exploring an entire product line, if the customer says something like, “Product A is a necessary evil. We have to carry it, but we also have to price it below cost.”

Obviously, that’s not an area you want to focus on. You’ll include some keywords to be thorough, but you’ll spend more of your time on the “money” keywords.

Question 5: What do you think are your top ten most important keywords?

Ask for ten keywords. The reason for this is that some customers think they need to rank for their entire keyword universe of 1000 × 10100 keywords.

On the flip side, there are clients that think they only need to rank for one keyword and it will solve all their problems. Chances are that’s a virtually unattainable keyword like “tablet”. This question will help you determine which type your customer is, as well as let you know what keywords absolutely must be included in your final research.

Asking these five questions will complete a formidable amount of your keyword research before you even sit down at your computer. It will also help you focus priorities and set realistic expectations with the very first client meeting.


Categorized in Online Research

As search engine optimization (SEO) professionals, we obsess with search data from a wide variety of resources. Which one is best for our clients? Which keyword research tool reveals the most accurate search behaviors when rebuilding a site’s information architecture? Does our web analytics data validate our keyword research?

And, more importantly, did these tools provide your most desired information? Some answers might surprise you.

Keyword research data

I love keyword research tools. I use all of them because I can discover core keyword phrases, which are commonly used across all of the commercial web search engines. And I can also tailor ads and landing pages to searchers who typically use a single, targeted search engine (and it isn’t always Google, as one might imagine).

However, keyword research tools are not a substitute for a knowledgeable and intuitive search engine marketer. All too often, website owners and even experienced search engine optimization professionals launch into a site’s information architecture without gauging user response. As good SEO professionals, we should understand when it is appropriate to implement keywords into a site’s information architecture: when keyword usage overwhelms users, and when keyword usage needs to be more apparent.

This situation occurred recently when I was performing some usability tests on a client site’s revised information architecture. This particular client website is being delivered in multiple languages. We were testing American English, British English, and French. Therefore, the test participants were American, British, and French.

All of the keyword research tools showed the word “student” or “students” (in French, “étudiant” or “étudiants”) as a possible target. The appearance of this word in both keyword research data and in the site’s web analytics data led my client to believe that we should make this area a main category.

If we had relied on the data from keyword research tools, we would have been wrong. If we had relied on the data from web analytics software, we would have been wrong.

The face-to-face user interaction gave us the right answer.

The facial expressions were enough to convince me. Almost every single time the word “student” or “étudiant” appeared during the usability test, I saw confusion. When I asked test participants why they seemed confused, they said that the particular keyword phrase was not appropriate for that type of website. They then placed the student-related information groupings in one of two piles:

  • Discard – Participants felt that the information label and/or grouping did not belong on the website at all.
  • Do not know – Participants were unsure whether the information label and/or grouping did or did not not belong on the website.

The discard pile won, with over 90% from all three language groups.

Now, imagine if this company did NOT have one-on-one interaction with searchers during the redesign process and only relied on keyword research tools. How much time and money might have been wasted?

Keyword research data is not the only type of data that can be easily misinterpreted.

Web analytics search data

One search metric that clients and prospects inevitably mention is “stickiness.” In other words, one of their search marketing goals is to increase the number of page views per visitor via search engine traffic, especially if the site is a publisher, blog, or news site. Increasing the number of page views per visitor provides more advertising opportunities as well as a positive branding impact. The average time on site (if it is longer than two minutes) is also commonly viewed as a positive search metric.

Or so it might seem. Here is an example.

Many SEO professionals, including me, provide blog optimization for a wide variety of companies (ecommerce, news, software, etc.). Not only do we provide keyword research for blogs, we must also monitor the effectiveness of keyword-driven traffic via web analytics data.

Upon initial viewing, the blog’s analytics data might indicate increased stickiness. Searchers are reading more blog entries. Searchers are engaged. Therefore, the blog content is great…that is a common conclusion.

For an exploratory usability test, I ask test participants to tell me about a blog post that they found very helpful. I asked them why they liked the blog’s content, and I listen very closely for keyword phrases. Audio and/or video recording makes this job a little easier.

When I asked test participants to refind desired information on a blog on the lab’s computer, I did not hear, “This blog content is great!” Comments I frequently heard were:

  • “I can’t find this [expletive] thing.”
  • “Now where could it be? I saw it here before….”
  • “I think this was posted in [month/day/year]….”
  • “Where the [expletive] is it?”

As you might imagine, the use of expletives became more and more frequent with the increased number of page views.

Sure, searchers who discover great blog content might bookmark the URL, or they might link to it from a “Links and Resources” section of their web site, or they might cite the URL in a follow-up post on another website. All of these actions and associated behaviors make it easier for searchers to refind important information.

However, when I review web analytics data, I often find that site visitors do not take these actions as frequently as people might think. Instead, with careful clickstream analysis combined with usability testing, I see that the average page view per visitor metric is heavily influenced by frustrated refinding behaviors.


I have always believed that search engine optimization is part art, part science. Certainly, keyword research data and web analytics data are very much part of the “science” part of SEO.

Nevertheless, the “art” part of SEO comes into play when interpreting this data. By listening to users and observing their search behaviors, having that one-on-one interaction, I can hear keywords that are not used in query formulation. I study facial expressions and corresponding mouse movements that are associated with keywords. I see how keywords are formatted in search engine results pages (SERPs) and corresponding landing pages, and how searchers react to that formatting and placement.

I cannot imagine my job as an SEO professional without keyword research tools and web analytics software. In addition, I cannot imagine my job as an SEO professional without one-on-one searcher interaction. What do you think? Have any of you learned something that keyword research tools and/or web analytics data did not reveal?


Categorized in Search Engine

A few weeks ago, I wrote about how to use your first meeting with a client to understand their business and collect information that could later inform your keyword research. Now, you’re back at your desk and wondering what to do with all that information.

To begin with, you should have three lists of keyword-types (I call them seeds):

Types of Keyword Seeds or Categories
  1. Seeds most important to your clients (note that these may include jargon and industry-specific terms that need further research)
  2. Seeds that accurately describe the business (these would be your own layman’s terms for what this client does)
  3. Seeds that are not relevant or core to your client’s business

I like to refer to these as seeds because they are a seed of an idea that could grow into giant “trees” of information and possibilities.

There’s no need at this point to distinguish between “deck” and “decking” for example, and this is a mistake SEOs often make; trying to narrow the field too much too early.

Let’s dive into each of these a little more deeply using an example of a client I did work for: Artisan Contruction Services.

Note that all of these lists have far more than 2-3 keywords on them, but for purposes of example, I’ve simplified them. This client is a local (to Raleigh, NC) remodeling company that specializes in building decks and screened porches and remodeling kitchens and bathrooms. (Those are my own words for List Two).

The owner of the company, when asked to describe the product in his own words, said:

“We provide decking, siding and window replacement, and interior remodeling.”

Seeds most important to the client (based on the above description and the keywords he mentioned) are decking, siding, windows and interior remodeling. This would be List One above.

Seeds that aren’t relevant (List Three above) are things the client prefers not to do or sub-contracts out, such as roofing (says he can never do it as cheaply as professional roofers), plumbing (he hates it) and highly specialized design work like tile inlays. He’s also not a licensed electrician. So these are keyword seeds to avoid.

Initial Lists of Keyword Seeds

Example of Keyword Seed Lists

List One

List One is based on jargon, and requires further research. The first thing I do with keywords like this is to look at competitors’ websites. I’ve gotten a list of competitors from the client that I’ll research, and I’ll also put these terms into Google or Bing and look at the sites that come up in the results (I’ll localize to Raleigh, NC so that I’m getting the most accurate set of competitors).

Reviewing these sites will give me more seeds to research based on that jargon. In this case, I found specific types of decking, such as composite and pressure-treated, and I found that many competitors also refer to screened porches as sunrooms or patios (which are slightly different, but may cover more potential customers).

One additional thing the client told me is that customers often aren’t sure of what they want until they call him in for an estimate, so I’m keeping this in mind. Also during my research, I found another competitor in search that wasn’t mentioned as a major competitor. I’ll put this on a list of things to ask the client about in our next meeting.

List One Keyword Seeds

Example of List One Expansion based on Competitors research

Next, I’ll look at how customers are actually referring to the different products and services.

I’ll use theGoogle Search Bar “related searches” area at the bottom of Google’s SERPs, Google Insights to look at trends, and the “Discussions” search option (click “More” under “Search” on the left side of a Google SERP page).



Based on what I found here, I’ve learned that many people are asking what the differences are between screened porches and sunrooms, as well as that they’re sometimes referred to as lanais or three-season porches. I’ll add these seeds to my research.

I also learned that many people are interested in enclosing an existing deck into a screened porch, or “winterizing” a screened porch. More seeds for my research.

To review, I’ve taken the keyword seeds [screened porch], [patio], and [sunroom] and added:

  • enclosing deck
  • winterizing porch
  • lanai
  • three-season porch

These are all things that my client’s customers are looking for that his competitors aren’t servicing. They should be easy wins.

Keyword Seeds List Two

Example of List Two expansion based on Google "Discussions"

List Two

I can research List Two in much the same way I did List One. I’ll add these seeds to the research as well.

List Three

List Three is a little different from the others. I won’t add these as seeds to my research, but I will save them for the elimination and refinement process later.

This is where instinct and experience becomes particularly useful, as it’s likely that I can take any list of keywords to avoid and expand it on instinct.

For example, based on what I know of this client, he already wants to avoid roofing, plumbing, tile, and electrical. But here are a few more statements I jotted down at our meeting that give me more clues:

  • “I’m not the best priced contractor out there, because I don’t hire any undocumented workers and I pay my taxes. But I am very experienced and my clients are always happy with my work.”

Now I know I need to avoid [cheap], [free], [low-cost], [best priced], and other keywords like that. [Quality], [experience] and [ethical] are possible modifiers that are allowed.

  • “I prefer to work with composite materials rather than pressure-treated lumber for decks. It’s much higher quality and creates a nicer finished product.”

So it’s a good idea to focus on any searches asking for the differences between those materials. Also I’ll probably weight the research more heavily to different types and brands of composite materials.

Another note I’m jotting down from this statement is to suggest the client create a page that discusses the pros and cons of composite vs. pressure-treated materials.

  • “A lot of customers get a quote from a company like SEARS home improvement when they’re thinking about doing a remodeling project. This makes it tough for me because the materials that SEARS uses are limited to less-expensive ones. It helps me a lot if I can get a sense of a client’s budget beforehand; a single project can vary by thousands of dollars depending on the materials used. But of course, nicer materials create a nicer finished project.”

I’m not exactly sure what I could take from this, but there are likely to be a lot of keywords related to home improvement and/or SEARS.

I’ll be careful of those keywords and use something like Google Insights to determine if those trend higher at a certain time of year. I might even put them into a tool like ComScore to see if I can determine if people who search for [home improvement] related terms are in a lower income bracket. Of course, I also know I’ll have to avoid any keywords having to do with the television program of the same name.

Keyword Seeds List Three

Example of expansion of List Three based on notes from the client meeting

This is just the tip of the iceberg for keyword research. The proverbial “rabbit hole” can get very deep sometimes, so it’s important to make good decisions about which keywords to expand and which to keep at surface level.

I’m sure at this point, you’re wondering why I haven’t mentioned Googles's keyword Frequency Tool. Researching search frequency can be very useful, especially in determining how far to expand a certain keyword seed. For example, I found almost immediately that [lanai] has very low search frequency. So I didn’t spend a lot of time on it.

Conversely, I found that [enclosing deck] is actually quite large, especially when viewed through Google Insights in the spring and summer months, localized to North Carolina.

Ultimately, I’ll put all of these keyword seeds into the Google Keyword Tool to find the most highly searched combinations of keywords and an overall estimate of the search frequency of one service (decks) over another (window replacement). This will help me guide the client on what content should be created for the website.

I prefer to do most of the research in the manner discussed above, and then use search frequency to refine, categorize and prioritize it. I have certain tools and formulas that I use to do that. Next time, I’ll give you these tools and explain how to refine what you’ve found and present it to your client.


Categorized in Online Research

Every internet-based business has a pool of keywords they focus on when targeting their website to get conversions. Researching keywords and picking up the relevant keyword ideas from different sources is undoubtedly a time-eater and a resource-consuming task. But how can you be sure these keywords play into the hand of your business?

Building out a Keyword List

Every SEO expert or an experienced webmaster knows that creating a good list of keywords that perfectly match your website’s content is only half the battle. Merging a huge pool of keywords into small groups to spread them across web pages in a clever way is a challenging task even for savvy marketers.

After all, there are some questions to think about:

How can I group keywords without wasting tons of time?
How can I make sure I do it the most efficient way possible?
Is there a chance I can do it all on auto-pilot?
And this is where Topvisor comes in. With the Topvisor Keyword Clustering tool, you can get a complete keyword structure for every page of your website in just a few clicks.

How To Segment Keywords Into Relevant Groups Using Topvisor

Rely on Automated Algorithms

When it comes to working with large data, it’s always a good idea to rely on proven automated algorithms that leave zero chance for mistakes. The tricky thing about doing the whole job on your own is what you think might be the best page-by-page keyword structure for your website isn’t necessarily what Google thinks.

Topvisor Clustering Tool is fully automated and based on TOP-10 of the search engine results page. This means it doesn’t even think as you think, it thinks like Google and does it in a completely automated way.

Here how it goes:

The tool will take your keywords, generate and send automated queries to search engines and then match web pages from search results for each keyword.

If the search engine returns the same web pages for different keywords and there are several matches, they will be banded up together (clustered). The keyword with the highest search volume will become a group name.

How To Segment Keywords Into Relevant Groups Using Topvisor

The tool will quickly show you a complete page-by-page keyword structure. The whole process won’t take more than 15 minutes, compared to at least one working day if you do it manually.

How To Segment Keywords Into Relevant Groups Using Topvisor

Pick a Correct Location

Nearly every business targets a particular location, which is why it’s important to enrich your website with keyword relevant to the target location. And that’s where automated algorithms come in hand. Pick a region and the Clustering tool will keep in mind that you want to consider only a local SERP.

Create Keyword-Rich Pages

You are free to adjust how crowded keyword groups should be. You can set a clustering level, which is a minimum number of matches in TOP-10 of SERP required to trigger grouping of these keywords. Clustering with a high clustering level produces more groups with fewer keywords in every group.

How To Segment Keywords Into Relevant Groups Using Topvisor

In Summary

So here we go. Keyword grouping is not a rocket science after all. Sign up for a free account here and get your top-notch keyword structure today.

Source:  https://www.searchenginejournal.com/segment-keywords-relevant-groups-using-topvisor/163385/

Categorized in Internet Technology

Has your business listing in Google been suspended? Not sure what happened? Columnist Joy Hawkins discusses the likely causes and how to address them.

I see threads over at the Google My Business forum all the time from panicked business owners or SEOs who have logged into Google My Business to see a big red “Suspended” banner at the top of the page. The Google My Business guidelines have a very long list of things you shouldn’t do, but some offenses are much more serious than others.

Before I get into which rule violations lead to suspensions, it’s important to know the facts around suspensions.

Google won’t tell you why you got suspended

A Google employee will rarely tell you why your account got suspended.

Business owners often want Google to spell out what rule caused their suspension, but Google isn’t about to help rule-breakers get better at doing it and avoid consequences.

There are two different types of suspensions

The first type of suspension is what I refer to as a soft suspension. This is when you log in to Google My Business and see the “suspended” label and no longer have the ability to manage your listing. However, your listing still shows up on Google and Google Maps/Map Maker.

In this case, the listing has really just become unverified. Since you broke Google’s guidelines in some way, they have removed your ability to manage the listing, but the listing’s ranking is rarely impacted. I once worked with a locksmith who ranked first in a major metro area; even after his account got suspended, his ranking didn’t decline.

To fix this type of suspension, all you need to do is create a new Google account, re-verify the listing and stop breaking the rules.

The second type of suspension is what I call a hard suspension. This is very serious and means your entire listing has been removed from Google, including all the reviews and photos. When you pull up the record in Google Map Maker, it will say “removed.”

In this case, your only solution is to get Google to reinstate it; however, the chances of that are slim because this generally only happens when Google has decided the business listing is not eligible to be on Google Maps.

Following are the top nine reasons that Google suspends local listings:

1. Your website field contains a forwarding URL

I dealt with a case last year where I couldn’t figure out why the listing got suspended. Google was able to publicly confirm that it was because the website URL the business was using in Google My Business was actually a vanity URL that forwarded to a different domain.

As per the guidelines, “Do not provide phone numbers or URLs that redirect or ‘refer’ users to landing pages.” This often results in a soft suspension.

2. You are adding extra keywords to your business name field

As per the guidelines:

Adding unnecessary information to your name (e.g., “Google Inc. – Mountain View Corporate Headquarters” instead of “Google”) by including marketing taglines, store codes, special characters, hours or closed/open status, phone numbers, website URLs, service/product information, location/address or directions, or containment information (e.g., “Chase ATM in Duane Reade”) is not permitted.

This often results in a soft suspension, since the business is still eligible to be on Google Maps but just has a different real name.

3. You are a service-area business that didn’t hide your address

According to Google’s guidelines on service-area businesses, you should only show your address if customers show up at your business address. Whenever I’ve seen this, it was a hard suspension, since the listing was not eligible to show up on Google Maps based on the Map Maker guidelines.

It’s extremely vital for a business owner of a service-area business to verify their listing, since Google My Business allows them, but Map Maker does not. This means any non-verified listing that appears on Google Maps for a service-area business can get removed, and the reviews and photos will disappear along with it.

4. You have multiple verified listings for the same business

According to the guidelines: “Do not create more than one page for each location of your business, either in a single account or multiple accounts.”

Google will often suspend both listings (the real one and the duplicate you created) but will un-verify the legit one (soft suspension) and remove the duplicate (hard suspension).

5. Your business type is sensitive or not allowed on Google Plus

This one is new to me, but recently Google suspended (soft suspension) a gun shop and claimed the business type is not allowed on Google Plus. Since every verified listing is automatically on G+, the only option is for them is to have an unverified listing on Google Maps.

According to the Google Plus guidelines, regulated goods are allowed if they set a geographic and age restriction, so the jury is still out on whether Google will reinstate it or not.

6. You created a listing at a virtual office or mailbox

Google states:

If your business rents a temporary, “virtual” office at a different address from your primary business, do not create a page for that location unless it is staffed during your normal business hours.

I often see businesses creating multiple listings at virtual offices because they want to rank in multiple towns and not just the city their office is actually located in. If Google catches them or someone reports it, the listings will get removed (hard suspension).

7. You created a listing for an online business without a physical storefront

The first rule for eligible businesses is that they must make in-person contact with customers. Since online businesses don’t do this, Google specifies that they are supposed to create a G+ brand page instead of a local page, which means they won’t rank in the 3-pack or on Google Maps.

I was once helping out a basket store in Ottawa on the Google My Business forum that creates custom gift baskets that you can order online. When I escalated it to Google to fix something, they unexpectedly removed her listing completely (hard suspension) because she ran an online store.

8. You run a service or class that operates in a building that you don’t own

For example, my church has an AA group that meets there weekly. They would not be eligible for a listing on Google Maps. According to the guidelines, “Ineligible businesses include: an ongoing service, class, or meeting at a location that you don’t own or have the authority to represent.”

9. You didn’t do anything wrong, but the industry you are in is cluttered with spam, so the spam filters are tighter

I commonly see this most often with locksmiths. I have run into several legitimate locksmithswho have had their listings suspended (hard suspensions, usually) because the spam filter accidentally took them down.

In this case, I would always suggest posting on the Google My Business forum so a Top Contributor can escalate the case to Google.


Has your listing been suspended for reasons I didn’t mention? Feel free to reach out to me or post on the forum and share your experience.

Source: http://searchengineland.com/top-9-reasons-google-suspends-local-listings-247394


Categorized in Search Engine

Warning: if you are going to argue a point about politics, medicine, animal care, or gun control, then you better take the time to make your argument legit.  Spending 10 seconds with Google and copy-pasting Wikipedia links doesn't cut it. The standard for an intelligent argument is 

Legitimate researchis called RE-search for a reason: patient repetition and careful filtering is what will win the day.

There are over 86 billion web pages published, and most of those pages are not worth quoting. To successfully sift it all, you must use consistent and reliable filtering methods. You will need patience to see the full breadth of writing on any single topic. And you will need your critical thinking skills to disbelieve anything until it is intelligently validated.

If you are a student, or if you are seeking serious medical, professional, or historical information, definitely heed these 8 suggested steps to researching online:


1.  Decide if the Topic Is 'Hard Research', 'Soft Research', or Both.

'Hard' and 'soft' research have different expectations of data and proof.  You should know the hard or soft nature of your topic to point your search strategy where it will yield the most reliable research results.

A) 'Hard research' describes scientific and objective research, where proven facts, figures, statistics, and measurable evidence are absolutely critical. In hard research, the credibility of every resource must be able to withstand intense scrutiny.

B) 'Soft research'describes topics that are more subjective, cultural, and opinion-based.  Soft research sources will be less scrutinized by the readers.

 C) Combined soft and hard research requires the most work, because this hybrid topic broadens your search requirements.  Not only do you need to find hard facts and figures, but you will need to debate against very strong opinions to make your case.  Politics and international economy topics are the biggest examples of hybrid research.

2.  Choose Which Online Authorities Are Suitable for Your Research Topic.

A) Hard research topics require hard facts and academically-respected evidence.  An opinion blog will not cut it; you will need to find publications by scholars, experts, and professionals with credentials. The Invisible Web will often be important forhard research.  Accordingly, here are possible content areas for your hard research topic:

  1. Academic journals  (e.g. a list of academic search engines here).
  2. Government publications (e.g. Google's 'Uncle Sam' search).
  3. Government authorities (e.g. theNHTSA)
  4. Scientific and medical content, sanctioned by known authorites (e.g. Scirus.com).
  5. Non-government websites that are NOT influenced by advertising and obvious sponsorship e.g.Consumer Watch)
  6. Archived news (e.g.Internet Archive)
  7. Blogs, including personal opinion blogs and amateur writer blogs (e.g. ConsumerReports, UK politics).
  8. Forums and discussion sites (e.g. Police discussion forum)
  9. Consumer product review sites (e.g. ZDnet, Epinions).
  10. Commercial sites that are advertising-driven (e.g. About.com)
  11. Tech and computer sites  (e.g.). 
  12. Firstly, start with broad initial researching atInternet Public Library,DuckDuckGo,Clusty/Yippy,Wikipedia, andMahalo. This will give you a broad sense of what categories and related topics are out there, and give you possible directions to aim your research.
  13. Secondly, narrow and deepen your Visible Web searchingwithGoogle and.  Once you have experimented with combinations of 3 to 5 different keywords, these 3 search engines will deepen the results pools for your keywords.
  14. Thirdly, go beyond Google, forInvisble Web(Deep Web) searching.BecauseInvisible Web pagesare not spidered by Google, you'll need to be patient and use slower and more specificsearch engineslike:

B) Soft research topics are often about collating the opinions of respected online writers.  Many soft research authorities are not academics, but rather writers who have practical experience in their field. Soft research usually means the following sources:

3.  Use Different Search Engines and Keywords

Now comes the primary legwork: using differentsearch enginesand using 3-5keywordcombinations. Patient and constant adjusting of your keywords are key here.

4.  Bookmark and Stockpile Possible Good Content.

While this step is simple, this is the second-slowest part of the whole process:  this is where we gather all the possible ingredients into organized piles, which we sift through later.  Here is the suggested routine for bookmarking pages:

  1. CTRL-Clickthe interesting search engine result links. This will spawn anew tab pageeach time you CTRL-Click.
  2. When you have 3 or 4new tabs, quickly browse them and do an initial assessment on their credibility.
  3. Bookmark any tabs you consider credible on first glance.
  4. Close the tabs.
  5. Repeat with the next batch of links.
  6. Carefully consider the author/source, and the date of publication.Is the author an authority with professional credentials, or someone who is peddling their wares and trying to sell you a book? Is the page undated, or unusually old?  Does the page have its owndomain name(e.g. honda.com, e.g. gov.co.uk), or is it some deep and obscure page buried at MySpace?
  7. Be suspicious of personal web pages, and any commercial pages that have a shoddy, amateurish presentation.Spelling errors,grammar errors, poor formatting, cheesy advertising on the side, absurd fonts, too many blinking emoticons... these are all red flags that the author is not a serious resource, and does not care about the quality of their publishing.
  8. Be suspicious of scientific or medical pages that display scientific or medical advertising.For example: if you are researching veterinarian advice, be wary if the veterinarian web page displays blatant advertising for dog medicine or pet food.  Advertisingcanpossiblyindicate a conflict of interest or hidden agenda behind the writer's content.
  9. Be suspicious of any ranting, overstating, overly-positive, or overly-negative commentary.If the author insists on ranting and crying foul, or conversely seems to shower excessive praise, that could be a red flag that there is dishonesty and fraudulent motivations behind the writing.
  10. Commercial consumer websites can be good resources, but be skeptical of every comment you read.  Just because 7 people rave that Pet Food X is good for their dogs does not necessarily mean it is good for your dog. Similarly, if 5 people out of 600 complain about a particular vendor, that doesn't mean the vendor is necessarily bad. Be patient, be skeptical, and be slow to form an opinion.
  11. Use your intuition if something seems amiss with the web page.  Perhaps the author is just a little too positive, or seems a little too closed to other opinions.  Maybe the author uses profanity, name-calling, or insults to try to make his point.  The formatting of the page might seem childlike and haphazard.  Or you get the sense that the author is trying to sell you something.  If you get any subconcious sense that there is something not quite right about the web page, then trust your intuition.
  12. Use Google 'link:' feature to see the 'backlinks' for a page.  This technique will list incoming hyperlinks from the major websites that recommend the web page of interest. These backlinks will give you an indicator how much respect the author has earned around the Internet.  Simply go to google and enter 'link:www.(theweb page's address)' to see the backlinks listed.
  13. The MLA citing method
  14. The APA citing method
  15. Multiple tab pages open simultaneously.
  16. Bookmarks/favorites that are fast and easy to manage.
  17. Page history that is easy to recall.
  18. Loads pages quickly for your computer's memory size.

This method, after about 45 minutes, will have yielded you dozens of bookmarks to sift through.

5.  Filter and Validate the Content.

This is the slowest step of all: vetting and filtering which content is legitimate, and which is drivelous trash.  If you are doing hard research, this is also the most important step of all, because your resources MUST withstand close examination later.

6.  Make a Final Decision on Which Argument You Now Support.

After spending a few hours researching, your initial opinion may have changed.  Maybe you are relieved, maybe you are more afraid, maybe you've just learned something and opened your mind that much more.  Whichever it is, you will need to have an informed opinion if you are about to publish a report or thesis for your professor.

If you have a new opinion, you might have to redo your research(or re-sift your existing research bookmarks) in order to collate facts that support your new opinion andthesis statement.    


7.  Quote and Cite the Content.

While there is not a single universal standard for citing (acknowledging) quotes from the Internet,  the Modern Language Association and American Psychological Association are two very respected citing methods:

 Here is an example MLA citation:

 Aristotle. Poetics. Trans. S. H. Butcher. The Internet Classics Archive.
Web Atomic and Massachusetts Institute of Technology,
13 Sept. 2007. Web. 4 Nov. 2008. ‹http://classics.mit.edu/›.

Here is a sample APA citation:

Bernstein, M. (2002). 10 tips on writing the living Web. A
List Apart: For People Who Make Websites, 149.
Retrieved fromhttp://www.alistapart.com/articles/writeliving

More details:how to cite Internet references.

More details: The Purdue University Owl Guide explains both of these citing methods in detail: 

Remember: DO NOT PLAGIARIZE.  You must either directly quote the author, or rewrite and summarize the content (along with appropriate citing).  But to restate the author's words as your own is illegal, and will get you a failing mark on your thesis or paper.

8.  Choose a Research-Friendly Web Browser

Researching is repetitive and slow.  You will want a tool that supports many open pages, and easily backtracks through previous pages.  A good research-friendly Web browser offers:

Of the many choices in 2014,the best research browsers are Chrome and Firefox, followed by Opera. IE10 is also a competent browser, but try the previous 3 choices for their speed and memory economy.

9.  Good Luck with Your Internet Researching!

Yes, it's re-searching....the slow and repetitive method of sifting good information from the bad. It should feel slow because it's about diligence and skeptical hard questioning.  But keep your attitude positive, and enjoy the discovery process.  While 90% of what you read you will discard, take pleasure in how funny (and how idiotic) some internet content is, and put your CTRL-Click tabs and your bookmark/favorites to good use.

Be patient, be skeptical, be curious, and be slow to form an opinion! 


Source : http://netforbeginners.about.com/od/navigatingthenet/tp/How-to-Properly-Research-Online.htm

Categorized in Online Research
Page 4 of 4


World's leading professional association of Internet Research Specialists - We deliver Knowledge, Education, Training, and Certification in the field of Professional Online Research. The AOFIRS is considered a major contributor in improving Web Search Skills and recognizes Online Research work as a full-time occupation for those that use the Internet as their primary source of information.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.