fbpx

Web Directories

Jay Harris

Jay Harris

Sunday, 20 November 2016 12:23

19 Confirmed Google Ranking Factors

For many websites and businesses Google continues to provide more traffic to their website than any other channel: more organic visitors than paid search, social and even direct traffic.

With that in mind it’s certainly worth keeping up-to-date with what works, what doesn’t work, what Google has said and how to avoid a dreaded penalty.

This article looks at the ranking factors Google has confirmed in the hope of helping to increase your business’ prominence on the web.

Positive Factors:

No one will be surprised to see links make it on the list. Link building has been a big business for many a year now.

Earlier this year, in a Q&A with Google, Search Quality Senior Strategist at Google Andrey Lipattsev confirmed that two of the top three ranking factors were ’content’ and ’links’:

“I can tell you what they are. It is content. And it’s links pointing to your site.”

So what specifically is it about content and links that helps ranks your website in Google’s Search Engine Results Pages (SERPs)?

Links

Quality

The quality of your inbound links is a huge factor with Google. A company or website with a small backlink profile can see a huge boost from just one link from an authoritative page on an authoritative website.

Google PageRank died in importance some years ago. These days I use Majestic’s Trust Flow score as it gives us a greater idea of authority sites from the regular sites. Or the difference between regular sites and the low-quality and spammy sites.

Quantity

It’s all well and good getting a link from one authoritative site but the more you get the better your site will rank.

This is not suggesting build links for the sake of numbers – I certainly wouldn’t recommend directory links or junk comments for the sake of increasing the number of links. However, if you had one link from a news story from an authoritative site recently, such as a news site, it may naturally be picked up by other sites, but also it presents you with an opportunity to get links from smaller news sites, publications and blogs from the same story.

And just because you got a story in a certain publication once doesn’t mean you can’t or shouldn’t go back to them in the future.

Anchor Text

Anchor text used to be the be-all-and-end-all of ranking a website many years ago, as stated in Google’s original algorithm:

“First, anchors often provide more accurate descriptions of web pages than the pages themselves.”

Whilst anchor text isn’t as strong a factor as it was a few years ago, it does still play a big role in ranking websites and webpages. One expects this to lessen over time but for now it’s still important to get some anchor text links to your website.

Don’t overdo it though – a natural backlink profile is made up with a majority of brand name and URL anchor text links.

Here’s an example of a keyword jumping from page 10 to page 2 off the back of one anchor text link in September 2016:

clip_image002
Screenshot of Authority Labs data showing a keyword jumping from position 99 to 18 and maintaining that ranking

Internal Links

Internal links also play a big role in your rankings. If you have lots of good links pointing to a specific page, or a number of pages, you should consider passing this onto your product pages if they need a boost in the SERPs. Though please do make it user-friendly.

Last month we shot up from page 4 to position 7 (page 1) for our own website by adding a couple of internal links from popular, relevant blog posts on our website. Though it has since dropped down to page two:

clip_image004
Looks like we need some external links pointing to this keyword!)

Content

Andrey Lipattsev also mentioned content as one of the top three factors when Google comes to ranking a site.

Getting the content right on your page can certainly play a big role in where you rank in Google’s SERPs so here are a few things to bear in mind when it comes to content as a ranking factor:

Title Tag

It’s still a very important ranking factor to have your keyword(s) in the title tag. Not only because it weighs heavily when Google is determining where to rank your site for specific searches, but also to help attract click-throughs to your website. Someone that has searched your target keyword and sees it in your page title, particularly at the beginning, is more likely to visit your website than if it wasn’t there at all.

We switched focus on our target keywords last month and here is one of the results we achieved just by changing the page title:

clip_image006
 
Screenshot of Authority Labs data showing a keyword jumping from position 54 to 5 after changing page title

Heading Tags

Heading tags also have some weight when it comes to ranking your website and it’s important to get your keyword(s) in the H1 where possible on your pages.

It is recommended that you only use one H1 per page though there is no harm in using H2s, H3s, etc. as well.

Content Length

Since the Google Panda algorithm update back in February, 2011 it became very noticeable how seriously the search engine takes the content on a given page.

Those of you who were working in SEO six or more years ago may remember how you could rank websites and webpages with thin content thanks to their backlinks. It’s not the case these days.

It’s more and more the case now that the top results in Google are in-depth articles. An interesting result I came across for ‘fx trading’ recently is that most of the top organic results are information pages and not the homepage or target page that fx companies would aim first-time users to land on:

clip_image008

This is a keyword that Google suggests bidding £38.98 ($65 CAD) in AdWords!

clip_image010

URLs

Making sure to include your keyword in the URL slug of your page also helps with ranking. If your keyword is already in the page title as advised then there’s no reason why you won’t have it in your page URL too.

Google continues to bold keywords within your URL that match search queries which help your listing stand out:

clip_image012

RankBrain

Completing the top three ranking factors in Google SERPs, a news piece published by Bloomberg last October quoted a Google senior research scientist, Greg Corrado, confirming RankBrain’s importance:

“RankBrain is one of the ’hundreds’ of signals that go into an algorithm that determines what results appear on a Google search page and where they are ranked,” Corrado said. “In the few months it has been deployed, RankBrain has become the third-most important signal contributing to the result of a search query.”

RankBrain is Google’s AI system that helps it process search results to provide more relevant results for its users. It works by guessing what words might have a similar meaning so it can filter the results according.

Other Positive Ranking Factors

Google has confirmed the following ranking factors. Although they are not in the top three we certainly believe these have a big influence on your rankings and should be taken into consideration when optimising your website:

Page Loading Speed

All the way back in April, 2010 Google confirmed that page loading speed was one of their search ranking factors; how quickly a website responds to web requests.

This isn’t just useful for helping your site rank better but for the users’ experience too. Ever been frustrated over the time it takes a website or webpage to load? Imagine your website loading slowly for first-time users and how this could put them off making a purchase or enquiry to your business?

Secure Website

In August, 2014 Google confirmed it was starting to use HTTPS as a ranking signal. Since then the web is now fill of https:// websites. How significant a factor it is as a ranking factor is anyone’s guess but Google claimed two years ago that they were seeing positive results following a test.

A secure website is also a trust signal to users. They are more likely to place an order through your website if you have a secure site than with a non-secure site.

Negative Ranking Factors

As well as proving that some of our efforts are positive towards ranking a website highly in Google, the search engine giant has also been able to confirm other factors that may have a negative effect on your positions within the results:

Manual Penalty

Manual penalties are Google’s way of removing or demoting your website or individual webpages. Those who have suffered a manual penalty from Google are notified via a message in Search Console if they have it set up.

These penalties are not related to algorithm additions such as Penguin or Panda but are Google manually judging and punishing websites themselves. This is usually a result of underhand behaviour such as trying to manipulate Google’s SERPs.

In September, Search Engine Watch published a list of 12 well-known businesses that have been hit with a manual penalty from Google over the years. When the Washington Post, WordPress and BBC are being hit by penalties from Google, no website is safe!

clip_image014

Penguin Penalty

Google first introduced the Penguin algorithm update in April, 2012 to devalue the impact of low-quality backlinks. This resulted in some websites and webpages being demoted in Google’s SERPs and some even being kicked out altogether.

For the nearly four-and-a-half years of its existence, websites could only recover from Penguin after both cleaning up their backlink profile and waiting for Google to manually refresh their results. As of 23 September, 2016 Penguin is now real-time meaning you can be hit or recover within days.

It has been questioned whether sites ever fully recover from a Penguin penalty so it’s certainly worth avoiding any underhand activity to put yourselves at risk – no one wants their website or business kicked out of Google.

Panda Penalty

The Google Panda update was released in February, 2011 with the aim of hitting sites with low-quality or thin content. This was the start of a big shift with Google providing higher-quality results and not just those with a large number of links.

Shortly after the algorithm was rolled out Google received lots of questions on their support forum, which may have resulted in them releasing a 23-bullet point guide on building high-quality sites.

The quality of content and the length of content for sites ranking at the top of the SERPs for popular keywords has noticeably increased over the past couple of years following the original Panda rollout.

Buying Or Selling Links

Google lists “Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a “free” product in exchange for them writing about it and including a link” on its Link Schemes page.

I’ve certainly heard of more and more websites getting messages from Google about unnatural links within their content. Here’s an example of the message some webmasters have received in their Search Console accounts:

clip_image016

Reciprocal Links

Google has stated that “Excessive link exchanges ("Link to me and I'll link to you") or partner pages exclusively for the sake of cross-linking” are to be avoided.

You’re taking a big risk by having a ‘links page’ on your website these days: something that was common more than five years ago was for sites to exchange links with each other.

Article Marketing

Also on the Google Link Schemes page is: “Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links”.

At the start of 2014, then head of the Google web spam team, Matt Cutts published a blog on his website announcing the decline in the benefit from guest posting. Within his article Mr Cutts went on to explain how guest posting had turned from something respectable into pure out spam solely with the intention of increase your website’s rankings within Google.

Press Releases

Press releases aren’t a no-no, but Google has stated that you should avoid any optimised anchor text links within them.

Submitting a release to the wire with optimised anchor text links is straightforward for Google to pick up.

Directories And Bookmarks

The search engine giant disapproves of ’Low-quality directory and bookmark site links’.

These link building methods remain popular though, perhaps because they’re cheap and used to work. I certainly wouldn’t advise this approach in the year 2016 or beyond.

Widgets

Optimised anchor text links on a widget should also be avoided if you want to avoid the wrath of Google.

Footer Links

Google doesn’t react too kindly to “Widely distributed links in the footers or templates of various sites” either.

Forum Comments

What used to be a popular link building tactic, Google acts negatively on “Forum comments with optimised links in the post or signature.” Presumably the same can be said of optimised anchor text left in comments on websites too.

Interstitials Or Distracting Ads

As of 10 January 2017, Google will start to demote pages that display intrusive interstitials and annoying ads on mobile devices. The theory is that they provide a poorer experience to users.

Whilst this is initially rolling out on mobile you can expect it to negatively affect desktop sites as well later in the year.

DMCA Complaints

Since August, 2012 Google has been lowering websites in their search results that they have received valid copyright removal notices for. Their official word back then was “Sites with high numbers of removal notices may appear lower in our results” but Google now removes pages entirely, though does still display a message to notify the search user:

clip_image018

Duplicate Content

Google states that: “In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we'll also make appropriate adjustments in the indexing and ranking of the sites involved.” Basically, your site won’t appear in their results if they believe you were intentionally duplicating content to rank.

Google provides plenty of useful documents containing advice and support to give you an idea how to drive organic traffic to your website and also helping you to avoid making any bad choices when attempting to rank your website within their search engine.

The Webmaster Support pages also have plenty of useful advice and a link to Google’s popular support forum where webmasters and such try to offer advice and help of their own.

Author:  Barrie Smith

Source:  http://www.searchenginepeople.com/

The Washington Post reports that a new search engine called Omnity is on the way, which is targeted at researchers and students. Not only is it being recognized for unique features that Google doesn’t offer, many publications are calling it “smarter than Google.”

Reports indicate that Omnity separates itself from the pack by serving up results which best match the search term entered in. There’s also the added capability of indicating how those results relate to one another.

If you’re researching a subject you know little about, for example, you can type it in as a search term and immediately see which resources are getting cited the most.In addition you can see who has conducted the most influential research on the subject as well as which university is leading when it comes to research on that subject.

Omnity will pull information from a variety of sets of data including: SEC filings, publicly available news, organizational reports, scientific journals, financial reports, and legal histories.

Alternatively, you can input your own data sources. For example, you can upload a piece of your own research, or some research papers found elsewhere, and the search engine will return the links to other resources that are relevant but not directly cited in sources you’ve uploaded. With this feature, you can easily find you can find unique sources of information to add to your research.

The Washington Post argues that Omnity overcomes one of the problems of modern search engines, which is the fact that today search engines are based on keywords. With that being the case, today search engines can only return results if the keywords in the title of the page match what’s being search for.  Omnity improves on the current search model by scanning through the entirety of a document.

The Post concedes that Omnity is not likely to overtake Google at any point in time, but niche search engines still have a place in the market. As search  continues to evolve, we may see Omnity being used in a way we can’t predict at this time. The Washington Post gives the example of niche search engine Wolfram Alpha, originally marketed as a computational search engine, now helps to power a search giant known as Siri.

It’s worth keeping an eye on new search engines like this because it’s an indication of where other search engine's might be going. It also demonstrates how our search habits are changing over time.

Author:  Matt Southern

Source:  https://www.searchenginejournal.com

Wednesday, 16 November 2016 11:39

Do You Search in the Singular or the Plural?

A Hitwise post digs into the behavior of searchers and sees whether they prefer searching in the singular or plural. Using the term "laptop" versus "laptops," it is clear that "laptops" is the winner in search. When investigating nine other terms, the following was discovered:

[W]hile the results are not conclusive, it does seem that plural terms are better at sending traffic to retailers than singular terms. Two thirds of the products tested performed better as plurals, with technology products in particular skewing in favour of an added ‘s’.

Of course, as one member on Sphinn notices, this is specific to traffic, not necessarily conversions. However, it's a good first stop. Now can someone compile a report on the conversions? ;)

Forum discussion continues at Sphinn.

Source:  seroundtable.com

Experts from all over the world have been pointing out the dark side of the deep web for quite some time now. However, new research goes to show there are plenty of legal reasons to use the darknet, as the number of legitimate sites far outpaces the number of underground marketplaces. This is quite a surprising outcome, although it will not put government’s minds at ease by any means.

The Darknet Is About More Than Online Crime

The research unveiled by Terbium Labs is quite interesting to take note of. Most people only know the deep web for its criminal activity, and law enforcement is cracking down on these illegal trades. But every story has two sides to it, and it turns out the number of legitimate deep web sites is far bigger than most people give it credit for.

The deep web offers users an additional layer of anonymity and privacy, which is often associated with online crime.However, there are other reasons to demand more privacy when browsing the Internet. Although the research only pulled data from 400 different sites, it goes to show there are multiple legitimate use cases on the deep web.

As one would come to expect, the results showcase there are many different categories of content to be found on the darknet. Drugs, Fraud, Counterfeits, and Hacking are all prominent site directories, but they only represent a small portion of all onion-based platforms. In fact, 6.8% of all search results returned adult content, which is also deemed as “legal.”

Other legal content one can find on the deep web ranges from hosting Facebook – which is often accessed through the Tor browser – to graphic design firms, political parties, and regular forums to discuss IT-related content. All of this content could easily exist without Tor, were it not for the software to offer more privacy and anonymity.

The research also highlights some worrisome development in the “illicit content” category, though. Even though most deep web discussions revolve around drugs and weapons, they only represent a fraction of what is going on among criminals. Exploitation is a serious offense, and it is becoming more prominent on the darknet than ever before. Exploitation ranges from pornographic, violent content, or any other type of illegal activity involving children.

Weapons of mass destruction are notoriously absent from this darknet findings report. Although exploring 400 web sites may not be the best way to target discussions about WMDs, it goes to show biological agents have not found their way to “traditional’” deep web platforms just yet. But that doesn’t mean it is not there for those who know how the look for it.

The main thing to take away from this research is how one cannot classify the deep web as just a place for criminal activity. However, since no one can grasp the full complexity of the darknet, to begin with, further research is warranted. For now, there is no reason to dismiss the positive side of the deep web, as there are more legitimate use cases than assumed at first.

Source:  livebitcoinnews.com

First there was Panda and Penguin. Now, Google will release a Google mobile update on April 21. This update promises to be even wider-reaching than both of the “bird-inspired” updates that valued high-quality content.

Writing For Google's Biggest Algorithm Update Yet | SEJ

Understanding the Scope of Google’s “Mobilegeddon” Update

Google’s new update promises to be a game changer. The algorithm will rank mobile-friendly sites higher than non-mobile-friendly ones. Many webmasters from around the world are (rightfully) anxious about its release since it could significantly impact traffic.

From a writer’s perspective, the update gives us something to think about as well. Does this mean we need to learn a whole new way to create web content?

There is no getting around the fact that your website must be mobile.

Before Panda and Penguin made their debuts, it was fairly easy to rank a website at the top of the search result by indiscriminately stuffing a particular keyword. These updates crippled a number of websites because they depended on that tactic to gain traffic.

The Mobilegeddon promises to do the same for webmasters who have neglected optimization for mobile browsers. This could be potentially devastating to some reaches of the Internet. Google has already stated that there will be no middle ground. Your site will either be mobile friendly or not. This could mean an entire reworking of site architecture and the content contained therein. This is of utmost importance to us as webmasters, writers, and marketers.

Content Production for the Mobilegeddon

Get ahead of this potentially game-changing update. Although it isn’t in effect yet, estimate how writing for a mobile site differs from writing for PCs. There is going to be a series of changes that content producers should aim to heed if they intend to keep producing high-quality, compelling content after the update has rolled out. Read this Search Engine Land post that offers three actions to prepare your website for the impending update.

From what we know about the update, it’s likely that we will have to make changes to our content production habits. Here are a few tactics that will help:

1. Curtail Headline Length

User experience on a mobile device is different than a desktop browser. One of the most obvious differences is the change in screen size (and the amount of usable real estate). Currently, a headline can stretch across the full banner-length of a browser, but mobile screens change the game when it comes to headline width.

 

What this Means for Us: Create shorter headlines. For Twitter users, it just means that you can practice your 140-character limit more often. For those of us who don’t use this particular social media network, now is a good time to start. We need to learn how to condense page-width headlines into more bite-sized chunks, without sacrificing the impact potential of our headline.

2. Make Shorter Paragraphs

“Snackable content” is something that content producers are all too aware of, but is especially important for mobile optimization. Create content that the user can consume in one sitting. However, the format in which we present this content is likely to be as bite-sized as the content itself. Because of short attention spans and aversions to “walls of text” it’s likely that mobile users would feel put upon when it comes to dealing with paragraphs that fill their entire screen.

What this Means for Us: Learn to summarize your ideas. Keep to the point and make your copy more targeted in nature. In some cases, such as home pages, reduce the amount of copy there altogether. Increased copy gives the user a hard time and makes for difficult reading, especially on a tiny display. Get your message across in short bursts.

3. Less Words, More Action

In Orwell’s 1984, he invented a form of the English language called “newspeak” where words were combined, removing unnecessary and frivolous ones and replacing the others that didn’t serve a purpose. This mobile update is likely to make content producers do the same, paring content down to be less wordy while at the same time interspersing calls to action. Condensing content will require us to consider what we write and distill the message in as few words as possible.

What this Means for Us: Rethink the methodology for creating content. In addition to making content compelling and benefit focused, we must also now take a look at the amount of words we use and how often we call to action. It could possibly mean a change in the basic tenets of web writing.

The exclusion is blog content– they will always rank and read better in long form – but for your home and main pages, less content means a better mobile experience, and happier readers.

 

Preparing for the Mobilegeddon Now

Luckily, this change does not require us to find a fallout shelter to survive. Writing habits just need to be carefully considered.

You may need to review web writing and revamp some marketing approaches accordingly to align to with what is expected from mobile friendly sites.

Source : searchenginejournal

The fact that Google gathers personal data on its users using a variety of their services is in no way recent news. It has long been known that this internet giant has a database where various search patterns and habits of their users are stored.

These are even publically accessible using said user’s Google email and password. The explanation, or rather excuse, for doing this is that the data being collected is carefully guarded and that ultimately it cannot be used to compromise the user in any way.

They continue to explain that the data is used to better the services provided by Google and to further enhance Google’s advertising capabilities by offering relevant ads during internet browsing.

They heavily negate claims that said data is shared with third parties, mainly law enforcement agencies, without the user’s knowledge or approval. All of this shows quite a considerable lack of interest in their users’ online anonymity.

If all of the above is true, then why are our voice search queries recorded using their virtual assistant on both Android and Windows 10. Every time somebody uses voice command to search for something on the web, it is recorded and later even transcribed and saved in that database.

Every time somebody uses voice command to search for something on the web, it is recorded and later even transcribed and saved in that database.

Luckily, this database is also readily accessible by the user and can even be altered; deleting it completely is the recommended course of action in order to preserve our online anonymity. After that, there is a way to turn off the permission for recording any further voice commands.

After that, there is a way to turn off the permission for recording any further voice commands.

Turning off Google Voice Command Recording

  • First of we will have to follow this link in order to log into the recordings database.
  • After entering our email and username, we will be presented with the list of recordings that represent our voice commands to Google’s virtual assistant and their transcriptions.
  • In the left sidebar, there is a “Delete Activity By” button that will allow us to delete entries from the database.
  • After clicking on it, we should choose “All Time” from the “Delete by Date” dropdown menu.
  • By clicking “Delete,” the process will be completed and all of the records, including our voice command recordings, will be deleted.

Turning off Further Recording

While this has deleted all off the records that Google has acquired about us since we started using their services, it still does not prevent them from continuing to collect records. Luckily, this can be disabled as well and here is how:

  • Again, we will have to visit this link.
  • Instead of clicking the “Delete Activity By” button, this time, go to “Activity Control.”
  • Here you will be presented with several switches that you can turn off, turn off all of them.

By following above steps, you have successfully denied Google permission to track and log your online activity and in turn increased your online anonymity.

Google and Entering the Dark Web

While we should already be aware that it is not possible to access the dark web using Google Chore, there are still some ways that Google can record our dark web searches, which are mentioned on Tor’s official webpage.

As it stands, while using Tor, it is important that other browsers, like Google Chrome in this instance, is turned off.

It is also advisable not to use Google search engine with Tor and to log out of any accounts connected to Google, like Gmail or YouTube to make sure that our online anonymity is secure.

One last thing to note is that some search queries, like information regarding hidden services, will “mark” us to the law enforcements and in extreme cases make them actively monitor our internet usage and online activity.

While this is not tied specifically or solely to Google and its services, it is still advisable to consider using other browsers or at least a strong VPN.

Secure Alternatives to Google Chrome

Tor

tor
Many people believe that tools for online anonymity like Tor are used only be a select few, usually assumed to be hackers.

For quite a long time now Tor has been the #1 browser for people looking to increase their online anonymity. It is also the only way to access hidden services located on the dark web.

Tor uses an ever-expanding network of nodes, computers which are mostly owned by people volunteering them for use in building up the Tor network.

This network serves as an intermediary between the user and the internet making it hard to connect their IP to the searches.

The downside of Tor actually lies in its popularity making it the most heavily monitored secure browser.

Despite this, Tor’s security is hardly compromised, and it will continue its work uninterrupted as long as there is a community to back it up.

Epic

Epic
Epic is an awesome browser if you want to keep your web browsing as tightly secure as possible.

There are as many routes taken to ensure online anonymity as there are browsers specializing in it. Epic is one of the more popular choices, and its philosophy is offering security through minimalism.

It is based on Chromium and if we had to compare it to anything it would be a heavily stripped down Google Chrome. An interesting feature about Epic is that it reroutes all their user’s searches through their company’s servers.

While this does increase online anonymity by making it harder to connect our IP to our specific searches, it also slows down our search speed, but not so much to make it not a very worthwhile exchange.

Another downside is the lack of malware and phishing prevention systems, but these can be avoided anyway but using a bit of common sense.

Cocoon Browsing

Cocoon Browsing
Cocoon makes the Web a better place by protecting your online privacy,

While not technically a browser, but rather a browser plugin, Cocoon Browsing offers some of the best online anonymity features on the web.

Aside from offering online anonymity through secure browsing, it also has built-in features like anti-Facebook tracking and end-to-end encrypted connection.

The only downside to this service is that it requires a monthly or yearly subscription. There are two versions of the service, Cocoon and Cocoon+, costing $1.49 and $2.49 per month respectively.

Conclusion

With every passing day, big corporations are gathering more and more data on their clients, and while said information is kept secure and confidential it is only a matter of time before it falls into the wrong hands.

In the end, many people will find that being monitored on the internet is intrusive to their online anonymity, despite the fact that the data logged is not accessible to the public.

While this data may provide some increase in comfort and utility when searching the web, it is still advisable to at least turn off all the tracking permissions on our browsers, if not using an online anonymity based alternative.

Source : darkwebnews

What is the Deep Web? What is the Dark Web? These are questions that tend to arise when we hear the term in many popular spy movies and TV shows today. It is a term that is used to describe a collection of websites that are visible but are very masterfully hidden from the general public. Most of these websites hide their IP addresses, which makes it impossible to identify who is behind the screen. 

These websites can’t be accessed via your standard browsers like Chrome and Firefox. These websites are not indexed on Google or Yahoo either. Some of these websites are useful and some I would not dare visit again. You can buy anything illegal from many of these websites;

you can buy a gun or a kilo of marijuana from Mexico. You can hire a hitman or even buy yourself a fake identity. The Deep Web is a murky place and I happened to explore it at length to find out what the fuck goes on in there. 

How Does it Work?

Almost every website in the Dark Web uses an encryption tool called TOR. It also acts as a browser and many of these websites have a .tor domain, something you cannot access through any regular browser.  Tor or Onion Routing is a free software that was developed by the United States Naval Research Lab for anonymous communication. The reason why it is also known as Onion routing is because the Internet is much like an onion and has many layers. 

What is The Dark Web?

The internet that we know of consists of 8 billion websites from all over the world, and what’s shocking to know is that traffic to websites like Facebook, Google, Amazon and any other page that uses HTTPS or .com as a domain, only contributes to only 4% of web content on the internet.  These websites are only on the ‘surface web’. 96% of the digital universe is predominantly present in the deep web. These websites are protected by passwords and cannot be traced to their owners. 

What is The Dark Web?

Not all Dark Websites in the Deep Web use TOR encryption. Some websites like the all-new Silk Road Reloaded use a similar service called I2P.  Silk Road was (maybe still is) a website for purchasing and selling recreational drugs.  Silk Road was the first modern black market website that caught too much heat and the owner was arrested and was incarcerated recently.  We will be talking about online marketplaces in the Deep Web a bit later. 

Is it the Deep Web or The Dark Web? 

You might think the terms are co-related but these terms tend to be different in definition. They don’t mean the same thing, as Deep Web is a term used for ALL websites that are present in the network including ‘Dark Web’ sites. Deep Web consists of all sorts of data which is boring and present for mundane reasons. 

However, it is exciting and scary to talk about Dark Websites in general. 

What Is The Dark Web And What The Hell Is Going On In There?

The Dark web is a part of the Deep Web and it requires specialised tools and equipment to access. These websites are deep underground and the owners of the websites have very good reason to keep their content hidden. 

Because of its nature, we cannot possibly fathom or determine how many websites actually exist with malicious content, but as I was researching the Deep Web, I came across some horrific websites that I would like to elaborate on. 

Online Black Market Marketplaces

What is The Dark Web?

The most visited websites in the Deep Web are mostly marketplaces that sell illegal drugs, pharmaceutical products and even pirated games. According to Trend Micro, an internet security firm, the user base that visits these websites normally like to buy and sell the following drugs:

What is The Dark Web?© BCCL

What is The Dark Web?© BCCL

What is The Dark Web?

The deep web provides a platform for anonymity and that is the best motivation for people to engage in illegal activities. There is a cybercriminal underground and these activities can have drastic effects in the real world. Recently, the founder of Silk Road, Ross William Ulbricht, or better known as Dread Pirate Roberts, was accused of money laundering, murder for hire, computer hacking, conspiracy to traffic fraudulent identities and conspiracy to traffic narcotics. You might be asking why? That’s because his website ‘Silk Road’ enabled people to sell all of these services. Think of Silk Road as the Amazon of illegal substances and services. 

What is The Dark Web?

The availability of illegal drugs is easily accessible and varies a lot on the Deep Web. Many of these websites sell cocaine, cannabis, and psychedelics amongst others. 

What is The Dark Web?

What is The Dark Web?

There’s even a search engine called ‘Grams’ that indexes and allows people to easily search Deep Web sites that sell illegal drugs. Hell, they even mimicked the Google logo to set themselves apart from other competing websites.  

Money Laundering and Counterfeiting 

In the Deep Web, you never use your regular credit card or your debit card to buy stuff. Hell, you don’t even use PayPal for these services. A virtual currency called ‘Bitcoin’ is the dominant mode of payment and it is a currency designed keeping anonymity in mind. It is the ideal currency for illegal activities which is outside of the control of traditional banks and governments. 

What is The Dark Web?

There are services available on the Deep web which makes it even harder for authorities to track your Bitcoin transactions. These services mix your Bitcoins through spidey networks and return them to you. Of course, they charge a small processing fee but this way it remains impossible to track. 

What is The Dark Web?

Bitcoins can be exchanged for real cash; however, there is a wide availability of fraudulent currency on the Deep Web. This counterfeit cash is available to buy in bulk or per order basis. They are almost identical to the real thing and are made of 100% Cotton Linen Paper, which is used in most paper money today.

These bills even have the appropriate watermark to make them look legit and can also fool any infrared checker that you commonly see in Bank today. Most of these bills can be detected by an infrared scanner but that does not prevent people from buying or selling it. These websites offer $20 Bills for half the price and other websites also offer Euros and Yen. 

Guns 

According to research by Carnegie Mellon University (CMU), the most popular items sold on the dark web are illegal drugs. MDMA and Marijuana are the most popular items sold, however, the sale of guns and other forms of weapons are catching up.  There is a dedicated website called The Armory, where users can buy all kinds of illegal firearms and explosives. And get this...they ship it all over the world!What is The Dark Web?

These sites have made it hard for authorities to effectively monitor them and it seems like the situation is not going to get better anytime soon. 

Passports and Citizenships 

What is The Dark Web?

Owning an American or an EU passport is one of the most valuable assets when it comes to travel and citizenship benefits. Being an American or an EU citizen certainly has its perks. They act not only as a document that will let you cross borders but one can open bank accounts, apply for loans, purchase property and even get state benefits if you are a citizen of a specific country. Unique documents like Passports and other powerful documents are faked and sold on the dark web. There are plenty of websites that claim to sell identical passports and driver licenses. They vary in price from country to country. 

What is The Dark Web?

What is The Dark Web?

In fact, the founder of Silk Road, Dread Pirate Roberts, was caught because he ordered dozens of Fake IDs on the deep web in order to hide his identity from the FBI. These fake IDs were caught by the FBI and showed how extensive and accurate some of these documents can be. 

Child Pornography 

I am not even going to dignify this topic with a full blown paragraph, since it is simply inhumane and disgusting even talking about this. Child pornography is present in stupendous quantity and it needs to stop right now. Just make sure you don’t click on that Twitter logo if you ever decide to explore the dark web. 

The Deep Web was invented with the sole purpose of fulfilling the genuine need of freedom and anonymity. It’s used by Governments to communicate with each other during a crisis, and journalists use it to leak documents that wouldn’t normally be available on the surface web. Civilians used it during the Egyptian crisis and it denotes that the Deep Web has far more use for good than evil. 

What is The Dark Web?© BCCL

Cybercrime has emerged to be the dominant form of usage for users from across the globe. It is the platform for obscurity and protection these cybercriminals need in order to operate. It gives them an edge over law and order and they have been polluting a space that might be the future of anonymity in due time. 

We here at Mensxp pay no heed to the Dark Web and we do not endorse or encourage you to start taking part in any illegal activity or immoral behaviour.   

Source : mensxp

We asked Google's Gary Illyes what SEOs and webmasters should think about as this year closes.

Every year we like to get a Googler who is close with the ranking and search quality team to give us future thinking points to relay to the search marketing community. In part two of our interview with Gary Illyes of Google, we asked him that question.

Advertisment

become-an-internet-research-specialist

After a little bit of coercion, Illyes told us three things:

(1) Machine learning

(2) AMP

(3) Structured data

He said:

Well I guess you can guess that we are going to focus more and more on machine learning. Pretty much everywhere in search. But it will not take over the core algorithm. So that’s one thing.

The other thing is that there is a very strong push for AMP everywhere. And you can expect more launches around that.

Structured data, again this is getting picked up more and more by the leads. So that would be another thing.

Interesting how Illyes said that the core algorithm will not be taken over by machine learning — that is an interesting point there. AMP, is obvious and structured data is as well.

Barry Schwartz: Gary, one of my favorite parts of Matt Cutts, I guess, presentations at SMX advanced towards the year and some other conferences. Was that one of the slides always gave, I guess, webmasters and SEOs what’s coming, like what the Google search quality team is looking into for the upcoming year. And we’re kind of at the end of the year now and I was wondering if you have any of those to share.

Gary Illyes: Common, it’s early October. I understand that they started like pushing the pumpkin spice thing but it’s really not the end of the year.

Danny Sullivan: I mean you guys take all of December off and work your way from the end of the year. And might I add like December 4th you’ll be like here’s the end. It’s not like in January you go to Google Trends goes, here’s what happened year. there pushed out in Decembers. Well one thing, surely you’ve got one thing one. One forward looking statement that you can give for us to make stock purchases.

Gary Illyes: Actually you are in luck because I have a presentation or keynote for Pubcon and I will have a slide for what’s coming.

Danny Sullivan: Well let’s have that slide. Because this isn’t going to air until after…

Barry Schwartz: Can you share anything, one thing,

Danny Sullivan: One thing, this isn’t, like this isn’t live.

Gary Illyes: Oh sure.

Well I guess you can guess that we are going to focus more and more on machine learning. Pretty much everywhere in search. But it will not take over the core algorithm. So that’s one thing. The other thing is that there is a very strong push for AMP everywhere. And you can expect more launches around that. Structure data, again this is getting picked up more and more by the leads. So that would be another thing.

Source : searchengineland

Wednesday, 19 October 2016 06:57

5-step guide to a killer marketing strategy

Benjamin Franklin once said, “If you fail to plan, you plan to fail.” Just as you plan for other aspects of your business, such as product, operations and inventory, marketing requires some extensive planning as well. Plot your marketing strategies ahead of time and it may drive your business from breaking even to breaking records!

Advertisment

become-an-internet-research-specialist

If you can’t get your product out there to your customers, there’s really no point in continuing the line of work. So, no matter the size of your business, spend some time to think about your business and craft a robust, killer marketing strategy. There are many factors to consider, but let’s look at what you should focus on.

1. Identify your customer persona

Smart businesses always determine their niche market first. Surveys, focus groups, and website metrics are great ways to get to know your audience that help you develop a detailed customer profile. Stay tuned to your customers’ needs, and make them feel valued with intelligent and relevant content – this helps set the stage for long-term relationships. If you’re bootstrapping, this can be a powerful marketing technique!

2. Know your competition

Entering new markets involves a great deal of research about the market landscape, which includes your competition. Instead of trial and error—which is a rapid path to burnout—look at what your competitors are doing. Pinpoint both their strengths and weaknesses and determine what you can do to edge ahead of the rest. What makes you special to your customers? Why is your product different and better? Always remain differentiated and don’t get lost in advertising clutter and spam. Know who you’re up against, and outsmart them.

3. Set realistic goals

Imagine a desolate desert. You’re wandering aimlessly from one mirage to another. If you don’t know where you’re going, how do you know when you get there? The same goes for marketing. Setting goals are the starting point of any plan. The exact marketing achievements should be realistic and attainable. Depending on the industry and business stage you’re in, each goal should impact the bottom line.

4. It’s all about tactics

Game plans are important. They guide and keep you focused on the key aspects of your business. Spend some time to figure out the exact marketing tasks that can help you achieve your goals. And be specific. For instance, if your goal is to generate online leads, then video content marketing tasks may become part of your key activity list. Dive in deeper with the details. Youtube videos? Webinars? Testimonial videos? Video edms? Get your tactics right, you’ll be well on your way for a home run!

5. Stay ahead of the curve

Technology evolves. Markets change. Algorithms adjust. Users adapt. We live in a dynamic environment, where changes are happening constantly. Take social media for example. Facebook and Instagram, 2 of the major social media platforms that businesses use, constantly change their algorithms. This potentially alters the way marketers maintain their reach and interaction with their customers from the site. Whether we like it or not, change is the only constant. Regularly review your marketing strategy and revise it as necessary. It is really more of a process and not a plan.

Learn from seasoned professionals

Marketing doesn’t have to be rocket science. With the right marketing strategy, it can be your best sales asset.

They say learning is a never-ending process, and you can always do with more knowledge. That’s exactly what Tech in Asia Jakarta’s Marketing Stage aims to do. Glean new insights from experts at Hubspot, LINE, Edelman, and more as they share actionable takeaways on Day 1 of our Jakarta conference on November 16 and 17. Check out our panel of speakers below:

With one pass, get access to all 6 content stages and so much more! Passes are currently going fast, at a 10 percent discount (code: tiajkt10). Promotion ends on October 28 – get your passes today before it’s too late!

Source : telegraph

Google employs some of the world’s smartest researchers in deep learning and artificial intelligence, so it’s not a bad idea to listen to what they have to say about the space. One of those researchers, senior research scientist Greg Corrado, spoke at RE:WORK’s Deep Learning Summit on Thursday in San Francisco and gave some advice on when, why and how to use deep learning.

Advertisment

become-an-internet-research-specialist

His talk was pragmatic and potentially very useful for folks who have heard about deep learning and how great it is — well, at computer visionlanguage understanding and speech recognition, at least — and are now wondering whether they should try using it for something. The TL;DR version is “maybe,” but here’s a little more nuanced advice from Corrado’s talk.

(And, of course, if you want to learn even more about deep learning, you can attend Gigaom’s Structure Data conference in March and our inaugural Structure Intelligence conference in September. You can also watch the presentations from our Future of AI meetup, which was held in late 2014.)

1. It’s not always necessary, even if it would work

Probably the most-useful piece of advice Corrado gave is that deep learning isn’t necessarily the best approach to solving a problem, even if it would offer the best results. Presently, it’s computationally expensive (in all meanings of the word), it often requires a lot of data (more on that later) and probably requires some in-house expertise if you’re building systems yourself.

So while deep learning might ultimately work well on pattern-recognition tasks on structured data — fraud detection, stock-market prediction or analyzing sales pipelines, for example — Corrado said it’s easier to justify in the areas where it’s already widely used. “In machine perception, deep learning is so much better than the second-best approach that it’s hard to argue with,” he explained, while the gap between deep learning and other options is not so great in other applications.

That being said, I found myself in multiple conversations at the event centered around the opportunity to soup up existing enterprise software markets with deep learning and met a few startups trying to do it. In an on-stage interview I did with Baidu’s Andrew Ng (who worked alongside Corrado on the Google Brain project) earlier in the day, he noted how deep learning is currently powering some ad serving at Baidu and suggested that data center operations (something Google is actually exploring) might be a good fit.

Greg Corrado

Greg Corrado

2. You don’t have to be Google to do it

Even when companies do decide to take on deep learning work, they don’t need to aim for systems as big as those at Google or Facebook or Baidu, Corrado said. “The answer is definitely not,” he reiterated. “. . . You only need an engine big enough for the rocket fuel available.”

The rocket analogy is a reference to something Ng said in our interview, explaining the tight relationship between systems design and data volume in deep learning environments. Corrado explained that Google needs a huge system because it’s working with huge volumes of data and needs to be able to move quickly as its research evolves. But if you know what you want to do or don’t have major time constraints, he said, smaller systems could work just fine.

For getting started, he added later, a desktop computer could actually work provided it has a sufficiently capable GPU.

3. But you probably need a lot of data

However, Corrado cautioned, it’s no joke that training deep learning models really does take a lot of data. Ideally as much as you can get yours hands on. If he’s advising executives on when they should consider deep learning, it pretty much comes down to (a) whether they’re trying to solve a machine perception problem and/or (b) whether they have “a mountain of data.”

If they don’t have a mountain of data, he might suggest they get one. At least 100 trainable observations per feature you want to train is a good start, he said, adding that it’s conceivable to waste months of effort trying to optimize a model that would have been solved a lot quicker if you had just spent some time gathering training data early on.

Corrado said he views his job not as building intelligent computers (artificial intelligence) or building computers that can learn (machine learning), but asbuilding computers that can learn to be intelligent. And, he said, “You have to have a lot of data in order for that to work.”

Source: Google

Training a system that can do this takes a lot of data.

4. It’s not really based on the brain

Corrado received his Ph.D. in neuroscience and worked on IBM’s SyNAPSE neurosynaptic chip before coming to Google, and says he feels confident in saying that deep learning is only loosely based on how the brain works. And that’s based on what little we know about the brain to begin with.

Earlier in the day, Ng said about the same thing. To drive the point home, he noted that while many researchers believe we learn in an unsupervised manner, most production deep learning models today are still trained in a supervised manner. That is, they analyze lots of labeled images, speech samples or whatever in order to learn what it is.

And comparisons to the brain, while easier than nuanced explanations, tend to lead to overinflated connotations about what deep learning is or might be capable of. “This analogy,” Corrado said, “is now officially overhyped.”

Source : gigaom

Page 4 of 7

AOFIRS

World's leading professional association of Internet Research Specialists - We deliver Knowledge, Education, Training, and Certification in the field of Professional Online Research. The AOFIRS is considered a major contributor in improving Web Search Skills and recognizes Online Research work as a full-time occupation for those that use the Internet as their primary source of information.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.