fbpx

China's top Internet search company has been ordered to change the way it displays results after an investigation into the death of a student who relied on its services to look for cancer treatments.China's cyberspace regulator ruled Monday that Baidu's search results are not clearly labeled, lack objectivity and heavily favor advertisers.

 

The Internet giant has until May 31 to change its search algorithm to ensure that results are based on credibility, and limit the size of advertisements to 30% of a results page. The company must also clean up its medical services ads and remove offers from unauthorized providers.

 


Baidu (BIDU, Tech30) drew the attention of investigators after it was mentioned in a widely shared online post written in February by Wei Zexi, who was suffering a rare form of cancer.
Wei recounted that in a desperate move to save him, his parents had borrowed money and sought an experimental treatment at a military hospital in Beijing based on Baidu search results.


"Baidu, I didn't know how evil it is and how it ranks medical information based on a bidding process," he wrote. "We thought: Baidu, a top-ranked hospital ... everything must be legitimate."
After a failed treatment, Wei died less than two months later at the age of 22.

 

 

As word of his story spread, scathing attacks on Baidu multiplied, first across Chinese social networks and then in traditional media. Fury was directed in particular at the company's alleged advertising practices for medical services.
Baidu said Monday that it would comply with the cyberspace regulator's ruling. Working under the supervision of investigators, the company said it has reviewed all medical service and product providers and removed some 126 million promotional messages from its site.


"The tragic death of Zexi has caused a huge social impact and also deeply touched Baidu," the company's top search executive, Xiang Hailong, said in a statement. "It has prompted all of us to re-examine our responsibility as a search engine company."


A separate investigation into the hospital conducted by health authorities and the military found it had misled the public by illegally outsourcing services and publishing fake medical ads.

 

 

But many Chinese Internet users have expressed anger that Baidu's search results allegedly blurred the line between a paid advertisement for the experimental treatment and more helpful information for Wei and his family.
Long frustrated by their government's blocking of Google (GOOGL, Tech30) services, Chinese Internet users have portrayed Baidu as an unscrupulous business that thrived only because of China's extensive censorship system known as the Great Firewall.Wei made a subtle point on access to Google in his post.

 

"A Chinese student in the United States helped me Google relevant information and contacted many hospitals there," he wrote after his treatment failed. "Only then did we find out that American hospitals had long stopped using the technology (used in the treatment) due to poor results in clinical trials."


Some analysts say that Baidu -- an Internet giant worth more than $60 billion -- makes for an easier public target than a complex web of murky entities in the Chinese healthcare sector.
But they also note that Baidu has been linked to similar controversies in the past and that it has many advertisers from the medical industry.


"Ad sales from China's medical service industry account for a big part of Baidu's total ad revenues," said Hong Bo, a prominent independent IT industry commentator. "I think it would be very difficult for them to give up on this huge amount of income."


Baidu doesn't disclose ad revenues by sector, but has said that health care was one of its "top revenue verticals" last year. Shares in the company have lost more than 10% over the past five trading sessions.

 

Source:  http://money.cnn.com/2016/05/09/technology/baidu-china-student-death-investigation/index.html

 

 

 

Categorized in Search Engine

Yahoo CEO Search Heats Up:

 

All Things D says Yahoo has narrowed down its CEO search to four or five big names, including Google business lead Nikesh Arora, former Microsoft executives Brian McAndrews and Hulu CEO Jason Kilar. While Kilar would be absolutely perfect (he’s smart, likeable, dynamic, and as we’ve seen from his memos to his network partners, he’s not afraid to shake up old thinking), it’s hard to imagine him leaving Hulu to run Yahoo. Hulu may be a tough place to work given its joint venture nature, it’s far more functional than Yahoo. While we’ve rallied for a while that outgoing CEO Carol Bartz should be replaced with a media executive (since Yahoo should operate as  a media company) among the other more realistic candidates, McAndrews makes the most sense. His background — he was the head of aQuantive and was an ABC exec for a decade — brings just the right blend of agency thinking, media-business savvy tech prowess, that Yahoo needs.

 

McAndrews surely understands that Yahoo isn’t going to become a search technology or engineering hotbed anytime soon, and shouldn’t try.

 

The AOL Agonistes:

 

AOL is having one tough time. CEO Tim Armstrong’s turnaround strategy has more than its fair share of doubters. Add in an activist fund that’s taking aim at AOL’s direction. According to Starboard, which owns about 4.5 percent of AOL, the company’s crown jewel of media assets collectively loses $500 million a year. Starboard believes the market is valuing AOL’s media business effectively at zero. That’s stunning, if true. AOL isn’t perfect, but it’s still a large, respected player in the ad world. The idea that it’s fallen that far is amazing.

 

The Problem with Social TV:

 

Marketers are talking up the potential of social TV, aided by the growth of apps such as IntoNow, Shazam, and GetGlue. But one glaring issue is being overlooked, and that’s the emergence of time-shifted and on-demand viewing. Most of those apps are designed to be used as a show airs, but its unclear what value they can add to both consumers and content creators as users continue consume less live TV content.

 

Swedish Show and Tell:

 

It sounds like the fantasy of a geek who lives primarily in a pixelated universe: being placed in charge of the Twitter account of an entire country. Those super-hip Swedes at curatorsofsweden.com give one random Swedish citizen a chance to Tweet in the name of all Swedes for a week through the @sweden Twitter address in a social engagement project that has been called “an insane breach of branding principles.” Breach, yes, insane no. Sweden, like most brands, retailers and smart marketers know that you don’t really need an 80-person big data firm to connect with your audience. You can connect, using common-sense, right where they are, in social. Sweden is using the much-lauded Voluntaire agency which states that “no one is waiting for an advertising campaign” and “companies no longer have the power to control their brands.” For Voluntaire, and the @sweden campaign, branding has become about real-time interactions, not just target audiences derived from algorithms of varied quality. The big data industry, with all of its pomp and circumstance, ought to take a few cues from Sweden and focus on creating crowdsourced innovations, not just audience segments for banner ads.

 

Source:  http://digiday.com/publishers/yahoo-ceo-search-narrows/

Categorized in Search Engine

 

Getty Images on Wednesday filed a competition law complaint against Google with the European Commission.

The company last year filed an "interested third party" submission in support of the EC's investigation into Google's anticompetitive business practices.

 

Getty's complaint, in essence, is that Google Images creates galleries of high-res copyrighted content, and that providing easy access to them dissuades consumers from going to the source to view or license those images. That damages Getty's image licensing business.

 

"Google's behavior is adversely affecting the lives and livelihoods of visual artists around the world, impacting their ability to fund the creation of future works," said Getty General Counsel Yoko Miyashita.

Getty also complained about Google giving preference to its own image search over its rivals' search tools.

 

 

Encourages Piracy

 

Online search is "an essential tool for the discovery of images," and Google Images dominates the market, Miyashita told TechNewsWorld. The Google Images format "promotes right-click piracy by making high-res imagery easily available" without the need to get a license or permission from the source.

"By cutting off user traffic to competing websites like Getty Images and reserving that traffic exclusively for its own benefit, Google creates captive, image-rich environments, and is able to maintain and reinforce its dominance in both image search and its general search services," Miyashita argued.

 

Fair Use or Not?

 

"Google's rationale for image search, in general, is that displaying the image is necessary for the user to assess how well the image corresponds to their search," noted Matthew Sag, a professor at the Loyola University Chicago School of Law.

 

That practice "has been litigated at least twice in the United States in relation to thumbnail images and has easily passed the test of fair use," he told TechNewsWorld.

 

Getty's complaint "is directed more specifically to the creation of high-resolution galleries," and Google could make a similar argument that consumers need to see the image in high-res to properly evaluate it, but "this argument is not nearly so compelling," Sag said.

 

What Getty Wants

 

Getty Images wants to encourage the EC, the U.S. Federal Trade Commission and other government bodies "to implement competition law-based sanctions against Google," Miyashita said.

 

The goal is to ensure that Google "ceases its anticompetitive practices and that when displaying images, it does so in a format that simply directs users to the most relevant source website in a way that doesn't prejudice image owners for Google's benefit," he explained. That means one click to source.

 

Further, Getty Images "would like to see Google taking steps to discourage copyright infringement," added Miyashita.

However, Google "is under no obligation to design its information services in a way that drives traffic to a particular website, or external websites in general," Loyola's Sag pointed out.

 

Getty's Chances of Success

 

The type of claim Getty is making "failed in the United States in Perfect 10 v. Google," noted Ben Depoorter, Sunderland Chair at UC Hastings College of the Law.

The Ninth Circuit Court of Appeals held that Google's framing and hyperlinking as part of an image search engine constituted fair use because it was highly transformative.

 

However, in Getty's case, "the context is different," Depoorter told TechNewsWorld, "because of the antitrust angle and the fact that EU countries do not have a similar broad and flexible fair use exception to copyright law."

Google Images "is part of Google's continued effort to provide the most attractive search engine in the world," Depoorter remarked. Image thumbnails on Google "direct you to other websites," and the higher-res versions "seek to improve the display of the results."

 

Google "will claim that they're in the business of directing traffic to the sites, not replacing them," he suggested.

While the design of Google Image search "would make a poor antitrust case in the U.S.," said Sag, "it might go further in the EU, because they take a broader view of abuse of dominance."

 

Source:  http://www.technewsworld.com/story/83438.html

 

 

Categorized in Search Engine

Flash's death has been slow and painful, and now Google is planning to deal it another blow. Google has detailed plans to start blocking most Flash content with Chrome, with the change targeted toward the end of this year.

 

Under its current vision, nearly every website would have Flash content blocked by default. Visitors would still be able to enable Flash content on a site-by-site basis, but they would have to specifically choose to do so. Chrome would display a prompt offering to enable Flash; if chosen, Chrome would remember to run Flash on that site for all future visits.

 

 

Only 10 sites would have Flash enabled by default — the "top 10 domains using Flash," to avoid annoying people with too many prompts. Those include YouTube, Facebook, Yahoo, Twitch, and Amazon. But they'll only have a one year exemption. After that, it sounds like they'll have Flash blocked by default, just like everyone else.

 

Of course, this change still doesn't fully remove Flash from Chrome. It's still in there and still able to be widely run, so long as people keep giving it permission. Even so, disabling it by default still offers protections against unwanted and potentially malicious content. And it encourages web developers to make the switch to HTML5, so that people aren't discouraged from leaving their site.

 

To further encourage that change, Chrome won't simply be blocking Flash — it'll be pretending like Flash isn't even installed. So if a website has a backup HTML5 player, people using Chrome will see that, rather than a prompt to enable Flash.

Specifics of Google's plan could still change. But the proposal notes that "the tone and spirit should remain fairly consistent," even if details are altered here and there.

 

 

 

Google began enabling Flash blocking on a very limited scale a year ago, when it started "intelligently" pausing unnecessary content as a way to preserve battery life. That's the default setting right now; this plan pushing things much further.

 

If you're interested, you can already enable the settings that Google is planning to switch over to. Buried inside of Chrome's preferences page (under privacy and then content settings), you can find an option called "let me choose when to run plugin content." It'll block all Flash content until you right click on it and choose to have it enabled.

 

Even Adobe doesn't think people should use Flash any longer, so there likely won't be a huge amount of pushback on Chrome's changes. Flash is a menace on battery life and is continually found to have serious security flaws, so its eventual disappearance will be celebrated at every step.

 

Source:  http://www.theverge.com/2016/5/15/11679394/chrome-to-block-flash-later-2016

 

 

 

Categorized in Search Engine

Spotted on mobile, the carousel features individual cards that link to separate landing pages.

 

Just when things were starting to feel a little stale in mobile search after the big desktop shakeup with the removal of right rail ads, Google is testing a new look for sitelinks on mobile.

 

Meh, a sitelinks test, you say? But this one is actually an interesting new use for cards in paid search ads. Conrad O’Connell posted screen shots of the new test on Serptests.com on a search for “ocean isle beach rentals.” I was able to replicate this once before losing it after looking for other examples, which I was not able to find. I was also not able to replicate on desktop, so it’s possible this is just a mobile test. The sitelink cards appear in a swipeable carousel below the ad copy in this Airbnb ad.

 

 

 

card sitelinks test google adwords serptests.com

 

 

 

We’re calling these sitelinks because they link to pages on the Airbnb site, but they may end up being called something else. The structure shown in the Airbnb ad above looks like the card info might be pulling from a feed. You’ll notice the larger font pricing in these cards makes them particularly prominent. We’ve reached out to Google and will update here if we learn more.

 

Source:  http://searchengineland.com/google-testing-carousel-sitelink-cards-search-ads-249471

 

Categorized in Search Engine

Google has made a significant change to its search results pages by extending the length of titles and descriptions. This was first spotted by Ross Hudgens on Twitter, and later reported on by Jennifer Slegg at The SEM Post.

 

 Google has made a significant change to its search results pages by extending the length of titles and descriptions. This was first spotted by Ross Hudgens on Twitter, and later reported on by Jennifer Slegg at The SEM Post.

Long title tags being tested again in the SERPs. Seeing 69 and 70 character results today.

 

Here’s What Has Changed


Title tags have been increased to 70–71 characters, which is up from its previous 50–60 characters. That’s at least enough to fit in another word or two.

 

Meta descriptions have been increased by 100 characters per line, and extended from two to three lines. That’s a significant increase, and presents far more of an opportunity to tell searchers what the page is about.

Slegg reports that Google is still truncating the descriptions to two lines for many search results still, so you may still see them coming in at around 160 characters at times. When a three line snipped is displayed they come in at 278 characters per line.

It’s important to note that this may be a test which Google could reverse at any time. By the company’s own admission, it is always A/B testing. So it’s a good idea not to base your SEO efforts on these numbers until it’s known for sure if the test will become a permanent change.

 

 

What Do SEOs Think?


For the post part, the change has gone unnoticed. According to this Reddit thread the change in organic search happened on May 4th, and it’s only now that anyone is really talking about it.

“The analysis that I’ve done has shown that Google updated the search results layout to give organic listings an increased title tag size – as well as increased width of the Featured Snippets/Answer Boxes, but have decreased the height of the Featured Snippets/Answer Boxes.”

 

 

Commenters in the Reddit thread for the most part agree the change hasn’t been noticed. Redditor Jonathan Jones also wrote in his own blog that the change has impacted CTR in a positive way, which could be a sign it will be here to stay if it continues to product positive change.

Jones recommends measuring your CTR from prior to May 4th to now to see if you’ve noticed any positive or negative change as a result of the longer search snippets.

Source:  https://www.searchenginejournal.com/google-titles-and-descriptions-2016/163812/

Categorized in Search Engine

Google has announced they have more deeply integrated the Google Search Console metrics into the Google Analytics reports. Under the Acquisition tab, you may now see a new section named “Search Console,” and this has replaced the “Search Engine Optimization” tab.

The new Search Console tab combines the data from both sources, i.e., Search Console and Google Analytics, into one report. Google will show you acquisition, behavior and conversion metrics for your organic search traffic directly in these reports. Previously, Google only showed the acquisition data in the old search engine optimization reports.

The Search Console section has four sections: Landing Pages, Countries, Devices and Queries. The first three give you the migrated experience, while the Queries report just gives you acquisition metrics, like the old report.

Here are the metrics one can get in the Landing Pages, Countries and Devices sections:

ga-sc

Each of these new reports will display how your organic search traffic performs when measured by any of these dimensions, Google told us. As data is joined at the landing page level, Landing Pages, Countries and Devices will show both Search Console and Google Analytics data, while the Queries report will only show Search Console data for individual queries. The same search queries will display in Google Analytics as you see in Search Console today, Google added.

This feature is rolling out to users over the next few weeks, so keep checking for it in Google Analytics.

Note that there is also the standard three-day data delay in Google Analytics that you would see in the Google Search Console.

Source:  http://searchengineland.com/google-search-console-metrics-now-deeply-integrated-google-analytics-249334

Categorized in Search Engine

It was only last year that Google turned its links from red to blue, but during that time searchers have grown quite accustomed to the blue links. So much so that when Google recently turned them black, outrage ensued.

 

In an A/B test, Google has changed its blue link titles to black. According to the reactions so far, it’s unlikely this change will become permanent any time soon. A Google spokesperson has confirmed the test, while stating they’re not quite sure if the black links are here to stay.

 

 

 

 

Here’s a quick screenshot of comments from Reddit regarding this news:

Screen Shot 2016-05-10 at 8.48.48 PM

 

 

If you empathize with the commenter at the bottom and are looking for a way to “fix” this, you’re in luck! There are a few ways you can go about changing the links back to blue. Keep in mind your mileage may vary, but here are some of the fixes people have reported so far.

 

How to Turn Black Links Back to Blue

 


A sure fire way to get blue links back is with the Chrome extension called Stylist, which lets you manipulate the style sheet of any website, even Google.
Some searchers have reported that logging in and out of their Google account will return the links back to blue.
Other users have reported seeing the blue links again after manipulating certain settings in Chrome.
In order to do this, navigate to the Google home page and click on the grid icon in the top left corner. Select “My Account”.

 


When in your account settings, go to “Personal info & privacy”. Within that section you can turn off “Your searches and browsing history”, which allegedly can turn the links back to blue.
Alternatively, you can live with the black links until the test inevitably runs its course. Or, on the other hand, maybe black is the new blue?

 

Source:  https://www.searchenginejournal.com/google-turns-links-black-get-back-blue/163495/

 

 

 

Categorized in Search Engine

London: Scientists have developed a new technology that could help connect every household to very high speed internet at lower costs.

The innovative technology will help address the challenges of providing households with high bandwidths while future-proofing infrastructure against the exponentially growing demand for data, researchers said.

The innovative technology will help address the challenges of providing households with high bandwidths while future-proofing infrastructure against the exponentially growing demand for data, researchers said.

While major advances have been made in core optical fibre networks, they often terminate in cabinets far from the end consumers.

The so called 'last mile' which connects households to the global internet via the cabinet, is still almost exclusively built with copper cables as the optical receiver needed to read fibre-optic signals is too expensive to have in every home.

"We have designed a simplified optical receiver that could be mass-produced cheaply while maintaining the quality of the optical signal," said lead researcher Sezer Erkilinc, from University College London (UCL).

 

"The average data transmission rates of copper cables connecting homes today are about 300 megabytes per second and will soon become a major bottleneck in keeping up with data demands, which will likely reach about 5-10 gigabytes per second by 2025," said Erkilinc.

"Our technology can support speeds up to 10 gigabytes per second, making it truly future-proof," he said.

Scientists developed a new way to solve the 'last mile problem' of delivering fibre connections direct to households with true fibre-to-the-home (FTTH) broadband technology.

They simplified the design of the optical receiver, improving sensitivity and network reach compared to existing technology.

 

Once commercialized, it will lower the cost of installing and maintaining active components between the central cabinet and homes.
Read also:
New robotic gripper can handle delicate objects: Research
A major factor limiting the uptake of FTTH is the overall cost associated with laying optical fibre cables to each household and providing affordable optical receivers to connect them to the network.

 

The novel optical receiver retains many of the advantages of the conventional optical receivers typically used in core networks, but is smaller and contains around 75-80% fewer components, lowering the cost.

 

"Our receiver, is much simpler, containing just a quarter of the detectors used in a conventional coherent optical receiver," said Seb Savory, previously at UCL and now at the University of Cambridge.

 

The study was published in the Journal of Lightwave Technology.

 

Source:  http://timesofindia.indiatimes.com

Categorized in Internet Technology

Online research is not only easy for the researcher, but also cost effective. You can survey a large group at barely any price online. However, a researcher will have several options to choose from when deciding a research method, and to decide on the best one for his task there are some things he should consider. He can start by narrowing down the research topic, which is important even for internet research.

The first step to conducting any research is to be very clear about what you seek. A researcher should first make up his mind what he is searching for. He may start by narrowing down his research area, i.e. the discipline. Then he can narrow it down further and decide what his research topic or focus is, to gain direction, and then he needs to decide and finalize his research question. This will give direction to what exactly needs to be searched.

After the research question is finalized, it will be easier for the researcher to decide what kind of sample he needs. A sample population is based on the target population; hence the target population needs to be defined. A researcher needs to know what group, age and gender he is looking for. A sample will be derived based on that. There are lots of sampling techniques that can be used in this sense. Some of these techniques include random sampling, stratified sampling, or quota sampling. Each of them has its pros and cons.

After the target population and the sample is finalized, there are other considerations you need to review. Is your sample small or large? Do you need qualitative data or quantitative data? A researcher may sometimes require a little of both. Since every research method is best suited to different requirements, after you have decided the size of the sample and the type of information you seek, then you can choose a research method.

Surveys and questionnaires are often used when you have a large sample. They are easy to distribute, fill and collate. They are mostly used for quantitative information but can be used to collect qualitative data too in some instances. If you are an online researcher, your task is made even easier. There are online tools available like Surveymonkey.com and Google Forms that allow you to not only make a customized form, but collate the data and present it in visually appealing graphs.

Interviews are traditionally conducted via face-to-face conversation, however with the technology available, interviews now also encompass questions asked over Skype, and other video calling software. They are often used when short answers are not preferred in one or more of the questions, and follow-up questions might need to be asked. They can be used for either quantitative or qualitative information, but they are preferred for small samples over large samples. Since conducting interviews is time-consuming and labor intensive, using interviews for large samples might prove to be costly.

For an even more qualitative and detail based research, case study is also an option. When a single person, program or a project has to be evaluated, you seek various dimensions and aspects to it and compile it. If it is a person being observed, a researcher will observe them over a period, and record his actions and reactions to stimuli. Case study is strictly used for qualitative data and small samples, because it requires effort and is a time intensive process.

While most of the research methods online are similar to methods used for traditional offline research, more caution should be observed when employing these techniques. Researchers often face problems due to the anonymity maintained over the internet as it becomes harder to establish real identity. Also, it is essential to make sure informed consent is obtained since covert research is often frowned upon. If a researcher uses online research with care and makes sure he uses the right medium, and method, internet research is going to be of immense help. 

Categorized in Research Methods
Page 4 of 4

AOFIRS

World's leading professional association of Internet Research Specialists - We deliver Knowledge, Education, Training, and Certification in the field of Professional Online Research. The AOFIRS is considered a major contributor in improving Web Search Skills and recognizes Online Research work as a full-time occupation for those that use the Internet as their primary source of information.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.