fbpx

[Source: This article was published in theverge.com By Adi Robertson - Uploaded by the Association Member: Jay Harris]

Last weekend, in the hours after a deadly Texas church shooting, Google search promoted false reports about the suspect, suggesting that he was a radical communist affiliated with the antifa movement. The claims popped up in Google’s “Popular on Twitter” module, which made them prominently visible — although not the top results — in a search for the alleged killer’s name. Of course, the was just the latest instance of a long-standing problem: it was the latest of multiple similar missteps. As usual, Google promised to improve its search results, while the offending tweets disappeared. But telling Google to retrain its algorithms, as appropriate as that demand is, doesn’t solve the bigger issue: the search engine’s monopoly on truth.

Surveys suggest that, at least in theory, very few people unconditionally believe news from social media. But faith in search engines — a field long dominated by Google — appears consistently high. A 2017 Edelman survey found that 64 percent of respondents trusted search engines for news and information, a slight increase from the 61 percent who did in 2012, and notably more than the 57 percent who trusted traditional media. (Another 2012 survey, from Pew Research Center, found that 66 percent of people believed search engines were “fair and unbiased,” almost the same proportion that did in 2005.) Researcher danah boyd has suggested that media literacy training conflated doing independent research using search engines. Instead of learning to evaluate sources, “[students] heard that Google was trustworthy and Wikipedia was not.”

GOOGLE SEARCH IS A TOOL, NOT AN EXPERT

Google encourages this perception, as do competitors like Amazon and Apple — especially as their products depend more and more on virtual assistants. Though Google’s text-based search page is clearly a flawed system, at least it makes it clear that Google search functions as a directory for the larger internet — and at a more basic level, a useful tool for humans to master.

Google Assistant turns search into a trusted companion dispensing expert advice. The service has emphasized the idea that people shouldn’t have to learn special commands to “talk” to a computer, and demos of products like Google Home show off Assistant’s prowess at analyzing the context of simple spoken questions, then guessing exactly what users want. When bad information inevitably slips through, hearing it authoritatively spoken aloud is even more jarring than seeing it on a page.

Even if the search is overwhelmingly accurate, highlighting just a few bad results around topics like mass shootings is a major problem — especially if people are primed to believe that anything Google says is true. And for every advance Google makes to improve its results, there’s a host of people waiting to game the new system, forcing it to adapt again.

NOT ALL FEATURES ARE WORTH SAVING

Simply shaming Google over bad search results might actually play into its mythos, even if the goal is to hold the company accountable. It reinforces a framing where Google search’s ideal final state is a godlike, omniscient benefactor, not just a well-designed product. Yes, Google search should get better at avoiding obvious fakery, or creating a faux-neutral system that presents conspiracy theories next to hard reporting. But we should be wary of overemphasizing its ability, or that of any other technological system, to act as an arbiter of what’s real.

Alongside pushing Google to stop “fake news,” we should be looking for ways to limit trust in, and reliance on, search algorithms themselves. That might mean seeking handpicked video playlists instead of searching YouTube Kids, which recently drew criticism for surfacing inappropriate videos. It could mean focusing on reestablishing trust in human-led news curation, which has produced its own share of dangerous misinformation. It could mean pushing Google to kill, not improve, features that fail in predictable and damaging ways. At the very least, I’ve proposed that Google rename or abolish the Top Stories carousel, which offers legitimacy to certain pages without vetting their accuracy. Reducing the prominence of “Popular on Twitter” might make sense, too, unless Google clearly commits to strong human-led quality control.

The past year has made web platforms’ tremendous influence clearer than ever. Congress recently grilled Google, Facebook, and other tech companies over their role in spreading Russian propaganda during the presidential election. A report from The Verge revealed that unscrupulous rehab centers used Google to target people seeking addiction treatment. Simple design decisions can strip out the warning signs of a spammy news source. We have to hold these systems to a high standard. But when something like search screws up, we can’t just tell Google to offer the right answers. We have to operate on the assumption that it won’t ever have them.

Categorized in Search Engine

[Source: This article was published in heartland.org By Chris Talgo and Emma Kaden - Uploaded by the Association Member: Grace Irwin]

The U.S. Department of Justice announced it will launch a wide-ranging probe into possible antitrust behavior by social media and technology giants. Although no companies were specifically named, it’s not hard to guess which corporations will be in the limelight: Amazon, Facebook, and, of course, the mother of all technology titans, Google.

There is certainly a case to be made that these companies have been shady with private user data, stifled competition, and manipulated the flow of information to their benefit. But it’s worth considering whether or not a federal government investigation and possible destruction of these influential companies are really necessary.

As perhaps one of the most powerful companies in the world, Google has the most to lose if the federal government intervenes. According to research by Visual Capitalist, 90.8 percent of all internet searches are conducted via Google and its subsidiaries. For comparison’s sake, Google’s two main competitors — Bing and Yahoo! — comprise less than 3 percent of total searches.

Due to its overwhelming dominance of the search engine industry, Google has nearly complete control over the global flow of information. In other words, Google determines the results of almost all web-based inquiries.

Of course, this is a potentially dangerous situation. With this amount of control over the dispersal of information, Google has the unique ability to sway public opinion, impact economic outcomes, and influence any and all matters of public information. For instance, by altering search results, Google can bury content that it deems unworthy of the public’s view.

The truth is, not only can Google do these things, it already has done them. The tech giant has a long history of manipulating search results and promoting information based on political bias.

On its face, one can easily see how supporting the regulation and breakup of Google could serve the public good. If executed properly (unlike most government interventions), Google web searches would be free of bias and manipulation. The possible unintended consequences of such an intrusion, however, could dwarf any benefits it might bring.

The internet is the most highly innovative and adaptive medium ever developed. In less than two decades, it has brought about a revolution in most aspects of our daily lives, from how we conduct commerce and communicate to how we travel, learn, and access information. The primary reason for this breathtaking evolution is the complete lack of government regulation, intervention, and intrusion into the infrastructure of the internet.

Right now, Google serves as one of the primary pillars of the internet framework. Yes, Google is far from perfect — after all, it is run by humans — but it is an essential component to a thriving internet ecosystem. But this does not mean Google will forever serve as the foundation of the internet — 20 years ago, it didn’t even exist, and 20 years from now, something new will most likely take its place.

As tempting as it is to tinker with the internet and the companies that are currently fostering the dynamic growth and innovation that make the internet so unique, regulating such a complex and intricate system could lead to its downfall at worst and its petrification at best.

Wanting to keep Google from manipulating consumers is a noble notion, but this should happen from the bottom up, not the top down. Consumers should be the ultimate arbiters of which internet-based companies thrive.

Remember (for those who are old enough) that when the internet became mainstream, navigating it through search engines was extremely primitive and challenging. Early search engines such as AltaVista, Ask Jeeves, and Infoseek barely met consumer expectations.

Fast forward to 2019, and ponder how much more convenient Google has made everyday life. From optimized search capability to email to video sharing to navigation, Google provides an all-inclusive package of services that billions of people find useful — at this point in time. Someday, though, a company will surely produce a product superior to Google that protects user data, takes bias out of the equation, and allows for robust competition, all while maintaining and elevating the quality of service. No doubt customers will flock to it.

The awesome, rapid technology innovations of the past 20 years are due in large part to a lack of government regulation. Imagine what progress could be made in the years to come if the government refrains from overregulating and destroying internet companies. That’s not to say that the government shouldn’t take action against illegal activities, but overregulating this dynamic industry to solve trivial matters would do much more harm than good.

The government should take a laissez-faire approach to regulation, especially when it comes to the internet. Consumers should be able to shape industries according to their needs, wants, and desires without the heavy hand of government intervention.

[Originally Published at American Spectator]

Categorized in Search Engine

[Source: This article was published in techcrunch.com By Catherine Shu - Uploaded by the Association Member: Clara Johnson]

When Facebook  Graph Search launched six years ago, it was meant to help users discover content across public posts on the platform. Since then, the feature stayed relatively low-profile for many users (its last major announcement was in 2014 when a mobile version was rolled out) but became a valuable tool for many online investigators who used it to collect evidence of human rights abuses, war crimes and human trafficking. Last week, however, many of them discovered that Graph Search features had suddenly been turned off, reports Vice.

Graph Search let users search in plain language (i.e. sentences written the way people talk, not just keywords), but more importantly, it also let them filter search results by very specific criteria. For example, users could find who had liked a page or photo, when someone had visited a city or if they had been in the same place at the same time with another person. Despite the obvious potential for privacy issues, Graph Search was also an important resource for organizations like Bellingcat, an investigative journalism website that used it to document Saudi-led airstrikes in Yemen for its Yemen Project.

Other investigators also used Graph Search to build tools like StalkScan, but the removal of Graph Search means they have had to suspend their services or offer them in a very limited capacity. For example, StalkScan’s website now has a notice that says:

“As of June 6th, you can scan only your own profile with this tool. After two years and 28M+ StalkScan sessions, Facebook decided to make the Graph Search less transparent. As usual, they did this without any communication or dialogue with activists and journalists that used it for legitimate purposes.The creepy graph search itself still exists, but is now less accessible and more difficult to use. Make sure to check yourself with this tool, since your data is still out there!”

Facebook may be trying to take a more cautious stance because it is still dealing with the fall out from several major security lapses, including the Cambridge Analytica data scandal, as well as the revelation earlier this year that it had stored hundreds of millions of passwords in plain text.

In a statement to Vice, a Facebook spokesperson said “The vast majority of people on Facebook search using keywords, a factor which led us to pause some aspects of graph search and focus more on improving keyword search. We are working closely with researchers to make sure they have the tools they need to use our platform.” But one of Vice’s sources, a current employee at Facebook, said within the company there is “lots of internal and external struggle between giving access to info so people can find friends or research things (like Bellingcat),  and protecting it.”

TechCrunch has contacted Facebook for more information.

Categorized in Internet Search

[Source: This article was published in thegroundtruthproject.org By Josh Coe - Uploaded by the Association Member: James Gill] 

Last week ProPublica uncovered a secret Facebook group for Customs and Border Patrol agents in which a culture of xenophobia and sexism seems to have thrived. The story was supported by several screenshots of offensive posts by “apparently legitimate Facebook profiles belonging to Border Patrol agents, including a supervisor based in El Paso, Texas, and an agent in Eagle Pass, Texas,” the report’s author A.C. Thompson wrote.

This is only the most recent example of the stories that can be found by digging into Facebook. Although Instagram is the new social media darling, Facebook, which also owns Instagram, still has twice the number of users and remains a popular place for conversations and interactions around specific topics. 

Although many groups are private and you might need an invitation or a source inside them to gain access, the world’s largest social network is a trove of publicly accessible information for reporters, you just need to know where to look. 

I reached out to Brooke Williams, an award-winning investigative reporter and Associate Professor of the Practice of Computational Journalism at Boston University and Henk van Ess, lead investigator for Bellingcat.com, whose fact-checking work using social media has earned a large online following, to talk about how they use Facebook to dig for sources and information for their investigations. Their answers were edited for length and clarity. 

1. Use visible community groups like a phonebook

 While it remains unclear how Thompson gained access to the Border Patrol group, Williams says you can start by looking at those groups that are public and the people that care about the issues you’re reporting. 

“I have quite a bit of success with finding sources in the community,” says Williams, “people on the ground who care about local issues, in particular, tend to form Facebook groups.” 

Williams uses Facebook groups as a phonebook of sorts when looking for sources.  For example, if a helicopter crashes in a neighborhood, she suggests searching for that specific neighborhood on Facebook, using specific keywords like the name of local streets or the particular district to find eyewitnesses. Her neighborhood in the Boston area, she recalls, has its own community page.

 Williams also recommends searching through Google Groups, where organizations often leave “breadcrumbs” in their message boards.

 “It’s not all of them,” she notes about these groups, “but it’s the ones that have their privacy settings that way.”

 After speaking with Williams, I spent a few hours poking around in Google Groups and discovered a surprising amount of local and regional organizations neglected their privacy settings. When looking through these group messages, I had a lot of success using keyword searches like “meeting minutes” or “schedule,” through which documents and contact information of “potential sources” were available. While you can’t necessarily see who the messages are being sent to, the sender’s email is often visible.

This is just one example of a group with available contacts

1 Search 22Southwest Baltimore22 Redacted

2 Search Meeting Minutes Redacted

3 Redacted 22Meeting Minutes22 Results

4 Meeting Minutes

2. Filter Facebook with free search tools created by journalists for journalists

Despite privacy settings, there’s plenty of low-hanging and fruitful information on social media sites from Facebook to Twitter to Instagram, as a 2013 investigation by New Orleans-based journalism organization The Lens shows. The nonprofit’s reporters used the “family members” section of a Charter School CEO’s Facebook page to expose her nepotistic hiring of six relatives.

 “But if you know how to filter stuff… you can do way more,” van Ess says.

  In 2017, van Ess helped dispel a hoax story about a rocket launcher found on the side of a road in Egypt using a combination of social media sleuthing, Google Earth and a free tool that creates panoramas from video to determine when the video clip of the launcher was originally shot. More recently, he used similar methods as well as Google Chrome plug-ins for Instagram and the help of more than 60 Twitter users to track down the Netherlands’ most wanted criminal

He says journalists often overlook Facebook as a resource because “99 percent” of the stuff on Facebook is “photos of cats, dogs, food or memes,” but there’s useful information if you can sort through the deluge of personal posts. That’s why he created online search tools graph.tips and whopostedwhat.com so that investigators like himself had a “quick method to filter social media.”

For those early-career journalists who’ve turned around quick breaking news blips or crime blotters for a paper’s city desk, might be familiar with the twitter tool TweetDeck (if not, get on it!), Who posted what? offers reporters a way to similarly search keywords and find the accounts posting about a topic or event. 

Here’s how it works: you can type in a keyword and choose a date range in the “Timerange” section to find all recent postings about that keyword. Clicking on a Facebook profile of interest, you can then copy and paste that Facebook account’s URL address into a search box on whopostedwhat.com to generate a “UID” number. This UID can then be used in the “Posts directly from/Posts associated with” section to search for instances when that profile mentioned that keyword. 

These tools are not foolproof. Often times, searches will yield nothing. Currently, some of its functions (like searching a specific day, month or year) don’t seem to work (more on that in the next section), but the payoff can be big.  

“It enables you essentially to directly search somebody’s timeline rather than scrolling and scrolling and scrolling and having it load,” says Williams of graph.tips, which she employs in her own investigations. “You can’t control the timeline, but you can see connections between people which is applicable, I found, in countries other than the States.”

 While she declined to provide specific examples of how she uses graph.tips—she is using van Ess’s tools in a current investigation—she offered generalized scenarios in which it could come in handy. 

For instance, journalists can search “restaurants visited by” and type in the name of two politicians. “Or you could, like, put in ‘photos tagged with’ and name a politician or a lobbyist,” she says. She says location tagging is especially popular with people outside the US. 

Facebook’s taken a lot of heat recently about privacy issues, so many OSINT tools have ceased to work, or, like graph.tips, have had to adapt. 

 3. Keep abreast of the changes to the platforms 

The trouble with these tools is their dependence on the social platform’s whim–or “ukase” as van Ess likes to call it. 

For example, on June 7, Facebook reduced the functionality of Graph Search, rendering van Ess’s graph.tips more difficult to use.

According to van Ess, Facebook blocked his attempts to fix graph.tips five times and it took another five days before he figured out a method to get around the new restrictions by problem-solving with the help of Twitter fans. The result is graph.tips/facebook.html, which he says takes longer than the original graph.tips, but allows you to search Facebook much in the same way as the original. 

Even though the site maintains the guarantee that it’s “completely free and open, as knowledge should be,” van Ess now requires first-time users to ask him directly to use the tool, in order to filter through the flood of requests he claims he has received. 

I have not yet been given access to the new graph.tips and can’t confirm his claims, but van Ess welcomes investigators interested in helping to improve its functionality. Much like his investigations, he crowdsources improvements to his search tools. Graph.tips users constantly iron out issues with the reworked tool on GitHub, which can be used like a subreddit for software developers.  

Ongoing user feedback, as well as instructions on how to use van Ess’s new Facebook tool, can be found here. A similar tool updated as recently as July 3 and created by the Czech OSINT company Intel X is available here, though information regarding this newer company is sparse. By contrast, all van Ess’s tools are supported by donations. 

The OSINT community has its own subreddit, where members share the latest tools of their trade. 

4. Use other social media tools to corroborate your findings

When it comes to social media investigations, van Ess says you need to combine tools with “strategy.” In other words, learn the language of the search tool–he shared this helpful blog post listing all of the advanced search operator codes a journalist would need while using Twitter’s advanced search feature.

Williams also had a Twitter recommendation: TwXplorer. Created by the Knight Lab this helpful tool allows reporters to filter Twitter for the 500 most recent uses of a word or phrase in 12 languages. The application will then list all the handles tweeting about that phrase as well as the most popular related hashtags.

Bonus: More search tools 

If you want even more open-source tools honed for journalistic purposes, Mike Reilley of The Society of Professional Journalists published this exhaustive and continuously updated list of online applications last month. Be warned though: not all of them are free to use.

Categorized in Investigative Research

[Source: This article was published in techdirt.com By Julia Angwin, ProPublica - Uploaded by the Association Member: Dana W. Jimenez]

The East German secret police, known as the Stasi, were an infamously intrusive secret police force. They amassed dossiers on about one quarter of the population of the country during the Communist regime.

But their spycraft — while incredibly invasive — was also technologically primitive by today's standards. While researching my book Dragnet Nation, I obtained the above hand drawn social network graph and other files from the Stasi Archive in Berlin, where German citizens can see files kept about them and media can access some files, with the names of the people who were monitored removed.

The graphic shows forty-six connections, linking a target to various people (an "aunt," "Operational Case Jentzsch," presumably Bernd Jentzsch, an East German poet who defected to the West in 1976), places ("church"), and meetings ("by post, by phone, meeting in Hungary").

Gary Bruce, an associate professor of history at the University of Waterloo and the author of "The Firm: The Inside Story of the Stasi," helped me decode the graphic and other files. I was surprised at how crude the surveillance was. "Their main surveillance technology was mail, telephone, and informants," Bruce said.

Another file revealed a low-level surveillance operation called an IM-vorgang aimed at recruiting an unnamed target to become an informant. (The names of the targets were redacted; the names of the Stasi agents and informants were not.) In this case, the Stasi watched a rather boring high school student who lived with his mother and sister in a run-of-the-mill apartment. The Stasi obtained a report on him from the principal of his school and from a club where he was a member. But they didn't have much on him — I've seen Facebook profiles with far more information.

A third file documented a surveillance operation known as an OPK, for Operative Personenkontrolle, of a man who was writing oppositional poetry. The Stasi deployed three informants against him but did not steam open his mail or listen to his phone calls. The regime collapsed before the Stasi could do anything further.

I also obtained a file that contained an "observation report," in which Stasi agents recorded the movements of a forty-year-old man for two days — September 28 and 29, 1979. They watched him as he dropped off his laundry, loaded up his car with rolls of wallpaper, and drove a child in a car "obeying the speed limit," stopping for gas and delivering the wallpaper to an apartment building. The Stasi continued to follow the car as a woman drove the child back to Berlin.

The Stasi agent appears to have started following the target at 4:15 p.m. on a Friday evening. At 9:38 p.m., the target went into his apartment and turned out the lights. The agent stayed all night and handed over surveillance to another agent at 7:00 a.m. Saturday morning. That agent appears to have followed the target until 10:00 p.m. From today's perspective, this seems like a lot of work for very little information.

And yet, the Stasi files are an important reminder of what a repressive regime can do with so little information.

Categorized in Search Engine

[Source: This article was Published in tamebay.com By Sasha Fedorenko - Uploaded by the Association Member: Issac Avila]

What social media can not do is to deliver the next day or same day and that is where Amazon excels. All young consumers want everything now, so fast shipping becomes a major factor where a person shops. This is why Amazon will continue to dominate.

It is true. Not myself but my partner and her friends not even gen z but 20s early 30s have pretty much been brought up by social media. She gets so many of her ideas from insta, Pinterest, and Facebook.

Wedding cake, singer, flowers, invitations, bridesmaid dresses even my groomsmen gifts have all come from social media for a wedding next month. We’re even picked our seats on our flight for the honeymoon because of a video someone put on youtube, saved us a few hundred quid on pointless upgrades.
You can add links to most social sites to websites or marketplaces.

Direct checkout is going to be the big game changer. Amazon will still dominate on price and be the place for big box shifters and penny chasing, but social will become so much more important for smaller traders.

Categorized in Social

[Source: This article was Published in techcrunch.com By Anthony Ha- Uploaded by the Association Member: Grace Irwin]

1. Facebook announces Libra cryptocurrency: All you need to know

Facebook has finally revealed the details of its cryptocurrency Libra, which will let you buy things or send money to people with nearly zero fees.

The company won’t fully control Libra, but instead get just a single vote in its governance like other founding members of the Libra Association, including Visa, Uber, and Andreessen Horowitz, which have invested at least $10 million each into the project’s operations.

2. Amazon’s Twitch acquired social networking platform Bebo for under $25M to bolster its esports efforts

Bebo lives!

3. The future of diversity and inclusion in tech

Silicon Valley is entering a new phase in its quest for diversity and inclusion in the technology industry. Some advocates call this part “the end of the beginning,” as Code2040 CEO Karla Monterroso put it.

Social

4. Palm’s tiny phone is available unlocked at $350

The phone’s available “at only” $350. That’s cheap compared to many full-sized, mid-tier handsets, but cheapness is a relative concept — this still seems like a high price for a second phone.

5. Carmen Sandiego returns to Google Earth with a new caper

Google Earth first made use of its rich global 3D visualization as a backdrop for a Carmen Sandiego tie-in back in March, but today there’s a new adventure to explore.

6. Google Calendar is down, it’s not just you

Maybe it’ll be back up by the time you read this newsletter.

7. How to negotiate term sheets with strategic investors

Negotiating a term sheet with a strategic investor necessitates a different set of considerations. (Extra Crunch membership required.)

Categorized in Social

[This article is originally published in zdnet.com written by Steven J. Vaughan-Nichols - Uploaded by AIRS Member: Eric Beaudoin]

For less than a $100, you can have an open-source powered, easy-to-use server, which enables you -- and not Apple, Facebook, Google, or Microsoft -- to control your view of the internet.

On today's internet, most of us find ourselves locked into one service provider or the other. We find ourselves tied down to Apple, Facebook, Google, or Microsoft for our e-mail, social networking, calendaring -- you name it. It doesn't have to be that way. The FreedomBox Foundation has just released its first commercially available FreedomBox: The Pioneer Edition FreedomBox Home Server Kit. With it, you -- not some company -- control over your internet-based services.

The Olimex Pioneer FreedomBox costs less than $100 and is powered by a single-board computer (SBC), the open source hardware-based Olimex A20-OLinuXino-LIME2 board. This SBC is powered by a 1GHz A20/T2 dual-core Cortex-A7 processor and dual-core Mali 400 GPU. It also comes with a Gigabyte of RAM, a high-speed 32GB micro SD card for storage with the FreedomBox software pre-installed, two USB ports, SATA-drive support, a Gigabit Ethernet port, and a backup battery.

Doesn't sounds like much does it? But, here's the thing: You don't need much to run a personal server.

Sure, some of us have been running our own servers at home, the office, or at a hosting site for ages. I'm one of those people. But, it's hard to do. What the FreedomBox brings to the table is the power to let almost anyone run their own server without being a Linux expert.

The supplied FreedomBox software is based on Debian Linux. It's designed from the ground up to make it as hard as possible for anyone to exploit your data. It does this by putting you in control of your own corner of the internet at home. Its simple user interface lets you host your own internet services with little expertise.

You can also just download the FreedomBox software and run it on your own SBC. The Foundation recommends using the CubietruckCubieboard2BeagleBone BlackA20 OLinuXino Lime2A20 OLinuXino MICRO, and PC Engines APU. It will also run on most newer Raspberry Pi models.

Want an encrypted chat server to replace WhatsApp? It's got that. A VoIP server? Sure. A personal website? Of course! Web-based file sharing à la Dropbox? You bet. A Virtual Private Network (VPN) server of your own? Yes, that's essential for its mission.

The software stack isn't perfect. This is still a work in progress. So, for example, it still doesn't have a personal email server or federated social networking, such as GNU Social and Diaspora, to provide a privacy-respecting alternative to Facebook. That's not because they won't run on a FreedomBox; they will. What they haven't been able to do yet is to make it easy enough for anyone to do and not someone with Linux sysadmin chops. That will come in time.

As the Foundation stated, "The word 'Pioneer' was included in the name of these kits in order to emphasize the leadership required to run a FreedomBox in 2019. Users will be pioneers both because they have the initiative to define this new frontier and because their feedback will make FreedomBox better for its next generation of users."

To help you get up to speed the FreedomBox community will be offering free technical support for owners of the Pioneer Edition FreedomBox servers on its support forum. The Foundation also welcomes new developers to help it perfect the FreedomBox platform. 

Why do this?  Eben Moglen, Professor of Law at Columbia Law School, saw the mess we were heading toward almost 10 years ago: "Mr. Zuckerberg has attained an unenviable record: he has done more harm to the human race than anybody else his age." That was before Facebook proved itself to be totally incompetent with security and sold off your data to Cambridge Analytica to scam 50 million US Facebook users with personalized anti-Clinton and pro-Trump propaganda in the 2016 election.

It didn't have to be that way. In an interview, Moglen told me this: "Concentration of technology is a surprising outcome of cheap hardware and free software. We could have had a world of peers. Instead, the net we built is the net we didn't want. We're in an age of surveillance with centralized control. We're in a world, which encourages swiping, clicking, and flame throwing."

With FreedomBox, "We can undo this. We can make it possible for ordinary people to provide internet services. You can have your own private messaging, services without a man in the middle watching your every move." 

We can, in short, rebuild the internet so that we, and not multi-billion dollar companies, are in charge.

I like this plan

Categorized in Internet Privacy

[This article is originally published in searchengineland.com written by Greg Sterling - Uploaded by AIRS Member: Eric Beaudoin]

Facebook users turned to Google search or went directly to publishers.

An interesting thing happened on August 3. Facebook was down for nearly an hour in Europe and North America. During that time, many users who were shut out of their Facebook News Feeds went directly to news sites or searched for news.

Direct traffic spikes during a Facebook outage. According to data presented by Chartbeat at the recent Online News Association conference, direct traffic to news publisher sites increased 11 percent (in large part from app-driven traffic), and search traffic (to news sites) increased 8 percent during the outage that occurred a little after 4:00 p.m., as shown in the chart above.

According to late 2017 data from the Pew Research Center:

Just under half (45 percent) of U.S. adults use Facebook for news. Half of Facebook’s news users get news from that social media site alone, with just one-in-five relying on three or more sites for news.

Algorithm change sent people back to search. From that perspective, it makes sense that when Facebook is unavailable, people will turn to direct sources to get news. Earlier this year, however, Facebook began to “fix” the News Feed by minimizing third-party “commercial content.” This impacted multiple entities, but most news publishers saw their referral traffic from Facebook decline, a pattern that predated the algorithm change.

Starting in 2017, there’s evidence that as Facebook referrals have declined, more people have turned to Google to obtain their news fix. Users no longer able to get news as easily from Facebook are going to Google or directly to news sources to get it.

Why it matters to marketers. The trends shown in this chart underscore opportunities for content creators to capitalize on well-optimized pages (and possibly ads) to reach news-seeking audiences in search. It also highlights programmatic and direct-buying ad opportunities for marketers to reach these audiences on publisher sites.

Categorized in News & Politics

 Source: This article was Published wired.com By IE LAPOWSKY - Contributed by Member: Bridget Miller

IN LATE JULY, a group of high-ranking Facebook executives organized an emergency conference call with reporters across the country. That morning, Facebook’s chief operating officer, Sheryl Sandberg, explained, they had shut down 32 fake pages and accounts that appeared to be coordinating disinformation campaigns on Facebook and Instagram. They couldn’t pinpoint who was behind the activity just yet, but said the accounts and pages had loose ties to Russia’s Internet Research Agency, which had spread divisive propaganda like a flesh-eating virus throughout the 2016 US election cycle.

Facebook was only two weeks into its investigation of this new network, and the executives said they expected to have more answers in the days to come. Specifically, they said some of those answers would come from the Atlantic Council's Digital Forensics Research Lab. The group, whose mission is to spot, dissect, and explain the origins of online disinformation, was one of Facebook’s newest partners in the fight against digital assaults on elections around the world. “When they do that analysis, people will be able to understand better what’s at play here,” Facebook’s head of cybersecurity policy, Nathaniel Gleicher, said.

Back in Washington DC, meanwhile, DFRLab was still scrambling to understand just what was going on themselves. Facebook had alerted them to the eight suspicious pages the day before the press call. The lab had no access to the accounts connected to those pages, nor to any information on Facebook’s backend that would have revealed strange patterns of behavior. They could only see the parts of the pages that would have been visible to any other Facebook user before the pages were shut down—and they had less than 24 hours to do it.

“We screenshotted as much as possible,” says Graham Brookie, the group’s 28-year-old director. “But as soon as those accounts are taken down, we don’t have access to them... We had a good head start, but not a full understanding.” DFRLab is preparing to release a longer report on its findings this week.

As a company, Facebook has rarely been one to throw open its doors to outsiders. That started to change after the 2016 election, when it became clear that Facebook and other tech giants missed an active, and arguably incredibly successful, foreign influence campaign going on right under their noses. Faced with a backlash from lawmakers, the media, and their users, the company publicly committed to being more transparent and to work with outside researchers, including at the Atlantic Council.

'[Facebook] is trying to figure out what the rules of the road are, frankly, as are research organizations like ours.'

GRAHAM BROOKIE, DIGITAL FORENSICS RESEARCH LAB

DFRLab is a scrappier, substantially smaller offshoot of the 57-year-old bipartisan think tank based in DC, and its team of 14 is spread around the globe. Using open source tools like Google Earth and public social media data, they analyze suspicious political activity on Facebook, offer guidance to the company, and publish their findings in regular reports on Medium. Sometimes, as with the recent batch of fake accounts and pages, Facebook feeds tips to the DFRLab for further digging. It's an evolving, somewhat delicate relationship between a corporate behemoth that wants to appear transparent without ceding too much control or violating users' privacy, and a young research group that’s ravenous for Intel and eager to establish its reputation.

“This kind of new world of information sharing is just that, it’s new,” Brookie says. “[Facebook] is trying to figure out what the rules of the road are, frankly, as are research organizations like ours.”

The lab got its start almost by accident. In 2014, Brookie was working for the National Security Council under President Obama when the military conflict broke out in eastern Ukraine. At the time, he says, the US intelligence community knew that Russian troops had invaded the region, but given the classified nature of their intel they had no way to prove it to the public. That allowed the Russian government to continue denying their involvement.

What the Russians didn’t know was that proof of their military surge was sitting right out in the open online. A working group within the Atlantic Council was among the groups busy sifting through the selfies and videos that Russian soldiers were uploading to sites like Instagram and YouTube. By comparing the geolocation data on those posts to Google Earth street view images that could reveal precisely where the photos were taken, the researchers were able to track the soldiers as they made their way through Ukraine.

“It was old-school Facebook stalking, but for classified national security interests,” says Brookie.

This experiment formed the basis of DFRLab, which has continued using open source tools to investigate national security issues ever since. After the initial report on eastern Ukraine, for instance, DFRLab followed up with a piece that used satellite images to prove that the Russian government had misled the world about its air strikes on Syria; instead of hitting ISIS territory and oil reserves, as it claimed, it had in fact targeted civilian populations, hospitals, and schools.

But Brookie, who joined DFRLab in 2017, says the 2016 election radically changed the way the team worked. Unlike Syria or Ukraine, where researchers needed to extract the truth in a low-information environment, the election was plagued by another scourge: information overload. Suddenly, there was a flood of myths to be debunked. DFRLab shifted from writing lengthy policy papers to quick hits on Medium. To expand its reach even further, the group also launched a series of live events to train other academics, journalists, and government officials in their research tactics, creating even more so-called “digital Sherlocks.”

'Sometimes a fresh pair of eyes can see something we may have missed.'

KATIE HARBATH, FACEBOOK

This work caught Facebook’s attention in 2017. After it became clear that bad actors, including Russian trolls, had used Facebook to prey on users' political views during the 2016 race, Facebook pledged to better safeguard election integrity around the world. The company has since begun staffing up its security team, developing artificial intelligence to spot fake accounts and coordinated activity, and enacting measures to verify the identities of political advertisers and administrators for large pages on Facebook.

According to Katie Harbath, Facebook’s director of politics, DFRLab's skill at tracking disinformation not just on Facebook but across platforms felt like a valuable addition to this effort. The fact that the Atlantic Council’s board is stacked with foreign policy experts including former secretary of state Madeleine Albright and Stephen Hadley, former national security adviser to President George W. Bush, was an added bonus.

“They bring that unique, global view set of both established foreign policy people, who have had a lot of experience, combined with innovation and looking at problems in new ways, using open source material,” Harbath says.

That combination has helped the Atlantic Council attract as much as $24 million a year in contributions, including from government and corporate sponsors. As the think tank's profile has grown, however, it has also been accused of peddling influence for major corporate donors like FedEx. Now, after committing roughly $1 million in funding to the Atlantic Council, the bulk of which supports the DFRLab’s work, Facebook is among the organization's biggest sponsors.

But for Facebook, giving money away is the easy part. The challenge now is figuring out how best to leverage this new partnership. Facebook is a $500 billion tech juggernaut with 30,000 employees in offices around the world; it's hard to imagine what a 14-person team at a non-profit could tell them that they don't already know. But Facebook's security team and DFRLab staff swap tips daily through a shared Slack channel, and Harbath says that Brookie’s team has already made some valuable discoveries.

During the recent elections in Mexico, for example, DFRLab dissected the behavior of a political consulting group called Victory Lab that was spamming the election with fake news, driven by Twitter bots and Facebook likes that appeared to have been purchased in bulk. The team found that a substantial number of those phony likes came from the same set of Brazilian Facebook users. What's more, they all listed the same company, Frases & Versos, as their employer.

The team dug deeper, looking into the managers of Frases & Versos, and found that they were connected with an entity called PCSD, which maintained a number of pages where Facebook users could buy and sell likes, shares, and even entire pages. With the Brazilian elections on the horizon in October, Brookie says, it was critical to get the information in front of Facebook immediately.

"We flagged it for Facebook, like, 'Holy cow this is interesting,'" Brookie remembers. The Facebook team took on the investigation from there. On Wednesday, the DFRLab published its report on the topic, and Facebook confirmed to WIRED that it had removed a network of 72 groups, 46 accounts, and five pages associated with PCSD.

"We’re in this all day, every day, looking at these things," Harbath says. "Sometimes a fresh pair of eyes can see something we may have missed."

Of course, Facebook has missed a lot in the past few years, and the partnership with the DFRLab is no guarantee it won't miss more. Even as it stumbles toward transparency, the company remains highly selective about which sets of eyes get to search for what they've missed, and what they get to see. After all, Brookie's team can only examine clues that are already publicly accessible. Whatever signals Facebook is studying behind the scenes remain a mystery.

Categorized in Internet Privacy

AOFIRS

World's leading professional association of Internet Research Specialists - We deliver Knowledge, Education, Training, and Certification in the field of Professional Online Research. The AOFIRS is considered a major contributor in improving Web Search Skills and recognizes Online Research work as a full-time occupation for those that use the Internet as their primary source of information.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.