Pete recommends – weekly highlights on cyber security issues – February 24 2018

Subject: ‘Get back in here!’ Sandy Hook lessons spared lives in Florida shooting

Source: Reuters via Yahoo

(Reuters) – As soon as she heard “Code Red Lockdown” on her radio in a Florida high school library, Diana Haneski remembered how a fellow librarian saved lives by locking 22 people in a supply closet during the massacre at Sandy Hook Elementary School.

“She was there that day in Sandy Hook and because of her I knew what to do,” said Haneski, 57, a library media specialist at Marjory Stoneman Douglas High School in Parkland, Florida, where a former student is charged with shooting dead 17 people on Wednesday.

Subject: Facebook faces big challenge to prevent future U.S. election meddling

Source: Reuters via Yahoo–finance.html

To know the identities of ad buyers, internet companies might need to duplicate the “know your customer” practices of banks and regularly share information with authorities, Ravel said. Facebook has said it will start requiring thorough documentation from election-related advertisers to verify their identity and location, beginning with U.S. elections this year. How extensive that vetting will be is unclear. “If you want to put up a theme page for a group, in the ordinary course you wouldn’t expect that a vendor like Facebook would require that sort of vetting,” said Dan Petalas, a former U.S. federal prosecutor. “The indictment really details an elaborate scheme that would be difficult to identify,” he said. Facebook said on Friday it was making “significant investments” to guard against future attacks and was working with the Federal Bureau of Investigation to deter election interference.

Subject: A look at FBI call center that failed to flag tip on shooter

Source: AP via Yahoo

It was one of hundreds of thousands of tips received annually by the FBI’s public tip line — 1-800-CALL-FBI. Here’s a look at how the agency analyzes tips:


Representatives at the public access line — the bureau’s tip center in West Virginia — are responsible for taking information from the public.

If a tip appears credible and warrants further investigation, it is supposed to be forwarded to an FBI field office.

Other tips sometimes involve non-criminal incidents and aren’t passed along, while threats against politicians, including the president, could be referred to the Secret Service.

Before the call center opened in 2012, field offices handled their own tips, which could inundate agents.


Attendants are trained to gather information that might aid investigations. They are taught listening and communications skills, undergo classroom training and receive on-the-job training with other representatives to learn how to write reports.

Follow the AP’s complete coverage of the Florida school shooting here:

Subject: Google is Making it Easier For 911 To Find You in an Emergency – Slashdot

Source: Slashdot

When you call 911 from a cellphone, your location is typically sent to the call taker by a wireless carrier. But that information isn’t always so accurate. Well Google might have a better way of going about it and it tested its system across a few states in December and January, the Wall Street Journal reports. In the states where the tests took place, Google sent location data from a random selection of 911 callers using Android phones straight to the people taking those calls. The test included 50 call centers that cover around 2.4 million people in Texas, Tennessee and Florida, and early reports of the results suggest the system is promising.

Subject: Lawsuits threaten infosec research — just when we need it most | ZDNet

Source: ZDNet [© 2018 CBS Interactive. All rights reserved.]

Steve Ragan, senior staff writer at tech news site CSO, and Dan Goodin, security editor at Ars Technica, were last year named defendants in two separate lawsuits. The cases are different, but they have a common theme: they are being sued by the companies covered in articles they wrote.

Although lawsuits targeting reporters, particularly on the security beat, are rare, legal threats are an occupational hazard that reporters are all too aware of — from companies threatening to call an editor to demand a correction — or else — to a full-blown lawsuit.

But the inevitable aftermath is a “chilling effect.” White-hat hackers and security researchers hesitate to report vulnerabilities and weaknesses to technology firms for fear of facing legal retribution.

With nation state attackers targeting elections and critical national security infrastructure on a near-daily basis, security research is needed more than ever.

NB other Security articles:


Subject: “Molly” Can Compile Your Social Media Profiles Into A Bot |

Source: AndroidHeadlines

A new solution called “Molly” wants to gather information from various social media platforms and aggregate it under a unified profile meant to power a personalized bot capable of answering questions about someone. Founded by Chris Messina who created the now-ubiquitous hashtag symbol that groups posts on Twitter, Instagram and other platforms, Molly aims to make it easier for people to learn more about one another. Currently, if someone wants to know more about a person they just met, for instance, they would likely check that person’s various social media accounts, scavenging for bits of information that would help them get a better idea of one’s life, friends, habits, achievements, views, and other such data.

Subject: Government Science Site Becomes Prime Real Estate for Fake Movie Pirates

Source: Gizmodo

The National Center for Biotechnology Information website is an invaluable resource for finding scientific studies and papers. Recently, it also became a promotional vector for a potential phishing site offering pirated movie streams.

As Gizmodo discovered Monday evening, the science database was coming up in the first page of results for searches that included the word “watch” and the title of most movies currently playing in theaters, like /Black Panther/ <>, /Fifty Shades Freed/ <>, and /Oceans 8/ <>. (I was trying to rent /Thor: Ragnarok/, okay?) A site-search of NCBI through Google for “full movie” returned nearly 40 pages <> of results, none more than a day old.

Given NCBI’s trustworthy reputation and .gov top-level domain, it tends to appear high in Google search results. After all, the overwhelming majority of content submitted to NCBI requires peer review by the scientific community. This high-ranking search score made it a valuable target for bad actors peddling pirated movies, who exploited a personal resume profile tool on NCBI called sciENcv.

Subject: Is it time to crack down on facial recognition?

Source: The Washington Examiner

At least half of American adults have their photo in a facial recognition network that authorities can search without a court order or meaningful privacy protections.

But that finding by Georgetown University researchers, who dubbed the networks a “perpetual line-up <>” in late 2016, generated little legislative activity.

Alvaro Bedoya, a Georgetown law professor who helped write the report, says he knows of a single lawmaker, Maryland state Del. Charles Sydnor, who took up his call for regulating official use of
facial recognition tools.

Sydnor, a Baltimore County Democrat, later gave up on that bill, <> which would require police to get a court order to search databases of driver’s license photos, after it died in committee last year.

Some privacy advocates say, however, that the issue’s time has come and that regulations are needed as states write rules for other emerging technologies, such as drones and cell-site simulators.

Illinois’ 2008 Biometric Information Privacy Act requires user consent and disclosure from companies about how they use biometric data such as fingerprints, iris scans, and face scans.

The National Telecommunications and Information Administration has twice begun developing industry best practices for facial recognition. A second attempt ended in 2016 after many privacy groups quit in protest. The NTIA reviews resulted only in the urging
<> of transparency and consideration of voluntary restraint.

“Generally, machine learning will replicate and amplify any bias in the data,” Asaro said. “For many consumer applications, this bias simply means the systems don’t work well if you do not look like the people who designed the system. But if that system is going to be used to make a significant decision about you, track you, notify police to stop you, prevent you from boarding a plane, etc., then those error rates really do matter.”…

NB see section PRIVACY:

Posted in: Cybersecurity, Legal Research, Privacy, Social Media