Showing posts with label ICTEC. Show all posts
Showing posts with label ICTEC. Show all posts

25 June 2017

Hard Questions: How We Counter Terrorism

By Monika Bickert,

In the wake of recent terror attacks, people have questioned the role of tech companies in fighting terrorism online. We want to answer those questions head on. We agree with those who say that social media should not be a place where terrorists have a voice. We want to be very clear how seriously we take this — keeping our community safe on Facebook is critical to our mission.

In this post, we’ll walk through some of our behind-the-scenes work, including how we use artificial intelligence to keep terrorist content off Facebook, something we have not talked about publicly before. We will also discuss the people who work on counterterrorism, some of whom have spent their entire careers combating terrorism, and the ways we collaborate with partners outside our company.

Our stance is simple: There’s no place on Facebook for terrorism. We remove terrorists and posts that support terrorism whenever we become aware of them. When we receive reports of potential terrorism posts, we review those reports urgently and with scrutiny. And in the rare cases when we uncover evidence of imminent harm, we promptly inform authorities. Although academic research finds that the radicalization of members of groups like ISIS and Al Qaeda primarily occurs offline, we know that the internet does play a role — and we don’t want Facebook to be used for any terrorist activity whatsoever.

We believe technology, and Facebook, can be part of the solution.

We’ve been cautious, in part because we don’t want to suggest there is any easy technical fix. It is an enormous challenge to keep people safe on a platform used by nearly 2 billion every month, posting and commenting in more than 80 languages in every corner of the globe. And there is much more for us to do. But we do want to share what we are working on and hear your feedback so we can do better.

Captain America and Information Operations (IO)

By Jon Herrmann

We’ve all (hopefully) seen Captain America, whether in a movie, comic book, or any other of a dozen venues. Even those who have heard only a little know the basics: Captain America is the super-soldier, one man who takes on thousands… and wins. Alone or with the support of some more average people (Howling Commando soldiers or secret agents of SHIELD), the key is the Captain. Compare that to the more “realistic” versions of combat we see in Band of Brothers, Saving Private Ryan, or even Call of Duty. If one person takes on a thousand, the one dies. So can we learn anything from Captain America for national security in reality? Maybe so… in information operations (IO).

Normal combat takes place in a physical realm, where normal (Gaussian) distributions and bell curves make sense. For millennia, we have learned the rules and principles of war or battle, if you prefer. We know that mass matters, and no single soldier is going to overwhelm even five adversaries, in any but the most bizarre circumstances. The rarity of those circumstances means that the effects of any single soldier are generally lost in a battle- the entire effort “averages out.” That is a key reason that having many soldiers is crucial in traditional combat.

Informational combat is atypical. Normal distributions are not normal at all, and bell curves are an illusion that poor commanders use to console themselves, seeking the comfort of a simple model instead of the frightening truth. Information warfare is the world of Captain America. In information warfare, communications professionals hone and craft hundreds of stories, developing articles like basic training develops recruits. Editors mold these stories with the sharp criticisms we once heard from drill sergeants like R. Lee Ermey or Heinlein’s Sergeant Zim. They know they are useful. Stories can overwhelm mental defenses like a traditional mass combat can overwhelm physical defenses. Psychologists have shown that repetition creates the perception of truth, regardless of factual basis, so sending story after story against the minds of opponents can be very effective over time. But the human wave is not the key to modern combat, and the story wave is not the key to modern information conflict.

It's Surprisingly Simple to Hack a Satellite

J.M. PORUP

Hacker conferences are famous for using quirky, hackablebadges. DefCon's 2015 badge was a working vinyl LP containing a spoken-word ciphertext copy of the Hacker Manifesto.

But at the Chaos Communication Camp, held in Zehdenick, Germany last week, the organizers did something different: they gave out 4500 rad1o badges. These software-defined radios are sensitive enough to intercept satellite traffic from the Iridium communications network.

During a Camp presentation entitled "Iridium Hacking: please don't sue us," hackers Sec and schneider demonstrated how to eavesdrop on Iridium pager traffic using the Camp badge.

The Iridium satellite network consists of 66 active satellites in low Earth orbit. Developed by Motorola for the Iridium company, the network offers voice and data communications for satellite phones, pagers, and integrated transceivers around the world. (Iridium went bankrupt in 1999, but was later purchased from Motorola in 2001 by private investors, who have revived the company.) The largest user of the Iridium network is the Pentagon.

"The problem," Sec explained, "isn't that Iridium has poor security. It's that it has no security."

The Army Can Now Stop Enemy Tanks In Their Tracks Without Firing A Shot

By JARED KELLER

U.S. Army personnel have successfully used advanced electronic warfare technology to completely disable enemy armor during a simulated tank assault at the Army National Training Center, Defense Systems reports.

Developed by the Army Rapid Capabilities Office (RCO), the combination of wireless communications-jamming and hacker exploits of vehicle systems forces enemy tanks to “stop, dismount, get out of their protection, [and] reduce their mobility,” as one Army observer described the ANTC training exercise at Fort Irwin, California.

This is only the second major Army test of tactical electronic warfare in recent history. In April, the RCO outfitted nearly 20 soldiers from the 2nd Cavalry Regiment at U.S. Army Garrison Bavaria in Vilseck, Germany, with advanced electronic warfare equipment for field-testing, the first time an Army electronic warfare system had been deployed in a tactical environment.

Barely the size of “a lightweight backpack,” the vehicle- and -infantry portable kits come with two primary capabilities: VROD (Versatile Radio Observation & Direction) to “detect and understand” enemy electromagnetic signals, and the so-called VMAX to “search and attack” with “electronic attack effects” that the Army RCO described as “more effective than the existing jammers used by anti-missile systems in aircraft.”

The Pentagon’s New Algorithmic Warfare Cell Gets Its First Mission: Hunt ISIS

BY MARCUS WEISGERBER

By year’s end, the Pentagon wants computers to be leading the hunt for Islamic State militants in Iraq and Syria, through turning countless hours of aerial surveillance video into actionable intelligence.

It’s part of Project Maven, a fast-moving effort launched last month by Deputy Defense Secretary Bob Work to accelerate, improve, and put to wider use the military’s use of machine learning. 

“We have to tackle the problem a different way,” said Air Force Lt. Gen. John N.T.“Jack” Shanahan, director for defense intelligence for warfighter support, and the man tasked with finding the new technology. “We’re not going to solve it by throwing more people at the problem…That’s the last thing that we actually want to do. We want to be smarter about what we’re doing.”

Thousands of military and civilian intelligence analysts are “overwhelmed” by the amount of video being recorded over the battlefield. These analysts watch the video, looking for abnormal activities. Right now, about 95 percent of the video shot by drone aircraft is from the campaign against ISIS in Iraq and Syria.

The Pentagon has raced to buy and deploy drones that carry high-resolution cameras over the past decade and a half of war in Afghanistan and Iraq. But on the back end, stateside analysts are overwhelmed. Pentagon leaders hope technology can ease the burden on the workforce while producing better results on the battlefield.

The Future of Military IT: Gait Biometrics, Software Nets, and Photon Communicators

BY PATRICK TUCKER

DISA director Lt. Gen. Alan Lynn talks about the tech he’s eyeing, some of which is barely out of the theoretical realm. 

Tomorrow’s soldiers will wield encrypted devices that unlock to their voices, or even their particular way of walking, and communicate via ad-hoc, software-defined networks that use not radio waves but light according to Lt. Gen. Alan Lynn, who leads the Defense Information Systems Agency, the U.S. military’s IT provider. On Tuesday, Lynn talked about next-generation technologies that DISA is looking into, some of which are barely experimental today.

Biometric access

Forget thumbprint unlock screens for phones and communications equipment. Tomorrow’s next-generation biometric identifiers are related to the data that soldiers create through their activity. That could include everything from the way that a soldier walks, to the way she holds her phone, to places that she’s been.

“In the future, we see that the systems you carry on you, developing information on you and taking information from you,” said Lynn. “Your walk is as individual as your thumbprint. Why is that important? Well, if you are in warfighting, oftentimes you wear gloves, oftentimes you wear masks…you can’t use a lot of the biometrics you would normally use. But your gait, your walk, that’s going to be there. We think that’s an important part of our future for identity.” 

24 June 2017

The Future of Military IT: Gait Biometrics, Software Nets, and Photon Communicators**

BY PATRICK TUCKER

DISA director Lt. Gen. Alan Lynn talks about the tech he’s eyeing, some of which is barely out of the theoretical realm.

Tomorrow’s soldiers will wield encrypted devices that unlock to their voices, or even their particular way of walking, and communicate via ad-hoc, software-defined networks that use not radio waves but light according to Lt. Gen. Alan Lynn, who leads the Defense Information Systems Agency, the U.S. military’s IT provider. On Tuesday, Lynn talked about next-generation technologies that DISA is looking into, some of which are barely experimental today.

Here are few of the key areas:

Biometric access

Forget thumbprint unlock screens for phones and communications equipment. Tomorrow’s next-generation biometric identifiers are related to the data that soldiers create through their activity. That could include everything from the way that a soldier walks, to the way she holds her phone, to places that she’s been.

New playground for non-state actors

M. K. Narayanan

‘Internet-enabled’ terrorism has introduced greater complexity in an already difficult scenario

Hidden terror was, till now, believed to be confined mainly to the less developed regions of the world — the 9/11 attack in the U.S. was seen as an aberration, or exception, rather than the rule in this respect. Since 2015, however, with the attack in January of that year on the Charlie Hebdo offices in Paris, followed by a series of major terrorist incidents in Brussels, Paris, Nice, Berlin and Istanbul during the past two years, it is evident that the developed world is no longer immune from terror strikes.

The Islamic State (IS) has claimed responsibility for the vast majority of these attacks, though this may not be true in all cases. What is not disputed any longer is that the West now has a sizeable number of radicalised Islamist elements who are willing to perpetrate acts of terror — either on their own, or under instructions from elsewhere.

Timeline of the new phase

Why I’m Directing The Air Force to Focus on Space

BY HEATHER WILSON 

In the coming months, the US Air Force will grow the space force in numbers and capabilities. 

For the service that I once served and now lead, one of the most important tasks ahead is getting space operations right.

In many respects, the Air Force and the nation are at a critical crossroads. We realize, as do our potential adversaries, that space is interconnected to American life and to U.S. military success. The time is now to integrate, elevate, and normalize space in the Air Force and thus assure continued American dominance in this most critical domain. 

We will do this systematically and doggedly, drawing lessons from earlier periods in which airmen created the resources, tools, and tradecraft to assure freedom of access and freedom of operation for the U.S. military writ large. Today, we begin the process of standing up a new organization at the Pentagon that will be responsible for recruiting, training and equipping airmen involved in the space mission. The establishment of the deputy chief of staff for space operations is the next step toward ensuring that we maintain space superiority.

This move will allow us to focus our attention on many critical areas as we make the policy and budget decisions necessary to train and equip airmen for the challenges in space, an essential but sometimes overlooked area of military operations. In the months ahead, you will hear much more about how we are transforming a mission that for some time has been designed around a relatively benign environment to one that has grown crowded and contested.

Empowering DOD with critical cyber training

By Jonathan Sholtis

U.S. federal agencies have increasing concerns about cybersecurity -- and rightly so. Recently, the Department of Defense faced criticism about its preparedness for a cyber-attack. A December 2016 report from the Office of the Director, Operational Test and Evaluation stated: “DOD personnel too often treat network defense as an administrative function, not a war fighting capability. Until this paradigm changes…the Department will continue to struggle to adequately defend its systems and networks from advanced cyber-attacks.”
While critical feedback can sometimes be warranted and even beneficial to drive improvement, this characterization does not reflect current efforts.

Both former Defense Secretary Ash Carter and current Secretary James Mattis have been explicit in their view that cyber is a key part of our national defense and should be classified as part of the war-fighting domain. The likely elevation of U.S. Cyber Command to combatant command status will put it on equal footing with commands like U.S. Central Command and U.S. Special Operations Command, clearly advancing cybersecurity as a priority for DOD leadership.

Ending The Endless Crypto Debate: Three Things We Should Be Arguing About Instead of Encryption Backdoors

By Kevin Bankston

Recently I participated in a fascinating conference at Georgia Tech entitled “Surveillance, Privacy, and Data Across Borders: Trans-Atlantic Perspectives.” A range of experts grappled with the international aspects of an increasingly pressing question: how can we ensure that law enforcement is able to obtain enough information to do its job in the twenty-first century, while also ensuring that digital security and human rights are protected? How can or should law and policy adapt to a world of digital evidence, much of which is easily obtainable—but much of which is not?

The primary focus of that conference was on how best to regulate the sharing of needed user data between internet companies in one country and law enforcement in another country. However, in this post—part of an online symposium at Lawfare following up on that conference—I’ll be mostly focusing on another, particularly controversial part of the broader conversation regarding modern policing: the debate over encryption, and how law enforcement should respond to it.

First, to very briefly summarize a long-running debate: until he was dismissed in May, FBI Director Comey had been arguing since 2014 that the growing prevalence of encryption—in particular, default encryption on some smartphones, and end-to-end encrypted messages that neither messaging service providers nor the government can decode—is depriving government investigators of needed evidence. This is what the FBI calls the “Going Dark” problem. For the past several years, and most recently in two speeches in March and in testimony to Congress in early May, Comey called for a solution to that problem. Privacy and security advocates fear that the FBI’s preferred solution may end up being a wrong-headed legislative mandate requiring providers to ensure some sort of exceptional technical access to encrypted data for government—what opponents (like me) would call a “backdoor”—or otherwise ensure that they do not deploy any encryption that they themselves cannot decrypt. I won’t bother repeating here the many arguments why such a mandate would be bad for America’s cybersecurity and economic security, as well as the civil and human rights of people around the world, nor why it would be mostly useless at preventing bad guys from using encryption if they want to; see here for arguments that I and my organization Open Technology Institute have previously made.

Intelligent Machines The Dark Secret at the Heart of AI

Will Knight

Last year, a strange self-driving car was released onto the quiet roads of Monmouth County, New Jersey. The experimental vehicle, developed by researchers at the chip maker Nvidia, didn’t look different from other autonomous cars, but it was unlike anything demonstrated by Google, Tesla, or General Motors, and it showed the rising power of artificial intelligence. The car didn’t follow a single instruction provided by an engineer or programmer. Instead, it relied entirely on an algorithm that had taught itself to drive by watching a human do it.

Getting a car to drive this way was an impressive feat. But it’s also a bit unsettling, since it isn’t completely clear how the car makes its decisions. Information from the vehicle’s sensors goes straight into a huge network of artificial neurons that process the data and then deliver the commands required to operate the steering wheel, the brakes, and other systems. The result seems to match the responses you’d expect from a human driver. But what if one day it did something unexpected—crashed into a tree, or sat at a green light? As things stand now, it might be difficult to find out why. The system is so complicated that even the engineers who designed it may struggle to isolate the reason for any single action. And you can’t ask it: there is no obvious way to design such a system so that it could always explain why it did what it did.

The mysterious mind of this vehicle points to a looming issue with artificial intelligence. The car’s underlying AI technology, known as deep learning, has proved very powerful at solving problems in recent years, and it has been widely deployed for tasks like image captioning, voice recognition, and language translation. There is now hope that the same techniques will be able to diagnose deadly diseases, make million-dollar trading decisions, and do countless other things to transform whole industries.

Ransomware attack reveals breakdown in US intelligence protocols, expert says

Edward Helmore 

The attack that temporarily crippled the NHS in Britain and dozens of other institutions across Europe and Russia reveals the failure of the US government’s protocols for warning software developers and the private sector about system vulnerabilities, a cyber-security expert told the Guardian.

Under the vulnerability equities process (VEP) established by the US government, US intelligence agencies are supposed to collectively determine whether to disclose a vulnerability it has obtained or discovered – so the software developer has a chance to fix the problem – or withhold the information to use the flaw for offensive or defensive purposes.

“The NSA is supposed to lead the vulnerability equities process with all the other government agencies gathered round to discuss their interests in the vulnerability, and to weigh the offensive capabilities against defensive concerns for the private sector and US interests,” said Adam Segal, the director of the digital and cyberspace policy program at the Council on Foreign Relations. The EternalBlue-WannaCry attack showed that the NSA did not reveal the vulnerability it had discovered before it was stolen and apparently auctioned off, Segal said.

The US government has consistently indicated it is predisposed to releasing vulnerabilities and leaning toward taking a defensive position. In testimony, NSA director Mike Rogers has said the intelligence agencies revealed close to 90% of vulnerabilities they discover.

According to Segal, the Shadow Brokers case and Wikileaks’ recent ‘Vault 7’ release of CIA hacking tools have led to increasing suspicion that may only be true given a narrow definition of vulnerability.

23 June 2017

** Pentagon Cyberwarriors Find Fertile Ground in Silicon Valley

By Sandra Erwin

It is virtually unheard of in government contracting for the Defense Department to be brief and straightforward in stating requirements.

So it was a surprise when a Pentagon solicitation this month for cybersecurity software was summed up in a single sentence: “The Department of Defense is interested in systems to automatically find previously unreported vulnerabilities in software without source code and automatically generate patches to remediate vulnerabilities with minimal false positives.”

The time window to bid on this opportunity also is unusually short. Responses will be accepted only from June 12 to June 20.

This is how business is done at the Defense Innovation Unit Experimental, known as DIUx. The Pentagon’s two-year-old enclave in Silicon Valley has moved rather quickly to shake up the contracting culture — and to prove that it is more interested in getting results than in forcing vendors to deal with red tape.

Necessity has forced the Pentagon to make innovation a top priority, especially in the cybersecurity field as the U.S. government and military information networks face unprecedented threats from hackers and malware. DIUx is being challenged to find solutions, and fast.

In technology-rich Silicon Valley, it falls on DIUx to spot relevant products, test them and select the ones that best solve problems for the Defense Department. DIUx has 40 people based in Mountain View, Calif., and smaller offices in Boston and Austin, Texas.

** Information Warfare: THAAD The Hack Attack Magnet


June 11, 2017: In May 2017 the United States revealed that it had sent one of its few cyber protection teams to defend the THAAD (Terminal High Altitude Area Defense) battery sent to South Korea earlier and declared operational in April. This anti-missile unit is considered a major target for hackers. Each THAAD battery consists of two or more launcher vehicles (each with eight missiles stored in canisters they are fired from), a fire control and communications system and a TPY-2 X-Band radar (or equivalent radar or radars). THAAD missiles weigh 836 kg (1,400 pounds) and are about the same size as the Patriot anti-aircraft missile and have a range of 200 kilometers and max altitude is 150 kilometers. THAAD is intended for stopping short (like SCUD) or medium range (up to 2,000 kilometer) range ballistic missiles. To work properly the battery depends a lot of networks for quickly transmitting target and other data. Since China, Russia and North Korea all have excellent network hacking capabilities and have been hostile to the stationing of a THAAD battery in South Korea, it was expected that the THAAD networks would be subject to penetration and disruption attempts by foreign hackers.

Neither the THAAD nor the cyber protection teams have had any real combat experience. THAAD has been successful in tests but the army is still seeking a realistic way to test the effectiveness of the cyber protection teams. The tense situation in South Korea could be the first real test of both new systems. At the moment THAAD seems more likely to succeed, but only if the untried cyber protection teams can keep numerous and determined hackers out.

The army knows it has a major problem with cyber protection as do the other services (air force, navy and marines). This was made clear after U.S. Army established its first Cyber Protection Brigade in late 2014. There were plans to create two more brigades by 2016. That did not happen because the army in particular and the military in general could not create or recruit enough qualified personnel. There were other problems but the key difficulty was a shortage of qualified people to staff the key units; the cyber protection team.

* Google ramps up efforts to combat online terrorism, recruitment efforts


by Paige Williams

Google announced Sunday that it is taking additional steps to prevent online terrorism using Google-owned platforms, according to a company blog post.

The company denounced the use of its websites to disseminate terrorist recruitment materials, most prominently YouTube, calling on industry to step up and participate in prevention.

“While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done,” said Kent Walker, senior vice president and general counsel of Google.

Google, and similar companies that own social media websites, must toe the line between freedom of speech and responsibility for security. Google stated that their increased efforts are a good balance between the two.

Its four-pronged mission intensifies the technological efforts it has already taken against extremist content.

Obstacles to Information Sharing in the Virtual Battlefield

JAMES CLAPPER

Statecraft and business have always been closely linked, but the advent of digital technology has blurred the roles more than ever. Systems crucial to the economic well-being and national security of the United States rest in the hands of private companies. The two sectors must cooperate by sharing information at an immense pace and scale to keep up with the threat of cyber attacks. The Cipher Brief’s Levi Maxey spoke with James Clapper, the former U.S. Director of National Intelligence, about how the U.S. approaches cybersecurity information sharing and why there continues to be obstacles for both government and private sector when sharing data on virtual risks with real world consequences.

The Cipher Brief: How would you characterize cooperation and information sharing between the intelligence community and major tech companies?

James Clapper: I guess I would call it a little strange. The problem with all of this information sharing business between intelligence and companies more broadly is that there are restraints and inhibitions on both as far as fulsome sharing. There is supposed to be an equities process that, at least while I was there, worked. Where you make adjustments based on the equities involved – if the intelligence community detects a vulnerability, do they share it or not? There is a process for deciding what’s best and probably no one is ecstatic about that process all the time.

TCB: Could you talk about how the equities process works? My understanding is that it is through the National Security Council but reports vulnerabilities to companies through the DHS.

Intelligence community must embrace the digital era: DIA director

By: Rachael Kalinyak

“What kind of future will we embrace?” 

This question echoed throughout Marine Corps Lt. Gen. Vincent Stewart’s speech at the 2017 GEOINT Conference in San Antonio, Texas. The director of the Defense Intelligence Agency spoke about the risks the intelligent community faces, including becoming irrelevant in a technology-driven era. To make his point, he mentioned the challenges faced by the Kodak film company as digital photography first entered the picture. Embracing the digital age is critical for intelligence community, he said. 

The desire to stay in the past, live in the success of the Cuban Missile Crisis, and stick to the techniques that have proven to be successful is strong, Stewart said. But he noted that failing to embrace the digital world will only lead the intelligence community to extinction. 

“We are not indispensable unless we are relevant to our customers, all of them,” Stewart said, explaining that the notion that “our success in the past is good enough for our success in the future” is wrong. This idea stifles innovation, stopping those who wish to help shape the future. To remain relevant, the intelligence community must learn to nurture innovation and take risks. 

After concluding his speech, Stewart noted a military downfall that has occurred in recent years — wargaming. 

Locked Shields: The world's largest cyber-war game



Tallinn, Estonia - Things are bad on the small island nation of Berylia after a diplomatic row with Crimsonia, its bigger neighbour and rival. There are street protests by the Crimsonian minority in Berylia, which then suffers a wave of cyber-attacks that make it lose control of its drones and its only international airbase.

Crimsonia is blamed for the cyberoffensive even though there's no hard proof. Crippled by the attacks, Berylia, a new member of the North Atlantic Treaty Organisation (NATO), weighs its options. One of them is to invoke Article 5 and take the military alliance to war against Crimsonia.

Berylia and Crimsonia are fictional and so is this scenario, which is part of Locked Shields, a cyberwar game. But the fact that the situation doesn't sound that far-fetched is one of the reasons why Locked Shields is so relevant today.

Locked Shields is "the world's largest and most advanced international technical live-fire cyber defence exercise", as described by the NATO-affiliated Cooperative Cyber Defence Centre of Excellence (CCDCOE), which has been organising it since 2010 in Tallinn, Estonia.

Cyberwar blurs lines between military/civilian, public/private sectors [CyCon Tallinn]


by Gerard O'Dwyer

The danger that the cyberwarfare threat spectrum could pose a sustained future risk to both military sites and critical civilian infrastructure – such as power grids, hospitals and telecom networks – is certain to stimulate deeper levels of collaboration between defense and law enforcement agencies and industry cyber experts.

The fundamental dynamic driving closer cooperation between state agencies and industry is formed from the common objective of not alone defending against threats in cyberspace, but devising the tools to respond using precise surgical strikes against aggressors in the cyber battlespace.

A higher degree of cooperation between state cyber defense agencies and industry will focus on developing more robust vertical data traffic analysis tools to improve the general understanding and interpretation of high malware activity and trends, nationally and globally, as used by hostile parties in the cyberspace domain.

The opening up of new channels of collaboration between state and industry actors in the cybersecurity area featured prominently at CyCon 2017 in Tallinn, Estonia.

Notwithstanding collaboration with industry, the state will retain a primary responsibility to defend its country’s systems, said James Lewis, a Senior Vice President at the Washington-based Center for Strategic and International Studies.