Wednesday, January 3, 2018
From Science Fiction to Reality: U.S. Navy Technology and Innovation
By David Smalley, Office of Naval Research
From the 1930s on, science fiction comics, books and movies had plenty of futuristic portrayals of “ray guns” shooting some kind of mysterious energy.
Decades later, Star Wars and Star Trek helped captivate millions more with the idea. But all along, the Office of Naval Research (ONR) has been steadfastly developing the real thing. Starting in the 1950s, ONR sponsored research that ultimately led to the first “lasers” (light amplification by stimulated emission of radiation). Today, ONR is working on high-energy, solid-state laser weapons. The Laser Weapons System (LaWS), a prototype, was fitted on a ship in the Arabian Gulf in 2014 and proved the ability of this test platform to shoot down UAVs in the air, and surface targets on the waves. Laser capabilities and power are growing every year. Coming soon to a theater (of operations) near you!
Like ray guns, robots have dominated popular imagination for decades.
And as with the development of lasers, yesterday’s science fiction has really become today’s science fact. Today robots perform complex duties on factory floors, clean floors in our houses and even deliver meals to your hotel room. But robots can also save lives. Think shipboard fires. Take lots of Sailors or Marines, add gunpowder and tight quarters where maneuverability is limited, and shipboard fires are a deadly threat – and extraordinarily dangerous to combat. As Sailors learn in firefighting training: In a fire at sea, there is no place to run. What if, ONR scientists thought, we could lessen the dangers of firefighting aboard ships? Using decades of investment into robotics (in 1963, ONR sponsored Shakey the robot, the first to reason through what actions it should take to fulfill a command), ONR researchers are developing SAFFiR – the Shipboard Autonomous Firefighting Robot. This human-sized robot can find and suppress even extreme shipboard fires, keeping Sailors out of harm’s way. Here’s hoping SAFFiR never has to do his (its?) job – but it’s nice to know it will be on watch.
Additional examples include augmented reality systems and advanced wireless networks that were among the technologies shown during the Ship-to-Shore Maneuver Exploration and Experimentation Advanced Naval Technology Exercise (S2ME2 ANTX) 2017, a set of amphibious exercises at Marine Corps Base Camp Pendleton in California last spring.
S2ME2 ANTX focused on five capability areas of amphibious operations: ship-to-shore maneuver; weapons fire support and effects; clearing assault lanes; command and control; and information warfare. Demonstrated technologies included unmanned and autonomous vehicles equipped with sensors to gather intelligence in the air, on land and underwater.
During each amphibious beach demonstration, unmanned surface and underwater vehicles approached the shore first, collecting intelligence about battlespace conditions-including threats and obstacles-providing an accurate picture of what warfighters would face when leaving their vessels and vehicles.
Hardly a day goes by without a news story that shows driverless cars, UAVs delivering holiday packages, or other uses of autonomy in modern life.
Many of these capabilities are possible because of ONR investments in autonomy. The ability for unmanned systems to take on dull, dirty or dangerous tasks has been a priority for ONR engineers. The Office of Naval Research recently developed a hardware and software suite and put it on swarms of unmanned small boats. Once equipped with the autonomy package – called the Control Architecture for Robotic Agent Command and Sensing, or CARACaS – the boats could collaboratively communicate; detect an intruding vessel in their area of responsibility; approach it; determine whether or not the intruder was a threat; and convey the information to the Sailors on a manned vessel outside the patrol zone. Think safer mine clearance, delivery of supplies in hot zones, ship escort and more.
When you think “virtual reality,” you may imagine Tony Stark from the “Iron Man” movies, hands raised and moving virtual displays in the air.
Well, Navy engineers are working hard to bring that to life. The Battlespace Exploitation of Mixed Reality (BEMR) Lab features a host of advanced virtual reality capabilities that will help warfighters train and operate in the future. ONR’s Jim Blesse recalls the moment of inspiration. “Someone came in and told us his two-year-old had a tantrum that morning because the television screen didn’t function the way his tablet did. A two-year-old! We looked at each other and said ‘Wow. What is that kid going to expect from technology when he’s 18? We need to envision that, now.'” ONR sponsored the BEMR Lab at SPAWAR Pacific to develop virtual reality technologies that are already impacting how the future force trains. (You can even get real-life dizzy looking down from the virtual crow’s nest.)
Wednesday, December 20, 2017
This excerpt from FinTech: The Technology Driving Disruption in the Financial Services Industry by Parag Y. Arjunwadkar explains the basic of blockchain and then reviews how Fintechs are adopting blockchain for security and transparency.
Tuesday, December 12, 2017
Paul Myer, CEO, Veracity Industrial Networks makes these predictions about the industrial IoT for 2018:
There will be a nation-state cyber-attack on our critical infrastructure in 2018
There has been an increase in the number of attacks on our nation’s infrastructure, such as our power grid. Now, these systems are secure and we have not seen a widespread attack be successful, but as the old saying goes, the bad guys only have to be “right” once; those defending these institutions have to be “right” every time.
In October, the DHS and FBI warned that the nuclear, energy, aviation, water and critical manufacturing industries have been targeted along with government entities in attacks dating back to at least May. They also reported that some of those hackers were successful in compromising the networks. All this evidence points to an increasing risk that an attacker will be successful in the near future.
The “industrial cybersecurity” space will see record investment in 2018
Cybersecurity has been attracting significant investment for years now, however the bulk of that investment has been in the traditional “IT security” world where the most action is. During 2018, we expect to see a shift where more investment goes toward companies addressing the industrial cybersecurity needs that are becoming critical.
The industrial side of cybersecurity has lagged the “IT security” world in development of tools and procedures. Reliance on “air-gapping” as a security measure has run its course. This will bring a new group of industrial cybersecurity-based solutions looking for investment. We predict that investment will be large and immediate.
The lack of trained cybersecurity personnel will become acute in 2018
There has been considerable discussion about the lack of trained cybersecurity professionals and the issue this causes. The growth of our cybersecurity needs has far outpaced the development of training programs and the number of new experts we are creating. Things will be worse before they get better and 2018 will likely shine a light on this intellectual shortfall.
This “shortage” will be more acute in industrial networks as there are less training options for those professionals. Also, the lack of viable networks security tools on the industrial network side make the actions of trained network security professionals all the more important.
The “ransomware” business model will be applied to more hacks
The advent of “ransomware” attacks, where your data is held hostage by a hacker that has compromised your computer until you pay a ransom. This new revenue source for today’s cyber criminals has broadened the targets for hackers geometrically. Suddenly, companies and individuals that house no data suitable for sale on the dark web have become targets. Locking up a grandmother’s photos could produce revenue now.
Part of the ransom phenomena is made possible by the anonymous nature of bitcoin as a currency. We predict that hackers will spread the ransomware business model to the industrial space by holding parts of OT/ICS networks hostage, in 2018.
Monday, December 11, 2017
The Convergence of Data Management Technologies, Growth of Metadata Management and the Increased Focus on AI Make Up the Most Impactful Trends for 2018
NAPERVILLE, Ill. (PRWEB) November 14, 2017 -- Infogix today identified pivotal data trends that will impact businesses in 2018 and beyond.
“Metadata management and ensuring data privacy for regulations such as GDPR joins earlier trends like AI and IoT, but the unexpected trend of 2018 will be the convergence of data management technologies,” said Emily Washington, senior vice president of product management at Infogix. “Big data has been the next big technology phenomenon for a long time, but businesses are increasingly evaluating ways to streamline their overall technology stack if they want to successfully leverage big data and analytics to create a better customer experience, achieve business objectives, gain a competitive advantage and ultimately, become market leaders.”
The top data trends for 2018 were assembled by business leaders at Infogix who have decades of experience in information technology. The major trends include:
2018: The Year of Converging Data Management Technologies
- Use cases have proven that leveraging data requires a multitude of separate tools for tasks like data quality, analytics, governance, data integration, metadata management and more.
- To extract meaningful insights and increase operational efficacy, businesses will increasingly demand flexible, integrated tools to enable users to quickly ingest, prepare, analyze, act on, and govern data—while easily communicating insights derived.
Increased Importance of Data Governance
- The deluge of data is growing, government regulations are increasing and teams have much greater access to data within an organization. Add to this the increasing need to leverage advanced analytics, and data governance has become more critical than ever.
- Data governance capabilities have evolved in a way that provides complete transparency into a business’s data landscape—allowing them to combat increasingly complex regulatory and compliance demands and the shifting tides of business policies and business alignment.
The Continued Rise of the Chief Data Officer (CDO)
- In today’s data-intensive environment, a CDO is more important than ever to navigate regulatory demands, successfully leverage data and manage enterprise-wide governance.
- A CDO helps businesses manage unstructured and unpredictable data, while successfully leveraging advanced analytics and maximizing the value of data assets across the business enterprise.
Ensuring Data Privacy for Regulations such as the General Data Protection Regulation (GDPR)
- When GDPR goes into effect in May 2018, it will strengthen and unify data protection rules for all organizations processing personal data for European Union (EU) residents.
- Through analytics-enabled data governance, a business can not only locate personal data enterprise-wide, but monitor compliance, usage, approvals, and accountability across the organization.
The Proliferation of Metadata Management
- Metadata is a growing trend for 2018. This “data about data” contains the information necessary to understand and effectively use data such as business definitions, valid values, lineage, and more.
- Using such ontologies, organizations can understand the relationship between data sets, as well as enhance discoverability in metadata. Metadata management is critical in enterprise data environments to support data governance, regulatory compliance and data management demands.
The Monetization of Data Assets
- Organizations recognize that data is either a liability or an asset. Metadata can be used to enable a deeper understanding of the most valuable information.
- We are seeing more organizations using a combination of logical, physical, and conceptual metadata to classify data sets based on their importance, and businesses can apply a numerical value to each data classification, effectively monetizing it.
The Future of Prediction: Predictive Analytics to Improve Data Quality
- With the continued concerns with data quality, and the volumes of data increasing, businesses are enhancing data quality anomaly detection with the use of machine-learning algorithms.
- By using historical patterns to predict future data quality outcomes, businesses can dynamically detect anomalies in data that might otherwise have gone unnoticed or only found much later through manual intervention.
IoT Becoming More Real
- Each passing year marks an increase in the number of connected devices generating data and there is a steep rise in focusing on extraction of insights from this data.
- We are starting to see more and more defined IoT use cases leveraging data—from newer connected devices like sensors, and drones for analytics initiatives. With this, there is a growing demand for streaming data ingestion and analysis.
“As more data is generated through technologies like IoT, it becomes increasingly difficult to manage and leverage. Integrated self-service tools deliver an all-inclusive view of a business’s data landscape to draw meaningful, timely conclusions,” said Washington. “Full transparency into a business’s data assets will be crucial for successful analytics initiatives, addressing data governance and privacy needs, monetizing data assets and more as we move into 2018.”
The last US presidential election revealed the dangers and the difficulties of prognostication. But that doesn't deter those determined to look ahead at what we may face in 2018. We reached out to several security mavens to learn what worries them about the coming year. It's interesting how broad their concerns are, and how little they overlap. Yes, 2018 will be an interesting year. You can read the predictions here.
Monday, November 27, 2017
On December 7, 2017, at 10 AM EST, James Bone, author of Cognitive Hack: The New Battleground in Cybersecurity ... the Human Mind, is conducting a webinar on "Is Cognitive Computing the Next Step to Help Fight Cybercrime?"
Wednesday, November 15, 2017
2017 has officially become the worst year on record with over 16,006 disclosed vulnerabilities.
RICHMOND, VA, November 14, 2017 -- Risk Based Security today announced the release of its Q3 2017 VulnDB QuickView report that shows there have been 16,006 vulnerabilities disclosed through September 30th this year. This is the highest number of disclosed vulnerabilities at the end of the third quarter on record and represents a 38% increase over the same period in 2016. In addition, cataloged vulnerabilities in the first nine months of 2017 have exceeded the total vulnerabilities for all of 2016 (15,832). The 16,006 vulnerabilities cataloged by Risk Based Security’s VulnDB research team eclipsed the total covered by the CVE and National Vulnerability Database (NVD) by 6,295.
“When hearing that so many vulnerabilities are missing from CVE/NVD, most security professionals want to justify the gap by trying to convince themselves that the vulnerabilities missed can’t possibly impact their organization and if they do they must be low risk. However, just as our previous reports have indicated this isn’t the case. 44.1% – over 2,700 – of the vulnerabilities not published by NVD/CVE have a CVSSv2 score between 7.0 and 10, which include widely deployed software used by many organizations. Any security product or tool that relies on CVE/NVD is putting your organization at serious risk.” said Jake Kouns, Chief Information Security Officer for Risk Based Security.
“As Equifax dominated the data breach headlines, it was revealed that due to a series of delays they were unable to patch the exploited flaw, now commonly known as Struts-Shock, in a timely fashion. What the media missed is that there have been a total of 75 vulnerabilities in Apache Struts, and 5 new vulnerabilities since Struts-Shock was disclosed. It makes you wonder if there were any other delays in correcting those issues as well, and if Equifax has additional unpatched vulnerabilities”, added Kouns.
The newly released 2017 Q3 2017 report from Risk Based Security shows that 39.9% of total reported vulnerabilities received CVSSv2 scores above 7.0. This means that not only is the number of vulnerabilities on the rise, but the severity of the vulnerabilities disclosed remains high. What is more concerning for organizations is that 31.6% of the vulnerabilities disclosed have public exploits available and 47.9% can be exploited remotely.
The VulnDB QuickView report also highlights the relationships between researchers and vendors, showing that they are continuing to work together. Vulnerabilities disclosed in a coordinated fashion continues to be around 43%, on par from the mid-year report. In addition, 6.1% of the vulnerabilities disclosed in software products were coordinated through vendor and third-party bug bounty programs.
“While our proprietary Vulnerability, Timeline, and Exposure Metrics (VTEM) show that not all vendors are prioritizing and fixing vulnerabilities as quickly as we would prefer, the good news is that 75.8% of 2017 vulnerabilities through September do have a documented solution”, says Kouns.