Homomorphic encryption will keep data safer
One big risk of data stored in the cloud is that if you need to use it, you have to decrypt it, opening it up to possible attack. But homomorphic encryption should fix that.
This technique encrypts data in such a way that applications can access it and make calculations based on it without actually using the data itself, just the encrypted representation of the data. And the calculations made by the application, once completed, can be encrypted as well.
So, for example, a patient record could be stored, and an application that predicts the patients’ outcomes could be applied to the data. But since the data is never decrypted it is never at risk. Such a scheme would help medical providers meet the requirements of medical-confidentiality laws and regulations.
This example is part of a Microsoft Research paper this year about achieving high throughput, accurate and private manipulation of encrypted data.
(By Senior Editor Tim Greene)
Open-source will be, if possible, even more commonplace
Much has been made of the flourishing of free and open-source software in recent years – collaborative, fluid teams working on open code bases are responsible for a large and growing proportion of the software used in the enterprise world.
And there’s no reason to think that the trend will change anytime soon. Neela Jacques, the executive director of the OpenDaylight SDN project, says that an over-reliance on closed, proprietary systems has hurt business users in the past.
“Companies have realized that it’s inefficient to try to build proprietary platforms that have a low chance of success,” he says. “What we’re seeing is an emerging model where organizations spend a moderate amount of resources with others to establish a collaborative, standard platform.”
Moreover, according to Jacques, the open-source community will become increasingly professionalized over the next decade – removing one of the barriers to entry for conservative companies worried about the occasional fiery war of words that periodically roils some open-source projects.
“The conversation will move to how we build and maintain the greatest shared technologies of our time, from licensing to certification to training talent to support these resources,” he says.
IoT will finally hit its stride
The Internet of Things is a phenomenon that we’re constantly told is in danger of taking off and really changing the world around us, but it never quite seems to happen. But within the next 10 years the technology will start to realize its potential, according to IDC vice president of network infrastructure Rohit Mehra.
Most of what’s held IoT back can be sorted into two categories – security issues and interoperability problems. The relative novelty of the technology means that there are few generally accepted communications standards, which in turn means that it can be difficult to have multiple IoT devices working together unless they’re all from the same manufacturer. That limits their utility, since the entire point is to have everything working as a seamless whole. Additionally, IoT devices represent potential ways into a network for attackers, and their security isn’t always assured.
But Mehra says that these problems will be solved – for a given degree of “solved” – within the coming decade.
“I think all the pieces are slowly falling into place,” he told Network World. “Today’s network can adapt, it can really scale, and all on the fly as application needs change – and what that means is, now, if I’m an IT guy, I can really think of leveraging the cloud, leveraging my big data and analytics applications, to do what’s best for all my IoT endpoints.”
Big Data will play a Big Role on the network
Big data – if you haven’t heard a lot about it over the past several years, you haven’t been paying much attention. A buzzword shoved into the conversation about every highly scalable technology product, the definition of big data has ballooned so much that it’s beginning to lose meaning.
The impact of the increasingly quantified tech world, however, has been meaningful indeed, and it will continue to have a notable impact on the network into the foreseeable future. Judith Hurwitz, who runs the consulting firm Hurwitz and Associates, says that big data’s role in analytics will be substantial.
“One of the most important trends that is just beginning to impact the network world is machine learning and cognitive computing,” she writes via email. “The ability to analyze massive amounts of data to look for patterns and anomalies changing the way new tools are able to anticipate problems before they can cause outages or intrusions.”
Watson-esque tools to perform complicated analysis require commensurately humongous data sources, so in a very real way, big data is powering what could be a renaissance in network analysis and management.
Nor is that influence limited to the network by itself. Matt Roberts, marketing director of telecom management firm Amdocs, says that data from other parts of the infrastructure can be built into an analytics solution.
“We’re moving to these new environments [where] … a lot of the decision-making or human intervention can be done through the use of intelligent analytics,” he says.