Archive for security

Docked!

I admit, I sometimes have weird vacations. I’ve had a few weeks off from work while awaiting the start of my new job. There was a trip to New Orleans (in the summer!) but also time spent watching the livestreams of two tech conferences. A little while back I watched and commented on Apple’s WWDC and, before heading off to NOLA, I tuned into DockerCon. I’m truly a geek. DockerCon is the conference for Docker users. In case you are unaware, Docker is arguably the most used (or at least well known) container technology. Containers are a type of virtualization. There’s plenty of places to look up containers so go do that now if you are ill informed about them.

DockerCon, unlike most conferences I have attended or viewed, is entirely oriented toward technology professionals. Even Microsoft Build and WWDC have more business influence than DockerCon. That’s not unexpected given that Docker’s whole business is centered around developers and sysadmins, It does, however, does add a certain flavor to the proceedings. For instance, the speakers seemed to spend an inordinate amount of time talking about why one would use a container. I would have thought that anyone who was at DockerCon was there to understand the “how” and had already figured out the “why”. It was whipped cream on ice cream – generally unnecessary and in the way of the good stuff.

The most interesting part of DockerCon was seeing how far the technology has come in such a short period of time. It’s not just the growth numbers – though there has been phenomenal uptake in Docker container usage – but the rate of evolution of the product itself that is so startling. In two years, Docker has gone from having only the basic container engine to networking and security upgrades along with the addition of plugins and orchestration. The platform choices have also expanded, though much of it is still in BETA. Whereas Docker, like most containers, has been based on LXC and limited to 64-bit Linux, they are now expanding into Windows and MacOS as well as various cloud platforms such as Amazon AWS and Microsoft Azure.

The upshot is that Docker is making itself more attractive for large scale production environments. Docker 1.12 adds features that are important to deploying containers in production, as opposed to developer, environments. For example, orchestration will be part of the 1.12 release. Called Swarm, this feature allows large numbers of containers to be instantiated easily and then managed effectively. Manual tools are fine for individual developers but not for production environments. Swarm, which is similar to Google Kubernates, does all this. The upgrades to security are also important to expanding the use of containers into more robust environments. The addition of key management, while mundane, is very important to maintaining secure environments and Docker 1.12 has it.

Docker is also introducing a new container format. Typically, containers have encapsulated one piece of processing. What the Distributed Application Bundle or (terribly nicknamed) DAB does is package many containers together so that a sysadmin can deploy the entire application at once. Not only does this make it easier to deploy a new application but makes it much easier to migrate or move whole applications. Coupled with Swarm, this is a big time saver for the OPS crowd. DAB is still experimental so it isn’t certain if it will become a feature but it shows that Docker is thinking the right way.

The big takeaway from DockerCon is that Docker containers are now ready for the big time. The ecosystem is growing and the product itself has evolved into something that is useful to production environments. Our little container tech has grown up and is ready to wear big boy pants.

Tackling Complexity and Security – The InformaticaWorld 2015 Big Picture

The message from IT professionals at InformaticaWorld 2015 this past week was pretty clear. Complexity is making data management tough to do these days. Cloud and mobile was, in their minds, a great boon to business. Both gave access to applications that used to be frozen on desktops. It also meant that data security was more complicated than ever and the amount and type of data rapidly expanding. New IT architectures, microservices and containers, were leading to more flexible and easier to deploy applications. The unfortunate side effect was data silos of structured, unstructured, and semi-structured data. Add to this mix machine data a.k.a dark data – data generated by and for devices and computer systems themselves – and the data landscape has become a complicated mass of different types of data, spread throughout thousands of sites, systems, and devices. It almost makes one long for the days when all of a company’s data was in a handful of SQL databases that powered a few applications.

Teasing value from all this data had become a headache to say the least. If just finding the data an organization needs to analyze is hard, making it useful sometimes seems impossible. Data is dispersed through the organization and often quite dirty with errors or no clear way to connect data together. Thankfully, technology has advanced beyond a data warehouse where we stuff aggregate data from a few systems. We can now build data lakes – data repositories with cleansed data and prepackaged and have user-friendly query capabilities that can tie together information from many disparate systems. This has had the unfortunate effect of creating a needle in the haystack problem. Business analysts now have access to so much data that it’s easy to drown in the data lake.

The same was true of data security. Mobile devices, cloud systems, and containers have made data much more portable and, hence, dangerous. It used to be that a company could secure its network and critical databases and the data was mostly safe. The sophistication of threats has, however, increased dramatically. More important (and somewhat perverse), by making data available to many more business users, in order to get more value out of data, managing the security of data has gotten more difficult. Between complexity and security, using an organization’s data to its advantage is, in some ways, harder than it used to be.

And that was the point of many of the announcements at InformaticaWorld 2015 of course. Project Atlantic is a great example of a forward thinking product strategy. It looks to harness dark data by converting it into something useful to a human analyst. In an ironic twist, Informatica is using machine learning to transform machine data into something people can understand. Another announcement, Project Sonoma, looks to simplify the management and use of Hadoop-based data lakes. Products like this, along with user facing tools such as Rev, will make data lakes more accessible allowing business users to gain value from huge amounts of corporate data. Informatica is expecting to add streaming data to Project Sonoma in 2016 which should greatly enhance the ability to use Internet of Things and other machine data as well as streaming social media data in data lakes. Remember, getting data into a data lake is one thing, making use of it is really hard. Project Sonoma looks to take allow companies to spend more time getting value from data instead of managing it.

Finally, Informatica demonstrated a variety of technologies for securing data. Informatica has had data security products, including data masking, for a while but now have a full management layer called Secure@Source. This product provides a dashboard that shows where there are data security flaws and when policies are being violated. It’s a tool for both the DBA and security administrator which sits squarely in both the data governance and security fields of IT.

A picture emerges from this conference of a company that is very different than it was even five years ago. While Master Data management is still the core business, Informatica has made it clear that they are really the data value company. The mission is to help customers do more with data by making accessing, securing, and integrating data across the enterprise a much easier process. And this is something that IT and business users can agree that they need.