Archive for July, 2012

The Department of Defense Cloud Computing Strategy


25 July 2012 – The U.S. Department of Defense released its cloud computing strategy that will move the department’s current network applications from a duplicative, cumbersome, and costly set of application silos to an end state designed to create a more agile, secure, and cost effective service environment that can rapidly respond to changing mission needs. In addition, the Defense Information Systems Agency (DISA) has been named as the enterprise cloud service broker to help maintain mission assurance and information interoperability within this new strategy.

“We are moving to an enterprise cloud environment that provides tangible benefits across the department by supporting the delivery of the joint information environment, from the continental United States to the warfighter at the tactical edge. This strategy lays the groundwork, as part of the Joint Information Environment framework, for achieving cloud adoption within the department,” said Teri Takai, Defense Department chief information officer. “It focuses on the creation of department core data centers, enterprise cloud infrastructure and sustainment of cloud services.”

We have read it and it is quite a remarkable document.  For an … Read more

Webinar on July 24th: Big Data and Hadoop – an introduction

Webinar date: Tuesday, July 24, 2012
Time: 11am Eastern; 8am Pacific
Duration: 1 Hour

Carahsoft is hosting a webinar that will include an introduction and context on the topic of Big Data and Hadoop and then dive into some context from Omer Trajman, VP of Technology Solutions at Cloudera. Omer will provided an update on CDH and Cloudera Enterprise, two of the most popular capabilities for fielding Hadoop in production environments.

Apache Hadoop is a software framework that supports data intensive distributed applications under a free-to-use license. Hadoop has been the driving force behind the Big Data industry. It enables applications to work over massive quantities of data with built-in features enabling enhanced reliability, scalability, and functionality over old style data tools.

Hadoop’s ability to implement new MapReduce methods may be exactly what you need if you have to make sense of large quantities of data. Storing and interacting with data using familiar SQL-type commands and traditional tools are also made easier by Hadoop framework’s hBase and Hive.

Please join publisher Bob Gourley as he provides context on the … Read more

Well played, EMC

17 July 2012 – As expected, EMC shuffled the deck today. It named COO Pat Gelsinger as the new VMware CEO, replacing Paul Maritz who has moved into a new role as EMC’s chief strategist.  It had been reported that Maritz was out as VMware’s CEO and was being considered as a replacement for EMC CEO Joe Tucci, who is expected to retire by the end of next year.  Maritz was a candidate to lead Cloud Foundry, the successful open-source platform as a service (PaaS) that has the potential to become a core part of EMC’s core cloud offering.  It has been reported that Cloud Foundry will be spun out as a subsidiary of EMC. VMware is a subsidiary of EMC.

For an analysis from GigaOM click here.… Read more

Data Display vs. Data Visualization

15 July 2012 – Gregor Aisch recently wrote a posting about gauges, and how he finds them inspiring and beautiful in their simplicity, even though they are generally disliked in visualization. His posting highlights a common misconception about visualization, and a conflation of different uses of data display, that is worth exploring. Gregor takes issue with the notion that visualization requires a certain number of data points to be displayed. He also considers “breaking those rules” by showing just a single data point in a chart. You can split hairs over how many data points you need, but the difference is a qualitative one: visualization shows a lot more data, usually including a lot of history (if there is a time axis), and sometimes even the future (i.e., forecasts). The tasks that a visualization serves are very different, because they typically are much more complex and not simple comparisons.

For more click here.

Cloud Computing: New Article 29 Working Party Opinion

3 July 2012 – The Article 29 Working Party has adopted a new Opinion which ‘analyses all relevant issues for cloud computing service providers operating in the EEA and their clients’.

The Article 29 Working Party, the working party made up of the various European data protection authorities which acts as independent European advisory body on data protection and privacy, adopted an Opinion on 1 July on cloud computing.

The Opinion may be read by clicking here.

The Executive Summary is as follows:

In this Opinion the Article 29 Working Party analyses all relevant issues for cloud computing service providers operating in the European Economic Area (EEA) and their clients specifying all applicable principles from the EU Data Protection Directive (95/46/EC) and the e-privacy Directive 2002/58/EC (as revised by 2009/136/EC) where relevant.

Despite the acknowledged benefits of cloud computing in both economic and societal terms, this Opinion outlines how the wide scale deployment of cloud computing services can trigger a number of data protection risks, mainly a lack of control over personal data as well as insufficient information with Read more