Archive for November, 2012

EMC prepares for shift in cloud revenue model

22 November 2012 – EMC has been busy befriending cloud-service providers, as they gradually become the storage vendor’s main source of revenue in the cloud-computing segment, according to EMC Australia’s managing director Alister Dias. Through VCE, a joint venture with Cisco, EMC storage equipment forms part of a converged infrastructure offering for cloud computing. As organisations embrace the cloud into their business, they are increasingly acquiring equipment and services from cloud service providers, rather than sourcing them directly from a vendor, Dias said.

“When we look at our own business, if you’re a technology vendor, cloud is an opportunity and a threat,” he said. “For example, there is a shift from customers buying technology from us in the way they traditionally did, and just turning to cloud providers.

“The cloud providers therefore become our customers.”

For more click here.… Read more

[VIDEO] Bernard Ourghanlian, Directeur Technique et Sécurité Microsoft France – Spéciale Big Data

21 November 2012 – Bernard Ourghanlian, Directeur Technique et Sécurité Microsoft France avec spéciale Big Data, nous explique sa vision du Big Data : les 3 “V”, les objectifs, la notion de vie privée:

Read more

How Big Data, cloud computing, Amazon and poll quants won the U.S. election

By: Gregory P. Bufithis, Esq.   Founder/CEO, The Cloud and E-Discovery

15 November 2012 –   As Daniel Honan of Big Think pointed out, just like in baseball and politics, there are winners and losers in a data-driven world. The losers in baseball, for instance, are the over-rated prospects who will never be drafted because data analysis has a way of finding them out early on in their careers. In politics, the biggest loser will be the horse race pundit, the guy who spins the polls to reinforce one side’s belief that it is winning when it’s actually losing. Sometimes this is done for partisan reasons, in the hope of creating “momentum,” and sometimes it is done to create a more compelling media narrative.

This was indeed a choice election, and the choice was between following entertainment journalism or data-based journalism. As Andrew Beaujon has pointed out, entertainment is fun, and math is hard. Well, math won.

Data analysis at its best

It is a fascinating area of data analysis.  As part of my neuroinformatics degree program, I recently had the chance … Read more

The Visualizing Global Marathon: student data-visualization enthusiasts at their best

14 November 2012 – As we have pointed out in previous posts, we are at the point of an “industrial revolution of data” with vast amounts of digital information being created, stored and analyzed.  The rise of “big data” has led in turn to an increased demand for tools to both analyze and visualize the information.   We are always looking for great examples.

One such is the Visualizing Global Marathon.  Over 1,000 student data-visualization enthusiasts took part in last weekend’s Visualizing Global Marathon (click here) working in teams to produce graphic illustrations of international disease outbreaks, the global flights network and the US presidential election. More than 100 visualizations were submitted – some static, others interactive – in the hope of winning a portion of the $15,000 prize fund provided by General Electric for the six winning entries across three categories.

The Guardian has a nice review of the Marathon with some examples of the winners (click here).

Also, Alberto Cairo gave a nice presentation (below) entitled “Data Viz 101” for the Marathon.  Alberto is the
Read more

Speeding algorithms by shrinking data; a new approach to processing “Big data”

13 November 2012 – Most computer scientists try to make better sense of big data by developing ever-more-efficient algorithms. The proliferation of cheap, Internet-connected sensors — such as the GPS receivers, accelerometers and cameras in smartphones —has meant an explosion of information whose potential uses have barely begun to be explored. In large part, that’s because processing all that data can be prohibitively time-consuming.

But in a paper presented this month at the Association for Computing Machinery’s International Conference on Advances in Geographic Information Systems, MIT researchers take the opposite approach, describing a novel way to represent data so that it takes up much less space in memory but can still be processed in conventional ways. While promising significant computational speedups, the approach could be more generally applicable than other big-data techniques, since it can work with existing algorithms.

In the new paper, the researchers apply their technique to two-dimensional location data generated by GPS receivers, a very natural application that also demonstrates clearly how the technique works. As Daniela Rus, a professor of computer science and engineering and … Read more

Amazon cloud entry in Australia poses legal concerns to business; attempt to avoid the U.S. Patriot Act?

13 November 2012 – E-commerce giant Amazon’s plans to offer data and computer hosting services through Australian data centres from this week will not indemnify customers from legal action in the United States, legal experts have warned.  Amazon’s hosting division will today announce plans to offer public cloud services – computers and hard drives that companies can lease for a fraction of the cost of purchasing a similarly capable machine – for the first time within Australian borders.

The move has been touted as a “game changer” for high-risk sectors like finance and government, which are traditionally kept from storing critical data outside of Australia. The introduction of Amazon-hosted services in Australia is thought to have been spurred by those concerns, providing local companies with the ability to store data in local facilities rather than data centres in the US, Singapore or Europe.

But lawyers told The Australian Financial Review the move will not immunise local companies from subpoenas issued by US courts or regulators. “The fact that Amazon holds data in Australia makes no difference to its obligation to … Read more

Deloitte lance un concours étudiant sur le big data

Jusqu’au mois de janvier, 70 étudiants de Télécom SudParis, encadrés par le cabinet d’audit et de conseil Deloitte, vont plancher sur les risques et les bénéfices liés au déploiement de ces nouvelles solutions.

12 November 2012 – Alors que les experts du big data mettent en garde contre le risque de pénurie de profils spécialisés, Deloitte décide de prendre le taureau par les cornes et lance un défi sur le thème aux étudiants de Télécom SudParis. « L’objectif est de les former à une problématique d’avenir – les risques, les contraintes et les bénéfices liés au déploiement d’une solution big data – et de détecter des talents », explique Nicolas Barbier, Manager Conseil Technology Advisory au sein du cabinet.

Pour les 70 étudiants de la majeure « Ingénierie et systèmes d’information » participant au défi, le programme s’annonce chargé. « Ce qu’on leur demande, après leur avoir fourni des premières pistes, bien sûr, c’est de réaliser un tour d’horizon du big data, de développer des méthodologies de mise en œuvre et de trouver de nouvelles études de cas »Read more

How The Feds Drive Cloud Innovation

12 November 2012 – The coolest cloud computing application in the world — and in our solar system — comes from NASA. The space agency is using commercial cloud services to process the digital images being transmitted to Earth from the Curiosity rover as it searches for signs of life on Mars.

Those images, taken by 17 cameras mounted to the six-wheel, SUV-like rover, are an incredible scientific trove, stored and managed by Amazon Web Services. The most recent images show the rover’s robotic arm taking the first scoops of Martian soil for analysis. NASA’s Jet Propulsion Lab is using a variety of Amazon services — EC2, S3, SimpleDB, Route 53, CloudFront, Relational Database Service, Simple Workflow, CloudFormation, Elastic Load Balancing–to make this happen. And the images are available not just to NASA scientists, but to you and me as well. “The public gets access as soon as we have access,” says Khawaja Shams, manager of data services at JPL.

For more from InformationWeek click here.

Read more

Le Big Data, nouvel outil dans la campagne présidentielle américaine

6 Novembre 2012 – Quoi de mieux qu’un contexte d’élection présidentielle pour faire parler les Big Data ? Alors que les Etats-Unis élisent leur prochain président, Microstrategy et SAP, deux cadres du segment de l’analyse des données en volume, recueillies notamment sur les réseaux sociaux, ont décidé d’illustrer les capacités de leur technologie respective en livrant des indicateurs sur l’électorat de chacun des deux candidats, Barack Obama et Mitt Romney. Le Big Data s’invite ainsi dans la campagne électorale, aux côtés des sondages, qui pour le coup prennent un coup de vieux.

« Les observateurs, tout comme les équipes des deux candidats, se reposent encore sur des méthodes classiques et laborieuses lorsque se pose la question de l’indice de popularité des candidats : celles des sondages, généralement menés par téléphone, oubliant du coup les personnes dont le numéro n’est pas référencé », note notre confrère allemand Süddeutsche Zeitung. Une méthode conventionnelle ? Trop. L’analyse des réseaux, et ainsi que celle des Big Data que ces nouveaux média génèrent, viennent ainsi apporter des indices, plus ou moins précis, à partir de … Read more

Cloudonomics: The Economics of Cloud Computing

1 November 2012 –  We receive scores of white papers on the cloud.  It is hard to plow through them all.  But here is one from Rackspace, from its CloudU™ which is a comprehensive Cloud Computing training and education curriculum developed by industry analyst Ben Kepes. We have been fortune enough to go through some of the units.

In this PDF Rackspace addresses the many reasons for organizations to move from traditional IT infrastructure to Cloud Computing. One of the most cited benefits is the economics of the Cloud. Yet while many people point out the cost savings that Cloud Computing brings to an organization, Rackspace believes attention should be drawn to four distinct mechanisms through which these cost savings are generated:
•By lowering the opportunity cost of running technology
•By allowing for a shift from capital expenditure to operating expenditure
•By lowering the total cost of ownership (TCO) of technology
•By giving organizations the ability to add business value by renewed focus on core activities

They detail these four mechanisms and introduce several case studies and examples to show

Read more