Archive for February, 2012

For Big Data and analytics, it’s initial public offering time

23 February 2012 –  Despite all its grandeur and prominence in “the cloud”, the financial markets have not been kind to Salesforce.com.  Earnings were off and stock suffered mightily for it.   But, ah, what a difference a quarter makes.  Let’s put it simply:  Salesforce killed it this quarter, and the shares are soaring after hours. As of 5:40 pm ET, Salesforce shares are trading at $146 even, up $14.23 or nearly 11 percent. That is what we call a bullish pattern.   So what happened? Results beat the consensus, for one thing. Earnings per share were 43 cents on a non-GAAP basis, beating the consensus of 40 cents. Sales were $632 million, knocking the consensus of $624 million on its hindquarters.   Then came the guidance, which was in truth a mixed bag. The revenue forecast stomped on the consensus: Salesforce said it expects revenue of $673 million to $678 million, well ahead of the $663 million consensus.  But?  The EPS forecast is a bit off at 33 to 34 cents, below the 36 cent consensus.   So what’s got investors so giddy? … Read more

IBM, Watson, Big Data and exponential technologies

22 February 2012 –  No matter how much we gush and carry-on about the amount of machine data in the world as we recently did at Legal Tech 2012 (click here), it is all just a “junior partner” in the data world.  The new “Big Think” is not machine data.  The volume of data we’re generating now from machines pales in comparison to the volume of data we’ll soon generate (are generating) from our own bodies.

My brain waves, my temperature, my pulse, my heart rate variability, my galvanic skin resistance, the number of steps I take, what I eat, what I breathe, who I talked to, my hormone levels, how happy I was, my brain’s efficiency at any time, and anything else I can think of stored in a very large, very secure, very friendly cloud analytics application.  Shared anonymously with any researcher who is doing something cool.  Or my physician.  That’s the big vision.  The new consumer-grade medtech offerings and other technologies  already out there, and in development, are  astounding.  The human body is a blank … Read more

Free webcast: Introduction to Virtualization — for you e-discovery mavens

21 February 2012 –  If you attended LegalTech 2012 a few weeks ago, you had the opportunity to discuss with several vendors the e-discovery issues involving virtual servers, desktops and storage, or see some of the brilliant off-the-floor presentations (as always at LegalTech there is more happening away from the event than at the event).

We first learned about the issues and became involved in learning the technology waaaaaay back in the old days … 2008 … when Autonomy’s ZANTAZ division incorporated into its e-discovery software the functionality of finding information stored in virtual environments.

The functionality came in the form of a module that could search multiple virtual file servers running a number of operating systems, freeing up IT shops from doing the work manually. It’s development was a no-brainer:  given the “extreme” type of rulings that were coming out of the e-discovery realm … court orders that a company preserve its RAM data … virtualized information had to be factored into a company’s efforts to comply with the Federal Rules of Civil Procedure concerning electronically stored information.  Through … Read more

Graphic proof of big demand for “Big Data” talent

18 February 2012 – Two back-to-back articles from Gigaom on the demand for “Big Data” skills:

1.  The graphic evidence that “Big Data” skills are very much in demand, shown by job postings for big data jobs that have skyrocketed since January 2010. Just take a look at this hockey stick of big data job listings, courtesy of Indeed.com (click here).

2.  If you can claim to be a data scientist and have the chops to back that up, you can pretty much write your own ticket even in this tough job market.   There is a huge demand for data scientists or anyone who can demonstrate other “Big Data” skills.  The reason? Companies in all industries now understand that they need to make better sense of the massive data sets at their disposal — data sets that can include computer log files, social networking feeds, digital video or audio, you name it.   That’s led to a spike in demand for data scientists — professionals that understand math and statistics but also have a flare for “art.” They understand … Read more

The Zettabyte: the coin of the realm in cloud computing

15 February 2012 –  Roger Strukhoff is a former Publisher at IDG and Guest Lecturer at MIT.  We met him at a cloud conference last year.  He has become one of our favorite “follows” on Twitter.  He splits most of his time between Silicon Valley and Southeast Asia.  In a recent post he writes:

“I’m more of a words-and-numbers person than a graphically oriented one, but do find occasional insight in a number of these wild cloud infographics that are going around. One of them, from Cisco’s Global Cloud Index, recently caught my attention – because of some of the incredible numbers it contained. According to Cisco, global datacenter traffic will more than quadruple in size between 2010 and 2015, from 1.1 to 4.8 zettabytes. About a third of that will be cloud-based traffic by 2015, compared to 11 percent in 2010, and will represent a growth of 12X over the five-year period. (Note: These numbers differ a bit from a Cisco report I cited earlier this month, but the awe-inspiring scale is the same.)  We can imagine we’re already Read more

How big data can tackle commercial building energy

15 February 2012 – There’s a tremendous opportunity to reduce energy consumption in commercial buildings, which currently account for more than 40 percent of the nation’s total energy use. According to Pike Research, energy efficiency projects could eventually save $40 billion in annual commercial building energy spending each year. However, at present investment rates, only a small part of that value will be captured.

So why are projects with positive returns going unfunded? Beyond limited access to financing, a large reason is that market participants – energy service providers, utilities, building owners and investors – don’t know which projects to fund. The standard process of manually identifying energy conservation measures is just too slow and expensive, not to mention unsystematic.

But a new breed of data analytics software solutions is aiming to break this gridlock. Using sophisticated algorithms to analyze large data sets for buildings, energy savings opportunities can be rapidly evaluated at low cost, buildings can be prioritized by how much energy they can save, and efficiency recommendations can be delivered – all with minimal human involvement.

For more … Read more

EMC to open Cloud and “Big Data” R&D Center in Russia

              

9 February 2012 – EMC announced yesterday its plans to establish an R&D center in the Skolkovo Foundation’s Innovation Hub in Russia that will focus on development of cloud infrastructure solutions and Big Data analytics technologies for Bioinformatics and Energy Efficiency.  EMC also plans to collaborate on research projects in these and other areas with Russian universities, government agencies, and local and multinational companies in the Skolkovo community. The new EMC facility will be located at the Skolkovo Innovation Center and will work in close cooperation with EMC’s existing R&D center in St. Petersburg.

The Skolkovo Foundation is a non-profit organization focused on establishing an innovation hub to stimulate innovative entrepreneurship that will benefit Russia and the global economy. Skolkovo’s goal is to leverage Russia’s resources in the field of contemporary applied research and create a favorable environment for undertaking scientific developments in five priority areas of technological development: power engineering and energy efficiency, space, bioinformatics, nuclear and computer technologies.

Skolkovo is governed by a special law, which gives its resident companies special economic conditions for running their businesses. More … Read more

One on One: Paul Maritz, VMware Chief Executive

 

7 February 2012 – Paul Maritz has been chief executive of VMware since July 2008. VMware’s server virtualization has made possible a vast consolidation of the computer business toward commodity chips and open-source software. It was also critical in the shift to “cloud” computing. A native of Rhodesia (now Zimbabwe), Mr. Maritz joined Intel in 1981. From 1986 to 2000 he was at Microsoft serving on its executive committee and overseeing Microsoft’s expansion into client-server corporate computing. He then founded a computing company which in 2008 was acquired by EMC, which also owns VMware.

For an interview with Mr. Maritz conducted by the New York Times click here.… Read more

“Big Data” designated “Enabler” in Deloitte’s 3rd Annual Tech Trends Report

3 February 2012 –  Big Data … not surprisingly … features in Deloitte’s just-released annual “Tech Trends” report identifying the 10 trends most likely to have an impact for CIOs in the coming year and beyond.  “It’s an uncommon time to have so many emerging forces – all rapidly evolving, technology-centric and each already impacting business so strongly,” said Mark White, principal and chief technology officer, Deloitte Consulting LLP, as the report was announced.  He continued:

“The convergence of five forces offers a new set of tools, opening the door to a new set of rules for operations, performance and competition. Our report outlines the opportunity for IT to truly help elevate business performance.”

The five emerging technology forces he was referring to are analytics, mobility, social, cloud and cyber security. Between them, said White, they provide the opportunity for businesses to accelerate performance in 2012.

Deloitte’s 3rd annual Tech Trends report Elevate IT for Digital Business, identifies the top 10 technology trends that will have the most potential to impact businesses over the next 18-24 months, grouping the … Read more

Investors and users beware: Facebook is all about IT (and that 30 petabytes Hadoop cluster)

By:  Gregory P Bufithis, Esq.  

2 February 2012 –  The nerve of Facebook.   Filing its S-1 during LegalTech when many of us were deeply ensconced in the issues surrounding the cloud and ediscovery.

As Gigaom points out, by now every statement in Facebook’s S-1 filing has already been pored over to death (I downloaded the S-1 to my iPad and enjoyed it over dinner last night) … published, blogged  and analyzed. Karsten Weide, a technology analyst with IDC, posted an analysis last night and reflected what many of the number crunchers have been saying:

“This filing implies Facebook is valued at $100 billion, which I think is too high. That’s about 27 times more than their 2011 revenue. But even assuming they can double revenue this year, I think it’s too high. It’s reminiscent of the valuations for stocks in the Internet 1.0 days.  Even if it were valued at just at $80 billion, I think it would be too high. There are a number of challenges and risks Facebook faces, and one is the growth of Google Plus … Read more