Archive for the ‘Data Visualization’ Category

Data Visualization: the 3 questions for passing the “eye candy” test

Eye candy


14 February 2014 – Everyone is talking about data visualization. Business users, analysts, lawyers. Data analytics and visualization are not new. For decades, businesses have collected data, analyzed it using a variety of BI tools, and generated reports. The process may take weeks or months, but eventually a few highly trained data analysts are able to pull the necessary figures from their dashboards and issue static, rearview reports to executives and other employees.

But businesses are finding that this traditional reporting process does not work nearly as well for big data, and certainly is not sufficient to capture the potential value that big data represents.

So everyone is increasingly turning to visualization-based data discovery tools that promote business intelligence to enable a multitude of users to easily integrate or “mash up” data from a wide range of sources — databases, clickstreams, social media, log files, videos, and more. With the aid of high-powered desktops and mobile computing devices users can perform real-time, predictive analyses, and showcase the results, to immediately and visually communicate information.

But it is hard to … Read more

How in HELL do you visualize a yottabyte?! Well, by use of a brilliant infographic.


Data on big screen


26 November 2013 – Nowadays, data size measurements such as kilobyte and megabyte are commonplace in tech parlance, but a new infographic puts those measurements into context, and takes a look at drive technologies, too. One byte? One character. 10 bytes? One word. A megabyte? A short novel. 10 terabytes? The entire printed collection in the U.S. Library of Congress. Etc., etc., etc.

Its source is our good friends at datascience@berkeley who are part of the University of California Berkeley and who offer a professional Master of Information and Data Science (MIDS) which is delivered online and which features a multidisciplinary curriculum designed to prepare data science professionals to solve real-world problems using complex and unstructured data. For more information just click the link above. Two of our staffers are currently enrolled.

The infographic begins with the humble bit and works its way up in file size all the way to zettabyte and yottabyte. A yottabyte is equal to 1,000 zettabytes and one yottabyte is the size of the entire world wide web, according to the infographic.


Read more

What Wi-Fi would look like if we could see it

Wi Fi visualization


23 July 2013 – Artist Nickolay Lamm, a blogger for, decided to shed some light on the subject of Wi-Fi. He created visualizations that imagine the size, shape, and color of wi-fi signals were they visible to the human eye.  “I feel that by showing what wi-fi would look like if we could see it, we’d appreciate the technology that we use everyday. A lot of us use technology without appreciating the complexity behind making it work.”

So, working with M. Browning Vogel, Ph.D., an astrobiologist and former employee at NASA Ames.  Vogel described the science behind wireless technology, and Lamm used the information to create the visualizations. And Vogel then provided captions for each illustration explaining the science of wi-fi: the size of a wi-fi energy field, and how a signal is transmitted.

Some very coll work here. For the full post plus Lamm’s visualizations click here.… Read more

The importance of making Big Data accessible to non-data scientists

Data science for dummies 1


4 May 2013 – Gartner analyst Doug Laney first coined the term ”big data” over over 12 years ago although one suspects — at least in its current form — people have been complaining about “information overload” since Roman times. But the term’s meaning is still far from clear and it wins continuous nominations in the “Tech Buzzword That Everyone Uses But Don’t Quite Understand” competitions, followed closely by “the cloud”.

When using the term, Gartner usually keeps the quote marks in place (i.e. it’s “big data”, not big data). And as we learned at the Gartner Business Intelligence and Analytics Summit in Barcelona two months ago, Gartner has spent a tremendous amount of time on it. As Gartner analyst Donald Feinberg warned people at the conference “talking only about big data can lead to self-delusion” and he urged people not to “surrender to the hype-ocracy.”

NOTE: next month we’ll have a chance to talk about “big data” more with Gartner analyst Debra Logan along with Jason R. Baron when our video crew travels to Rome to interview Read more

Visualizing the Paris metro system

Parisian subway 625x367


16 April 2013 – Data visualization group Dataveyes looks closer at the Paris metro system from a time and crowd point of view:  

This visualization offers to challenge the way we traditionally view our 2D metro maps. Métropolitain takes on an unexpected gamble: using cold, abstract figures to take the pulse of a hectic and feverish metropolis. The metro map is no longer arbitrarily dictated by the spatial distance between two points. By playing around with two extra variables — time and crowds — users can transform the map, view it in 3D and unveil the true reality behind their daily commute.

No doubt inspired by the Travel Time Tube Map of the London Underground by Tom Carden, Métropolitain lets you select a station and the lines morph to represent how long it takes to get to other stations. A layer underneath is a heatmap that shows annual incoming traffic per station.

Finally, you can switch between 2-D and 3-D. I’m not sure if the extra dimension adds much from an understanding point of view, but it is … Read more

The power of data visualization’s “Aha!” moments: an interview with Amanda Cox of The New York Times



19 March 2013 – Amanda Cox has been a graphics editor at the New York Times for eight years. Trained as a statistician, Cox develops visualizations across platforms, from simple print infographics to highly complex online interactive data tools. The Times is a visualization leader, but Cox believes the best is yet to come from this discipline, which she calls “both young and not young.” In an interview with the Harvard Business Review Blog Network Cox spoke about the Times’ approach to visualization and the power of “Aha!” moments:

Do you think data visualization is entering a time when it’s becoming a core communication tool?

I wish there were more examples in the high-end data viz world to back that up. I wish there were more examples where data viz actually mattered. The case studies for us to lean on are sparser than they should be. On the other hand, you can argue it’s a young field and people are doing all kinds of crazy interesting things, and that’s a good thing. There’s that classic idea that it’s useful … Read more

Want to analyze the President’s State of the Union speech? Then use some data visualization software

13 February 2013 – The State of the Union address by the U.S. President is a particularly apt data set to explore for clues about the change in political language: it is a remarkably consistent form available annually over the entire history of the United States. Article II Section 3 of the Constitution inaugurates the practice:

[The President] shall from time to time give to the Congress Information of the State of the Union, and recommend to their Consideration such Measures as he shall judge necessary and expedient …

Very quickly, the address acquired a conventional form as a yearly message delivered by the office of the president, at the beginning of the year, to the Congress, the representatives of the people. This consistent structure, which endures over the course of U.S. history, is what allows for a useful comparison between actual instances of the address. Such comparisons would be difficult to make otherwise between more random fragments of political discourse not regulated by a uniform temporal frame and an archetypal structure of address.

In the ceremonious address in congress, … Read more

Mapping translations of “Othello”: another great use of data analysis/data visualization

8 February 2013 – I am always on the look out for examples of data analysis, algorithms and data visualization used in unique ways. Last week at a data analysis/data visualization workshop in London I learned about a collaborative, multi-disciplinary project called “Version Variation Visualisation” (VVV) through a venture called TransVis which is a digital humanities approach to analyzing the multiplicity of Shakespeare re-translations.   Its data is nowhere near complete, but it is a starting point for the creation of a global map of Shakespeare’s influence in the world.

TransVis collects, digitizes, analyses and compares translations and variations of literary works. In an initial prototype VVV they use analysis methods, interfaces and visualization tools to explore 37 translations of Shakespeare’s Othello into German with more works translated into other languages to come.  It is a joint effort by Tom Cheesman of Swansea University, along with Kevin Flanagan and Studio NAND.
The map is more of a browser to see where specific publications were written, rewritten and published. All culturally important works are translated over and over again. The differences are … Read more

How Big Data, cloud computing, Amazon and poll quants won the U.S. election

By: Gregory P. Bufithis, Esq.   Founder/CEO, The Cloud and E-Discovery

15 November 2012 –   As Daniel Honan of Big Think pointed out, just like in baseball and politics, there are winners and losers in a data-driven world. The losers in baseball, for instance, are the over-rated prospects who will never be drafted because data analysis has a way of finding them out early on in their careers. In politics, the biggest loser will be the horse race pundit, the guy who spins the polls to reinforce one side’s belief that it is winning when it’s actually losing. Sometimes this is done for partisan reasons, in the hope of creating “momentum,” and sometimes it is done to create a more compelling media narrative.

This was indeed a choice election, and the choice was between following entertainment journalism or data-based journalism. As Andrew Beaujon has pointed out, entertainment is fun, and math is hard. Well, math won.

Data analysis at its best

It is a fascinating area of data analysis.  As part of my neuroinformatics degree program, I recently had the chance … Read more

The Visualizing Global Marathon: student data-visualization enthusiasts at their best

14 November 2012 – As we have pointed out in previous posts, we are at the point of an “industrial revolution of data” with vast amounts of digital information being created, stored and analyzed.  The rise of “big data” has led in turn to an increased demand for tools to both analyze and visualize the information.   We are always looking for great examples.

One such is the Visualizing Global Marathon.  Over 1,000 student data-visualization enthusiasts took part in last weekend’s Visualizing Global Marathon (click here) working in teams to produce graphic illustrations of international disease outbreaks, the global flights network and the US presidential election. More than 100 visualizations were submitted – some static, others interactive – in the hope of winning a portion of the $15,000 prize fund provided by General Electric for the six winning entries across three categories.

The Guardian has a nice review of the Marathon with some examples of the winners (click here).

Also, Alberto Cairo gave a nice presentation (below) entitled “Data Viz 101” for the Marathon.  Alberto is the
Read more