Reading Time: 4 minutes

Yogi SchulzWith its vast volumes, big data is useless without the analytic and presentation functionality of visual analytical tools.

Think how difficult it is to spot anomalies or trends in endless rows of spreadsheet data. Visual analytic tools solve that overload problem.

Big data is an in-vogue topic in the information technology world. Many executives see that it opens up transformative possibilities for products, services and markets.

This understanding has sparked massive investments in software for visual analytics and business intelligence, and related cloud-based services. The sales of this software will grow significantly in the future. Wikibon, a community of business technology practitioners, projects the big data market will top US$84 billion in 2026, attaining a 17 percent compound annual growth rate for the forecast period 2011 to 2026.

Big data refers to the vast data volumes being produced by:

  • the Internet of things (IoT). IoT is a fancy term for the rapidly growing number of sensors in electronic devices such as smartphones, cars, appliances, industrial machinery, satellites and airplanes;
  • everyone’s social media conversations, web browsing and search history;
  • business transactions as captured by point-of-sale terminals, loyalty programs and debit/credit cards;
  • organizations of all types and sizes adding content to websites;
  • business applications and documents;
  • data, reports and presentations created by governments, businesses, and non-governmental organizations.

Because of its large volume, big data is difficult to analyze meaningfully for business value. Let’s assess the tools many organizations routinely use to analyze their data and see how successful each approach really is.

Visual analytics is just right for big data

Visual analytics is a component of business intelligence software that emphasizes:

  • visualizations or charts as output;
  • a point-and-click graphical user interface for remarkably easy development.

Visualizations are valuable because they display a lot of data in an easy-to-understand format that works well for our visually-oriented minds.

Business intelligence software is a set of tools for acquiring and transforming raw data into meaningful and valuable information for business analysis and improvement.

More advice on running your business

Overall, business intelligence software packages are growing suites of software that:

  • can process ginormous data volumes;
  • integrate data from multiple, disparate data sources;
  • offer a wide variety of visualization types that provide considerable formatting control and the ability to customize default values;
  • include ad hoc queries, charts, dashboards, proactive intelligence and alerts;
  • model, forecast and project data;
  • offer rich reporting services to build and publish interactive reports;
  • are self-serve, meaning end-users can develop their own reports and visualizations;
  • include data governance, security and role-based access to data.

However, business intelligence is undermined by a lack of:

  • data quality such as wrong or missing values;
  • secondary or supporting data;
  • analysis staffing;
  • organizational openness to new results.

Data mining is too complicated for most uses of big data

Data mining is the automatic or semi-automatic analysis of big data to extract previously unknown patterns that may be useful for business improvement. Data mining techniques include artificial intelligence and machine learning.

Overall, data mining:

  • focuses on the sophisticated exploration of data;
  • offers unexpected insights.

However, data mining:

  • requires expensive specialists such as data scientists for effective operation;
  • depends on traditional statistical methods that are unsuitable for vast amounts of big data;
  • creates observations and recommendations that may be difficult for the typical business management audience to relate to;
  • produces unsupportable results when guided by preconceived notions.

Classical reporting is too ponderous for big data

Some organizations have made significant investments in software tools and development to produce a rich library of reusable data analysis reports where the end-user can dynamically vary the data selection criteria. Leading examples are SAP Crystal Reports and Oracle Reports.

Overall, reusable reports:

  • produce great-looking output for routine queries;
  • deliver reliable, consistent results because the information systems department carefully developed and tested the reports;
  • can be efficiently produced when the data volume is modest and the number of data sources is low.

However, reusable reports:

  • aren’t self-serve, even though constantly changing requirements would make that helpful;
  • require software developer skills to develop and enhance so their success is entirely dependent on the information systems department’s capacity and responsiveness;
  • require software maintenance when the versions of the underlying applications are upgraded;
  • don’t support the exploration of the data;
  • are often missing basic graphing functionality;
  • will choke when there are large volumes of data or the number of data sources grows;
  • tend to proliferate over time as many versions with minor differences are created.

Excel is too restrictive for big data

We’ve all heard that Excel is the leading tool for data analytics. It’s widely and successfully used by organizations of all sizes for primitive applications, simple tools and modest data volumes. Excel is also widely used as a powerful personal productivity tool within many larger organizations where information systems department responsiveness is a problem.

Overall, Excel is:

  • cheap;
  • self-serve;
  • fast for developing quick analysis of modest amounts of data;
  • a marvellously flexible tool for manipulating data;
  • incredibly easy to use to graph modest amounts of data.

However, Excel:

  • severely limits the data volume it can successfully query;
  • restricts the number of data sources it can access;
  • uses syntax that makes programming and debugging difficult;
  • too often delivers misleading or inconsistent results due to software defects;
  • handles missing values inconsistently;
  • produces primitive output;
  • isn’t scalable to multiple end-users;
  • is devoid of enterprise-level management features.

Ultimately, then, visual analytics is dramatically superior to the alternatives for achieving business value from big data.

Yogi Schulz has over 40 years of information technology experience in various industries. Yogi works extensively in the petroleum industry. He manages projects that arise from changes in business requirements, the need to leverage technology opportunities, and mergers. His specialties include IT strategy, web strategy and project management.

Yogi is a Troy Media contributor. For interview requests, click here.

The opinions expressed by our columnists and contributors are theirs alone and do not inherently or expressly reflect the views of our publication.

© Troy Media
Troy Media is an editorial content provider to media outlets and its own hosted community news outlets across Canada.