Skip to main content Skip to footer

Understanding Data Dynamics Analysis Scalability

How to best use GrapeCity's Data Dynamics Analysis -- The World's First Free-Form Data Visualization And Business Intelligence component for .NET

GrapeCity’s Data Dynamics Analysis is the first free-form, multi-dimensional .NET data visualization and business intelligence (BI) component for Visual Studio. It offers rich data analysis and interactive data visualization options, including pivot tables and charts, all conveniently packaged in a single .NET component, complete with an out-of-the-box end user analysis interface.

A Lot of Power for a Component
That is a lot of power in one component and not surprisingly, customers and evaluation users get blown away by the features. Comments such as “this product has limitless potential” are typical of what we hear when people see the component in action. As a result, we have seen a steady increase in sales as developers begin to see true interactive BI as an integral feature of the applications they are building today.

With Power Comes Responsibility
With the kind of features this component has, it is tempting to think of it as a potential replacement for stand-alone client-server analysis systems. As a result, we occasionally get questions from evaluation users or existing customers regarding how to best use this product when analyzing an inappropriately large volume of data. This is because they expect scalability characteristics similar to what they would expect from stand-alone analysis systems.

The Key Differentiator
The key difference between Data Dynamics Analysis and client-server style, stand-alone analysis systems is that Data Dynamics Analysis is an in-process component that allows you to embed an entire data visualization/BI system inside your .NET application. The trade-off is that it runs inside the memory space of the containing application. This means that it shares the system resources of the application, wherever that instance resides (mostly on client desktops).

In other words, all data querying, aggregation, calculations and rendering are happening inside this memory space that it shares with other components of the containing application. This is unlike stand-alone systems that use 100% dedicated in-memory space and optionally use physical storage media for caching, etc. to improve scalability.

Common Usage Scenarios
Data Dynamics Analysis works great for data analysis scenarios with “typical” data volumes, and for a wide range of data formats like relational data, XML, Analysis Cubes, etc. Another way to use the component is in combination with an existing cube or a server-based analysis system that provides some of the back end heavy lifting while Data Dynamics Analysis provides a dramatically superior user experience.

Exception: Inappropriately Large Volumes of Data
Occasionally, someone will throw the component inappropriately large volumes of data that include a large number of rows, measures and dimensions. In this case, the total count of discrete data elements increases exponentially, while the component still has to work within the constraints of the containing application. That’s when users will inevitably notice the scalability impact--especially if they are comparing it with stand-alone client-server analysis systems.

Comparing Scalability Characteristics (or Not)
Comparing the scalability characteristics of Data Dynamics Analysis with those of full-blown Analysis systems is a bit like comparing apples and oranges. That is because the fundamental architecture in the two cases is different. One is an in-process memory-constrained .NET component with a rich, interactive, data visualization experience as the primary benefit, while the other is more focused on using client-server architecture, dedicated memory and local storage for large-scale data handling. On the other hand, the user interface in these stand-alone systems is more basic, and you cannot embed them inside a .NET application like you can a component.

Recommended Best Practice
While using a .NET analysis component, the recommended best practice is to expose a volume of data that is reasonable and manageable for the end user to analyze, and to exercise some oversight on the size of the data that is fetched and passed on to the analysis component. There is generally no need to dump all of the data on the user at once. The ideal solution is to design your analysis approach thoughtfully to connect to relational data, XML and other options, or use the Analysis component as a front-end to Microsoft Analysis Services or an equivalent cube service.

What’s Next
Having made the distinction clear, our goal is to continuously enhance our analysis technology in a number of ways in the coming months. For example, the upcoming major update to Data Dynamics Analysis features a Microsoft Silverlight™ based rich data visualization experience inside the Web browser that allows for a Web analysis experience on par with the Windows Forms experience. Other enhancements are also in the pipeline.

MESCIUS inc.

comments powered by Disqus