Limit on query rows?

Posted by: cynthia-dd on 3 August 2017, 3:52 pm EST

  • Posted 3 August 2017, 3:52 pm EST

    Hi,



    I am using a query that returns over 1000 rows (actually around 80,000 rows). When I drag a attribute (field) to the ROWS section, it displays a time elapsed window, and after about 28 seconds a message "By request, more than 1000 records have been received. The processing may take a considerable amount of time. Would you like to continue?". I clicked YES and the app hung and stopped responding.



    I am wondering why i get this or if there are limits on the amount of rows. I need to be able to display large sets of data, up to 1 million rows sometimes.



    Thanks
  • Replied 3 August 2017, 3:52 pm EST

    Thank you for the response. I use the following code to connect to the datasource:

           GrapeCity.ActiveAnalysis.DataSources.IDataSource dataSource = DataSourceFactory.ReadDataSource(
                new XmlNodeReader(node.SelectSingleNode("DataSource")), "DataSource");

            if (pivotView.DataSource != null)
                pivotView.DataSource.Disconnect();
            pivotView.DataSource = dataSource;
            pivotView.DataSource.Connect();

    The connectionType in the xml is sql and the query itself executes in under a second.

    I tried on version 2.0.307.18 as well as the one in your link (2.0.396.0). In each case I ran the test by adding the fourth row of data to the layout and timed and recorded the memory usage. Memory usage was determined by looking at the process usage in task manager.

    2.0.307.18:
    39.2s - 1,050mb
    38.6s - 1,020mb
    38.9s - 1,023mb

    2.0.396.0:
    37.8s - 1,024mb
    39.6s - 1,025mb
    38.9s - 1,022mb

    I would be happy to look into this further if there is anything I can do.
  • Replied 3 August 2017, 3:52 pm EST

    Hello jrg0839,

    Could you please let me know what datasource you are using? I would request you to download the interim build from this link as we made some changes in this build to minimize the memory usage. Thank you for investing your time in testing this issue with various builds.

    Regards,
    Aashish
  • Replied 3 August 2017, 3:52 pm EST

    I am having a similar problem, but I am only working with 1084 records. Everything works fine (3 row, 3 columns selected) until I add a 4th attribute to the rows. The IIS Worker Process pins the CPU at 100% for 5-10 minutes and the memory usage jumps to over 800mb. Is this expected when there are only 1000 rows of data?

    The attached file shows the output when it does finally finish.

    Tested on 1.0.634, 1.0.912 & 2.0.307.18

    2011/01/Data.xml-1.xlsx
  • Replied 3 August 2017, 3:52 pm EST

    Hello,



    Regarding your question I would like to highlight some points related to the issue:



    (1) Please note that DDA has limitation of display points. This limit can be changed using PivotView.WarningTreschold property. For this you can go through the following online help document:

    http://www.datadynamics.com/Help/ddAnalysis/DataDynamics.Analysis.Windows.Forms~DataDynamics.Analysis.Windows.Forms.PivotView~WarningThreshold.html





    Number of data poins is number of possible positions in multidimensional cube with 5 dimensions: Rows, Colums, Pages, Data and Encodings. If only one dimension is used, the number of points are equal to number of positions in this axis. For example, if there is only one attribute in Rows shelf, then number of points will be equal to number of result values of the attribute expression.





    (2) You can define Hierarchies and attributes using query fields and VB expressions for example string concatenation and other.

    For this you can go through the following forum post which demonstrate the above points for measures

    http://www.datadynamics.com/forums/121606/ShowPost.aspx



    A query string to DB represents list of the query fields. You can define additional query fields using VB and existing Query fields.



    (3) It is necessary to define the attributes and hierarchies in a way to keep the number of their result values minimal. DDA try’s to process all values of the result attribute or hierarchy expression. If you have obtained many values (for example all days in the century), may be it is necessary to define this schema item as measure or aggregate requested data (returns only months).



    If you would like to have a limitation on the processed data then I can go ahead and enter a feature request to implement this.



    Let me know if you need further assistance/clarification.



    Regards,

    Gaurav



  • Replied 3 August 2017, 3:52 pm EST

    example.zip uploaded. It uses an xls file with the data. extract c_data.zip to your c:\ drive.

    In the modified sample it loads the xls datasouce and one of the two analysis files. It is currently set to the fast/low memory example, change it to slow.analysis to see the other.

    Interestingly you can load these analysis files in the win viewer and they both perform well.
  • Replied 3 August 2017, 3:52 pm EST

    Here is something interesting. Maybe it will help. When I use items from a hierarchy as individual items the process consumes about a GB of memory and takes about 40 seconds, but when I pull the entire hierarchy into the layout it goes fast and only consumes about 250mb.

    See the screenshots.

    2011/01/slow.PNG
  • Replied 3 August 2017, 3:52 pm EST

    Screenshot of faster execution with less memory consumption.

    2011/01/fast.PNG
  • Replied 3 August 2017, 3:52 pm EST

    I have responded to the other post. You may upload the data or samples at the below location:
    ftp://ftp.fpoint.com/ActiveReports/upload/

    Regards,
    Aashish
  • Replied 3 August 2017, 3:52 pm EST

    I am working on this, but I am having problems with local cube datasource (see http://www.datadynamics.com/forums/ShowPost.aspx?PostID=138172).

    Can I send you the data another way?
  • Replied 3 August 2017, 3:52 pm EST

    Hello,

    Thank you for your post. Could you please modify the DatasourceConnection sample and send it back to us so that we can replicate this issue at our end. The sample establishes connection to xml datasource.

    Regards,
    Aashish
  • Replied 3 August 2017, 3:52 pm EST

    jrg0839,

    thank you for setting up the sample. Please download the latest interim build from this location. This build takes around 10 seconds to display slow.analysis view and takes around 75 MB of memory. Please share the results with me once you test this.

    Regards,
    Aashish
  • Replied 3 August 2017, 3:52 pm EST

    kahkent,

    The build which Aashish referred to in his last post was for ActiveAnalysis. I would like to tell you that Data Dynamics Analysis is a discontinued product and is no longer in active development. You may want to try the latest build 2.1.706.0 of ActiveAnalysis by downloading it from here and see if you still face any issues.

    Regards,
    Sankalp
  • Replied 3 August 2017, 3:52 pm EST

    Is this for data dynamics analysis or active analysis? I have the same problem, can I have the build?
    thanks
  • Replied 3 August 2017, 3:52 pm EST

    Thank you, I have tested the new build and it looks good. Much better performance and better memory usage.
Need extra support?

Upgrade your support plan and get personal unlimited phone support with our customer engagement team

Learn More

Forum Channels