Posted 4 August 2017, 2:43 pm ESTI have several questions relating to performance with recordsets and Active Reports. Currently I have a landscape report of which I have attatched a PDF copy with dummy data as an example and visual reference. The report consists of a main report that sets up and fills in the header information, name, dates, graphic, and column headings, it then calls for detail data. Each record is identified by a page association number which allows for 1 to 6 records per page. The problem is a record has so many data points (60+ that are displayed) that they cannot all fit on a single line, so it has been divided into four subsections, each with it's own subreport.
Right now I am running 1 DataControl per report and feed each an SQL string to generate a recordset, the main recordset is basically a list of all the pages that need printed, association numbers plus header data. Each subreport grabs an association limited recordset for each page, 15-20 datapoints, for 4 mini-recordsets per page, so as you can imagine I get a performance hit anytime I start to run up the page count.
Currently this report can get it's data from two different sources.
A) Internal company network via a software application that connects to a central SQL2000 server for all data and recordsets, which contains all data for all customers.
B) External via another software application that connects to a local Access Jet 4.0 database, which would be limited to a customer or two's data.
Would I gain performance by calling 1 sub-recordset per page with a full 60+ datapoint count, clone it and then pass the clones to the subreports?
Would I gain performance by calling recordsets that were not page/association limited at the start of a report generation, then filtering for each page?
Should I be doing both Filtering and Cloning?
4) Another angle entirely that I am missed so far to date?