someone doing some reports for me using Reporting Services is getting some problems.
This
is what he stated the problem as: "There is a ASP.NET service which
eats up all of the memory until the report system crashes"
By
removing as many of the views as possible and using tables directly,
the report successfully ran (in a bit over two minutes, processing
300,000 records), however, when doubling the amount of data to be
included, the report wouldn't run again.
Has anybody come across a similar problem and have any advice?
Thanks,
Frank
Rather than sending 600,000 rows across the network, I'd suggest filtering/aggregating on the database side and using Reporting Services mainly to format/display. Take advantage of the strenghts of each product.
Otherwise, I'd suggest increasing the amoung of memory available on the machine, and if necessary upgrading to 64-bit which supports much larger amounts of RAM.
Thanks, Donovan.
|||Thanks, I believe the developer has tried to do as much processing on the server, but I will pass on your suggestion.Upgrading RAM has already been suggested, but the system is for some financially strapped countries so it may not be possible.
Thanks again.|||Hi Donovan,
the developer has redesigned the reports as you suggested and instead of sending 600,000 records to Reporting Services he is now sending 6,300 and the report works!
Thanks a bunch!
No comments:
Post a Comment