I have tried to export a Reporting Services report with 60,000 rows in it
(not more than Excel 65k limit!) to Excel but an "Internal Server Error" was
returned. When I checked the web servers event log it stated that
ASPNET_WP.EXE had terminated the process because it had consumed more than
60% of the availible RAM (1Gb total). Exporting the same report to PDF or
CSV worked fine.
A) why has the process consumed > 600Mb of RAM
B) is this a known issue and is it going to be resolved?
--
Cheers
NickDo you have the service pack for reporting services installed? SP 0 would
export as xml for excel, with the service pack it will export as binary.
(though, 60,000 rows is still a HUGE report)
Though, I have to ask, what are you going to do with a 60,000 row report?
That must be over 1000 pages! Certainly, no one is actually going to look
at all of that? If you need to do an export, just use DTS in SQL or custom
code. SQL Reporting seems to have been designed to create reports that will
be looked at by people.
Mike G.
"Nick Prudhoe" <NickPrudhoe@.discussions.microsoft.com> wrote in message
news:4CF1934E-937C-40AB-B901-C1014D4D3615@.microsoft.com...
>I have tried to export a Reporting Services report with 60,000 rows in it
> (not more than Excel 65k limit!) to Excel but an "Internal Server Error"
> was
> returned. When I checked the web servers event log it stated that
> ASPNET_WP.EXE had terminated the process because it had consumed more than
> 60% of the availible RAM (1Gb total). Exporting the same report to PDF or
> CSV worked fine.
> A) why has the process consumed > 600Mb of RAM
> B) is this a known issue and is it going to be resolved?
> --
> Cheers
> Nick|||Fair point. I'm aware it's a huge report but I'm trying to prove reporting
services out before my company spends any money/time on it (I know money
shouldn't be an issue ;-)).
To be honest I'd still expect it to work. We may well have customers who
want to produce reports in Excel with a large amount of rows (e.g. we have
customer with > 100,000 employees) and then play with the data in Excel.
--
Cheers
Nick
"Mike G." wrote:
> Do you have the service pack for reporting services installed? SP 0 would
> export as xml for excel, with the service pack it will export as binary.
> (though, 60,000 rows is still a HUGE report)
> Though, I have to ask, what are you going to do with a 60,000 row report?
> That must be over 1000 pages! Certainly, no one is actually going to look
> at all of that? If you need to do an export, just use DTS in SQL or custom
> code. SQL Reporting seems to have been designed to create reports that will
> be looked at by people.
> Mike G.
>
> "Nick Prudhoe" <NickPrudhoe@.discussions.microsoft.com> wrote in message
> news:4CF1934E-937C-40AB-B901-C1014D4D3615@.microsoft.com...
> >I have tried to export a Reporting Services report with 60,000 rows in it
> > (not more than Excel 65k limit!) to Excel but an "Internal Server Error"
> > was
> > returned. When I checked the web servers event log it stated that
> > ASPNET_WP.EXE had terminated the process because it had consumed more than
> > 60% of the availible RAM (1Gb total). Exporting the same report to PDF or
> > CSV worked fine.
> >
> > A) why has the process consumed > 600Mb of RAM
> > B) is this a known issue and is it going to be resolved?
> >
> > --
> > Cheers
> > Nick
>
>|||For this sort of thing what you really are doing is a data export into
Excel. You are not really concerned with the fidelity of how it looks.
Trying to get this fidelity is what makes it take so long. What I do when I
have users that want to export to play with the data is this. I add a link
(I call it Export Data), I then either call (jump to URL0 the same report
and render in CSV ASCII format or I have another hidden one which is pared
down to have the data export well. Note a couple of critical issues. You
want to render the CSV in ASCII, it defaults to Unicode which Excel does not
handle well (it puts everything in a single column). The CSV ASCII format is
lightening fast and when I say open when prompted it goes right into Excel.
Note that you will want to click on each field in the detail row and set its
name since that is where the CSV rendering gets the column names for Excel.
With the next version you will be able to specify that the CSV export when
done from Report Manager should be in ASCII format but for right now you
have to do my work around.
To give you an idea of speed improvement, 5 minutes (and bringing server to
its knees) went to 10 seconds.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Nick Prudhoe" <NickPrudhoe@.discussions.microsoft.com> wrote in message
news:1BAB2A6F-0221-44E6-A16B-FDD62642CE1D@.microsoft.com...
> Fair point. I'm aware it's a huge report but I'm trying to prove
> reporting
> services out before my company spends any money/time on it (I know money
> shouldn't be an issue ;-)).
> To be honest I'd still expect it to work. We may well have customers who
> want to produce reports in Excel with a large amount of rows (e.g. we have
> customer with > 100,000 employees) and then play with the data in Excel.
> --
> Cheers
> Nick
>
> "Mike G." wrote:
>> Do you have the service pack for reporting services installed? SP 0
>> would
>> export as xml for excel, with the service pack it will export as binary.
>> (though, 60,000 rows is still a HUGE report)
>> Though, I have to ask, what are you going to do with a 60,000 row report?
>> That must be over 1000 pages! Certainly, no one is actually going to
>> look
>> at all of that? If you need to do an export, just use DTS in SQL or
>> custom
>> code. SQL Reporting seems to have been designed to create reports that
>> will
>> be looked at by people.
>> Mike G.
>>
>> "Nick Prudhoe" <NickPrudhoe@.discussions.microsoft.com> wrote in message
>> news:4CF1934E-937C-40AB-B901-C1014D4D3615@.microsoft.com...
>> >I have tried to export a Reporting Services report with 60,000 rows in
>> >it
>> > (not more than Excel 65k limit!) to Excel but an "Internal Server
>> > Error"
>> > was
>> > returned. When I checked the web servers event log it stated that
>> > ASPNET_WP.EXE had terminated the process because it had consumed more
>> > than
>> > 60% of the availible RAM (1Gb total). Exporting the same report to PDF
>> > or
>> > CSV worked fine.
>> >
>> > A) why has the process consumed > 600Mb of RAM
>> > B) is this a known issue and is it going to be resolved?
>> >
>> > --
>> > Cheers
>> > Nick
>>
Wednesday, March 7, 2012
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment