While I am importing data from excel to SQL I receive the below error. the memory is on the max. the format of excel is .xlsx
. The size of the excel is 170 MB (178,587,611 bytes). But I got:
not enough storage error.
I will appreciate if anyone helps me.
Data flow execution failed. Not enough storage is available to process this command. (Exception from HRESULT: 0x80070008) (Microsoft.SqlServer.DTSRuntimeWrap)
That error is coming from the SSIS runtime, not SQL Server.
Running out of space in SQL Server produces
Msg 9002, Level 17, State 4, Line 20
The transaction log for database 'XXX' is full due to 'ACTIVE_TRANSACTION'.
or
Msg 1105, Level 17, State 2, Line 20
Could not allocate space for object 'YYY' in database 'XXX' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.
It's unrelated to storage, and usually indicates a memory problem. I would first try reducing the buffer sizes in your Data Flow, and ensure that your data flow doesn't have any components that require loading large amounts of data into memory, like lookups.
This error mainly occurs when handling big Excel files using OLE DB adapter (OLE DB Connection manager or EXCEL connection manager) since this adapter has many limitations. I suggest reading excel file in chunks. In similar situations, I use mainly a C# script to do that, or you can do it by implementing a for loop container to loop over used range in excel.
One additional suggestion, is to use an 64-bit version of Microsoft Excel. This may increase the amount of data to be manipulated.
For additional information, you can refer to the following answers:
User contributions licensed under CC BY-SA 3.0