0x8007000E Description: "Out of memory." error

-1

I am trying to load data from db2 to db2 using ssis. There is only source table which has 2.4 million records and there is no transformation between source and destination table, but the loading stops after 1.6 million records. error which I am getting is:

Error: 0xC0202009 at LOAD TO SATGE_GLMXPF_COMPRESSED, OLE DB Destination [227]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x8007000E. An OLE DB record is available. Source: "Microsoft Cursor Engine" Hresult: 0x8007000E Description: "Out of memory.". Error: 0xC0047022 at LOAD TO SATGE_GLMXPF_COMPRESSED, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "OLE DB Destination" (227) failed with error code 0xC0202009 while processing input "OLE DB Destination Input" (240). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure. Error: 0xC02090F5 at LOAD TO SATGE_GLMXPF_COMPRESSED, DataReader Source [2]: The DataReader Source was unable to process the data. Exception from HRESULT: 0xC0047020 Error: 0xC0047038 at LOAD TO SATGE_GLMXPF_COMPRESSED, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on DataReader Source returned error code 0xC02090F5. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

sql
sql-server
ssis
db2
asked on Stack Overflow Mar 9, 2017 by deepanshu nagpal • edited Mar 11, 2017 by James Z

2 Answers

0

You need to configure "Rows per batch" and "Maximum insert commit size" carefully.

answered on Stack Overflow Mar 9, 2017 by Bilal Ayub
0

The "Out of Memory" error occurs when the memory allocated to your process is overwhelmed by the amount of data that you are trying to process. Here are a few considerations:

  • Do you actually have enough memory installed on the machine that you are running this? I would raise an eyebrow at a 32 GB server. If you are testing locally, use a smaller dataset and install more memory on your development machine. It is not uncommon to have a 16GB laptop these days.
  • 32 or 64bit? 32 bit processes can only allocate 2GB of memory which can become quickly overrun. Try switching to 64 bit if possible.
  • Reduce the columns in the dataflow. If possible, remove any columns that are not being used. This will improve utilization of the memory buffers.
  • Use data providers from IBM or buy a custom one. i.e. http://www.cozyroc.com/ssis/db2-destination. The error that you had in trying to fast load:

Failed to open a fastload rowset for...

is most likely caused by the provider not supporting bulk loading. Bulk loading will help move things much faster, which is important on a memory constrained system.

answered on Stack Overflow Mar 9, 2017 by Mark Wojciechowicz

User contributions licensed under CC BY-SA 3.0