I am trying to load data from db2 to db2 using ssis. There is only source table which has 2.4 million records and there is no transformation between source and destination table, but the loading stops after 1.6 million records. error which I am getting is:
Error: 0xC0202009 at LOAD TO SATGE_GLMXPF_COMPRESSED, OLE DB Destination [227]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x8007000E. An OLE DB record is available. Source: "Microsoft Cursor Engine" Hresult: 0x8007000E Description: "Out of memory.". Error: 0xC0047022 at LOAD TO SATGE_GLMXPF_COMPRESSED, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "OLE DB Destination" (227) failed with error code 0xC0202009 while processing input "OLE DB Destination Input" (240). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure. Error: 0xC02090F5 at LOAD TO SATGE_GLMXPF_COMPRESSED, DataReader Source [2]: The DataReader Source was unable to process the data. Exception from HRESULT: 0xC0047020 Error: 0xC0047038 at LOAD TO SATGE_GLMXPF_COMPRESSED, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on DataReader Source returned error code 0xC02090F5. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
You need to configure "Rows per batch" and "Maximum insert commit size" carefully.
The "Out of Memory" error occurs when the memory allocated to your process is overwhelmed by the amount of data that you are trying to process. Here are a few considerations:
Failed to open a fastload rowset for...
is most likely caused by the provider not supporting bulk loading. Bulk loading will help move things much faster, which is important on a memory constrained system.
User contributions licensed under CC BY-SA 3.0