SSIS DTS_E_OLEDBERROR and Out of Memory Error transferring SQL Server records to Oracle Database

0

I'm trying to import around 5M records from a SQL Server database to an Oracle Database and am getting an Out of Memory error. See details below. I've tried a number of recommendations already from researching, but haven't found a solution.

Things I've tried/Notes:

  1. Source is SQL Server using "OLE DB Source" and the "SQL Server Native Client 11.0" Driver
  2. Destination is Oracle Db using "OLE DB Destination" and the "Oracle Provider for OLE DB" driver.
  3. I tried the "Microsoft OLE DB Provider for Oracle" as the destination instead.
  4. I tried increasing the "Defalut Buffer Size" the Data Flow to 100 MB
  5. I tried increasing the "DefaultBufferMaxRows" to 10 M rows.
  6. The computer is 64-bit with 16GB of RAM and has Windows 7 Professional.
  7. I have both the 32-bit and 64-bit Oracle Client installed
  8. The SSIS project's "Run64bitRuntime" property is set to True.
  9. I'm using Visual Studio 2015.

Any ideas?

Here are the pertinent messages.

Warning: 0x80047076 at Data Flow Task Load PFILSA OneLink Table 2012, SSIS.Pipeline: The output column "DateTime" (137) on output "OLE DB Source Output" (105) and component "OLE DB Source" (94) is not subsequently used in the Data Flow task. Removing this unused output column can increase Data Flow task performance.

Error: 0xC0202009 at Data Flow Task Load PFILSA OneLink Table 2012, OLE DB Destination [12]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x8007000E.

An OLE DB record is available. Source: "Microsoft Cursor Engine"
Hresult: 0x8007000E Description: "Out of memory.".

sql-server
oracle
visual-studio-2015
ssis
asked on Stack Overflow Apr 18, 2017 by ptownbro • edited Apr 18, 2017 by Hadi

1 Answer

0

Your package is trying to read too much data into memory than what your machine can handle before inserting into your destination. Open the destination and set it use fast-load data access mode, set rows per batch to 10000 and maximum insert commit size to 100000

answered on Stack Overflow Apr 19, 2017 by Ntiyiso Mayile

User contributions licensed under CC BY-SA 3.0