Scenario: we are fetching rows from SQL Server to C#.Net console application and doing action on the retrieved data from SQL Server through stored procedure; after the action is performed the new data is stored into the MongoDB using C#-MongoDB-Driver.
Issue: There are billions of rows. My stored procedure contains query as follows:
select * from table_name
To work out some batch-wise logic there is no identity column nor any date columns or such.
Information: As of now the application is fetching the records upto 3500 - 5000 records and storing into MongoDB and then it throws an error which is as follows:
System.Runtime.InteropServices.SEHException (0x80004005): External component has thrown an exception.
Question: Can anyone suggest me some logic to work out for batch-wise read/fetch from SQL Server?
If you use MSSQL 2012 try OFFSET-FETCH clause. It is the best solution!
SELECT … ORDER BY orderid OFFSET 25 ROWS fetches only the next 25 rows.
It means this query will return from 25 to 50 records. The
ORDER BY clause is mandatory, so if you don't want to use order, use
ORDER BY (SELECT NULL)
If you can't use
OFFSET-FETCH in SQL Server 2012 and assuming the table has a primary key or column(s) that allow you to uniquely identify a row, lets call it
UniqueKey, then in 2005 upwards you could use
ROW_NUMBER like this...
SELECT UniqueKey, col2, col3 FROM ( SELECT UniqueKey, col2, col3, ROW_NUMBER() OVER (ORDER BY UniqueKey) AS RowNum FROM YourTable ) sub WHERE sub.RowNum BETWEEN @startRow AND @endRow
User contributions licensed under CC BY-SA 3.0