Can I make an IDataReader ignore column length definitions?

1

I'm working with a ProgressDB table that defines columns as strings of max length x, but in no way, shape, or form actually enforces this limitation; I have discovered several cases where the column data is significantly larger than the column length as defined. (It was an unpleasant shock to discover that was even possible). It is conceivable that there may be a way to enforce these constraints on the database side, but that is not practical at this time for a multitude of reasons.

My application bombs in a rather spectacular way upon encountering a row with such illegal data. I can catch the exception at the row read level, but what I'd really like to do is read the row and disregard the exception, as I am manipulating the string data and do not care about the length at all.

Is there a way to do this using the IDataReader interface? These are large datasets and manipulating it in a DataTable is not a realistic solution.

UPDATE Here's an interesting (frustrating) thing: I tried to capture a bad IDataReader.Read() like so:

var reading = true;
do
{
    try
    {
        reading = r.Read();
        if (reading)
        {
            var columns = new string[columnCount - 1];
            try
            {
                for (int i = 0; i < (columnCount - 1); i++)
                {
                    columns[i] = Convert.ToString(r.GetValue(i));
                }
                writer.Write(string.Join(this.ColumnDelimiter, columns) + this.RowDelimiter);
            }
            catch (Exception ex)
            {
                Log.Error(string.Format("Exception dumping table {0}.  At the point of the error, the columns resembled:\r\n{1}\r\n\r\nThe exception was: {2}",
                    this.TableName, string.Join(this.ColumnDelimiter, columns.Select(c => c ?? string.Empty).ToArray()), ex.ToString()), ex);
            }
        }
    }
    catch(Exception ex)
    {
        Log.Error("Error reading row.", ex);
    }

} while (reading);

...but when I ran this, the application puked with an uncaught exception:

2012-04-17 16:14:22,863 [1] ERROR GHM.ODBCSqlDump.Parser [(null)] - Error reading row.
System.Data.Odbc.OdbcException (0x80131937): ERROR [HY000] [DataDirect][ODBC Progress OpenEdge Wire Protocol driver][OPENEDGE]Column [ColumnName] in table [TableName] has value exceeding its max length or precision.
   at System.Data.Odbc.OdbcConnection.HandleError(OdbcHandle hrHandle, RetCode retcode)
   at System.Data.Odbc.OdbcDataReader.Read()
   at GHM.ODBCSqlDump.Parser.Execute(Stream streamOut)

Unhandled Exception: System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
   at System.Data.Common.UnsafeNativeMethods.SQLFetch(OdbcStatementHandle StatementHandle)
   at System.Data.Odbc.OdbcStatementHandle.Fetch()
   at System.Data.Odbc.OdbcDataReader.Read()
   at GHM.ODBCSqlDump.Parser.Execute(Stream streamOut)
   at GHM.DatabaseDump.Executor.ExecuteGroup(TransferGroup group)
   at GHM.DatabaseDump.Executor.Execute()
   at GHM.DatabaseDump.Program.Main(String[] args)

... my troubles increase at a faster pace than my solutions... sigh...

c#
odbc
asked on Stack Overflow Apr 17, 2012 by Jeremy Holovacs • edited Dec 26, 2015 by Brian Tompsett - 汤莱恩

2 Answers

1

Wow, this is messy. Can you modify your SQL SELECT statement to perform a SUBSTRING(1, n) where n is the defined length of the field, on each of the text fields that might be causing the overflow? You won't be able to get back all of the data in each field (which is bad data to begin with), but at least it might stop the exception.

Sorry, I'm not a ProgressDB guy, so I'm framing the solution in terms of T-SQL.

answered on Stack Overflow Apr 17, 2012 by mgnoonan
0

There is a field called width in the definition of the table in progress, where is defined the length for sql or odbc sources. This field has to be long enough to fit all the data. Otherwise you will end up getting that error.

answered on Stack Overflow Oct 3, 2013 by apascualini

User contributions licensed under CC BY-SA 3.0