Error processing SQL Statements against SQLite database

I am using the ChipworkX to download string data from a web service, convert that data into a SQL statement to insert data into a SQLite database on a SD card. I download a single string, process it and run the SQL statement against the SQLLite database and repeat until all of my data is downloaded. In my particular example I have about 1300 lines of data, so 1300 SQL statements to process. They are all insert statements. I have my code that does the insert wrapped in a try catch. I can process usually 700-800 rows before it fails. In the catch clause it tries to save the error and stack trace to a text file on the SD card for troubleshooting purposes. However that fails too. Below is the code snippet and below that is the errors that show up in Visual Studio since I was running out of the VS IDE. The SQLite error doesn’t tell me much, but the error trying to write to the text file indicates a bit more. I’m wondering if anyone has seen anything like this.


SQLiteConnection connection = new SQLiteConnection(_connectionstring);
connection.Open();
SQLiteCommand command = new SQLiteCommand(sql, connection);
int result = command.ExecuteNonQuery();
connection.Close();

Below are the errors from Visual Studio.

Exception GHIElectronics.NETMF.SQLite.SQLiteException - 0x00000000 (6)

#### Message: Could not open database file: \SD\DATA\SPS.DB
#### GHIElectronics.NETMF.SQLite.Database::Open [IP: 0033] ####
#### SPSMF.SQLiteConnection::Open [IP: 0058] ####
#### SPSMF.SPSSQLLiteData::ExecuteSQL [IP: 000d] ####
#### SPSMicro.Sync::GetData [IP: 010c] ####
#### SPSMicro.scSettings::SyncStart [IP: 01df] ####

A first chance exception of type ‘GHIElectronics.NETMF.SQLite.SQLiteException’ occurred in GHIElectronics.NETMF.SQLite.dll
0
#### Exception System.IO.IOException - CLR_E_TOO_MANY_OPEN_HANDLES (6) ####
#### Message:
#### Microsoft.SPOT.IO.NativeFileStream::.ctor [IP: 0000] ####
#### System.IO.FileStream::.ctor [IP: 014a] ####
#### System.IO.FileStream::.ctor [IP: 000a] ####
#### System.IO.StreamWriter::.ctor [IP: 000e] ####
#### SPSMF.Utilities::LogToFile [IP: 003f] ####
#### SPSMF.Utilities::LogToFile [IP: 0007] ####
#### SPSMicro.Sync::GetData [IP: 012a] ####
#### SPSMicro.scSettings::SyncStart [IP: 01df] ####

From the error message it looks like files are being left open; are you creating a new connection each time you process one of your lines of data? If so, try adding connection.Dispose(); after connection.Close(); or better use something like:

using (SQLiteConnection connection = new SQLiteConnection(_connectionString))
{
    // ... do processing here ...
}

… which will call the Dispose() for you.

It looks like your logger is leaking resources.

Thanks JonG. I am creating the connection and then closing it each time. I figured since I was closing it each time it wouldn’t be a problem. Previously I was doing it in a batch mode where I would save all of the SQL statements and then process them against one connection. However I would experience problems where after about 400 or 500 it would just give me an error saying “Unable to execute SQL Statement”. Reading the data from the stream and then processing one SQL statement at a time seemed to get a bit farther. I will experiment with the dispose option.

As for the logger I am definitely closing the file after writing a line to the log file.