I have a database in SQL Server 2008 R2 that has millions of files stored in it as varbinary blobs. I set up a process last week to do the following:
- Use some Entity Framework code to get the row (i.e. entity) that has the blob.
- Copy the blob stream to an object store.
- Update entity in the database with the new Object ID from the store.
I have 30 threads doing this constantly and the process will still take several days. A couple days ago the database log file filled up and I'm sure it had to be caused by this process. I decided point-in-time backups are not critical for this database and set the database to use the Simple recovery model. Then yesterday I got errors from my process again that said the log file filled up! How is this possible in simple mode?! Any idea what I can do to stop this from happening? When I monitor the log file SQL Server lets it get to 7% full and then truncates it down to zero so I'm super confused. It seems as though the log file starts to grow unchecked when the full backup begins...