SQL SERVER – Time Out Due to Executing DELETE on Large RecordSet

Just a day ago, I received following question:

“I have large table more than 1M rows. I want to delete every row in my table. Everytime I ran DELETE statement, it times out and does not do it job. The data in table is useless and I do not need it ever. Your suggestion please.”

The reason I decided to write article about this question because I receive similar questions very often. I think many readers will find answer to this question useful. My answer to his question is here with:
“If DELETE is timing out use TRUNCATE instead. It will be faster and will not be logged in log file.”

Solarwinds

However, I recommended few additional points to keep in mind while using TRUNCATE.

Command TRUNCATE will reset identity to initial identity. If you want to keep your identity same you will have to get the maximum identity before you truncate and reset it to that maximum identity. SQL SERVER – DBCC RESEED Table Identity Value – Reset Table Identity . If you are not using identity or do not have use for identity you do not have to worry.

TRUNCATE command if not used with BEGIN and END transaction, you will not be able to roll back using log files. Make sure you absolutely do not need the truncated data. SQL SERVER – TRUNCATE Can’t be Rolled Back Using Log Files After Transaction Session Is Closed

Reference : Pinal Dave (https://blog.sqlauthority.com)

Solarwinds
Previous Post
SQL SERVER – 2005 – Change Compatibility Level – T-SQL Procedure
Next Post
SQL SERVER – What is – DML, DDL, DCL and TCL – Introduction and Examples

Related Posts

5 Comments. Leave new

  • Also TRUNCATE is not so useful when you are deleting “conditionally”
    e.g. delete 1M rows, in a table with 3M rows

    In this case
    – watch for index buildup (maybe drop-and-create index after delete)
    – watch for log buildup (maybe switch to SIMPLE, then back to FULL recovery)
    – do it in smaller batches

    Reply
  • R Chella Ganesh
    January 17, 2008 6:13 pm

    Hi Pinal,

    When trying to delete 1 M records in a single stretch, Will try to write to the logfile for deletion of all 1 Million records., and then the actual deletion starts. if you try, deleting the records in smaller batch of say 500 records per time, May speed up the process.

    This is my opinion. Please correct me if i am wrong.

    Regards
    Chella Ganesh
    C/o Excel Soft Technologies P Ltd.
    Mysore
    India

    Reply
  • Hi Pinal,

    When i am tryign to use the dbcc

    DBCC CHECKIDENT (Schools.tblCaste,reseed,1)

    Msg 102, Level 15, State 1, Line 1
    Incorrect syntax near ‘.’.

    Reply
  • >> Also TRUNCATE is not so useful when you are deleting “conditionally”

    I do it as a while loop — something like this

    while (select count(1) from mytable where indexnum 0
    begin
    delete top (5000) from mytable where indexnum < 100000
    end

    That loops through and takes out in chunks. It will grow your tran log, but it shouldn't be as bad, and you don't face a time out issue.

    Reply
  • DBCC CHECKIDENT (‘Schools.tblCaste’,reseed,1)

    Reply

Leave a Reply

Menu