Results 1 to 11 of 11

Thread: Large Updates on SQL 6.5 - suggestions???

  1. #1
    Join Date
    Dec 2002
    Posts
    5

    Question Large Updates on SQL 6.5 - suggestions???

    Hi,
    I am new here and am hoping someone might have performance tuning suggestions for a large update I have to do.

    I am updating 350,000 records on a 6.5 database and I am incrimenting them out 5,000 at a time. It took about 17 hours to complete the update!!!

    Does this sound really off? Is there anything I can do to speed this up? I am doing a straight update. I am not specifying any cursor and am not using any temp tables... but I am doing it in a stored procedure. Any suggestions about what I can do better would be very much appreciated.

    Thanks again.

  2. #2
    Join Date
    Nov 2002
    Location
    New Jersey, USA
    Posts
    3,932
    What do you mean by incrementing?. Are you committing every 5000 rows?.

    There can be many reasons for the performance problem. First thing you should do is to run performance monitor to see what is the bottleneck.

    Also check the explain plan to see how the update is executed.

    Plus
    1. Are you updating clustered index key?, if yes then every time a row is modified the row may have to be relocated.

    2. Are there any foreign key constraints in this table? Or are there other tables referencing this table?

    3. Are your tempdb and log files in different drive?

  3. #3
    Join Date
    Dec 2002
    Posts
    5

    Red face Large Number of Updates in 6.5 - Reply

    Hi skhanal,
    Thanks so much for your reply!
    The database is a test database that I am working with in preparation for updating the real db next month. When I run the stored procedure I am the only one in the database so there are no other connections that could cause additional locks. I am incrementing 5000 at a time because when I tried to run it on the whole table I got errors on syslogs and tempdb and transaction logs even though trucate was turned on.

    The procedure that I run checks the system_id (PK) and I have 3 variables. Hi, Low, Cur. I update field: "Name" by 5000 system_id's at a time and that seems to have solved my problem with filling up the logs and causing errors.

    1 - I'm not sure if this db uses clustered index's but my query is based on the primary key, but I am not updating a primary or foreing key field, just a field in the table.
    If I were, what could I do differently to help?

    2 - Yes there are foreign key's in the table and there are several tables that reference this table.

    3 - Tempdb and the log files are located on the same partition.

    4 - I have not run the performance monitor to see what the back up could be. Not sure I would know what to do if I found a back up because it's one table and one field I have to do? Maybe a temporary table???

    Would a dynamic cursor be good? It's my understanding that cursors are not recommended in stored procedures?

    Maybe I should do an explicit transaction? Would it make any difference?

    Hope this helps some? I just don't know where to go with this? The timing of this is important and I have to find a way to make this go as fast as it possibly can!

    Thanks again so much for taking the time.

    TedBud

  4. #4
    Join Date
    Dec 2002
    Location
    London, ON - Canada
    Posts
    6
    Do you have an example of the update? Are you using a HAVING clause or WHERE? Are you using CASE?

    Thanks,

    Steve

  5. #5
    Join Date
    Dec 2002
    Posts
    5

    Unhappy SQL Updates 6.5 - update script

    Hi,
    Thanks for the post!!! Here is what
    I'm putting in a stored procedure.
    Any thoughts are most appreciated.


    declare @high int, @lowest int, @current int

    --initialize the loop
    select @high = 350000, @lowest = 128, @current = 5000

    --declare the loop conditions
    while (@current < @high) begin

    --the loop
    update <table>(tblCusInfo)
    set <field>(names)
    where <pk>(system_id) between @lowest and @current

    --update the loop
    select @lowest = @current, @current = @current + 5000
    end

    Thanks

  6. #6
    Join Date
    Dec 2002
    Location
    London, ON - Canada
    Posts
    6
    --During the loop:
    update <table>(tblCusInfo)
    set <field>(names)
    where <pk>(system_id) between @lowest and @current

    ...what are you setting the (names) field to? Is this information coming from another table? Are you setting it to null? Seems to be missing something.

    ***

    Plus, as skhanal mentioned, check for indexing. If the table has nonclustered indexes, drop these before your update. It's usually faster to drop and recreate them then to update data while they're in place.

    If there's a clustered index, removing it will cause other nonclustered indexes to be rebuilt.

    It also may be more efficient to run sequentially, and not by increments of 5000. Is there a reason for this?

  7. #7
    Join Date
    Dec 2002
    Posts
    5

    Large U[dates 6.5

    Hi,
    I will check for the indexes to see.

    I guess I'm not sure what you mean by sequentially, should I list a series of ranges? Do you have a quick example of what you mean.

    I'm doing it this way on advice, it's also what I'm use to doing from other programming that I've done. If there is a quicker and easier way, I'm all for it!!!

    Thanks again so much.

  8. #8
    Join Date
    Dec 2002
    Location
    London, ON - Canada
    Posts
    6
    I was wondering why you're jumping 5000 at a time, instead of going one-by-one?

  9. #9
    Join Date
    Dec 2002
    Posts
    5

    Large Updates 6.5 -

    Hmmmmm interesting! I originally started with that. I was able to update 35,000 records in about 1.5 hours. I chose this because I spoke to someone who works on the same system who just went through this conversion. They had about the same number of records and were on 6.5. They indicated that they thought 5,000 at a time seem to work well for them! I have never updated so many records at one time ... let alone on 6.5!!1

    According to my first test, at 35k then 350k would be about 10 hours...
    I think I'll give it a shot again and see if it works out!

    Thanks,

  10. #10
    Join Date
    Dec 2002
    Location
    London, ON - Canada
    Posts
    6
    Also, I noted that you're running this on a test database, and are the only user. You should test the limitations of concurrent activity while updating those records. This will help you decide what time (after hours or weekend?) to do it in production, plus tell you if you have to kill active spids and put the database in dbo-only mode. Goes without saying, but you should do a full backup before the conversion.

    I'm guessing that your update will lock the table, so inquiry may be fine, but maintenance to the table will be locked out. You should test the theory, to see what works and what freezes up. Monitor for blocked processes.

    Once one process fails, it will block others, and the application will gridlock sooner or later. I can email you a screen shot of that in my test database running a web app, if you'd like.

    The only way around the table block is using CURSORs which would undoubtedly slow your script. You may want to try CASE, but it will probably also slow down your process.

    Good luck,

    Steve

  11. #11
    Join Date
    Nov 2002
    Location
    New Jersey, USA
    Posts
    3,932
    One more thing, is there any trigger on the table?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •