Results 1 to 4 of 4

Thread: Help SQL SERVER 7.0 Memory ISSUES

  1. #1
    elsie reed Guest

    Help SQL SERVER 7.0 Memory ISSUES

    Colleagues,

    I am currently working on a data warehousing team, using SQL Server 7.0, we are loading several large (60 to 80 MB) ASCII files into the database, and are having some speed and performance issues. ONE THING WE SEE IS THAT THE MEMORY USAGE ON OUR SERVER IS COMPLETELY MAXED OUT AS OUR LOADS ARE TAKING PLACES.

    Our server has four processors, and 1.5 GB of memory. What else can I provide? These DTS packages are extensive and complex, a lot of transformations and lookups are used to aggregate the information. This is a financial website, aggregating different billing systems to one unified level. As you may see, timeliness, especially at the end of each month is critical. Some of these loads are taking 12 to 15 hours. HAVE YOU ENCOUNTERED SIMILAR PROBLEMS, ANY ADVICE OR SUGGESTIONS WOULD BE GREATLY APPRECIATED.

  2. #2
    MAK Guest

    Help SQL SERVER 7.0 Memory ISSUES (reply)

    1. Increase the tranlog size considerably
    2. Increase the network packetsize if u r uploading from a different server
    3. Change all the update statement with Convert, left, right, substr to normal update and executre those after regular updates.
    4. Change the SQL memory settings to Dynamic and change the maximum limit to
    1.2 GB and minimum to 700 MB.
    5. Increase the Pagefile.sys considerably.
    6. stop unnecessary services running on NT Services.
    7. Increases the SQL Priority
    8. make sure no other jobs are running on that server that brings down the performance.
    9. Put tranlog and data on different drives.
    10. Put temp db on different drive.
    11. drop all the clustered index before updating and recreate after updating.

    **** Thats what I can think of****
    -MAK


    ------------
    elsie reed at 2/6/01 1:33:55 PM

    Colleagues,

    I am currently working on a data warehousing team, using SQL Server 7.0, we are loading several large (60 to 80 MB) ASCII files into the database, and are having some speed and performance issues. ONE THING WE SEE IS THAT THE MEMORY USAGE ON OUR SERVER IS COMPLETELY MAXED OUT AS OUR LOADS ARE TAKING PLACES.

    Our server has four processors, and 1.5 GB of memory. What else can I provide? These DTS packages are extensive and complex, a lot of transformations and lookups are used to aggregate the information. This is a financial website, aggregating different billing systems to one unified level. As you may see, timeliness, especially at the end of each month is critical. Some of these loads are taking 12 to 15 hours. HAVE YOU ENCOUNTERED SIMILAR PROBLEMS, ANY ADVICE OR SUGGESTIONS WOULD BE GREATLY APPRECIATED.

  3. #3
    Elsie Reed Guest

    Help SQL SERVER 7.0 Memory ISSUES (reply)

    Thanks for your response where do i go the increase the pagefile.sys?


    ------------
    MAK at 2/6/01 1:51:09 PM

    1. Increase the tranlog size considerably
    2. Increase the network packetsize if u r uploading from a different server
    3. Change all the update statement with Convert, left, right, substr to normal update and executre those after regular updates.
    4. Change the SQL memory settings to Dynamic and change the maximum limit to
    1.2 GB and minimum to 700 MB.
    5. Increase the Pagefile.sys considerably.
    6. stop unnecessary services running on NT Services.
    7. Increases the SQL Priority
    8. make sure no other jobs are running on that server that brings down the performance.
    9. Put tranlog and data on different drives.
    10. Put temp db on different drive.
    11. drop all the clustered index before updating and recreate after updating.

    **** Thats what I can think of****
    -MAK


    ------------
    elsie reed at 2/6/01 1:33:55 PM

    Colleagues,

    I am currently working on a data warehousing team, using SQL Server 7.0, we are loading several large (60 to 80 MB) ASCII files into the database, and are having some speed and performance issues. ONE THING WE SEE IS THAT THE MEMORY USAGE ON OUR SERVER IS COMPLETELY MAXED OUT AS OUR LOADS ARE TAKING PLACES.

    Our server has four processors, and 1.5 GB of memory. What else can I provide? These DTS packages are extensive and complex, a lot of transformations and lookups are used to aggregate the information. This is a financial website, aggregating different billing systems to one unified level. As you may see, timeliness, especially at the end of each month is critical. Some of these loads are taking 12 to 15 hours. HAVE YOU ENCOUNTERED SIMILAR PROBLEMS, ANY ADVICE OR SUGGESTIONS WOULD BE GREATLY APPRECIATED.

  4. #4
    Dan Tan Guest

    Pagefile.sys (Help SQL SERVER 7.0 Memory ISSUES (reply))


    START/SETTINGS/CONTROL PANEL/SYSTEM

    Once there, click the "PERFORMANCE" tab and you'll see a "Virtual Memory" section with a "Change" button.

    Click the "Change" button. In the next dialog box, you can specify one or more Paging files. Each hard drive partition can only contain one paging file, and the system partition must contain a page file of at LEAST 2 megabytes in size (we usually keep a minimum paging file on the system partition and keep a large one on a different partition).

    Click "Set" when you are done, then close the dialog boxes.

    You'll probably have to reboot.

    Good luck!

    BCP is much faster. We have the same issue with imports from text files that require lots of conversions and filtering. If the text files can be exported in a format that needs no converting or filtering, BCP (Bulk Insert task in DTS) is several times faster than a standard DTS transformation.

    Dan Tan
    dan.tan@tekelec.com
    www.tekelec.com



    ------------
    Elsie Reed at 2/6/01 4:55:03 PM

    Thanks for your response where do i go the increase the pagefile.sys?


    ------------
    MAK at 2/6/01 1:51:09 PM

    1. Increase the tranlog size considerably
    2. Increase the network packetsize if u r uploading from a different server
    3. Change all the update statement with Convert, left, right, substr to normal update and executre those after regular updates.
    4. Change the SQL memory settings to Dynamic and change the maximum limit to
    1.2 GB and minimum to 700 MB.
    5. Increase the Pagefile.sys considerably.
    6. stop unnecessary services running on NT Services.
    7. Increases the SQL Priority
    8. make sure no other jobs are running on that server that brings down the performance.
    9. Put tranlog and data on different drives.
    10. Put temp db on different drive.
    11. drop all the clustered index before updating and recreate after updating.

    **** Thats what I can think of****
    -MAK


    ------------
    elsie reed at 2/6/01 1:33:55 PM

    Colleagues,

    I am currently working on a data warehousing team, using SQL Server 7.0, we are loading several large (60 to 80 MB) ASCII files into the database, and are having some speed and performance issues. ONE THING WE SEE IS THAT THE MEMORY USAGE ON OUR SERVER IS COMPLETELY MAXED OUT AS OUR LOADS ARE TAKING PLACES.

    Our server has four processors, and 1.5 GB of memory. What else can I provide? These DTS packages are extensive and complex, a lot of transformations and lookups are used to aggregate the information. This is a financial website, aggregating different billing systems to one unified level. As you may see, timeliness, especially at the end of each month is critical. Some of these loads are taking 12 to 15 hours. HAVE YOU ENCOUNTERED SIMILAR PROBLEMS, ANY ADVICE OR SUGGESTIONS WOULD BE GREATLY APPRECIATED.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •