I am trying to set up a DTS to transfer logging data from one server to another.
The record may already exist at the destination causing a primary key violation. I do not want this error to cause the entire DTS to fail.

When I execute the DTS I created by right clicking and selecting "Execute Package" it shows me 2 errors. Although there are 2 errors the rows that do not have a primary key violation are successfully transfered to the destination database.
Here are the 2 errors I see:

Error 1:
Error at Destination for Row number 97. Errors encountered so far in this task: 97.
The statement has been terminated.
Violation of PRIMARY KEY constraint 'PK_event'. Cannot insert duplicate key object 'event'.

Error 2:
Error at Destination for Row number 198. Errors encountered so far in this task: 198.
The statement has been terminated.
Violation of PRIMARY KEY constraint 'PK_eventDetail'. Cannot insert duplicate key object 'eventDetail'.


These errors make sense, there were 97 duplicate lines in the event table and 198 duplicates in the eventDetail table.

This is the behavior I want. New rows are copied to the destination database.

When I schedule the DTS as a Job in the Enterprise manager things change. When the DTS is executed as a Job (as opposed to me right clicking and selecting "Execute Package"), the job reports a failure and none of the new rows are transfered to the destination database.

Why does the DTS transfer the rows that do not violate the Primary Key constraint when I manually execute it and not when it is executed as a job?

How can I get the DTS to function as desired?

Thanks,

Andy