I have developed an Access97 database that processes furniture parts to be manufactured. So it has order number, part numbers, module numbers, sizes, etc. I'll call it database D. In addition each customer has its own database that grows over the years. These customer databases can attain to 50mB or more. The purpose of database D is to temporarily pull in one day's worth of data from the customer databases that pertain to that day and then create reports to assist manufacture. We are limited in the speed and server space available. That's why we're using this approach in the first place rather than having all customers in one database. So admittedly, the broad approach I'm taking is less than ideal.

I use quite a lot of VBA code for this. Temp tables are loaded and appended to other tables. There are delete queries, summations, lots happens in there. I have fought with lack of reliability. It seems that data gets lost on the shuffle. For the most part I have overcome the problems so that we are finding in practice that it is serving OK. But to assure this I used some techniques that seem to me unorthodox. I cite 3 in particular.
1. I have a series of processes divided into two or more sections, each section being triggered by clicking a button. This can cause reliable processing which would not happen if the whole were triggered by a single command.
2. Completely closing out of the database and reopening it between processing segments.
3. Stopping and asking for manual "continue" between records.
4. Creating an append query using discrete VBA code, rather than just running an Access append query.

It seems to me that I should be able to rightfully expect consistently reliable data coming out without having to use these bandaids and crutches.

OK garbage in garbage out. But I expect to get ALL of the garbage neatly packaged. Any suggestions?