Making a copy of a database for query\reporting

B. J. Scherer

Making a copy of a database for query\reporting
I agree with your statement that you need to asses the value of the data
being up to date and the amount of DASD / tape that you need to accommodate
a replication / propagation tool. The additional log activity, due to the
Data Capture, is an initial fear but those who have implemented it have not
complained about the additional logging. Depending on the activity, the
additional logs may be minimal. If there is a lot of activity, then all the
more reason to give the user their data in a timely manner. Either way, the
ability to keep your database up at the cost of some additional log space
and a few extra cartridges should be enough payback. The other benefit is
the ability to propagate the changes to other platforms or make summaries
without doing multiple passes through the data. Depending on the user needs
and the perceived benefit of the application / tool, their budget may buy
some additional cheap DASD and tapes so they can get daily / hourly data
rather than waiting a month.

-----Original Message-----
From: Massimo Scarpa [mailto:[login to unmask email]
Sent: Tuesday, January 04, 2000 3:41 AM
To: [login to unmask email]
Subject: Re: [DB2-L] Making a copy of a database for query\reporting


We had 3 types of db to replicate. We didn't use anty 3rd-party tool.

- A small one, we replicate with UNLOAD/LOAD via shared tape and shared GDGs
(NEARLINE STORAGETEK shared between prod and test) during night time. We
generated the JCL once, then it was scheduled and parameters edited via
CONTROL-M (with auto-edit feature).
It was a production db even if created in a test LPAR.

-Two big db. One was refreshed every month (more or less) due to huge
dimensions
of tables with UNLOAD/LOAD; the other was duplicated via DSN1COPY + OBIDXLAT
dd
cards + RECOVER INDEX (ES). All db were updated during night time and all
auto-edited and scheduled via CONTROL-M .

We would like to to use Platinum's LOG ANALYZER, but we had to enable DATA
CAPTURE with an alter table and you must consider your log size will
increase.
All tools checked had disadvantages but the most important is the huge
quantity
of data to replicate (and the log created with heavy insert/update/delete
activity i.e. out of space in log DASDs).

I saw that one of the most important parameters to decide what is the best
solution is the time the duplicate database may be unusable (and out of
sync) by
applications. If window is narrow, the best solution is to use replication
tools; if the windows is large enough the best solution (and less cost) is
to
use DSN1COPY or UNLOAD/RELOD in parallel for each tablespace.

I hear that DATA REPLICATOR is a good tool, but I did not test this tool.
I knew someone used RVA SNAPSHOT COPY, but I don't know how.

Regards
Max Scarpa
Data & system Admin
CESVE SpA

Standard disclaimers apply