Sunday, February 12, 2012

Can I pick your brains for a second? transferring a large database problem.

Okay, I have a problem that I need some suggestions with.
I have a 30 Gig database that I need to sync across a T1 every night. This
database is a data dump of our billing system data, the tables are dropped
ad re-created every night.
I can do a very long and laborious process on the server of creating tables
with the primary keys from each table and a binary checksum of a row, then
complaining it to a copy of the database from the day before. Then I
generate SQL statements for the updates (Actually deletes and inserts). All
of this on the source server.
I'd like to compact this down using a DTS package on the destination side,
however I can't use a dynamic query in the "openRowSet" command to get only
the row that I want.
Here is what I'm doing (Just for one table, I'll have data driven steps for
each table in my DB)
insert into @.tempChecker
select BSum, chgNo from openquery(EMBAN2, 'Select BINARY_CHECKSUM(*) BSum,
chgno from charge_t')
Declare Charge_t_Cursor Cursor
FOR
select
a.chgno
from
(select
BSum,
chgno
from @.tempChecker
) a
where
not exists
(
select 'x'
from charge_t b
where a.chgno = b.chgno and a.BSum <> BINARY_CHECKSUM(*)
)
Once I get all the chgno's (The key in the table) I want to only pull back
the data that has changes. So I have this (Which I know I can't do put you
get the idea)
Open Charge_t_Cursor
Fetch next from Charge_t_cursor into @.ChgNo
While @.@.FETCH_STATUS = 0
BEGIN
insert into @.ChargeT select *
from OPENROWSET(EMBAN2, 'SELECT * FROM charge_t WHERE chgno = ' +
cast(isnull(@.ChgNo, 0) as varchar)))
Fetch next from Charge_t_cursor into @.ChgNo
END
Close Charge_t_cursor
deallocate Charge_t_cursor
select * from @.ChargeT
this way I'm only pulling over the wire the data that is updated and it will
be in a nice, compact binary format. ANY help would be VERY appreciated!!!
Thanks much!
Scotthave you tried to export your tables into flat files, compact these files
(using winzip or anything else)
copy the files and load these files on the destination server.
you could reduce the tranfert from 30gb to 1Gb.
more complex then a DTS pump, but this could be a good solution.
another way...
you can use replication.
or you can create your own "replication" model, I mean add triggers on your
table to log changes into another table in your source system, then load
these changes only every night.
Have you a "last update date" colum in your source tables?
"Scott M" <scott@.nospam.com> wrote in message
news:ezl3f2saFHA.3120@.TK2MSFTNGP12.phx.gbl...
> Okay, I have a problem that I need some suggestions with.
> I have a 30 Gig database that I need to sync across a T1 every night.
> This database is a data dump of our billing system data, the tables are
> dropped ad re-created every night.
> I can do a very long and laborious process on the server of creating
> tables with the primary keys from each table and a binary checksum of a
> row, then complaining it to a copy of the database from the day before.
> Then I generate SQL statements for the updates (Actually deletes and
> inserts). All of this on the source server.
> I'd like to compact this down using a DTS package on the destination side,
> however I can't use a dynamic query in the "openRowSet" command to get
> only the row that I want.
> Here is what I'm doing (Just for one table, I'll have data driven steps
> for each table in my DB)
> insert into @.tempChecker
> select BSum, chgNo from openquery(EMBAN2, 'Select BINARY_CHECKSUM(*) BSum,
> chgno from charge_t')
> Declare Charge_t_Cursor Cursor
> FOR
> select
> a.chgno
> from
> (select
> BSum,
> chgno
> from @.tempChecker
> ) a
> where
> not exists
> (
> select 'x'
> from charge_t b
> where a.chgno = b.chgno and a.BSum <> BINARY_CHECKSUM(*)
> )
> Once I get all the chgno's (The key in the table) I want to only pull back
> the data that has changes. So I have this (Which I know I can't do put
> you get the idea)
> Open Charge_t_Cursor
> Fetch next from Charge_t_cursor into @.ChgNo
> While @.@.FETCH_STATUS = 0
> BEGIN
> insert into @.ChargeT select *
> from OPENROWSET(EMBAN2, 'SELECT * FROM charge_t WHERE chgno = ' +
> cast(isnull(@.ChgNo, 0) as varchar)))
> Fetch next from Charge_t_cursor into @.ChgNo
> END
> Close Charge_t_cursor
> deallocate Charge_t_cursor
> select * from @.ChargeT
> this way I'm only pulling over the wire the data that is updated and it
> will be in a nice, compact binary format. ANY help would be VERY
> appreciated!!! Thanks much!
> Scott

No comments:

Post a Comment