now

Okay, I have a problem that I need some suggestions with.
I have a 30 Gig database that I need to sync across a T1 every night. This
database is a data dump of our billing system data, the tables are dropped
ad re-created every night.
I can do a very long and laborious process on the server of creating tables
with the primary keys from each table and a binary checksum of a row, then
complaining it to a copy of the database from the day before. Then I
generate SQL statements for the updates (Actually deletes and inserts). All
of this on the source server.
I'd like to compact this down using a DTS package on the destination side,
however I can't use a dynamic query in the "openRowSet" command to get only
the row that I want.
Here is what I'm doing (Just for one table, I'll have data driven steps for
each table in my DB)
insert into @.tempChecker
select BSum, chgNo from openquery(EMBAN2, 'Select BINARY_CHECKSUM(*) BSum,
chgno from charge_t')
Declare Charge_t_Cursor Cursor
FOR
select
a.chgno
from
(select
BSum,
chgno
from @.tempChecker
) a
where
not exists
(
select 'x'
from charge_t b
where a.chgno = b.chgno and a.BSum <> BINARY_CHECKSUM(*)
)
Once I get all the chgno's (The key in the table) I want to only pull back
the data that has changes. So I have this (Which I know I can't do put you
get the idea)
Open Charge_t_Cursor
Fetch next from Charge_t_cursor into @.ChgNo
While @.@.FETCH_STATUS = 0
BEGIN
insert into @.ChargeT select *
from OPENROWSET(EMBAN2, 'SELECT * FROM charge_t WHERE chgno = ' +
cast(isnull(@.ChgNo, 0) as varchar)))
Fetch next from Charge_t_cursor into @.ChgNo
END
Close Charge_t_cursor
deallocate Charge_t_cursor
select * from @.ChargeT
this way I'm only pulling over the wire the data that is updated and it will
be in a nice, compact binary format. ANY help would be VERY appreciated!!!
Thanks much!
ScottHello Scott,
It is not easy to use a DTS package to do this job. I think database
replication is more appropriate for this situation. You may consider
transaction or merge replication according to your requirements.
If you still want to consider DTS, lookup query feature might be helpful
http://msdn.microsoft.com/library/d...-us/dtssql/dts_
addf_misc_2dix.asp
Regards,
Peter Yang
MCSE2000/2003, MCSA, MCDBA
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
========================================
=============
This posting is provided "AS IS" with no warranties, and confers no rights.
| From: "Scott M" <scott_M@.nospam.nospam>
| Subject: Can I pick your brains for a second? transferring a large
database problem
| Date: Mon, 6 Jun 2005 17:33:03 -0500
| Lines: 72
| MIME-Version: 1.0
| Content-Type: text/plain;
| format=flowed;
| charset="iso-8859-1";
| reply-type=original
| Content-Transfer-Encoding: 7bit
| X-Priority: 3
| X-MSMail-Priority: Normal
| X-Newsreader: Microsoft Outlook Express 6.00.2900.2527
| X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.2527
| Message-ID: <ui2eBfuaFHA.584@.TK2MSFTNGP15.phx.gbl>
| Newsgroups: microsoft.public.sqlserver.datawarehouse
| NNTP-Posting-Host: adsl-068-209-157-050.sip.lft.bellsouth.net
68.209.157.50
| Path: TK2MSFTNGXA01.phx.gbl!TK2MSFTNGP08.phx.gbl!TK2MSFTNGP15.phx.gbl
| Xref: TK2MSFTNGXA01.phx.gbl microsoft.public.sqlserver.datawarehouse:1784
| X-Tomcat-NG: microsoft.public.sqlserver.datawarehouse
|
| -=-=- Sorry for the dup post, I'm posting it under my MSDN universal acct
| now

|
| Okay, I have a problem that I need some suggestions with.
|
| I have a 30 Gig database that I need to sync across a T1 every night.
This
| database is a data dump of our billing system data, the tables are dropped
| ad re-created every night.
|
| I can do a very long and laborious process on the server of creating
tables
| with the primary keys from each table and a binary checksum of a row, then
| complaining it to a copy of the database from the day before. Then I
| generate SQL statements for the updates (Actually deletes and inserts).
All
| of this on the source server.
|
| I'd like to compact this down using a DTS package on the destination side,
| however I can't use a dynamic query in the "openRowSet" command to get
only
| the row that I want.
|
| Here is what I'm doing (Just for one table, I'll have data driven steps
for
| each table in my DB)
|
| insert into @.tempChecker
| select BSum, chgNo from openquery(EMBAN2, 'Select BINARY_CHECKSUM(*) BSum,
| chgno from charge_t')
|
| Declare Charge_t_Cursor Cursor
| FOR
| select
| a.chgno
| from
| (select
| BSum,
| chgno
| from @.tempChecker
| ) a
| where
| not exists
| (
| select 'x'
| from charge_t b
| where a.chgno = b.chgno and a.BSum <> BINARY_CHECKSUM(*)
| )
|
| Once I get all the chgno's (The key in the table) I want to only pull back
| the data that has changes. So I have this (Which I know I can't do put
you
| get the idea)
|
| Open Charge_t_Cursor
|
| Fetch next from Charge_t_cursor into @.ChgNo
|
| While @.@.FETCH_STATUS = 0
| BEGIN
| insert into @.ChargeT select *
| from OPENROWSET(EMBAN2, 'SELECT * FROM charge_t WHERE chgno = ' +
| cast(isnull(@.ChgNo, 0) as varchar)))
|
| Fetch next from Charge_t_cursor into @.ChgNo
| END
|
| Close Charge_t_cursor
| deallocate Charge_t_cursor
|
| select * from @.ChargeT
|
| this way I'm only pulling over the wire the data that is updated and it
will
| be in a nice, compact binary format. ANY help would be VERY
appreciated!!!
| Thanks much!
|
| Scott
|
||||Would replication work if the source tables are dropped every time it's
built?
"Peter Yang [MSFT]" <petery@.online.microsoft.com> wrote in message
news:s73X27waFHA.3928@.TK2MSFTNGXA01.phx.gbl...
> Hello Scott,
> It is not easy to use a DTS package to do this job. I think database
> replication is more appropriate for this situation. You may consider
> transaction or merge replication according to your requirements.
> If you still want to consider DTS, lookup query feature might be helpful
> [url]http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dtssql/dts_[/ur
l]
> addf_misc_2dix.asp
> Regards,
> Peter Yang
> MCSE2000/2003, MCSA, MCDBA
> Microsoft Online Partner Support
> When responding to posts, please "Reply to Group" via your newsreader so
> that others may learn and benefit from your issue.
> ========================================
=============
>
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
>
>
> --
> | From: "Scott M" <scott_M@.nospam.nospam>
> | Subject: Can I pick your brains for a second? transferring a large
> database problem
> | Date: Mon, 6 Jun 2005 17:33:03 -0500
> | Lines: 72
> | MIME-Version: 1.0
> | Content-Type: text/plain;
> | format=flowed;
> | charset="iso-8859-1";
> | reply-type=original
> | Content-Transfer-Encoding: 7bit
> | X-Priority: 3
> | X-MSMail-Priority: Normal
> | X-Newsreader: Microsoft Outlook Express 6.00.2900.2527
> | X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.2527
> | Message-ID: <ui2eBfuaFHA.584@.TK2MSFTNGP15.phx.gbl>
> | Newsgroups: microsoft.public.sqlserver.datawarehouse
> | NNTP-Posting-Host: adsl-068-209-157-050.sip.lft.bellsouth.net
> 68.209.157.50
> | Path: TK2MSFTNGXA01.phx.gbl!TK2MSFTNGP08.phx.gbl!TK2MSFTNGP15.phx.gbl
> | Xref: TK2MSFTNGXA01.phx.gbl
> microsoft.public.sqlserver.datawarehouse:1784
> | X-Tomcat-NG: microsoft.public.sqlserver.datawarehouse
> |
> | -=-=- Sorry for the dup post, I'm posting it under my MSDN universal
> acct
> | now

> |
> | Okay, I have a problem that I need some suggestions with.
> |
> | I have a 30 Gig database that I need to sync across a T1 every night.
> This
> | database is a data dump of our billing system data, the tables are
> dropped
> | ad re-created every night.
> |
> | I can do a very long and laborious process on the server of creating
> tables
> | with the primary keys from each table and a binary checksum of a row,
> then
> | complaining it to a copy of the database from the day before. Then I
> | generate SQL statements for the updates (Actually deletes and inserts).
> All
> | of this on the source server.
> |
> | I'd like to compact this down using a DTS package on the destination
> side,
> | however I can't use a dynamic query in the "openRowSet" command to get
> only
> | the row that I want.
> |
> | Here is what I'm doing (Just for one table, I'll have data driven steps
> for
> | each table in my DB)
> |
> | insert into @.tempChecker
> | select BSum, chgNo from openquery(EMBAN2, 'Select BINARY_CHECKSUM(*)
> BSum,
> | chgno from charge_t')
> |
> | Declare Charge_t_Cursor Cursor
> | FOR
> | select
> | a.chgno
> | from
> | (select
> | BSum,
> | chgno
> | from @.tempChecker
> | ) a
> | where
> | not exists
> | (
> | select 'x'
> | from charge_t b
> | where a.chgno = b.chgno and a.BSum <> BINARY_CHECKSUM(*)
> | )
> |
> | Once I get all the chgno's (The key in the table) I want to only pull
> back
> | the data that has changes. So I have this (Which I know I can't do put
> you
> | get the idea)
> |
> | Open Charge_t_Cursor
> |
> | Fetch next from Charge_t_cursor into @.ChgNo
> |
> | While @.@.FETCH_STATUS = 0
> | BEGIN
> | insert into @.ChargeT select *
> | from OPENROWSET(EMBAN2, 'SELECT * FROM charge_t WHERE chgno = ' +
> | cast(isnull(@.ChgNo, 0) as varchar)))
> |
> | Fetch next from Charge_t_cursor into @.ChgNo
> | END
> |
> | Close Charge_t_cursor
> | deallocate Charge_t_cursor
> |
> | select * from @.ChargeT
> |
> | this way I'm only pulling over the wire the data that is updated and it
> will
> | be in a nice, compact binary format. ANY help would be VERY
> appreciated!!!
> | Thanks much!
> |
> | Scott
> |
> |
>|||have you try to create a local copy, then zip it, send the file by FTP,
unzip and load the data?
For example, you export to CSV files the new data from the source server.
A CSV file can be loaded in a database very quickly using bulk insert
commands.
Or you could detach a database, zip it, copy, reattach on the target server.
Or do the same with a backup file.
I have a demo database which is a 5gb of data, when I compress it, the size
is only 150mb and the time to transfert is excellent. (10min to compress,
10min to transfert, 5 to expand; instead-of 2h of transferts)
"Scott M" <scott_M@.nospam.nospam> wrote in message
news:ui2eBfuaFHA.584@.TK2MSFTNGP15.phx.gbl...
> -=-=- Sorry for the dup post, I'm posting it under my MSDN universal acct
> now

> Okay, I have a problem that I need some suggestions with.
> I have a 30 Gig database that I need to sync across a T1 every night.
> This
> database is a data dump of our billing system data, the tables are dropped
> ad re-created every night.
> I can do a very long and laborious process on the server of creating
> tables
> with the primary keys from each table and a binary checksum of a row, then
> complaining it to a copy of the database from the day before. Then I
> generate SQL statements for the updates (Actually deletes and inserts).
> All
> of this on the source server.
> I'd like to compact this down using a DTS package on the destination side,
> however I can't use a dynamic query in the "openRowSet" command to get
> only
> the row that I want.
> Here is what I'm doing (Just for one table, I'll have data driven steps
> for
> each table in my DB)
> insert into @.tempChecker
> select BSum, chgNo from openquery(EMBAN2, 'Select BINARY_CHECKSUM(*) BSum,
> chgno from charge_t')
> Declare Charge_t_Cursor Cursor
> FOR
> select
> a.chgno
> from
> (select
> BSum,
> chgno
> from @.tempChecker
> ) a
> where
> not exists
> (
> select 'x'
> from charge_t b
> where a.chgno = b.chgno and a.BSum <> BINARY_CHECKSUM(*)
> )
> Once I get all the chgno's (The key in the table) I want to only pull back
> the data that has changes. So I have this (Which I know I can't do put
> you
> get the idea)
> Open Charge_t_Cursor
> Fetch next from Charge_t_cursor into @.ChgNo
> While @.@.FETCH_STATUS = 0
> BEGIN
> insert into @.ChargeT select *
> from OPENROWSET(EMBAN2, 'SELECT * FROM charge_t WHERE chgno = ' +
> cast(isnull(@.ChgNo, 0) as varchar)))
> Fetch next from Charge_t_cursor into @.ChgNo
> END
> Close Charge_t_cursor
> deallocate Charge_t_cursor
> select * from @.ChargeT
> this way I'm only pulling over the wire the data that is updated and it
> will
> be in a nice, compact binary format. ANY help would be VERY
> appreciated!!!
> Thanks much!
> Scott
>|||Yes, the files are too large to zip in a decent amount of time (The database
files)
however we are zipping and transferring the flat files that the UNIX box is
creating.
"Jj" <willgart@.BBBhotmailAAA.com> wrote in message
news:Omk62fNdFHA.3620@.TK2MSFTNGP09.phx.gbl...
> have you try to create a local copy, then zip it, send the file by FTP,
> unzip and load the data?
> For example, you export to CSV files the new data from the source server.
> A CSV file can be loaded in a database very quickly using bulk insert
> commands.
> Or you could detach a database, zip it, copy, reattach on the target
> server.
> Or do the same with a backup file.
> I have a demo database which is a 5gb of data, when I compress it, the
> size is only 150mb and the time to transfert is excellent. (10min to
> compress, 10min to transfert, 5 to expand; instead-of 2h of transferts)
>
> "Scott M" <scott_M@.nospam.nospam> wrote in message
> news:ui2eBfuaFHA.584@.TK2MSFTNGP15.phx.gbl...
>
No comments:
Post a Comment