Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

How can I design a script to push or pull results from commands run on one server to another?

Status
Not open for further replies.
Dec 12, 2007
19
US
For example, is there a way I push the results from a script run on a local server into a script on a remote server that does something with the results without having to land the results on disk?

Or is there a way I can pull the results from a script/command run on a remote server into a script on a local server that does something with the results without having to land the results on disk?

Can I use a pipe or something else to accomplish this? If so, could someone provide me with an example?

The problem I am trying to solve is a lack of disk space. I have data in a database on one server that I want to unload and load into a database on another server. I know how to run the unload command on the one server and the load command on the other server and I know how to put these in UNIX scripts. What I do not know is how to run the unload in a script that pushes the table output through to the remote server load program to load the results into the remote database without having to land the unloaded data on disk in an interim file during the process. I am fine to push or pull the data, whichever is easiest. If I have the option to do this either way, then I can decide which server has the most resources available to run this and use the push or pull approach based on that.

Thanks in advance.
 
Can't you access the database remotely ?

Hope This Helps, PH.
FAQ219-2884
FAQ181-2886
 
You should look in the direction PHV is suggesting. In Oracle you can create a database link between two separate databases. Then your data transfer can be as simple as this...

Code:
INSERT INTO LOCAL_TABLE
SELECT * FROM REMOTE_TABLE;

I'm sure other databases can do similar things.

 
Netezza does not support access to remote databases like ORACLE and other databases do. It is not as sophisticated as a lot of the other relational databases in this respect. I am trying to find a way to migrate data to it without having to land the data in files. We do not have the storage for that.
 
Also take a look at the Netezza Data Loading Guide PDF provided by IBM. It looks like the nzload utility can pull data from an ODBC data source, which pretty much opens up most relational databases.





 
The other database is DB2 LUW so the ORACLE information is not relevant at this time but we do have ORACLE databases and at some point in the future may need to load information from these sources so this is very helpful information that I may well need in the future. Thank you so much.

Yes, thanks, ODBC is an option and I have been reading and working with the Netezza Data Loading Guide. Some of the tables that will be involved with this project are quite large with billions and billions of rows. ODBC can be an option for small tables whose data types match. I am running into some issues getting columns with date or timestamp data types to work correctly. It is not a simple matter of INSERT INTO SELECT FROM because I cannot connect to both databases at the same time and because the date and timestamp formats between these two databases is different. I have however figured out that there are options I can use with the DB2 LUW EXPORT or the DB2HPU (High Performance Unload) utility and the NZLOAD utility to work around the date and timestamp conversion issue.

Testing of the DB2 LUW unload utilities and Netezza NZLOAD utility have proved to be much faster than the ODBC interface. In addition to this, I think that I may be able to use a pipe between the two utilities in order to avoid a huge storage requirement to land billions and billions of rows unloaded before loading them. Finally, if I use scripts to accomplish this, I can schedule them to run against our current production environment and have them stop and restart using date ranges by year when migrating huge amount of information that span a large number of years. In this way, I should be able to manage this in a way that does not have a huge impact on our current production operations.

The DB2HPU utility does not employ the database engine to unload the data so the combination of this and the NZLOAD utility look like the best solution. I plan to search this web site and the web to see if I can find a simple script example that uses a pipe from one server to another. The utilities allow me to connect to a local DB2 LUW database and unload data from a table and then use the NZLOAD utility and the Netezza ODBC interface you mention to load rows into tables in a remote Netezza database. I hope that a pipe will work for all of this to avoid having to get additional storage to support the migration. Also, batch scripts can be scheduled and unload SQL parameters can be used to break up the huge logical unit of work into smaller pieces by year to give me the flexibility to operate all of this in a way that plays nice with current production operational requirements and service level agreements.

Thanks for everyone's helpful information and ideas.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top