Monday, February 9, 2009

How to take data pump export dumpfile to another machine

In this example I will show you how to take data pump export so that dump file will be created in another database rather than the database from which you are doing data pump export.

In this example I will do all operations currently sitting my own client machine.
I will take a data pump export from database ORCL and dumpfile will be written to database TIGER on another machine.

Below is the steps.
Step 01: Connect to the tiger database and create database link.
C:\>sqlplus arju/a@tiger

SQL*Plus: Release 10.2.0.1.0 - Production on Tue Feb 10 01:11:21 2009

Copyright (c) 1982, 2005, Oracle. All rights reserved.


Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options

SQL> create database link orcl.net using 'ORCL';


Database link created.

Note that TNS alias entry of ORCL must reside on the TIGER machine. You can't use your own TNS alias entry here. After creating database link make sure you can connect to orcl database.

SQL> select count(*) from user_tables@orcl.net;

COUNT(*)
----------
2

Without TNS entry you can still create your database link. Like below.

SQL> create database link orcl2.net connect to arju identified by a
using '(DESCRIPTION=(ADDRESS = (PROTOCOL = TCP)(HOST =10.4.12.233) (PORT=1521))
(connect_data=(service_name=orcl)))';


Database link created.

test by,
SQL> select count(*) from user_tables@orcl2.net;

COUNT(*)
----------
2
Step 02: After successful database link creation run expdp on the tiger machine with NETWORK_LINK parameter. Dump file will be created on tiger machine and metadata and data will be exported through database link.

In this example optionally directory=d is provided. You can create it as ,
SQL> create directory d as 'your expected location';

If no directory parameter is provided by default in the directory DATA_PUMP_DIR dump file is exported.
SQL> $expdp arju/a@tiger directory=d schemas=arju dumpfile=arju_dump_from_orcl.dmp network_link=orcl2.net

Export: Release 10.2.0.1.0 - Production on Tuesday, 10 February, 2009 1:53:34

Copyright (c) 2003, 2005, Oracle. All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "ARJU"."SYS_EXPORT_SCHEMA_01": arju/********@tiger directory=d schemas=arju dumpfile=arju_dump_from_orcl.dmp network_link=orcl2.net
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 128 KB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/DB_LINK
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "ARJU"."EMP" 4.921 KB 2 rows
. . exported "ARJU"."TEST" 4.914 KB 1 rows
Master table "ARJU"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for ARJU.SYS_EXPORT_SCHEMA_01 is:
D:\ARJU_DUMP_FROM_ORCL.DMP
Job "ARJU"."SYS_EXPORT_SCHEMA_01" successfully completed at 13:54:59
Related Documents
http://arjudba.blogspot.com/2009/02/important-guideline-about-data-pump.html
http://arjudba.blogspot.com/2009/02/how-to-do-data-pump-import-from-higher.html
http://arjudba.blogspot.com/2008/04/data-pump-exportimport-how-we-can-do-it.html
http://arjudba.blogspot.com/2009/02/ude-00018-data-pump-client-is.html
http://arjudba.blogspot.com/2008/04/exportimport-datapump-parameter-query.html
http://arjudba.blogspot.com/2008/04/improve-data-pump-performance.html
http://arjudba.blogspot.com/2009/02/compatibility-table-of-oracle-data-pump.html
http://arjudba.blogspot.com/2008/04/datapump-parameters-include-and-exclude.html

2 comments:

  1. Have you tried to this in a situation where the source database is 9i and the target database is 10g? In other words, can this method be used to overcome the inability of using Data Pump on a 9i database?

    ReplyDelete
  2. We have a oracle 11g server running 2 database instances (dev and prod). One of the schemas in prod say schema MySchema is live. We have a copy of MySchema in the dev instance as well. What I would like to do is to update MySchema in dev from prod MySchema. I want to have some process which will run a script to update MySchema every night.

    ReplyDelete