Wednesday, July 8, 2009

Default location of Dump, Log and SQL file

As data pump is server based, all dump files, log files and SQL files are generated and accessed from /to server-based directory paths.

In simple term, Data pump directory object is an alias of an operating system directory. There must physically exist OS directory in order to effect corresponding oracle directory. Changes permission to oracle directory does not affect any changes to OS directory and in the same way changes permission to OS directory does not affect any oracle directory. In order to read/write any oracle directory, there must explicitly have read/write permission from corresponding OS directory.

As this post is related tolocation of dump, log and sql file in the following section the order of precedence to determine a file's location is discussed in case of Data Pump Export and Import.

1)If we specify a directory name with the parameter of LOGFILE or DUMPFILE or SQLFILE then the location specified by that directory object is used. It is very good to know that the directory object must be separated from the filename by a colon(:).
An example is given below.

SQL> create table test(a number);

Table created.

SQL> create directory c_drive as 'c:';

Directory created.

SQL> create directory d_drive as 'd:';

Directory created.

SQL> host expdp dumpfile=c_drive:test.dmp logfile=d_drive:test.log tables=test

Export: Release 11.1.0.6.0 - Production on Wednesday, 08 July, 2009 19:11:06

Copyright (c) 2003, 2007, Oracle. All rights reserved.

Username: arju/a

Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "ARJU"."SYS_EXPORT_TABLE_01": arju/******** dumpfile=c_drive:test.dmp logfile=d_drive:test.log tables=test
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 192 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported "ARJU"."TEST":"P1" 5.007 KB 1 rows
. . exported "ARJU"."TEST":"P2" 5.007 KB 1 rows
. . exported "ARJU"."TEST":"P3" 5.007 KB 1 rows
Master table "ARJU"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for ARJU.SYS_EXPORT_TABLE_01 is:
C:\TEST.DMP
Job "ARJU"."SYS_EXPORT_TABLE_01" successfully completed at 19:11:39

2)If the directory object is not specified for a file which means not included with DUMPFILE/LOGFILE/SQLFILE, then the directory object named by the DIRECTORY parameter is used.

So if your command is below then the dump file will represent c_drive directory which
indicates C: drive.

SQL> host expdp directory=c_drive tables=test

Export: Release 11.1.0.6.0 - Production on Wednesday, 08 July, 2009 22:04:09

Copyright (c) 2003, 2007, Oracle. All rights reserved.

Username: arju/a

Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "ARJU"."SYS_EXPORT_TABLE_01": arju/******** directory=c_drive tables=test
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 192 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "ARJU"."TEST":"P1" 5.007 KB 1 rows
. . exported "ARJU"."TEST":"P2" 5.007 KB 1 rows
. . exported "ARJU"."TEST":"P3" 5.007 KB 1 rows
Master table "ARJU"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for ARJU.SYS_EXPORT_TABLE_01 is:
C:\EXPDAT.DMP
Job "ARJU"."SYS_EXPORT_TABLE_01" successfully completed at 22:04:50

3)If you don't provide any directory object with filename, and if no directory object was named by the DIRECTORY parameter, then the value of the environment variable DATA_PUMP_DIR is used.

Always remember that this environment variable is defined using operating system
commands on the client system where the Data Pump Export and Import utilities
are run. But this value assigned to this client-based environment variable must be the name of a server-based directory object. So you first need to create a directory in the server as a DBA user and then you need to setup in client environment variable.

Below is an exact example of this scenario.
Though in this example DATA_PUMP_DIR directory points to OS directory D:\app\Arju\admin\arju\dpdump\ but because of client environmental variable setting of DATA_PUMP_DIR to TEST(which is E: drive) create the dumpfile inside E:\ drive.
Never confuse with the name DATA_PUMP_DIR. Here there can be any directory name that reside in the database.

E:\>sqlplus / as sysdba

SQL*Plus: Release 11.1.0.6.0 - Production on Wed Jul 8 15:54:04 2009

Copyright (c) 1982, 2007, Oracle. All rights reserved.


Connected to:
Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options

SQL> select directory_path from dba_directories where directory_name='DATA_PUMP_DIR';

DIRECTORY_PATH
--------------------------------------------------------------------------------
D:\app\Arju\admin\arju\dpdump\

SQL> create directory test as 'E:';

Directory created.

SQL> Disconnected from Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options

E:\>set DATA_PUMP_DIR=TEST


E:\>expdp schemas=arju
Export: Release 11.1.0.6.0 - Production on Wednesday, 08 July, 2009 15:54:54

Copyright (c) 2003, 2007, Oracle. All rights reserved.

Username: / as sysdba

Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYS"."SYS_EXPORT_SCHEMA_01": /******** AS SYSDBA schemas=arju
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 576 KB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/VIEW/VIEW
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/POST_TABLE_ACTION
Processing object type SCHEMA_EXPORT/MATERIALIZED_VIEW
Processing object type SCHEMA_EXPORT/TABLE/MATERIALIZED_VIEW_LOG
Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
. . exported "ARJU"."C1" 5.007 KB 1 rows
. . exported "ARJU"."P3" 5.007 KB 1 rows
. . exported "ARJU"."T" 5.023 KB 2 rows
. . exported "ARJU"."TEST":"P1" 5.007 KB 1 rows
. . exported "ARJU"."TEST":"P2" 5.007 KB 1 rows
. . exported "ARJU"."TEST":"P3" 5.007 KB 1 rows
. . exported "ARJU"."TEST2" 5.406 KB 1 rows
. . exported "ARJU"."TEST3" 5.414 KB 2 rows
. . exported "ARJU"."T_LARGEST" 5.046 KB 5 rows
. . exported "ARJU"."MASTER_TEMP" 0 KB 0 rows
. . exported "ARJU"."MASTER_TEMP2" 0 KB 0 rows
. . exported "ARJU"."MLOG$_MASTER_TEMP" 0 KB 0 rows
. . exported "ARJU"."MV_MASTER" 0 KB 0 rows
. . exported "ARJU"."T1" 0 KB 0 rows
Master table "SYS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
E:\EXPDAT.DMP
Job "SYS"."SYS_EXPORT_SCHEMA_01" successfully completed at 15:56:37


4)If none of the above three conditions yields a directory object, but you invoke data pump and you have permission on DATA_PUMP_DIR, then Data Pump attempts to use the value of the default server-based directory object, DATA_PUMP_DIR.

This DATA_PUMP_DIR directory is automatically created at database creation or when the database dictionary is upgraded.

If you don't have access to the DATA_PUMP_DIR directory object then you will get insufficient privilege error.

Related Documents

No comments:

Post a Comment