Friday, February 6, 2009

New features in Oracle database 11g of data pump utility

1)With release of 11g data pump client the parameter COMPRESSION is enhanced. With COMPRESSION parameter we now can also compress data. Now the value of the COMPRESSION parameter can be specified to ALL, METADATA_ONLY, DATA_ONLY and NONE by which we can compress both data and metadata, only data, only metadata, or no data during an data pump export. METADATA_ONLY is the default.

Prior to 11g only METADATA_ONLY and NONE could be specified. With now COMPRESSION=ALL, dumpfile size can be greatly reduced so no need to use any compression utility to compress the dumpfile further.

2)The parameter ENCRYPTION, ENCRYPTION_ALGORITHM, ENCRYPTION_MODE are introduced.
With ENCRYPTION parameter you can choose to encrypt both data and metadata, only data, only metadata, no data, or only encrypted columns during an export. Valid keywords of this parameter are, ALL, DATA_ONLY, METADATA_ONLY, ENCRYPTED_COLUMNS_ONLY, or NONE.

With ENCRYPTION_ALGORITHM parameter you can specify an algorithm how encryption should be done. Valid keywords for this parameter is AES128, AES192, and AES256. AES128 is the default.

With ENCRYPTION_MODE you can specify the type of security to use for performing encryption and decryption during an export. Valid keywords for this parameter is DUAL, PASSWORD, and TRANSPARENT. TRANSPARENT is the default.

3)The parameter TRANSPORTABLE helps to perform table mode export and import using the transportable method. It is similar to the TRANSPORT_TABLESPACES parameter but in this case table's data is imported rather than tablespaces.

4)The parameter PARTITION_OPTIONS specify how partitioned tables should be handled during import operations. While data pump import operation with PARTITION_OPTIONS parameter you can specify value DEPARTITION, MERGE and NONE. The default value is NONE.

5)Prior to 11g, if any dumpfile exists with the same name as to be exported on the directory then error would return and export terminated. As of 11g with REUSE_DUMPFILES parameter existing dumpfile can be overwritten during an export operation. When this parameter is set to "Y", any existing dumpfiles will be overwritten. When the default values of "N" is used, an error is issued if the dump file already exists.

6)Prior to 11g, if the table to be imported is existed in the schema with the same name, then one way is to import the table is to use REMAP_SCHEMA and then import the table in another schema. Another way is rename original table into another name and then import the table. With of 11g you can now rename table while importing. With the parameter REMAP_TABLE you can do this.

7)With the REMAP_DATA parameter you can specify a conversion function which will be automatically applied on a table's column during export and import. This technique can be used to mask sensitive data by replacing original data with random alternatives.
To syntax to use it is,
REMAP_DATA=tablename.column_name:your_desired_function

8)During import operations using the external table method, setting the DATA_OPTIONS parameter to SKIP_CONSTRAINT_ERRORS allows load operations to continue through non-deferred constraint violations, with any violations logged for future reference. Without using of it the whole operation will be rolled back.

Also setting the DATA_OPTIONS parameter to XML_CLOBS specifies that all XMLTYPE columns should be exported as uncompressed CLOBs.

9)One time automatic restart of worker process that have stopped due to certain errors.

No comments:

Post a Comment