Download presentation
Presentation is loading. Please wait.
Published byJordan Swenson Modified over 10 years ago
1
CHAPTER 13 Data Pump
2
Tool that replaces legacy exp/imp utilities Data Pump is a scalable, feature-rich utility that allows you to extract objects and data from a database You can use the extract file as: Point in time backup of your database Efficiently copy large amounts of objects/data from one database environment to another Extract SQL from the export file Partially extract specific objects or data from the export file Easily stop/suspend/restart operations
3
Data Pump Architectural Components expdp (Data Pump export utility) impdp (Data Pump import utility) DBMS_DATAPUMP PL/SQL package (Data Pump API) DBMS_METADATA PL/SQL package (Data Pump Metadata API)
4
Data Pump Architecture
5
Steps Required to Export Data 1. Create a database-directory object that points to an operating-system directory that you want to write/read Data Pump files to/from. 2. Grant read, write on the directory to the database user running the export. 3. From the operating-system prompt, run the expdp utility.
6
Creating a Database Directory Use CREATE DIRECTORY statement to create a directory: SQL> create directory dp_dir as '/oradump'; View directory details: SQL> select owner, directory_name, directory_path from dba_directories;
7
Granting Access to the Directory As a privileged user, grant access to the directory to the user who will be invoking expdp: SQL> grant read, write on directory dp_dir to ;
8
Taking an Export From the operating system invoke expdp: $ expdp darl/foo directory=dp_dir tables=inv dumpfile=exp.dmp logfile=exp.log
9
Importing using Data Pump Similar to export, but using the import utility. Need to setup directory and access 1. Create a database-directory object that points to an operating-system directory that you want to write/read Data Pump files to/from. 2. Grant read, write on the directory to the database user running the export or import. 3. From the operating system prompt, run the impdp command.
10
Using Data Pump Interactive Command Mode Allows you to interact with Data Pump from a command line mode utility interface. There are two ways to get to the interactive command- mode prompt: Press Ctrl+C in a Data Pump job that you started via expdp or impdp. Use the ATTACH parameter to attach to a currently running job Most useful commands: status, start_job, kill_job, parallel, stop_job.
11
Attaching to a Running Job You can attach to a currently running Data Pump job to view its status, stop/start it, and so on. Determine the job name: SQL> select owner_name, operation, job_name, state from dba_datapump_jobs; Use the ATTACH command line switch: $ impdp darl/engdev attach=sys_import_schema_02
12
Stopping and Restarting a Job You can stop and restart a Data Pump job First attach to the job, then issue interactive commands: $ impdp darl/foo attach=sys_import_table_01 Import> stop_job Import> start_job
13
Terminating a Data Pump Job You might find yourself in a scenario where you have a long running Data Pump job that appears to be hung. You want to kill the job as gracefully as possible. First attach to the job, then kill it: Import> kill_job
14
Using a Parameter File Parameter files are useful for repeatability and consistency. Parameter files also make it easier to handle parameters that use single and double quotes. Sometimes single and double quotes are mis-interpreted by the OS when entered on the command line. userid=darl/foo directory=dp dumpfile=invp.dmp logfile=invp.log tables=f_sales $ impdp parfile=pfile.ctl
15
Estimating the Size of Export Jobs Sometimes it’s useful to estimate the size of a job before starting it. Helps you get an idea how much disk space will be required. $ expdp dbauser/foo estimate_only=y full=y logfile=n
16
Listing the Contents of Dump Files Allows you to translate the contents of a dump file created by expdp into SQL statements. Very handy for generating a copy of the SQL required to re- create an environment. Use the SQLFILE parameter: $ impdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmp \ SQLFILE=dpump_dir2:expfull.sql In this mode, impdp doesn’t import anything, it only creates a file that contains the SQL required to re-create objects within the dump file.
17
Exporting and Importing Directly Across the Network Powerful technique for copying environments. Efficient way to copy users and data from one database to another. 1. Create users in target environment. 2. Create a database link 3. Create a directory object. 4. Run the impdp command using the NETWORK_LINK 5. parameter that points at the database link you created in the prior step.
18
Transportable Tablespaces Powerful feature for copying large amounts of data from one environment to another. Ensure tablespace is self-contained. Make the tablespaces being transported read-only. Use Data Pump to export the metadata for the tablespaces being transported: Copy the Data Pump export dump file to the destination server. Copy the datafile(s) to the destination database. Import the metadata into the destination database.
19
Exporting Tablespace Metadata Sometimes it’s useful to capture tablespace information and transfer it to a new database environment. Use a combination of FULL and INCLUDE=TABLESPACE. $ expdp darl/foo directory=dp dumpfile=phredstg.dmp content=metadata_only full=y \ include=tablespace
20
Specifying Different Datafile Paths and Names When you’re copying database environments, oftentimes the source and target servers have different mount point names. You can rename the datafiles on the import using REMAP_DATAFILE. userid=darl/foo directory=dp dumpfile=phredstg.dmp full=y include=tablespace:"like 'TBSP%'" remap_datafile="'/ora01/dbfile/O11R2/tbsp101.dbf':'/ora02/O11R2/tb1.d bf'" remap_datafile="'/ora01/dbfile/O11R2/tbsp201.dbf':'/ora02/O11R2/tb2.d bf'" remap_datafile="'/ora01/dbfile/O11R2/tbsp301.dbf':'/ora02/O11R2/tb3.d bf'"
21
Changing Segment and Storage Attributes You may have very different segment and storage requirements when copying database environments. Use the TRANSFORM parameter to remove the segment and storage attributes from being specified when importing. $ impdp darl/foo directory=dp dumpfile=inv.dmp transform=segment_attributes:n $ impdp darl/foo directory=dp dumpfile=inv.dmp transform=storage:n
22
Importing into a Different Tablespace from the Original You may want to export a user and when importing map the old user to a new username and map the old tablespaces associated with the user’s objects to new tablespace names. Use REMAP_SCHEMA and REMAP_TABLESPACE $ impdp darl/foo directory=dp dumpfile=rm.dmp remap_schema=HEERA:CHAYA \ remap_tablespace=TBSP1:V_DATA tables=heera.ticket
23
Sophisticated Filtering Mechanisms Use the QUERY parameter to export or import subsets of data. Use the SAMPLE parameter to export a percentage of the rows in a table. Use the CONTENT parameter to exclude or include data and metadata. Use the EXCLUDE parameter to specifically name items to be excluded. Use the INCLUDE parameter to name the items to be included (thereby excluding other non-dependent items not included in the list). Use parameters like SCHEMA to specify that you only want a subset of the database’s objects (those that belong to the specified user or users).
24
Filtering via a Query Use this syntax to filter via a query: QUERY = [schema.][table_name:] query_clause Example using a parameter file: userid=darl/foo directory=dp dumpfile=inv.dmp tables=inv,reg query=inv:"WHERE inv_desc='Book'" query=reg:"WHERE reg_id <=20"
25
Exporting a Percentage of the Data Doesn’t work well when exporting combinations of parent/child tables, because the percentage doesn’t enforce that parent rows match corresponding child rows. General syntax for exporting a percentage of data in a table: SAMPLE=[[schema_name.]table_name:]sample_percent $ expdp darl/foo directory=dp tables=inv sample=10 dumpfile=inv.dmp
26
Excluding Objects from the Export File Use this general syntax to exclude objects from being exported: EXCLUDE=object_type[:name_clause] [,...] userid=darl/foo directory=dp dumpfile=inv.dmp tables=inv exclude=index:"LIKE 'INV%'"
27
Including Only Specific Objects in an Export File Use INCLUDE to specify objects to be included. directory=datapump dumpfile=pl.dmp include=function:"='ISDATE'",procedure:"='DEPTREE_FILL'"
28
Excluding Objects from Import Use EXCLUDE to exclude objects on import. impdp darl/foo dumpfile=d.dmp directory=dp exclude=TRIGGER,PROCEDURE
29
Including Objects in Import To only include objects on import, use INCLUDE: userid=darl/foo directory=dp dumpfile=h.dmp schemas=HEERA include=table:"like 'A%'"
30
Creating a Consistent Export No consistent=y switch (like the old exp utility) Use flashback_scn or flashback_time SQL> select current_scn from v$database; $ expdp darl/foo directory=dp flashback_scn=8400741902387
31
Importing When Objects Already Exist Use TABLE_EXISTS_ACTION to specify behavior if the object already exists The TABLE_EXISTS_ACTION parameter takes the following options: SKIP (default if not combined with CONTENT=DATA_ONLY) APPEND (default if combined with CONTENT=DATA_ONLY) REPLACE TRUNCATE
32
Renaming a Table Rename a table on import General syntax: REMAP_TABLE=[schema.]old_tablename[.partition]:new_t ablename or REMAP_TABLE=[schema.]old_tablename[:partition]:new_t ablename
33
Remapping Data Allows you to obfuscate data on import. Some production data you may not want imported with easily readable values. Create a PL/SQL function to obfuscate the data $ impdp darl/foo directory=dp dumpfile=cust.dmp tables=customers remap_data=customers.last_name:obfus.obf
34
Cloning a User Use REMAP_SCHEMA to rename a user during import $ impdp darl/foo directory=dp remap_schema=inv:inv_dw dumpfile=inv.dmp
35
Miscellaneous Features Suppressing a Log File Using Parallelism Specifying Additional Dump Files Reusing Output File Names Creating a Daily DDL File Compressing Output Encrypting Data
36
Monitoring Data Pump Jobs Screen output Data Pump log file Database alert log Querying the status table Querying data-dictionary views Interactive command-mode status Using operating-system utilities’ process status ps
37
Data Pump Legacy Mode Enter old exp or imp parameter at the command line when invoking Data Pump Data Pump automatically translates parameter into parameters it understands Available starting with 11g R2 Very handy feature if you’re familiar with the old exp/imp utilities
38
Summary Data Pump is a flexible and powerful utility for taking point in time backups and copying database environments. As a DBA, you should be proficient with this utility. Sophisticated object and data transfer filtering parameters. Use Data Pump in legacy mode to view how the old exp/imp parameters are translated into Data Pump parameters.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.