CHAPTER 13 Data Pump. Tool that replaces legacy exp/imp utilities Data Pump is a scalable, feature-rich utility that allows you to extract objects and.

Slides:



Advertisements
Similar presentations
Get the Best Out of Oracle Data Pump Functionality
Advertisements

CHAPTER 4 Tablespaces and Datafiles. Introduction After installing the binaries, creating a database, and configuring your environment, the next logical.
ITEC474 INTRODUCTION.
BY LECTURER/ AISHA DAWOOD DW Lab # 3 Overview of Extraction, Transformation, and Loading.
18 Copyright © Oracle Corporation, All rights reserved. Transporting Data Between Databases.
1 Oracle Migration Strategies COUG Presentation – Feb 2013 Feb 21, 2013 Ray Smith
INTRODUCTION TO ORACLE Lynnwood Brown System Managers LLC Backup and Recovery Copyright System Managers LLC 2008 all rights reserved.
18 Copyright © 2005, Oracle. All rights reserved. Moving Data.
Dale Roberts 1 Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
18 Copyright © 2005, Oracle. All rights reserved. Moving Data.
17 Copyright © 2007, Oracle. All rights reserved. Moving Data.
CHAPTER 14 External Tables. External Table Features An external table allows you to create a database table object that uses as its source an operating.
The New Data Pump Caleb Small Next generation Import / Export New features Better performance Improved security Versatile interfaces.
1 Chapter 15 Duplicating Databases and Transporting Data.
CERN - IT Department CH-1211 Genève 23 Switzerland t Transportable Tablespaces for Scalable Re-Instantiation Eva Dafonte Pérez.
MI807: Database Systems for Managers Introduction –Course Goals & Schedule –Logistics –Syllabus Review Relational DBMS Basics –RDBMS Role in Applications.
Harvard University Oracle Database Administration Session 2 System Level.
Introduction to Databases Transparencies
Advanced Databases Basic Database Administration Guide to Oracle 10g 1.
MD807: Relational Database Management Systems Introduction –Course Goals & Schedule –Logistics –Syllabus Review RDBMS Basics –RDBMS Role in Applications.
Database Backup and Recovery
INTRODUCTION TO ORACLE Lynnwood Brown President System Managers LLC Data Loading & Backup And Recovery Lecture 5 Copyright System Managers LLC 2003 all.
Backup and Recovery Part 1.
Chapter 12 Performing Incomplete Recovery. Background Viewed as one of the more difficult chapters to write Thought it was important to put in material.
Harvard University Oracle Database Administration Session 11 Database Recovery.
CHAPTER 18 RMAN Backups and Reporting. Introduction to RMAN Backups and Reporting The focus of this chapter is backups of: Datafiles Control files Archived.
Database Management Systems (DBMS)
AGENDA Tools used in SQL Server 2000 Graphical BOL Enterprise Manager Service Manager CLI Query Analyzer OSQL BCP.
Data Pump in Oracle Database 10 g : Foundation for Ultra-High Speed Data Movement Utilities Session id: George H. Claborn Data Pump Technical Project.
Introduction to Oracle Backup and Recovery
IS 4510 – Database Administration Module – 2 Database Backup 10/24/20141Compiled by: Zafar Iqbal Khan.
Backup & Recovery Concepts for Oracle Database
PPOUG, 05-OCT-01 Agenda RMAN Architecture Why Use RMAN? Implementation Decisions RMAN Oracle9i New Features.
Database Security and Auditing: Protecting Data Integrity and Accessibility Chapter 3 Administration of Users.
DBS201: DBA/DBMS Lecture 13.
Database Administration TableSpace & Data File Management
Chapter Oracle Server An Oracle Server consists of an Oracle database (stored data, control and log files.) The Server will support SQL to define.
Oracle 10g Data Pump Say good bye to exp and imp (or not)! 2006/02/23 Simon Pane.
Chapter 7 Making Backups with RMAN. Objectives Explain backup sets and image copies RMAN Backup modes’ Types of files backed up Backup destinations Specifying.
FEDERATION AND REPLICATION Created by “THE ONE”. INTRODUCTION Federations and Replications are two key Documentum Administrative Features for Distributed.
Extents, segments and blocks in detail. Database structure Database Table spaces Segment Extent Oracle block O/S block Data file logical physical.
Informix IDS Administration with the New Server Studio 4.0 By Lester Knutsen My experience with the beta of Server Studio and the new Informix database.
The protection of the DB against intentional or unintentional threats using computer-based or non- computer-based controls. Database Security – Part 2.
A Guide to Oracle9i1 Database Instance startup and shutdown.
1 All Powder Board and Ski Oracle 9i Workbook Chapter 9: Database Administration Jerry Post Copyright © 2003.
Oracle Data Integrator Procedures, Advanced Workflows.
Database Design and Management CPTG /23/2015Chapter 12 of 38 Functions of a Database Store data Store data School: student records, class schedules,
1 Chapter Overview Preparing to Upgrade Performing a Version Upgrade from Microsoft SQL Server 7.0 Performing an Online Database Upgrade from SQL Server.
Lecture # 3 & 4 Chapter # 2 Database System Concepts and Architecture Muhammad Emran Database Systems 1.
Chapter 8Oracle9i DBA II: Backup/Recovery and Network Administration 1 Chapter 8 Complete Recovery with Recovery Manager.
INTRODUCTION TO DBS Database: a collection of data describing the activities of one or more related organizations DBMS: software designed to assist in.
IT Database Administration Section 09. Backup and Recovery Backup: The available options Full Consistent (cold) Backup Database shutdown, all files.
Database Security. Multi-user database systems like Oracle include security to control how the database is accessed and used for example security Mechanisms:
IT Database Administration SECTION 02. CREATING DATABASES Creating a Database: Overview 1. Plan the physical design and storage structures 2. Back.
Creating an Oracle Database. Considerations before creating a DB Planning for Database Creation –Select the standard database block size. –Use an undo.
12 Copyright © 2009, Oracle. All rights reserved. Managing Backups, Development Changes, and Security.
CHAPTER 14 External Tables. External Table Features An external table allows you to create a database table object that uses as its source an operating.
Overview of Oracle Backup and Recovery Darl Kuhn, Regis University.
Getting the Most from Oracle Data Pump Roy Swonger Director, Oracle Database Utilities January, 2007.
Hyperion Artifact Life Cycle Management Agenda  Overview  Demo  Tips & Tricks  Takeaways  Queries.
3 Copyright © 2007, Oracle. All rights reserved. Using the RMAN Recovery Catalog.
​ TdBench 7.2 – tdb.sh Utility Script. 2 Created for TdBench 7.x release to consolidate tools Open architecture – looks for scripts in the./tools directory.
TYPES OF BACKUP AND RECOVERY
Export and Import Utilities
Recovery Catalog Creation and Maintenance
Oracle Database for APM 9.1
Getting the Most from Oracle Data Pump
How to Thrive as a DBA in an Oracle10g World
Duplicating a Database
UTILITY TO EXPORT/IMPORT DATA Gary M. Noble LDS Church
Presentation transcript:

CHAPTER 13 Data Pump

Tool that replaces legacy exp/imp utilities Data Pump is a scalable, feature-rich utility that allows you to extract objects and data from a database You can use the extract file as: Point in time backup of your database Efficiently copy large amounts of objects/data from one database environment to another Extract SQL from the export file Partially extract specific objects or data from the export file Easily stop/suspend/restart operations

Data Pump Architectural Components expdp (Data Pump export utility) impdp (Data Pump import utility) DBMS_DATAPUMP PL/SQL package (Data Pump API) DBMS_METADATA PL/SQL package (Data Pump Metadata API)

Data Pump Architecture

Steps Required to Export Data 1. Create a database-directory object that points to an operating-system directory that you want to write/read Data Pump files to/from. 2. Grant read, write on the directory to the database user running the export. 3. From the operating-system prompt, run the expdp utility.

Creating a Database Directory Use CREATE DIRECTORY statement to create a directory: SQL> create directory dp_dir as '/oradump'; View directory details: SQL> select owner, directory_name, directory_path from dba_directories;

Granting Access to the Directory As a privileged user, grant access to the directory to the user who will be invoking expdp: SQL> grant read, write on directory dp_dir to ;

Taking an Export From the operating system invoke expdp: $ expdp darl/foo directory=dp_dir tables=inv dumpfile=exp.dmp logfile=exp.log

Importing using Data Pump Similar to export, but using the import utility. Need to setup directory and access 1. Create a database-directory object that points to an operating-system directory that you want to write/read Data Pump files to/from. 2. Grant read, write on the directory to the database user running the export or import. 3. From the operating system prompt, run the impdp command.

Using Data Pump Interactive Command Mode Allows you to interact with Data Pump from a command line mode utility interface. There are two ways to get to the interactive command- mode prompt: Press Ctrl+C in a Data Pump job that you started via expdp or impdp. Use the ATTACH parameter to attach to a currently running job Most useful commands: status, start_job, kill_job, parallel, stop_job.

Attaching to a Running Job You can attach to a currently running Data Pump job to view its status, stop/start it, and so on. Determine the job name: SQL> select owner_name, operation, job_name, state from dba_datapump_jobs; Use the ATTACH command line switch: $ impdp darl/engdev attach=sys_import_schema_02

Stopping and Restarting a Job You can stop and restart a Data Pump job First attach to the job, then issue interactive commands: $ impdp darl/foo attach=sys_import_table_01 Import> stop_job Import> start_job

Terminating a Data Pump Job You might find yourself in a scenario where you have a long running Data Pump job that appears to be hung. You want to kill the job as gracefully as possible. First attach to the job, then kill it: Import> kill_job

Using a Parameter File Parameter files are useful for repeatability and consistency. Parameter files also make it easier to handle parameters that use single and double quotes. Sometimes single and double quotes are mis-interpreted by the OS when entered on the command line. userid=darl/foo directory=dp dumpfile=invp.dmp logfile=invp.log tables=f_sales $ impdp parfile=pfile.ctl

Estimating the Size of Export Jobs Sometimes it’s useful to estimate the size of a job before starting it. Helps you get an idea how much disk space will be required. $ expdp dbauser/foo estimate_only=y full=y logfile=n

Listing the Contents of Dump Files Allows you to translate the contents of a dump file created by expdp into SQL statements. Very handy for generating a copy of the SQL required to re- create an environment. Use the SQLFILE parameter: $ impdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmp \ SQLFILE=dpump_dir2:expfull.sql In this mode, impdp doesn’t import anything, it only creates a file that contains the SQL required to re-create objects within the dump file.

Exporting and Importing Directly Across the Network Powerful technique for copying environments. Efficient way to copy users and data from one database to another. 1. Create users in target environment. 2. Create a database link 3. Create a directory object. 4. Run the impdp command using the NETWORK_LINK 5. parameter that points at the database link you created in the prior step.

Transportable Tablespaces Powerful feature for copying large amounts of data from one environment to another. Ensure tablespace is self-contained. Make the tablespaces being transported read-only. Use Data Pump to export the metadata for the tablespaces being transported: Copy the Data Pump export dump file to the destination server. Copy the datafile(s) to the destination database. Import the metadata into the destination database.

Exporting Tablespace Metadata Sometimes it’s useful to capture tablespace information and transfer it to a new database environment. Use a combination of FULL and INCLUDE=TABLESPACE. $ expdp darl/foo directory=dp dumpfile=phredstg.dmp content=metadata_only full=y \ include=tablespace

Specifying Different Datafile Paths and Names When you’re copying database environments, oftentimes the source and target servers have different mount point names. You can rename the datafiles on the import using REMAP_DATAFILE. userid=darl/foo directory=dp dumpfile=phredstg.dmp full=y include=tablespace:"like 'TBSP%'" remap_datafile="'/ora01/dbfile/O11R2/tbsp101.dbf':'/ora02/O11R2/tb1.d bf'" remap_datafile="'/ora01/dbfile/O11R2/tbsp201.dbf':'/ora02/O11R2/tb2.d bf'" remap_datafile="'/ora01/dbfile/O11R2/tbsp301.dbf':'/ora02/O11R2/tb3.d bf'"

Changing Segment and Storage Attributes You may have very different segment and storage requirements when copying database environments. Use the TRANSFORM parameter to remove the segment and storage attributes from being specified when importing. $ impdp darl/foo directory=dp dumpfile=inv.dmp transform=segment_attributes:n $ impdp darl/foo directory=dp dumpfile=inv.dmp transform=storage:n

Importing into a Different Tablespace from the Original You may want to export a user and when importing map the old user to a new username and map the old tablespaces associated with the user’s objects to new tablespace names. Use REMAP_SCHEMA and REMAP_TABLESPACE $ impdp darl/foo directory=dp dumpfile=rm.dmp remap_schema=HEERA:CHAYA \ remap_tablespace=TBSP1:V_DATA tables=heera.ticket

Sophisticated Filtering Mechanisms Use the QUERY parameter to export or import subsets of data. Use the SAMPLE parameter to export a percentage of the rows in a table. Use the CONTENT parameter to exclude or include data and metadata. Use the EXCLUDE parameter to specifically name items to be excluded. Use the INCLUDE parameter to name the items to be included (thereby excluding other non-dependent items not included in the list). Use parameters like SCHEMA to specify that you only want a subset of the database’s objects (those that belong to the specified user or users).

Filtering via a Query Use this syntax to filter via a query: QUERY = [schema.][table_name:] query_clause Example using a parameter file: userid=darl/foo directory=dp dumpfile=inv.dmp tables=inv,reg query=inv:"WHERE inv_desc='Book'" query=reg:"WHERE reg_id <=20"

Exporting a Percentage of the Data Doesn’t work well when exporting combinations of parent/child tables, because the percentage doesn’t enforce that parent rows match corresponding child rows. General syntax for exporting a percentage of data in a table: SAMPLE=[[schema_name.]table_name:]sample_percent $ expdp darl/foo directory=dp tables=inv sample=10 dumpfile=inv.dmp

Excluding Objects from the Export File Use this general syntax to exclude objects from being exported: EXCLUDE=object_type[:name_clause] [,...] userid=darl/foo directory=dp dumpfile=inv.dmp tables=inv exclude=index:"LIKE 'INV%'"

Including Only Specific Objects in an Export File Use INCLUDE to specify objects to be included. directory=datapump dumpfile=pl.dmp include=function:"='ISDATE'",procedure:"='DEPTREE_FILL'"

Excluding Objects from Import Use EXCLUDE to exclude objects on import. impdp darl/foo dumpfile=d.dmp directory=dp exclude=TRIGGER,PROCEDURE

Including Objects in Import To only include objects on import, use INCLUDE: userid=darl/foo directory=dp dumpfile=h.dmp schemas=HEERA include=table:"like 'A%'"

Creating a Consistent Export No consistent=y switch (like the old exp utility) Use flashback_scn or flashback_time SQL> select current_scn from v$database; $ expdp darl/foo directory=dp flashback_scn=

Importing When Objects Already Exist Use TABLE_EXISTS_ACTION to specify behavior if the object already exists The TABLE_EXISTS_ACTION parameter takes the following options: SKIP (default if not combined with CONTENT=DATA_ONLY) APPEND (default if combined with CONTENT=DATA_ONLY) REPLACE TRUNCATE

Renaming a Table Rename a table on import General syntax: REMAP_TABLE=[schema.]old_tablename[.partition]:new_t ablename or REMAP_TABLE=[schema.]old_tablename[:partition]:new_t ablename

Remapping Data Allows you to obfuscate data on import. Some production data you may not want imported with easily readable values. Create a PL/SQL function to obfuscate the data $ impdp darl/foo directory=dp dumpfile=cust.dmp tables=customers remap_data=customers.last_name:obfus.obf

Cloning a User Use REMAP_SCHEMA to rename a user during import $ impdp darl/foo directory=dp remap_schema=inv:inv_dw dumpfile=inv.dmp

Miscellaneous Features Suppressing a Log File Using Parallelism Specifying Additional Dump Files Reusing Output File Names Creating a Daily DDL File Compressing Output Encrypting Data

Monitoring Data Pump Jobs Screen output Data Pump log file Database alert log Querying the status table Querying data-dictionary views Interactive command-mode status Using operating-system utilities’ process status ps

Data Pump Legacy Mode Enter old exp or imp parameter at the command line when invoking Data Pump Data Pump automatically translates parameter into parameters it understands Available starting with 11g R2 Very handy feature if you’re familiar with the old exp/imp utilities

Summary Data Pump is a flexible and powerful utility for taking point in time backups and copying database environments. As a DBA, you should be proficient with this utility. Sophisticated object and data transfer filtering parameters. Use Data Pump in legacy mode to view how the old exp/imp parameters are translated into Data Pump parameters.