Presentation is loading. Please wait.

Presentation is loading. Please wait.

UTILITY TO EXPORT/IMPORT DATA Gary M. Noble LDS Church

Similar presentations


Presentation on theme: "UTILITY TO EXPORT/IMPORT DATA Gary M. Noble LDS Church"— Presentation transcript:

1 UTILITY TO EXPORT/IMPORT DATA Gary M. Noble LDS Church
DATAPUMP UTILITY TO EXPORT/IMPORT DATA Gary M. Noble LDS Church

2 Background Experience with Oracle Databases
Familiar with Export/Import Utility RMAN backups Datapump to replace old Export/Import

3 Introduction More features than the standard export/import
Use in addition to RMAN backups Another means to upgrade to a higher version of Oracle

4 Why Use Datapump DataPump handles Oracle 10g data types.
DataPump features DataPump speed

5 Datapump Components Command line client expdp
Command line client impdp DBMS_DATAPUMP (Datapump API) DBMS_METADATA (Metadata API)

6 Outline Datapump is faster than standard export/import
Setup for datapump export Setup for datapump import Datapump features Experiences Kill the job

7 Datapump Speed Standard Export and Import utilities ran as clients
Datapump is run as part of the database instance on the database server Datapump can do parallel work Create multiple thread processes Create multiple data files (file sets)

8 Datapump Control Master Control Process Master Table Worker Process

9 Major Features PARALLEL – maximum number of threads
START_JOB – ability to restart a job ATTACH – detach and reattach to job NETWORK_LINK – export and import over network REMAP_DATAFILE – import to a different datafile. REMAP_TABLESPACE – map to new tablespace

10 Additional Datapump Features
Filter by using EXCLUDE and INCLUDE VERSION – specify version of objects. Parameters are compatible, latest, or version number.

11 DataPump Export Setup Make a server file system directory
Create a database directory that references the file system directory Grant read write privileges to the directory Grant privileges for full export Create an export parameter file

12 Default Datapump Directory
Oracle default datapump directory DATA_PUMP_DIR $ORACLE_HOME/rdbms/log/ Information is found in table DBA_DIRECTORIES

13 Export Preliminary Setup
mkdir /backup/<database>/datapump Set up your environment for ORACLE_HOME and ORACLE_SID then sqlplus “/ as sysdba” create DIRECTORY datapump_dir as ‘/backup/<database>/datapump’ ; grant read, write on Directory datapump_dir to <dp_schema> ; grant exp_full_database to <dp_schema> ;

14 Start Datapump Export Job
Expdp parfile=/backup/<database>/datapump/expdp_<database>_<db_schema>.par Expdp <db_schema>/<password> directory=datapump_dir schema=schema dumpfile=expdp_<database>_<schema>.dmp parallel=4 job_name=job_<database>_schema

15 Example Export Parameter File
Userid=<dp_schema>/password Dumpfile=expdp_<database>_<schema>.dmp Logfile=expdp_<database>_<schema>.log Directory=datapump_dir Schemas=schema Job_name=job_expdp_<database>_<schema> Status=240

16 Datapump Import Setup Make a server file system directory
Create a database directory Grant read privileges to directory Grant privileges for full import Create an import parameter file

17 Import Preliminary Setup
mkdir /backup/<database>/datapump Set up your environment for ORACLE_HOME and ORACLE_SID then sqlplus “/ as sysdba” create DIRECTORY datapump_dir as ‘/backup/<database>/datapump’ ; grant read, write on Directory datapump_dir to <dp_schema> grant imp_full_database to <dp_schema>

18 Start Datapump Import Job
impdp parfile=/backup/<database>/datapump/impdp_<database>_<schema>.par impdp <dp_schema>/<password> directory=datapump_dir table_exists_action=truncate dumpfile=impdp_<database>_<schema>.dmp parallel=4 job_name=job_impdp_<database>_<schema>

19 Example Import Parfile
Userid=<dp_schema>/<password> Schemas=<schema> Exclude=grant Directory=datapump_dir Dumpfile=expdp_<database>_<schema>.dmp Table_exists_action=replace

20 Some Basic Parameters Directory=Datapump_dir - Specify the datapump directory that has been defined in the database Schemas=User1,User2,User3 Dumpfile=datapump_job_file%U.dmp Tables=Table1,Table2 Estimate=Statistics – The default is blocks, to estimate the size of the export

21 Important Features EXCLUDE you can exclude schemas
REMAP_SCHEMA user1 to user2 REMAP_TABLESPACE user1_data to user2_data SQLFILE script of sql (DDL) statements STATUS list status every few seconds JOB_NAME run as an instance job

22 Exclude EXCLUDE=USER – Exclude a specific user and all objects of that user EXCLUDE=GRANT – Exclude definitions or privileges but not objects of the user EXCLUDE=VIEW, PACKAGE, FUNCTION – Exclude a specific type of object EXCLUDE=INDEX:”LIKE ‘EMP%’” – Exclude indexes whose names start with EMP.

23 Include INCLUDE=PROCEDURE – Include just the procedure objects
INCLUDE=TABLE:”IN (‘MANAGERS’,’FACILITIES’)” INCLUDE=INDEX:”LIKE ‘JOB%’” Note, INCLUDE and EXCLUDE parameters are mutually exclusive

24 Network_Link NETWORK_LINK=database_link
If this is an export then, retrieved data from the referenced database and written to a datapump file. If this is an import then, retrieved data from the referenced database is imported into the current database.

25 Filters QUERY=employees: ”WHERE department_id >10 AND salary > 10000” QUERY=salary:”WHERE manager_id <> 13” What can I filter with exclude and include – DATABASE_EXPORT_OBJECTS, SCHEMA_EXPORT_OBJECTS, and TABLE_EXPORT_OBJECTS For example – select object_path, comments from schema_export_objects where object_path not like ‘%/%’ ;

26 Table_Exists_Action Skip Append Truncate Replace

27 Import Parameters Remap_schema=User1:User2
Remap_tablespace=User1_tblspace:User2_tblspace Transform=OID:N Do not use the original Oracle Identification. Transform=segment_attributes:N Useful when you do not want to keep the original storage definition.

28 Interactively Work With The Job
Check on the status Stop the job Restart the job Kill the job

29 Check Status of Datapump Job
Select job_name, operation, job_mode, state from user_datapump_jobs ; Expdp <dp_schema>/<password> attach=<job_name> Status Exit_Client – Exit client but leave job running Continue_Client – Resume logging

30 Kill Datapump Job Select job_name, operation, job_mode, state from user_datapump_jobs ; Expdp <dp_schema>/<password> attach=<job_name> Kill_job

31 Interactive Commands Add_File – Add dumpfile to dumpfile set.
Continue_Client – Restart job if idle. Exit_Client – Exit interactive session. Filesize – File size of new files (Add_File). Help – Interactive session commands. Kill_Job – Delete the attached job and exit.

32 More Interactive Commands
Parallel – Specify the maximum number of active workers Set to more than twice the number of CPUs Worker processes are created as needed Reuse_Dumpfiles – Overwrite dump file if it exists Stop_Job – Stop job execution & exit client Start_Job – Start or resume current job

33 Restrictions DataPump is an Oracle utility, therefore the dump file can only be imported by datapump. You can still get the error snap shot too old. If the job is started using “/ as sysdba”, you need to know the Oracle database system password, to check status, kill job, etc.

34 Oracle 11g Features Compression – besides none, and metadata_only the new features are all, and data_only Encryption – Oracle 10g exported already encrypted columns. Oracle 11g can encrypt all of the metadata and/or data Data_Options – XML_CLOBS exports XML columns in uncompressed CLOB format

35 More Oracle 11g Features Partition – On import only. Used to merge partitions into one table. Partition options are departition and merge (all partitions) Transportable – permits exporting metadata for specific tables. Remap_Data – enables data to be modified to obscure sensitive information.

36 Oracle 11g OEM Tab Data Movement Move Row Data
Export to Export files (expdp) Import from Export files (impdp) Import from Database (NETWORK_LINK) Monitor Export and Import Jobs

37 Problems Encountered The previous run has created the datapump datafile with the same name as current job Space on the tablespaces The job suspends Make the file extensible or add another datafile NFS mounted soft Set event and bounce the database Migrate to Oracle 11g

38 Job Will Not Die The datapump processes are killed
Drop the master table with the same job name Delete the datapump file The datapump file has been delete or moved Cannot now attach to the job

39 Review Use datapump as another tool for the DBA
Take the time to set it up properly Learn the basic and rich features Create scripts for backups and refreshes

40 Comments Comments or questions Thank you for coming References:
Oracle Database Utilities 10g Release2(10.2), Part #B Oracle Database Utilities 11g Release1(11.1), Part#B Oracle is a registered trademark of Oracle Corp.

41 The End Last slide: - Speaker: Gary M. Noble - Session name: Data Pump
- Contact information for further questions: Be sure to include UTOUG Training Days in the title. Thank You


Download ppt "UTILITY TO EXPORT/IMPORT DATA Gary M. Noble LDS Church"

Similar presentations


Ads by Google