Sunday, 30 June 2013

(q)Export Import in Oracle

These tools are used to transfer data from one oracle database to another oracle database. You Export tool to export data from source database, and Import tool to load data into the target database. When you export tables from source database export tool will extracts the tables and puts it into the dump file. This dump file is transferred to the target database. At the target database the Import tool will copy the data from dump file to the target database.
From Ver. 10g / 11g Oracle is recommending to use Data Pump Export and Import tools, which are enhanced versions of original Export and Import tools.
The export dump file contains objects in the following order:
  1. Type definitions
  2. Table definitions
  3. Table data
  4. Table indexes
  5. Integrity constraints, views, procedures, and triggers
  6. Bitmap, function-based, and domain indexes

When you import the tables the import tool will perform the actions in the following order, new tables are created, data is imported and indexes are built, triggers are imported, integrity constraints are enabled on the new tables, and any bitmap, function-based, and/or domain indexes are built. This sequence prevents data from being rejected due to the order in which tables are imported. This sequence also prevents redundant triggers from firing twice on the same data
You can run Export and Import tool in two modes
            Command Line Mode
            Interactive Mode
When you just type exp or imp at o/s prompt it will run in interactive mode i.e. these tools will prompt you for all the necessary input. If you supply command line arguments when calling exp or imp then it will run in command line mode
You can control how Export runs by entering the EXP command followed
by various arguments. To specify parameters, you use keywords:
     Format:  EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
     Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
               or TABLES=(T1:P1,T1:P2), if T1 is partitioned table


Keyword                               Description (Default)                    
--------------------------------------------------------------
USERID                                 username/password                     
BUFFER                                                size of data buffer                         
FILE                                         output files (EXPDAT.DMP)       
COMPRESS                          import into one extent (Y)         
GRANTS                               export grants (Y)                             
INDEXES                               export indexes (Y)                          
DIRECT                                  direct path (N)                                 
LOG                                       log file of screen output               
ROWS                                   export data rows (Y)                      
CONSISTENT                     cross-table consistency(N)         
 FULL                                    export entire file (N)
 OWNER                               list of owner usernames
TABLES                                 list of table names
 RECORDLENGTH              length of IO record
 INCTYPE                              incremental export type
 RECORD                              track incr. export (Y)
TRIGGERS                            export triggers (Y)
 STATISTICS                         analyze objects (ESTIMATE)
 PARFILE                               parameter filename
CONSTRAINTS                   export constraints (Y)
OBJECT_CONSISTENT    transaction set to read only during object export (N)
FEEDBACK                           display progress every x rows (0)
FILESIZE                                maximum size of each dump file
FLASHBACK_SCN             SCN used to set session snapshot back to
FLASHBACK_TIME           time used to get the SCN closest to the specified time
QUERY                                  select clause used to export a subset of a table
RESUMABLE                       suspend when a space related error is encountered(N)
RESUMABLE_NAME       text string used to identify resumable statement
RESUMABLE_TIMEOUT    wait time for RESUMABLE
TTS_FULL_CHECK             perform full or partial dependency check for TTS
TABLESPACES                    list of tablespaces to export
TRANSPORT_TABLESPACE           export transportable tablespace metadata (N)
TEMPLATE                           template name which invokes iAS mode export

The Export and Import tools support four modes of operation
                        FULL               :Exports all the objects in all schemas
                        
OWNER             :Exports objects only belonging to the given OWNER
                        
TABLES           :Exports Individual Tables
                        
TABLESPACE  :Export all objects located in a given TABLESPACE.
Example of Exporting Full Database
The following example shows how to export full database
$exp USERID=scott/tiger FULL=y FILE=myfull.dmp
In the above command, FILE option specifies the name of the dump file, FULL option specifies that you want to export the full database, USERID option specifies the user account to connect to the database. Note, to perform full export the user should have DBA or EXP_FULL_DATABASE privilege. 
To export Objects stored in a particular schemas you can run export utility with the following arguments
$exp USERID=scott/tiger OWNER=(SCOTT,ALI) FILE=exp_own.dmp
The above command will export all the objects stored in SCOTT and ALI’s schema.
To export individual tables give the following command
$exp USERID=scott/tiger TABLES=(scott.emp,scott.sales) FILE=exp_tab.dmp
This will export scott’s emp and sales tables.
If you include CONSISTENT=Y option in export command argument then, Export utility will export a consistent image of the table i.e. the changes which are done to the table during export operation will not be exported.
Objects exported by export utility can only be imported by Import utility. Import utility can  run in Interactive mode or command line mode.
You can let Import prompt you for parameters by entering the IMP command followed by your username/password:
     Example: IMP SCOTT/TIGER
Or, you can control how Import runs by entering the IMP command followed
by various arguments. To specify parameters, you use keywords:
     Format:  IMP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
     Example: IMP SCOTT/TIGER IGNORE=Y TABLES=(EMP,DEPT) FULL=N
               or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
USERID must be the first parameter on the command line.
Keyword
Description (Default)
USERID
username/password
BUFFER
size of data buffer
FILE
input files (EXPDAT.DMP)
SHOW
just list file contents (N)
IGNORE
ignore create errors (N)
GRANTS
import grants (Y)
INDEXES
import indexes (Y)
ROWS
import data rows (Y)
LOG
log file of screen output
FULL
import entire file (N)
FROMUSER
list of owner usernames
TOUSER
list of usernames
TABLES
list of table names
RECORDLENGTH
length of IO record
INCTYPE
incremental import type
COMMIT
commit array insert (N)
PARFILE
parameter filename
CONSTRAINTS
import constraints (Y)
DESTROY
overwrite tablespace data file (N)
INDEXFILE
write table/index info to specified file
SKIP_UNUSABLE_INDEXES
skip maintenance of unusable indexes (N)
FEEDBACK
display progress every x rows(0)
TOID_NOVALIDATE
skip validation of specified type ids
FILESIZE
maximum size of each dump file
STATISTICS
import precomputed statistics (always)
RESUMABLE
suspend when a space related error is encountered(N)
RESUMABLE_NAME
text string used to identify resumable statement
RESUMABLE_TIMEOUT
wait time for RESUMABLE
COMPILE
compile procedures, packages, and functions (Y)
STREAMS_CONFIGURATION
import streams general metadata (Y)
STREAMS_INSTANITATION
import streams instantiation metadata (N)
To import individual tables from a full database export dump file give the following command
$imp scott/tiger FILE=myfullexp.dmp FROMUSER=scott TABLES=(emp,dept)
This command will import only empdept tables into Scott user and you will get a output similar  to as shown below
Export file created by EXPORT:V10.00.00 via conventional path
import done in WE8DEC character set and AL16UTF16 NCHAR character set
importing SCOTT's objects into SCOTT
. . importing table                         "DEPT"          4 rows imported
. . importing table                          "EMP"         14 rows imported
Import terminated successfully without warnings.
For example, suppose Ali has exported tables into a dump file mytables.dmp. Now Scott wants to import these tables. To achieve this Scott will give the following import command
$imp scott/tiger  FILE=mytables.dmp FROMUSER=ali TOUSER=scott
Then import utility will give a warning that tables in the dump file was exported by user Ali and not you and then proceed.
Suppose you want to import all tables from a dump file whose name matches a particular pattern. To do so, use “%” wild character in TABLES option. For example, the following command will import all tables whose names starts with alphabet “e” and those tables whose name contains alphabet “d
$imp scott/tiger FILE=myfullexp.dmp FROMUSER=scott TABLES=(a%,%d%)
Migrating a Database across platforms.
The Export and Import utilities are the only method that Oracle supports for moving an existing Oracle database from one hardware platform to another. This includes moving between UNIX and NT systems and also moving between two NT systems running on different platforms.
The following steps present a general overview of how to move a database between platforms.
  1. As a DBA user, issue the following SQL query to get the exact name of all tablespaces. You will need this information later in the process.
SQL> SELECT tablespace_name FROM dba_tablespaces;
  1. As a DBA user, perform a full export from the source database, for example:
exp system/manager FULL=y FILE=myfullexp.dmp
  1. Move the dump file to the target database server. If you use FTP, be sure to copy it in binary format (by entering binary at the FTP prompt) to avoid file corruption.
  2. Create a database on the target server.
  3. Before importing the dump file, you must first create your tablespaces, using the information obtained in Step 1. Otherwise, the import will create the corresponding datafiles in the same file structure as at the source database, which may not be compatible with the file structure on the target system.
  4. As a DBA user, perform a full import with the IGNORE parameter enabled:
> imp system/manager FULL=y IGNORE=y FILE=myfullexp.dmp
Using IGNORE=y instructs Oracle to ignore any creation errors during the import and permit the import to complete.

  1. Perform a full backup of your new database.


Starting with Oracle 10g,  Oracle has introduced an enhanced version of EXPORT and IMPORT utility known as DATA PUMP. Data Pump is similar to EXPORT and IMPORT utility but it has many advantages. Some of the advantages are:
  • Most Data Pump export and import operations occur on the Oracle database server. i.e. all the dump files are created in the server even if you run the Data Pump utility from client machine. This results in increased performance because data is not transferred through network.

  • You can Stop and Re-Start export and import jobs. This is particularly useful if you have started an export or import job and after some time you want to do some other urgent work.

  • The ability to detach from and reattach to long-running jobs without affecting the job itself. This allows DBAs and other operations personnel to monitor jobs from multiple locations.

  • The ability to estimate how much space an export job would consume, without actually performing the export

  • Support for an interactive-command mode that allows monitoring of and interaction with ongoing jobs

To Use Data Pump,  DBA has to create a directory in Server Machine and create a Directory Object in the database mapping to the directory created in the file system.
The following example creates a directory in the filesystem and creates a directory object in the database and grants privileges on the Directory Object to the SCOTT user.
$mkdir my_dump_dir
$sqlplus
Enter User:/ as sysdba
SQL>create directory data_pump_dir as ‘/u01/oracle/my_dump_dir’;
Now grant access on this directory object to SCOTT user
SQL> grant read,write on directory data_pump_dir to scott;
To Export Full Database, give the following command
$expdp  scott/tiger FULL=y DIRECTORY=data_pump_dir DUMPFILE=full.dmp
             LOGFILE=myfullexp.log JOB_NAME=myfullJob
The above command will export the full database and it will create the dump file full.dmp in the directory on the server /u01/oracle/my_dump_dir
In some cases where the Database is in Terabytes the above command will not feasible since the dump file size will be larger than the operating system limit, and hence export will fail. In this situation you can create multiple dump files by typing the following command
$expdp  scott/tiger FULL=y DIRECTORY=data_pump_dir DUMPFILE=full%U.dmp
       FILESIZE=5G  LOGFILE=myfullexp.log JOB_NAME=myfullJob
This will create multiple dump files named full01.dmp, full02.dmp, full03.dmp and so on. The FILESIZE parameter specifies how much larger the dump file should be.

To export all the objects of SCOTT’S schema you can run the following export data pump command.
$expdp scott/tiger DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
         SCHEMAS=SCOTT
You can omit SCHEMAS since the default mode of Data Pump export is SCHEMAS only.
If you want to export objects of multiple schemas you can specify the following command
$expdp scott/tiger DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
         SCHEMAS=SCOTT,HR,ALI
You can use Data Pump Export utility to export individual tables. The following example shows the syntax to export tables
$expdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=tables.dmp  
               TABLES=employees,jobs,departments

Exporting Tables located in a Tablespace
If you want to export tables located in a particular tablespace you can type the following command

$expdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=tbs.dmp
TABLESPACES=tbs_4, tbs_5, tbs_6

The above will export all the objects located in tbs_4,tbs_5,tbs_6

You can exclude objects while performing a export by using EXCLUDE option of Data Pump utility. For example you are exporting a schema and don’t want to export tables whose name  starts with “A” then you can type the following command
$expdp scott/tiger DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
         SCHEMAS=SCOTT EXCLUDE=TABLE:”like ‘A%’”
Then all tables in Scott’s Schema whose name starts with “A “ will not be exported.
Similarly you can also INCLUDE option to only export certain objects like this
$expdp scott/tiger DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
         SCHEMAS=SCOTT INCLUDE=TABLE:”like ‘A%’”
This is opposite of EXCLUDE option i.e. it will export only those tables of Scott’s schema whose name starts with “A”
Similarly you can also exclude INDEXES, CONSTRAINTS, GRANTS, USER, SCHEMA
You can use QUERY option to export only required rows. For Example, the following will export only those rows of employees tables whose salary is above 10000 and whose dept id is 10.
 expdp hr/hr QUERY=emp:'"WHERE dept_id > 10 AND sal > 10000"'
        NOLOGFILE=y DIRECTORY=dpump_dir1 DUMPFILE=exp1.dmp

You can suspend running export jobs and later on resume these jobs or kill these jobs using Data Pump Export. You can start a job in one client machine and then, if because of some work, you can suspend it. Afterwards when your work has been finished you can continue the job from the same client, where you stopped the job, or you can restart the job from another client machine.
For Example, suppose a DBA starts a full database export by typing the following command at one client machine CLNT1 by typing the following command
$expdp scott/tiger@mydb FULL=y DIRECTORY=data_pump_dir  
     DUMPFILE=full.dmp LOGFILE=myfullexp.log JOB_NAME=myfullJob
After some time, the DBA wants to stop this job temporarily. Then he presses CTRL+C to enter into interactive mode. Then he will get the Export> prompt where he can type interactive commands
Now he wants to stop this export job so he will type the following command
Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([y]/n): y
The job is placed in a stopped state and exits the client.
After finishing his other work, the DBA wants to resume the export job and the client machine from where he actually started the job is locked because, the user has locked his/her cabin. So now the DBA will go to another client machine and he reattach to the job by typing the following command
$expdp hr/hr@mydb ATTACH=myfulljob
After the job status is displayed, he can issue the CONTINUE_CLIENT command to resume logging mode and restart the myfulljob job.
Export> CONTINUE_CLIENT
A message is displayed that the job has been reopened, and processing status is output to the client.
Note: After reattaching to the Job a DBA can also kill the job by typing KILL_JOB, if he doesn’t want to continue with the export job.



Objects exported by Data Pump Export Utility can be imported into a database using Data Pump Import Utility. The following describes how to use Data Pump Import utility to import objects
If you want to Import all the objects in a dump file then you can type the following command.
$impdp hr/hr DUMPFILE=dpump_dir1:expfull.dmp FULL=y
LOGFILE=dpump_dir2:full_imp.log

This example imports everything from the expfull.dmp dump file. In this example, a DIRECTORY parameter is not provided. Therefore, a directory object must be provided on both the DUMPFILE parameter and the LOGFILE parameter
The following example loads all tables belonging to hr schema to scott schema

$impdp SYSTEM/password DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp
REMAP_SCHEMA=hr:scott

If SCOTT account exist in the database then hr objects will be loaded into scott schema. If scott account does not exist, then Import Utility will create the SCOTT account with an unusable password because, the dump file was exported by the user SYSTEM and imported by the user SYSTEM who has DBA privileges.
You can use remap_tablespace option to import objects of one tablespace to another tablespace by giving the command
$impdp SYSTEM/password DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp
REMAP_TABLESPACE=users:sales

The above example loads tables, stored in users tablespace, in the sales tablespace.
Generating SQL File containing DDL commands using Data Pump Import

You can generate SQL file which contains all the DDL commands which Import would have executed if you actually run Import utility
The following is an example of using the SQLFILE parameter.
impdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmp
SQLFILE=dpump_dir2:expfull.sql
A SQL file named expfull.sql is written to dpump_dir2.
If you have the IMP_FULL_DATABASE role, you can use this parameter to perform a schema-mode import by specifying a single schema other than your own or a list of schemas to import. First, the schemas themselves are created (if they do not already exist), including system and role grants, password history, and so on. Then all objects contained within the schemas are imported. Nonprivileged users can specify only their own schemas. In that case, no information about the schema definition is imported, only the objects contained within it.
The following is an example of using the SCHEMAS parameter. You can create the expdat.dmp file used in this example by running the example provided for the ExportSCHEMAS parameter.
$impdp hr/hr SCHEMAS=hr,oe DIRECTORY=dpump_dir1 LOGFILE=schemas.log
DUMPFILE=expdat.dmp

The hr and oe schemas are imported from the expdat.dmp file. The log file, schemas.log, is written to dpump_dir1
The following example shows a simple use of the TABLES parameter to import only the employees and jobs tables from the expfull.dmp file. You can create theexpfull.dmp dump file used in this example by running the example provided for the Full Database Export in Previous Topic.
$impdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmp TABLES=employees,jobs
This will import only employees and jobs tables from the DUMPFILE.
Similar to the DATA PUMP EXPORT utility the Data Pump Import Jobs can also be suspended, resumed or killed. And, you can attach to an already existing import job from any client machine.
For Example, suppose a DBA starts a importing by typing the following command at one client machine CLNT1 by typing the following command
$impdp scott/tiger@mydb FULL=y DIRECTORY=data_pump_dir  
     DUMPFILE=full.dmp LOGFILE=myfullexp.log JOB_NAME=myfullJob
After some time, the DBA wants to stop this job temporarily. Then he presses CTRL+C to enter into interactive mode. Then he will get the Import> prompt where he can type interactive commands
Now he wants to stop this export job so he will type the following command
Import> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([y]/n): y
The job is placed in a stopped state and exits the client.
After finishing his other work, the DBA wants to resume the export job and the client machine from where he actually started the job is locked because, the user has locked his/her cabin. So now the DBA will go to another client machine and he reattach to the job by typing the following command
$impdp hr/hr@mydb ATTACH=myfulljob
After the job status is displayed, he can issue the CONTINUE_CLIENT command to resume logging mode and restart the myfulljob job.
Import> CONTINUE_CLIENT
A message is displayed that the job has been reopened, and processing status is output to the client.
Note: After reattaching to the Job a DBA can also kill the job by typing KILL_JOB, if he doesn’t want to continue with the import job.

No comments: