Dear Readers,
My name is Franz Devantier, creator of this blog. I am an Oracle Certified
Professional (OCP DBA 11g) Security DBA.
I will be sharing with you the basic duties of an Oracle DBA, and also
some of the undocumented, and not so well known tasks.
Datapump Exports:
The first thing you need to do, to set up datapump exports, is to verify that you have a suitable Directory object.
Get to a shell in Unix, or a command prompt in Windows:
export ORACLE_SID=<SidName> or in windows set ORACLE_SID=<SidName>
sqlplus / as sysdba, or connect as your DBA user.
SQL> set linesize 120
SQL> select directory_name||' : '||directory_path from dba_directories;
This should give you a list of directory names, and directory paths. Determine if one of these directory paths is suitable for your purposes. Go to the directory_path, and determine if it is being used for other purposes. If for whatever reason, it is not suitable. For example there may not be sufficient space on the file system or in windows the drive, or it may already be used for other purposes, and you want to keep things seperate. In this case you would first create a directory.
For example:
SQL> create directory <Directory_name> as '<Directory_path>';
Then you need to grant rights to the user, who will be running datapump. This may be the system user, or your DBA_user, or another designated user.
SQL> grant execute, read, write on directory <Directory_name> to <user_name>;
Now you are ready to start creating your datapump export script. Have a look at the syntax, by invoking this datapump, to display only the available parameters:
> expdp help=y
The best approach here is to create a parameter file, for example:
> expdp_<dbname>.par
There are three important parameters to have in this file:
DIRECTORY=<directory_name>
DUMPFILE=<dumpfile_name>.dmp
LOGFILE=<logfile_name>.log
Note that these names are not fully qualified, because the Directory parameter will point to the path. So the parameters DUMPFILE and LOGFILE should contain only the file names.
You also need to specify what you need to export. If you are not sure, then you can export the entire database:
FULL=Y
If you know which schemas, then you can add the schema names in
SCHEMAS=scott
Or if you only need a few tables then:
TABLES=(schema.table_name,schema.table_name:partition_name)
Exporting tables, enables you to drill down to specific partitions, which can save space and time, if you have large partitioned tables.
Now that you have set up your parameter file you are ready to run. At this point in time, you may decide to run it all from the command line, but to keep the parameter file as a form of documentation, or you may run it using the parameter file.
From the command line:
> expdp user@connect_string directory=<directory_name> dumpfile=<dumpfile_name> logfile=<logfile_name> full=y
You will then be prompted for the password. this is best practice, because it avoids the password being available while the process is running.
Using a parameter file:
> expdp user@connect_string parfile=expdp_<dbname>.par
You may find that you need to specify the fully qualified path for the parameter file.
Here is one last parameter that you may like to add into your parameter file. With datapump if the dumpfile already exists, then you stop before you start with an error. With the exp utility, it would have automatically just overwritten it.
REUSE_DUMPFILES=Y
This parameter will cause your old dumpfile of the same name to be automatically over-written. Make sure that this is what you want, before you add this parameter in.
So your final parameter file could look someting like this"
DIRECTORY=<directory_name>
DUMPFILE=<dumpfile_name>.dmp
LOGFILE=<logfile_name>.log
TABLES=(schema.table_name1,schema.table_name2)
REUSE_DUMPFILES=Y
If you were to be running this from UNIX or windows, you would have run it directly from the command line. However now you can't close that command window or Unix shell until it is finished. If you had connected to a windows server with "Remote Desktop Connection", then you can click the 'X' in the top right hand corner, and the server will continue to process your script until you log in again. The problem with this is that by default you can have only two concurrent connections, so somebody may bump your connection off, if they need to connect to the server a little later on. To over come this issue in windows you can run it from the scheduler.
In Unix, you will have to sit around until it is finished before you can close the shell. Running it from the cron, which is the UNIX scheduler would have also given you more options. However here is one common way to work around this problem in UNIX:
Put the process in the background, or start it in the background. Further more if you run it with nohup, then you will have a log file afterwards of everything that got spooled to the screen. Fo example:
nohup expdp user@connect_string parfile=expdp_<dbname>.par &
Now you can watch in realtime what happen like this: tail -50f nohup.out
If you need to exit the Unix shell you can, because it is running in the background. When you log in again, you can just do the tail -50f nohup.out, to see where you are in the process. nohup.out should be close to the same place that the log file that we defined in this exercise is. Good idea to check them both. You can also spool the log file in the same way: "tail -50f logfile.log"
Now the important part comes when you validate if the backup was successful or not.
You start off by gong right to the end of the log file, to see if there were any errors. If you see a message that implies that it ended successfully without errors, then you are good to go. If you see something that tells you that you have finished successfully with errors, or warnings, then you need to go back through the log file, to determine what went wrong.
Always keep the log file with the dump file. The log file will tell you in detail what was exported, and if it was successful. Without the information in the logfile, the dumpfile can not be trusted as a valid source to recovered from, and therefore has very limited value. So always keep your logfiles.
Well that should be enough to get you going on datapump exports. Keep looking at this blog for more database administration insights.
Franz Devantier,
Need a database health check?
Income stabilizer (win-win opportunity)
Join the carefully selected and tested cash-flow generating
program below to potentially create a long-term residual or annuity type income
enhancer for yourself.
Traffic Wave - Free Report: The report will
give you all the information you need to start making a nice long-term residual
income stream for yourself.
|
No comments:
Post a Comment