Impdp Unable To Open Dump File For Read
Download >>>>> https://urluss.com/2t7N1F
ORA-31640 Unable to open dump file "string" for readOracle Database Tips by Donald BurlesonNovember 18, 2015Question: I am getting the ORA-31640 error during an import:
Well I found the problem. Actually I had type error. I have typo "vahe.DMP" instead of "vahe.dmp"(in lower case ). I think error message is not good one, because it should clearly say that file does not exist instead of saying "unable to open dump file '' for read" (IMHO)
Description:[From ELUNA SysAdmin list:]We started our migration from 18.01 to 20.01 and, for each library we are migrating, we noticed this error in DataPumping import error log.Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing optionsORA-39001: invalid argument valueORA-39000: bad dump file specificationORA-31640: unable to open dump file"/exlibris/aleph/upgrade_express_1901_2001/data/a18_1/pto01/files/dpdir/pto0101.dmp" for readORA-27037: unable to obtain file statusLinux-x86_64 Error: 13: Permission denied Additional information: 3Import (dp) pto01 endedI tried a manual execution of the impdb command for import and it ends in the same way.Exporting without datapumping seems to work.Resolution:[From Alexandra Major:]Did you check the OS-Permissions on the dmp-files (user-, group- and file-permissions) and of the dpdir? The error messages 'unable to open for read' and 'permission denied' look like permission-problems.[From site:]I checked the permissions for the files and they were right. But i never checked the /exlibris/aleph/update_express_1901_2001permissions and they were 700 (for aleph user). Probably impdp program execs as "oracle" and not as "aleph" user. Now it's working.
It looks like Oracle can't read the .dmp file, because it's under another user's home directory. Oracle has to be able to read it, not the Linux user you're running the impdp command as. (Contrary to my earlier misleading comments!)
Then the above ORA-39142: incompatible version number 4.2 in dump file error does not happen when you export from Oracle 12.1.0.2 with Bundle Patches either April or July 2018, and try to import into another Oracle 12.1.0.2 database having a lower (or no) patch bundle applied.
Oracle Data Pump is a utility that allows you to export Oracle data to a dump file and import it into another Oracle database. It is a long-term replacement for the Oracle Export/Import utilities. Oracle Data Pump is the recommended way to move large amounts of data from an Oracle database to an Amazon RDS DB instance.
If database components are invalidated, you can delete the DB instance and re-create it from the DB snapshot. The restored DB instance includes any dump files staged on the DB instance when you took the DB snapshot.
Don't import dump files that were created using the Oracle Data Pump export parameters TRANSPORT_TABLESPACES, TRANSPORTABLE, or TRANSPORT_FULL_CHECK. RDS for Oracle DB instances don't support importing these dump files.
If you dump file exceeds 5 TB, you can run the Oracle Data Pump export with the parallel option. This operation spreads the data into multiple dump files so that you do not exceed the 5 TB limit for individual files.
This process imports a dump file into the DATA_PUMP_DIR directory, a preconfigured directory on all Oracle DB instances. This directory is located on the same storage volume as your data files. When you import the dump file, the existing Oracle data files use more space. Thus, you should make sure that your DB instance can accommodate that additional use of space. The imported dump file is not automatically deleted or purged from the DATA_PUMP_DIR directory. To remove the imported dump file, use UTL_FILE.FREMOVE, found on the Oracle website.
Use the Amazon RDS procedure rdsadmin.rdsadmin_s3_tasks.upload_to_s3 to copy the dump file to the Amazon S3 bucket. The following example uploads all of the files from the DATA_PUMP_DIR directory to an Amazon S3 bucket named myS3bucket.
Perform this step using the Amazon RDS procedure rdsadmin.rdsadmin_s3_tasks.download_from_s3. When you download a file to a directory, the procedure download_from_s3 skips the download if an identically named file already exists in the directory. To remove a file from the download directory, use UTL_FILE.FREMOVE, found on the Oracle website.
Data Pump jobs are started asynchronously. For information about monitoring a Data Pump job, see Monitoring job status in the Oracle documentation. You can view the contents of the import log by using the rdsadmin.rds_file_util.read_text_file procedure. For more information, see Reading files in a DB instance directory.
Data Pump jobs are started asynchronously. For information about monitoring a Data Pump job, see Monitoring job status in the Oracle documentation. You can view the contents of the export log by using the rdsadmin.rds_file_util.read_text_file procedure. For more information, see Reading files in a DB instance directory.
Create a database link between your source DB instance and your target DB instance. Your local Oracle instance must have network connectivity to the DB instance in order to create a database link and to transfer your export dump file.
Use DBMS_FILE_TRANSFER to copy the dump file from the source database instance to the target DB instance. The following script copies a dump file named sample.dmp from the source instance to a target database link named to_rds (created in the previous step).
This post is part of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.In this post, we will install and use SQLcl and Datapump from the Compute Instance (VM) connecting to our Oracle Database in the Autonomous Transaction Processing (ATP) Cloud.Although I use most of the time SQL Developer to connect to the database, I find it important to be able to use command-line tools too, as this is what you can automate and it's really fast.In the previous post, we installed the command line tools of the Oracle Cloud on our own machine, but for the Oracle Tools, I prefer to install them on our Compute Instance in the Cloud. Especially when we want to automate something, it's easier to do this from another machine in the cloud. It also makes it easier to follow as we only have to focus on how to install the Oracle Tools on Linux.Oracle Instant ClientIn order to connect to an Oracle database from a machine, we will use the Oracle Instant Client software. You can download the software for the different operating systems, but as our VM is running Oracle Linux we can install it with just a few commands:First, update yum so it's smarter where to find Oracle software:yum install oracle-release-el7Next, we can search for the Oracle Instant Client version we need:yum search oracle-instantWe want to install the Oracle Instant Client version of the system we want to connect to. For the Free Oracle Database on ATP, it's Oracle Database Release 18.4, so we will pick Oracle Instant Client 18.5. To be honest, typically I take the latest version of the software, but when I tried that, the Oracle Instant Client complained the libraries were not compatible with the version we wanted to connect to. I always thought you could use newer versions of the Oracle tools against previous databases, but apparently, that is no longer the case (at least not during my tests). Anyway, it's good to have the version of the same tool as the version you connect to.Install the Instant Client basic and tools packages:yum install oracle-instantclient18.5-basic.x86_64yum install oracle-instantclient18.5-tools.x86_64As a last step we set some environment variables:export PATH=/usr/lib/oracle/18.5/client64/bin:$PATHexport LD_LIBRARY_PATH=/usr/lib/oracle/18.5/client64/libexport TNS_ADMIN=/usr/lib/oracle/18.5/client64/lib/network/adminThat's it! We can now use the Oracle tools. Note there's also a SQL Plus package, which allows you to connect from a command line to the database, but I prefer to use SQLcl as it has some cool features for Oracle APEX (e.g. exporting your app). Download SQLcl now.Before we move on to installing SQLcl, make sure you still have the credentials (wallet) file we used when connecting with SQL Developer to our database. Just like with SQL Developer, we also need this with SQLcl to connect to our database. As a reminder here's the screenshot I'm talking about:Upload both the SQLcl and Credentials zip file to the Compute Instance (VM):scp -i .ssh/oraclecloud /Users/dgielis/Downloads/wallet_DBDIMI.zip opc@132.145.215.55:/tmpscp -i .ssh/oraclecloud /Users/dgielis/Downloads/sqlcl-19.2.1.206.1649.zip opc@132.145.215.55:/tmpConnect to your VM and unzip the files:ssh -i .ssh/oraclecloud opc@132.145.215.55unzip /tmp/wallet_DBDIMI.zip -d /usr/lib/oracle/18.5/client64/lib/network/adminunzip /tmp/sqlcl-19.2.1.206.1649.zip -d /optBefore we can run SQLcl we also need to make sure we have JAVA installed, as SQLcl depends on that:yum install javaTo make it easier to run SQLcl from anywhere we will create a symbolic link:ln -s /opt/sqlcl/bin/sql /usr/lib/oracle/18.5/client64/bin/sqlNow we are ready to connect to our database on ATP:sql admin@dbdimi_highThere we go... we can connect from our VM to our ATP database.The next thing we want to do is export the data from our ATP database. We will use Datapump that came with the installation of the tools.Run the command to export the schema CLOUD:expdp admin@dbdimi_high \exclude=index,cluster,indextype,materialized_view,materialized_view_log,materialized_zonemap,db_link \data_options=group_partition_table_data \parallel=1 \schemas=cloud \dumpfile=export%u.dmpSo where did this export go? To the default DATA_PUMP_DIR directory we don't have direct access to... but to list the files in the directory we can do:SELECT * FROM DBMS_CLOUD.LIST_FILES('DATA_PUMP_DIR');Remember my previous blog post about the Object Storage, in which we set up a Backups bucket?Oracle allows you to connect your Object Storage to your ATP database and that is exactly what we will do further on :)We will use the same user we created earlier for CLI. In order to connect to ATP we need to set up an Auth Token. Go to the User Details of cliUser and click the Auth Tokens:Click the Generate Token button:There's the token... you only see it once, so make sure to copy it:Next, connect to your ATP database and run the script to add the credentials to the ATP database:begin dbms_cloud.create_credential( credential_name => 'DEF_CRED_NAME' , username => 'cliUser' , password => 'Frx}R9lD0O}dIgZRGs{:' );end;/Now that the DBMS_CLOUD package has credentials, we can do other calls with this package.To add the Datapump export files to the Object Storage, we can use the PUT_OBJECT procedure.I created a small script to take all the files from the DATA_PUMP_DIR and put them in the backups Bucket in the Object Storage:begin for r in (select object_name, bytes from dbms_cloud.list_files('DATA_PUMP_DIR')) loop dbms_cloud.put_object(credential_name => 'DEF_CRED_NAME', object_uri => ' -ashburn-1.oraclecloud.com/n/id9u4qbhnjxj/b/backups/o/'||r.object_name, directory_name => 'DATA_PUMP_DIR', file_name => r.object_name); end loop; end;/And when we check our bucket, we see the Datapump export files! Yay!We also want to export our Oracle APEX apps. In some projects, I use the APEXExport utility, but now we will use SQLcl to export our APEX app 101:apex export 101In real life I typically create a few scripts which I can run one-by-one or combined in a general backup script. The script will export the Oracle schemas, the APEX apps and save the files to another location, in our case the Object Storage.vi make_backup.sh Here are the details of the scripts which are called in the main backup script:You can schedule this script with crontab, for example, every day at 2AM:The above is just an example of what you can do to automate your backups. You have to decide how frequently you want to do those backups.If you want to move your existing Oracle database and APEX apps to the Oracle Cloud, the steps are similar to above. You upload your Datapump export file to your Object Storage. Next, run the Data Pump Import with the dump file parameter set to the list of file URLs on your Cloud Object Storage and the credential parameter set to the name of the credential you created earlier. For example:impdp admin/password@dbdimi_high \ directory=data_pump_dir \ credential=def_cred_name \ dumpfile= -ashburn-1.oraclecloud.com/n/adwc/b/adwc_user/o/export%u.dmp \ parallel=1 \ partition_options=merge \ transform=segment_attributes:n \ transform=dwcs_cvt_iots:y transform=constraint_use_default_index:y \exclude=index,cluster,indextype,materialized_view,materialized_view_log,materialized_zonemap,db_linkNext, you would create your APEX workspace and import the APEX apps.In the next post, we dive again in Oracle APEX and how to send emails from your APEX app. 2b1af7f3a8