OracleDBA's

4,025 members
  • Join

    When you join a group, other members will be able to see your profile and message you. The group logo will be visible on your profile unless you change that setting.

  • Information and settings

Have something to say? Join LinkedIn for free to participate in the conversation. When you join, you can comment and post your own discussions.

Selvakumar N.

Selvakumar

Datapump - some tips oraclemamukutti.blogspot.in

Data Pump is a utility for unloading/loading data and metadata into a set of operating system files called a dump file set. The dump file set can be imported only by the Data Pump Import utility. The dump file set can be...

  • Flag as Inappropriate
  • Comment (8)
  • May 21, 2012
  • Close viewer

Comments

  • Henk V.

    Henk

    Henk V.

    DBA/Sapbeheerder at Gemeente Zutphen

    Be carefull, using expdp cannot handle LONG tables..........

  • Selvakumar N.

    Selvakumar

    Selvakumar N.

    Senior Software Engineer at CGI

    Hi Henk, expdp can handle LONG tables but the performance is too slow.
    impdp cannot handle LONG tables only when done as network import which doesn't involve any dumpfiles.
    So when LONG tables are in consideration, do an export and import. It is possible.

  • Henk V.

    Henk

    Henk V.

    DBA/Sapbeheerder at Gemeente Zutphen

    Selvakumar,
    i did an export with expdp (ora10.2.4.) After import with impdp the table with a column LONG, the text for example hello, looked like .h.e.l.l.o. The application couild not read the data. There are 8443 rows in it with textfields, which i am now exported wiht sqldeveloper, and removerd the lf, cr, and null. and import them.

  • Selvakumar N.

    Selvakumar

    Selvakumar N.

    Senior Software Engineer at CGI

    Henk,

    Please check the below link
    http://aprakash.wordpress.com/2011/01/06/how-to-move-table-containing-long-or-lob-columns/

    This shows datapump can deal with LONG datas. But this depends on some limitations and restrictions. you can see those in oracle documentation
    http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_overview.htm

  • -

    - -.

    This is not a DataPump issue, but a DB-Link restriction. Long columns can't be accessed remotely by db-link. But, neveretheless, Selvakumar is right, this data type can be exported/imported directly to/from dump file.

  • Karen P.

    Karen

    Karen P.

    Senior Oracle DBA at INC Research (formerly Kendle International)

    Hi,

    We are in the process of implementing Datapump on a variety of different sized databases. Just wondered if anyone had any advice or guidelines on setting the parallelism parameter. I know about the 2* no of CPUs as a starting point but that's all I can find....

    Thanks, Karen

  • Sandra H.

    Sandra

    Sandra H.

    Solution Architect at Liberty Health

    Parallelism will be more effective when you split the dumpfiles across filesystems. Do so using the DUMPFILE parameter
    Create the directories on different filesystems, if using HDP then try get LUNS from different pools

    expdp / dumpfile=dir1:test_1.dmp, dir1:test_2.dmp, dir2:test_3.dmp, dir3:test_4.dmp logfile=dir1:test.log full=y parallel=4
    where dir1, dir2 and dir3 are directory names created in the database.

Have something to say? Join LinkedIn for free to participate in the conversation. When you join, you can comment and post your own discussions.

Feedback