As you are on 10g you can do this with the Data Pump API. You need to have read and write access on a directory object which maps to the destination OS directory.
In the following example I am exporting two tables, EMP and DEPT, to a file called EMP.DMP in a directory identified by DATA_PUMP_DIR.
SQL> declare
2 dp_handle number;
3 begin
4 dp_handle := dbms_datapump.open(
5 operation => 'EXPORT',
6 job_mode => 'TABLE');
7
8 dbms_datapump.add_file(
9 handle => dp_handle,
10 filename => 'emp.dmp',
11 directory => 'DATA_PUMP_DIR');
12
13 dbms_datapump.add_file(
14 handle => dp_handle,
15 filename => 'emp.log',
16 directory => 'DATA_PUMP_DIR',
17 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
18
19 dbms_datapump.metadata_filter(
20 handle => dp_handle,
21 name => 'NAME_LIST',
22 value => '''EMP'',''DEPT''');
23
24 dbms_datapump.start_job(dp_handle);
25
26 dbms_datapump.detach(dp_handle);
27 end;
28 /
PL/SQL procedure successfully completed.
SQL>
@DerekMahar asks:
"Is there a similar data pump tool or
API available for execution from the
client side"
DataPump, both the PL/SQL API and the OS utility, write to Oracle directories. An Oracle directory must represent an OS directory which is visible to the database. Usually that is a directory on the server, although I suppose it is theoretically possible to map a PC drive to the network. You'd have to persuade your network admin that this is a good idea, it is a tough sell, because it isn't...
The older IMP and EXP utilities read and wrote from client directories, so it is theoretically possible possible to IMP a local dump file into a remote database. But I don't think this is a practical approach. By their nature dump files tend to be big, so importing across a network is slow and prone to failure. It is a much better solution to zip the dump file, copy it to the server and import it from there.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…