I'm getting back to doing some PL/SQL coding after a break of several years. I'm trying to start with something simple & building on that.
I've got this code which should read in values from one file & write out to another. What's happening is that I don't get all input fields in the output file.
More accurately, if the size of the input file is 2048 bytes then I do get all input fields in the output file.
If there are less than 2048 bytes on input then I get nothing; if there are more than 2048 then I begin to lose
some fields.
There must be some environment setting which is affecting this but I don't know what. (I added "set serveroutput"
hoping for the best but no difference)
(It's Oracle9i on Solaris 9)
Can anyone tell me where I'm going wrong? Thanks, Chris
Code:
create or replace procedure change_details(
dirname in varchar2,
id_filename in varchar2)
is
ids_file utl_file.file_type;
spool_file utl_file.file_type;
id_var varchar2(400);
begin
ids_file:=utl_file.fopen(dirname, id_filename, 'R');
spool_file:=utl_file.fopen(dirname,'update_names.lst', 'W');
loop
begin
utl_file.get_line (ids_file, id_var);
utl_file.put_line (spool_file,id_var);
exception
when no_data_found then
exit;
end;
end loop;
utl_file.fclose(ids_file);
end;
/
spool person_ids
set serveroutput on size 1000000;
begin
change_details('/export/home/ora817/scripts/anonymise/anon_cf','ids.lst');
end;
/
spool off
I've got this code which should read in values from one file & write out to another. What's happening is that I don't get all input fields in the output file.
More accurately, if the size of the input file is 2048 bytes then I do get all input fields in the output file.
If there are less than 2048 bytes on input then I get nothing; if there are more than 2048 then I begin to lose
some fields.
There must be some environment setting which is affecting this but I don't know what. (I added "set serveroutput"
hoping for the best but no difference)
(It's Oracle9i on Solaris 9)
Can anyone tell me where I'm going wrong? Thanks, Chris