1. Get rid of all advertisements and get unlimited access to documents by upgrading to Premium Membership. Upgrade to Premium Now and also get a Premium Badge!

How to Upload .csv file to a table through APEX?

Discussion in 'Oracle Application Express (APEX)' started by Vikram S, May 5, 2014.

  1. Vikram S

    Vikram S Active Member

    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    80
    HI guys,

    I have to load a data present in a csv file to a table through Oracle APEX.

    I created a browse page item to upload the .csv file and then a insert process with a code written shown below.



    DECLARE
    v_blob_data BLOB;
    v_blob_len NUMBER;
    v_position NUMBER;
    v_raw_chunk RAW(10000);
    v_char CHAR(1);
    c_chunk_len number := 1;
    v_line VARCHAR2 (32767) := NULL;
    v_data_array wwv_flow_global.vc_arr2;
    v_rows number;
    v_sr_no number := 1;

    BEGIN

    delete from up_csv_v;

    -- Read data from wwv_flow_files</span>
    select blob_content into v_blob_data
    from wwv_flow_files
    where last_updated = (select max(last_updated) from wwv_flow_files where UPDATED_BY = :APP_USER)
    and id = (select max(id) from wwv_flow_files where updated_by = :APP_USER);

    v_blob_len := dbms_lob.getlength(v_blob_data);
    v_position := 1;

    -- Read and convert binary to char</span>
    WHILE ( v_position <= v_blob_len ) LOOP
    v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
    v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
    v_line := v_line || v_char;
    v_position := v_position + c_chunk_len;
    -- When a whole line is retrieved </span>
    IF v_char = CHR(10) THEN
    -- Convert comma to : to use wwv_flow_utilities </span>
    v_line := REPLACE (v_line, ',', ':');
    -- Convert each column separated by : into array of data </span>
    v_data_array := wwv_flow_utilities.string_to_table (v_line);
    -- Insert data into target table </span>
    EXECUTE IMMEDIATE 'insert into up_csv_v (sku, productname, type, scheme, forsale, points, promvaliditystart, promvalidityend, eurvmwarelistprice,eurvmwarenetprice, gbpvmwarelistprice
    ,gbpvmwarenetprice,hassupportchildoptions,advsupport,validmediaproductcode,minqty,maxqty,mustmatchparentqty,requirechildproduct,requiresskudiscountswap,corediscount
    ,USERLEVEL,ispromotion,promotiontype,popupmessagetext,popupevidencetext,popupevidencefieldname,combinationoptions,manualcheckreason
    ,informationmessageeng,informationmessagefre,informationmessageger,informationmessageita,informationmessagespa,advadditional)
    values :)1,:2,:3,:4,:5,:6,:7,:8,:9,:10,:11,:12,:13,:14,:15,:16,:16,:17,:18,:19,:20,:21,:22,:23,:24,:25,:26,:27,:28,:29,:30,:31,:32,:33,:34)'
    USING
    v_sr_no,
    v_data_array(1),
    to_number(v_data_array(2)),
    v_data_array(3),
    v_data_array(4),
    v_data_array(5),
    v_data_array(6),
    to_number(v_data_array(7)),
    to_number(v_data_array(8)),
    to_number(v_data_array(9)),
    v_data_array(10);
    -- Clear out
    v_line := NULL;
    v_sr_no := v_sr_no + 1;
    END IF;

    END LOOP;
    END;


    But getting an error,

    1 error has occurred
    ORA-06502: PL/SQL: numeric or value error: character to number conversion error

    Please help me solving this issue.


    Thanks & Regards,
    Vikram S
     
  2. krasnoslobodtsev_si

    krasnoslobodtsev_si Forum Genius

    Messages:
    626
    Likes Received:
    129
    Trophy Points:
    810
    Location:
    Russian Federation
    Hi.

    Where you originally assigns value variables set V_LINE ?

    pay attention on your code :


    Deal exception and print the line number in which an error occurs.

    Check the delimiter character of the fractional part of the whole number.

    Note:provide an example of your data file.
     
  3. Vikram S

    Vikram S Active Member

    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    80
    Hi,

    I did it in a different way and it's working.

    Requirement was to upload a .csv file and insert the data into a table simultaneously through APEX.

    The procedure which i have written for insertion of data to a table from .csv file on uploading:

    CREATE OR REPLACE PROCEDURE LOAD_LEAD_LIST(P_FILENAME VARCHAR2)
    AS
    LC_VAR VARCHAR2(4000);
    CURSOR CSV_CONTENT IS
    SELECT * FROM TABLE(FILE_LOAD_PKG.F_CSV_LOAD_DATA(P_FILENAME)) WHERE ROWNAME = 'DATAROW';
    BEGIN
    FOR C_REC IN CSV_CONTENT
    LOOP
    INSERT INTO LEADLIST_V (SKU, PRODUCTNAME, TYPE, SCHEME, FORSALE, POINTS, PROMVALIDITYSTART, PROMVALIDITYEND, EURVMWARELISTPRICE,EURVMWARENETPRICE,GBPVMWARELISTPRICE,GBPVMWARENETPRICE,HASSUPPORTCHILDOPTIONS,ADVSUPPORT,VALIDMEDIAPRODUCTCODE,MINQTY,MAXQTY,MUSTMATCHPARENTQTY,REQUIRECHILDPRODUCT,REQUIRESSKUDISCOUNTSWAP,COREDISCOUNT ,USERLEVEL,ISPROMOTION,PROMOTIONTYPE,POPUPMESSAGETEXT,POPUPEVIDENCETEXT,POPUPEVIDENCEFIELDNAME,COMBINATIONOPTIONS,MANUALCHECKREASON,INFORMATIONMESSAGEENG,INFORMATIONMESSAGEFRE,INFORMATIONMESSAGEGER,INFORMATIONMESSAGEITA,INFORMATIONMESSAGESPA,ADVADDITIONAL,UPTIME,FINISHEDTIME)
    VALUES (C_REC.F1,C_REC.F2,C_REC.F3,C_REC.F4,C_REC.F5,C_REC.F6,C_REC.F7,C_REC.F8,C_REC.F9,C_REC.F10,C_REC.F11,C_REC.F12,C_REC.F13,C_REC.F14,C_REC.F15,C_REC.F16,C_REC.F17,C_REC.F18,C_REC.F19,C_REC.F20,C_REC.F21,C_REC.F22,C_REC.F23,C_REC.F24,C_REC.F25,C_REC.F26,C_REC.F27,C_REC.F28,C_REC.F29,C_REC.F30,C_REC.F31,C_REC.F32,C_REC.F33,C_REC.F34,C_REC.F35,SYSTIMESTAMP,SYSTIMESTAMP);
    END LOOP;
    COMMIT;
    EXCEPTION WHEN OTHERS THEN
    LC_VAR:=SQLERRM;
    INSERT INTO TEST_V(NAME) VALUES(LC_VAR);
    END LOAD_LEAD_LIST;
    /


    Since i have used cursor here, insertion of data takes more time as number of rows increase in csv file. So i need to do performance tuning where i have to use bulk collect and for all.

    Please guide me regarding this.


    Thanks & Regards,
    Vikram S
     
  4. krasnoslobodtsev_si

    krasnoslobodtsev_si Forum Genius

    Messages:
    626
    Likes Received:
    129
    Trophy Points:
    810
    Location:
    Russian Federation
    Hi.
    It is more correct to control transaction on the defiant side...

    simple candidate solution :

    Code (SQL):


    CREATE OR REPLACE PROCEDURE LOAD_LEAD_LIST(P_FILENAME VARCHAR2,P_ERR_MSG  OUT VARCHAR2)
    AS
       
    BEGIN
    INSERT INTO
        LEADLIST_V (SKU, PRODUCTNAME, TYPE, SCHEME, FORSALE, POINTS, PROMVALIDITYSTART, PROMVALIDITYEND, EURVMWARELISTPRICE,EURVMWARENETPRICE,GBPVMWARELIST PRICE,GBPVMWARENETPRICE,HASSUPPORTCHILDOPTIONS,ADV SUPPORT,VALIDMEDIAPRODUCTCODE,MINQTY,MAXQTY,MUSTMA TCHPARENTQTY,REQUIRECHILDPRODUCT,REQUIRESSKUDISCOU NTSWAP,COREDISCOUNT ,USERLEVEL,ISPROMOTION,PROMOTIONTYPE,POPUPMESSAGET EXT,POPUPEVIDENCETEXT,POPUPEVIDENCEFIELDNAME,COMBI NATIONOPTIONS,MANUALCHECKREASON,INFORMATIONMESSAGE ENG,INFORMATIONMESSAGEFRE,INFORMATIONMESSAGEGER,INFORMATIONMESSAGEITA,INFORMATIONMESSAGESPA,ADVADDIT IONAL,UPTIME,FINISHEDTIME)
    SELECT
        C_REC.F1,C_REC.F2,C_REC.F3,C_REC.F4,C_REC.F5,C_RE C.F6,C_REC.F7,C_REC.F8,C_REC.F9,C_REC.F10,C_REC.F1 1,C_REC.F12,C_REC.F13,C_REC.F14,C_REC.F15,C_REC.F1 6,C_REC.F17,C_REC.F18,C_REC.F19,C_REC.F20,C_REC.F2 1,C_REC.F22,C_REC.F23,C_REC.F24,C_REC.F25,C_REC.F2 6,C_REC.F27,C_REC.F28,C_REC.F29,C_REC.F30,C_REC.F3 1,C_REC.F32,C_REC.F33,C_REC.F34,C_REC.F35,SYSTIMES TAMP,SYSTIMESTAMP)
        ;
    FROM
        TABLE(FILE_LOAD_PKG.F_CSV_LOAD_DATA(P_FILENAME))  C_REC
    WHERE
        ROWNAME = 'DATAROW';
    EXCEPTION
        WHEN OTHERS THEN  P_ERR_MSG := SQLERRM;
    END LOAD_LEAD_LIST;

     

    or

    Use DML error logging.
    Simply check existence of errors in your processed file and execute COMMIT OR ROLLBACK

    Code (SQL):


    BEGIN
      DBMS_ERRLOG.create_error_log (dml_table_name => 'LEADLIST_V');
    END;
    /
    CREATE OR REPLACE PROCEDURE LOAD_LEAD_LIST(P_FILENAME VARCHAR2)
    AS
       
    BEGIN
       
    INSERT INTO
        LEADLIST_V (SKU, PRODUCTNAME, TYPE, SCHEME, FORSALE, POINTS, PROMVALIDITYSTART, PROMVALIDITYEND, EURVMWARELISTPRICE,EURVMWARENETPRICE,GBPVMWARELIST PRICE,GBPVMWARENETPRICE,HASSUPPORTCHILDOPTIONS,ADV SUPPORT,VALIDMEDIAPRODUCTCODE,MINQTY,MAXQTY,MUSTMA TCHPARENTQTY,REQUIRECHILDPRODUCT,REQUIRESSKUDISCOU NTSWAP,COREDISCOUNT ,USERLEVEL,ISPROMOTION,PROMOTIONTYPE,POPUPMESSAGET EXT,POPUPEVIDENCETEXT,POPUPEVIDENCEFIELDNAME,COMBI NATIONOPTIONS,MANUALCHECKREASON,INFORMATIONMESSAGE ENG,INFORMATIONMESSAGEFRE,INFORMATIONMESSAGEGER,INFORMATIONMESSAGEITA,INFORMATIONMESSAGESPA,ADVADDIT IONAL,UPTIME,FINISHEDTIME)
    SELECT
        C_REC.F1,C_REC.F2,C_REC.F3,C_REC.F4,C_REC.F5,C_RE C.F6,C_REC.F7,C_REC.F8,C_REC.F9,C_REC.F10,C_REC.F1 1,C_REC.F12,C_REC.F13,C_REC.F14,C_REC.F15,C_REC.F1 6,C_REC.F17,C_REC.F18,C_REC.F19,C_REC.F20,C_REC.F2 1,C_REC.F22,C_REC.F23,C_REC.F24,C_REC.F25,C_REC.F2 6,C_REC.F27,C_REC.F28,C_REC.F29,C_REC.F30,C_REC.F3 1,C_REC.F32,C_REC.F33,C_REC.F34,C_REC.F35,SYSTIMES TAMP,SYSTIMESTAMP)
        ;
    FROM
        TABLE(FILE_LOAD_PKG.F_CSV_LOAD_DATA(P_FILENAME))  C_REC
    WHERE
        ROWNAME = 'DATAROW'
    LOG ERRORS INTO err$_LEADLIST_V (P_FILENAME) REJECT LIMIT UNLIMITED;

    END LOAD_LEAD_LIST;
    /

     
     
  5. Vikram S

    Vikram S Active Member

    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    80
    Hi,

    Thanks for your reply.

    I need to do performance tuning on that which i has sent earlier.

    Eg. 1000 records get uploaded in 10 secs, 10000 in a minute, 20000 in 6 minutes and 50000 takes some 40 minutes.

    The code which you sent is also taking same time.

    Thanks & Regards,
    Vikram S
     
  6. krasnoslobodtsev_si

    krasnoslobodtsev_si Forum Genius

    Messages:
    626
    Likes Received:
    129
    Trophy Points:
    810
    Location:
    Russian Federation
    Hi.
    From where you took such requirements for response time?
    You can provide the description of the table LEADLIST_V and function FILE_LOAD_PKG.F_CSV_LOAD_DATA of a package for csv file processing?
    You can use dbms_profile for profiling of your function FILE_LOAD_PKG.F_CSV_LOAD_DATA?

    You can answer the following questions:
    1) what is the time it is required on file processing of containing 10 rows
    2) what is the time it is required on file processing of containing 100 rows
    3) what is the time it is required on file processing of containing 1000 rows
     
    Vikram S likes this.