1. Get rid of all advertisements and get unlimited access to documents by upgrading to Premium Membership. Upgrade to Premium Now and also get a Premium Badge!

A sql tuning

Discussion in 'SQL PL/SQL' started by 13478, Apr 22, 2016.

  1. 13478

    13478 Active Member

    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    80
    Location:
    Louisana
    Hello,

    I have a DML:

    UPDATE /*+ PARALLEL */ MF_HELLO

    SET TYP_OF_AGCY_TRFR = (SELECT /*+ PARALLEL */ TYP_OF_AGCY_TRFR FROM MF_WORLD LN WHERE LN.UIDY = MF_HELLO.LINE_ID)
    WHERE EXISTS(SELECT /*+ PARALLEL */ 1 FROM MF_WORLD LN WHERE LN.UIDY = MF_HELLO.LINE_ID);

    25276916 rows updated.

    Elapsed: 04:28:56.48

    if I use the approach like, drop index/constraint of table MF_HELLO, update, then re-add index/constraints. it will be faster.

    Is there any other way to make it even faster and simpler?

    Thank you very much in advance!!!
     
  2. krasnoslobodtsev_si

    krasnoslobodtsev_si Forum Genius

    Messages:
    626
    Likes Received:
    129
    Trophy Points:
    810
    Location:
    Russian Federation
    Hi.
    Practically it is always possible to carry out either refactoring, or optimization.

    It is better to carry out your inquiry in serial the mode one operator.
    for example: mere
    Code (SQL):

    MERGE INTO mf_hello n
    USING mf_world o
    ON (o.uidy = n.line_id)
    WHEN matched THEN UPDATE SET n.TYP_OF_AGCY_TRFR = o.TYP_OF_AGCY_TRFR;
    If there are constraintss or indexes, then can be better to use DBMS_PARALLEL_EXECUTE

    Additional link : DBMS_PARALLEL_EXECUTE
     
    Last edited: Apr 23, 2016
    13478 likes this.
  3. 13478

    13478 Active Member

    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    80
    Location:
    Louisana
    Hello Krasnoslobodtsev:

    Your "merge" idea works, it is great, thank you very much!!!
    I am running it to see how long it will take.

    also, I am exploring of using DBMS_PARALLEL_EXECUTE:


    DECLARE
    l_chunk_sql VARCHAR2(1000);
    l_sql_stmt VARCHAR2(1000);
    l_try NUMBER;
    l_status NUMBER;

    BEGIN

    -- Create the TASK
    DBMS_PARALLEL_EXECUTE.CREATE_TASK ('mytask');

    -- Chunk the table
    l_chunk_sql := 'SELECT 1 FROM MF_WORLD LN, MF_HELLO WHERE LN.UIDY = MF_HELLO.LINE_ID';
    DBMS_PARALLEL_EXECUTE.CREATE_CHUNKS_BY_SQL('mytask', l_chunk_sql, false);

    -- Execute the DML in parallel
    -- the WHERE clause contain a condition on manager_id, which is the chunk
    -- column. In this case, grouping rows is by manager_id.
    l_sql_stmt := 'update /*+ ROWID (MF_HELLO) */ MF_HELLO
    SET SET TYP_OF_AGCY_TRFR = (SELECT TYP_OF_AGCY_TRFR FROM MF_WORLD LN WHERE LN.UIDY = MF_HELLO.LINE_ID)
    WHERE EXISTS between :start_id and :end_id';
    DBMS_PARALLEL_EXECUTE.RUN_TASK('mytask', l_sql_stmt, DBMS_SQL.NATIVE,
    parallel_level => 10);

    -- If there is error, RESUME it for at most 2 times.
    L_try := 0;
    L_status := DBMS_PARALLEL_EXECUTE.TASK_STATUS('mytask');
    WHILE(l_try < 2 and L_status != DBMS_PARALLEL_EXECUTE.FINISHED)
    Loop
    L_try := l_try + 1;
    DBMS_PARALLEL_EXECUTE.RESUME_TASK('mytask');
    L_status := DBMS_PARALLEL_EXECUTE.TASK_STATUS('mytask');
    END LOOP;

    -- Done with processing; drop the task
    DBMS_PARALLEL_EXECUTE.DROP_TASK('mytask');

    end;
    /

    get error

    ERROR at line 1:
    ORA-01007: variable not in select list
    ORA-06512: at "SYS.DBMS_PARALLEL_EXECUTE_INTERNAL", line 634
    ORA-06512: at "SYS.DBMS_PARALLEL_EXECUTE", line 163
    ORA-06512: at line 14

    Looks like below two parts have problems:

    1. l_chunk_sql := 'SELECT 1 FROM

    2. WHERE EXISTS between :start_id and :end_id';


    how about your thinking?

    Thank you so much! have a good weekend!!
     
  4. krasnoslobodtsev_si

    krasnoslobodtsev_si Forum Genius

    Messages:
    626
    Likes Received:
    129
    Trophy Points:
    810
    Location:
    Russian Federation
    Yes, the reason of failure of execute the specified points 1 and 2.

    1)
    l_chunk_sql := 'SELECT MH.LINE_ID,MH.LINE_ID FROM MF_WORLD MW, MF_HELLO MH WHERE MW.UIDY = MH.LINE_ID';

    2) replace to "WHERE MF_HELLO.LINE_ID between :start_id and :end_id';"

    3) you use the generating chunk query of type: "Chunk by User-Provided SQL", therefore it is necessary to remove /*+ ROWID (MF_HELLO) */ from l_sql_stmt


    For studying and testing..

    Example One:
    Code (SQL):
    DECLARE
    l_chunk_sql VARCHAR2(1000);
    l_sql_stmt VARCHAR2(1000);
    l_try NUMBER;
    l_status NUMBER;

    BEGIN

    -- Create the TASK
    DBMS_PARALLEL_EXECUTE.CREATE_TASK ('mytask');

    -- Chunk the table
    l_chunk_sql := 'SELECT MH.LINE_ID,MH.LINE_ID FROM MF_WORLD MW, MF_HELLO MH WHERE MW.UIDY = MH.LINE_ID';
    DBMS_PARALLEL_EXECUTE.CREATE_CHUNKS_BY_SQL('mytask', l_chunk_sql, FALSE);

    -- Execute the DML in parallel
    -- the WHERE clause contain a condition on manager_id, which is the chunk
    -- column. In this case, grouping rows is by manager_id.
    l_sql_stmt := 'update  MF_HELLO MH
    SET MH.SET TYP_OF_AGCY_TRFR = (SELECT MW.TYP_OF_AGCY_TRFR FROM MF_WORLD MW WHERE LN.UIDY = MH.LINE_ID)
    WHERE   MH.LINE_ID between :start_id and :end_id'
    ;
    DBMS_PARALLEL_EXECUTE.RUN_TASK('mytask', l_sql_stmt, DBMS_SQL.NATIVE,
    parallel_level => 10);

    -- If there is error, RESUME it for at most 2 times.
    L_try := 0;
    L_status := DBMS_PARALLEL_EXECUTE.TASK_STATUS('mytask');
    WHILE(l_try < 2 AND L_status != DBMS_PARALLEL_EXECUTE.FINISHED)
    Loop
    L_try := l_try + 1;
    DBMS_PARALLEL_EXECUTE.RESUME_TASK('mytask');
    L_status := DBMS_PARALLEL_EXECUTE.TASK_STATUS('mytask');
    END LOOP;

    -- Done with processing; drop the task
    DBMS_PARALLEL_EXECUTE.DROP_TASK('mytask');

    END;
    /
    I believe that it is better to use generation of range of chunk on a column
    Example two :
    Code (SQL):
    DECLARE
    l_chunk_sql VARCHAR2(1000);
    l_sql_stmt VARCHAR2(1000);
    l_try NUMBER;
    l_status NUMBER;

    BEGIN

    -- Create the TASK
    DBMS_PARALLEL_EXECUTE.CREATE_TASK ('mytask');

    -- Chunk the table on COL
    DBMS_PARALLEL_EXECUTE.CREATE_CHUNKS_BY_NUMBER_COL
    (
       task_name    => 'mytask',
       table_owner  => USER,
       TABLE_NAME   => 'MF_HELLO',
       table_column  => 'LINE_ID',
       chunk_size    => 10000
    );
    -- Execute the DML in parallel
    -- the WHERE clause contain a condition on manager_id, which is the chunk
    -- column. In this case, grouping rows is by manager_id.
    l_sql_stmt := '
    update (select n.line_id ,n.TYP_OF_AGCY_TRFR NEW_VAL, o.TYP_OF_AGCY_TRFR OLD_VAL
    from mf_hello n,mf_world o where
    o.uidy between :start_id and :end_id and n.line_id = o.uidy
    )  set new_val = old_val'
    ;
    DBMS_PARALLEL_EXECUTE.RUN_TASK('mytask', l_sql_stmt, DBMS_SQL.NATIVE,
    parallel_level => 10);

    -- If there is error, RESUME it for at most 2 times.
    L_try := 0;
    L_status := DBMS_PARALLEL_EXECUTE.TASK_STATUS('mytask');
    WHILE(l_try < 2 AND L_status != DBMS_PARALLEL_EXECUTE.FINISHED)
    Loop
    L_try := l_try + 1;
    DBMS_PARALLEL_EXECUTE.RESUME_TASK('mytask');
    L_status := DBMS_PARALLEL_EXECUTE.TASK_STATUS('mytask');
    END LOOP;

    -- Done with processing; drop the task
    DBMS_PARALLEL_EXECUTE.DROP_TASK('mytask');

    END;
    /
     
    Last edited: Apr 24, 2016
  5. 13478

    13478 Active Member

    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    80
    Location:
    Louisana
    Hello Krasnoslobodtsev_si,

    Thank you very much.

    I tried this,

    DECLARE
    l_chunk_sql VARCHAR2(1000);
    l_sql_stmt VARCHAR2(1000);
    l_try NUMBER;
    l_status NUMBER;

    BEGIN

    -- Create the TASK
    DBMS_PARALLEL_EXECUTE.CREATE_TASK ('mytask');

    -- Chunk the table on COL
    DBMS_PARALLEL_EXECUTE.CREATE_CHUNKS_BY_NUMBER_COL
    (
    task_name => 'mytask',
    table_owner => USER,
    TABLE_NAME => 'MF_HELLO',
    table_column => 'LINE_ID',
    chunk_size => 10000
    );
    -- Execute the DML in parallel
    -- the WHERE clause contain a condition on manager_id, which is the chunk
    l_sql_stmt := '
    update (select n.line_id ,n.TYP_OF_AGCY_TRFR NEW_VAL, o.TYP_OF_AGCY_TRFR OLD_VAL
    from MF_HELLO n,MF_WORLD o where
    o.uidy between :start_id and :end_id and n.line_id = o.uidy
    ) set new_val = old_val';
    DBMS_PARALLEL_EXECUTE.RUN_TASK('mytask', l_sql_stmt, DBMS_SQL.NATIVE,
    parallel_level => 10);

    -- If there is error, RESUME it for at most 2 times.
    L_try := 0;
    L_status := DBMS_PARALLEL_EXECUTE.TASK_STATUS('mytask');
    WHILE(l_try < 2 AND L_status != DBMS_PARALLEL_EXECUTE.FINISHED)
    Loop
    L_try := l_try + 1;
    DBMS_PARALLEL_EXECUTE.RESUME_TASK('mytask');
    L_status := DBMS_PARALLEL_EXECUTE.TASK_STATUS('mytask');
    END LOOP;

    -- Done with processing; drop the task
    DBMS_PARALLEL_EXECUTE.DROP_TASK('mytask');

    END;
    /



    Error report:
    ORA-01722: invalid number
    ORA-06512: at "SYS.DBMS_PARALLEL_EXECUTE_INTERNAL", line 534
    ORA-06512: at "SYS.DBMS_PARALLEL_EXECUTE", line 148
    ORA-06512: at line 13
    01722. 00000 - "invalid number"
    *Cause:
    *Action:


    if running chunk by sql version, get same error...

    thinking....

    Thank you very much.
     
  6. krasnoslobodtsev_si

    krasnoslobodtsev_si Forum Genius

    Messages:
    626
    Likes Received:
    129
    Trophy Points:
    810
    Location:
    Russian Federation
    Whether there is a mistake if to generate chunk of type CREATE_CHUNKS_BY_SQL?
    Can you provide ddl for table MF_HELLO and MF_WORLD?
     
  7. 13478

    13478 Active Member

    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    80
    Location:
    Louisana

    Hello Krasnoslobodtsev_si:

    I am working on something else today, sorry for late reply.

    while, using "merge" way, one DML's running time is down from 31 minutes to 1 minute, it is cool.
    but, the other DML (total similar way, just different tablexxx for "Merge into MF_WORLD using tablexxx"), is still running for 2 more hours by far.

    I may get back to this later. ( to explore using DBMS_PARALLEL_EXECUTE)

    Thank you so much!!!