Results 1 to 10 of 10
  1. #1
    Join Date
    Jan 2004
    Posts
    145

    Unanswered: Transactions and Locking

    Hello,
    I'm fairly new to transactions and mysql. I'm fairly experienced with mysql itself though. The version I'm using is 5.0.45. My current project has to do with managing live inventory. I've done this successfully, and easilly, in MSSQL. So my first attempt was as follows:

    Start Transaction
    Look up existing quantity
    If enough remains, decrement appropriate amount
    commit transaction

    I didn't work, when running simultaneous updates I over allocate the available quantity, if timed right.

    In MSSQL it is easy to test, start the transaction and don't commit it. Then run another query accessing the same record and it hangs until you issue a commit on the other session. Trying the same thing in mysql results in the first query updating the record and the second query being able to access the record fine. What is puzzling is I never commit the transaction in the first session. I read about setting autocommit=0 and that doesn't seem to work.

    I'm guessing I have to do something with the transaction isolation level. I thought I had it with READ COMMITED but I did not. So my goal is, using the example above, is to have two sessions running the same queries. The first session should run fine. The second session should wait on the select statement until the first session is done. It then will execute the select statement, see there is not enough quantity and not go forward with the update.

    Hopefully this make sense. Can anyone tell me the steps or commands I'm supposed to use to get this behavior?

    Thanks
    GG
    Code:
    On Error Goto Hell
    
    Hell:
        Msgbox("An error occurred, but was sent to Hell. You may continue.")

  2. #2
    Join Date
    Jun 2007
    Location
    London
    Posts
    2,527
    Why do you need transactions to do this? Couldn't you just do something like:
    Code:
    $sql = "update your_table
            where item = $item
                  and qty >= $amt";
    
    mysql_query( $sql );
    
    if ( mysql_affected_rows() != 1 ) {
        echo "not enough";
    } else {
        echo "done";
    }
    EDIT : fixed bug I just noticed
    Last edited by mike_bike_kite; 09-24-09 at 20:22.

  3. #3
    Join Date
    Jan 2004
    Posts
    145
    I guess my first thought is the situation I presented is pretty much why transactions exist. Your solution is interesting. Let me provide some more info though. I'm using Ruby On Rails(RoR) and a MySql stored procedure that returns whether or not the update was successful. Also the update can consist of multiple items. Here is a run down of my stored procedure:

    Select count(*) from cart where qty_avail - qty_in_cart is <= 0

    If the count > 0 then one or more of the products does not have enough quantity. If that is the case do not update inventory and return the value of 'FALSE' to RoR.

    If the count is 0 then decrement the appropriate inventory and return 'TRUE' to RoR.

    It is important to return the status back to RoR so the user gets a message.
    GG
    Code:
    On Error Goto Hell
    
    Hell:
        Msgbox("An error occurred, but was sent to Hell. You may continue.")

  4. #4
    Join Date
    Jun 2007
    Location
    London
    Posts
    2,527
    Quote Originally Posted by gwgeller
    Also the update can consist of multiple items.
    I think that is the vital bit of info you missed out.

  5. #5
    Join Date
    Jan 2004
    Posts
    145
    ok, I think I still need to add to my original post. The problem I was having was while session 1 was updating records session 2 was able to still do a select and get the old values. What I want is for the select to be blocked until session 1 was done updating. I believe I have found my answer.

    Make sure you are using a transactional table like innodb. A no brainer but easy to forget especially if you mix like we do, innodb and myisam. I was getting frustrated when I finally realized the table I was testing on was one of the two myisam tables in our db.

    I read about autocommit and transaction isolation levels, but from my testing all you have to do is this:

    Session 1:

    Start transaction;

    update record 15;

    ... hold off on commit

    Session 2:

    select record 15 lock in shared mode;

    So session 2 hangs until Session 1 issues a commit. Another thing to remember is that you need to open separate connections for this to work. I had to open two separate SqlYog instances. Or you could have SqlYog and command prompt. Anyway this appears to working now. Hope this helps someone.
    GG
    Code:
    On Error Goto Hell
    
    Hell:
        Msgbox("An error occurred, but was sent to Hell. You may continue.")

  6. #6
    Join Date
    Jan 2004
    Posts
    145
    Back again My test environment wasn't quite like my production environment so my last post did not fully work. To start, Mysql takes a little digging to understand transactions. Another gotcha I encountered was that some statements cause implicit COMMITs. I was using a temporary table and when I dropped it there was an implicit COMMIT which was messing up my testing. To remedy this I used DROP TEMPORARY TABLE which does not use an implicit COMMIT. Next is the bigger issue. I actually needed to use SELECT...FOR UPDATE for my select to wait on any transactions to be committed. So here is my SQL outline:

    Start transaction

    create temp table

    select rows into temp table using SELECT...FOR UPDATE

    determine if adequate inventory

    if so update quantity else do not update

    drop temp table using TEMPORARY keyword

    commit transaction

    So the above will block all selects on inventory until the transaction is committed. Note only selects on the same product(s) are blocked.
    GG
    Code:
    On Error Goto Hell
    
    Hell:
        Msgbox("An error occurred, but was sent to Hell. You may continue.")

  7. #7
    Join Date
    Jun 2007
    Location
    London
    Posts
    2,527
    You'll get more responses to this if you say exactly what it is you're doing within your transaction. I'm not even convinced you need a transaction but until we seem some SQL or get a full description we can't really provide much help. Obviously using transactions where they are not need will just lead to performance issues.

  8. #8
    Join Date
    Jan 2004
    Posts
    145
    Mike,
    I have everything working. My last post was for reference for anyone who comes upon this in the future. Here is why transactions are needed.

    First I need to see how much inventory a product has, determine if it is adequate, and lastly decide to update the inventory or not. This is three separate steps. If I don't wrap all three in a transaction there is a possibility for inconsistent data. Session two might be retrieving inventory, step 1, while session one is deciding if the inventory is adequate, step 2. In this case session 2 is viewing the inventory before session one has had a chance to update it, if it needs to. So if session one does update the inventory, session two now has old inventory data.

    I guess the transaction isn't really used in the traditional way. I'm not really concerned about rolling back. The goal is for all three steps to execute for one session before the next session can begin. The only way I know to do that is to use a transaction to lock the individual records. Hopefully this clears things up.
    GG
    Code:
    On Error Goto Hell
    
    Hell:
        Msgbox("An error occurred, but was sent to Hell. You may continue.")

  9. #9
    Join Date
    Sep 2009
    Posts
    64
    gwgeller ... I think you're using transaction incorrectly. When you said

    Quote Originally Posted by gwgeller
    Start Transaction
    Look up existing quantity
    If enough remains, decrement appropriate amount
    commit transaction
    Transaction won't do you any good. Consider this scenario:

    Code:
    CREATE TABLE `item` (
      `id` int(10) unsigned NOT NULL AUTO_INCREMENT,
      `item_name` varchar(255) NOT NULL,
      `item_quantity` int(11) NOT NULL,
      PRIMARY KEY (`id`)
    ) ENGINE=InnoDB;
    INSERT INTO item (NULL, 'Oranges', 100);
    INSERT INTO item (NULL, 'Grapes',   50);
    And the following SQL statement:
    Code:
    START TRANSACTION;
    SELECT * FROM item;
    UPDATE item SET item_quantity = 25 WHERE id = 1;
    COMMIT;
    Now have 2 connections going to MySQL. Run the first 3 statements on one connection (all except the commit part). Then on the 2nd connection, do the select statement. You will see that Oranges still have quantity 100. When you issue a commit on the first connection, then you'll see Oranges having 25 quantity on the 2nd connection.

    Similarly, when you start transaction, then look up quantity then issue an update statement. And right before you commit, another connection is looking up the quantity, the other connection will see the old data. This is obviously not what you want.

    What you need in this case is locking. Either lock the table for read and write on one thread, or use advisory lock like GET_LOCK(), or rely on warnings if you don't want to use transactions.

    1. Lock table:
    Code:
    LOCK TABLES item WRITE;
    SELECT @qty := item_quantity - 1 FROM item WHERE id = 1;
    UPDATE item SET item_quantity = @qty WHERE id = 1 AND @qty >= 0;
    UNLOCK TABLES;
    You can detect for success or not by checking @qty variable to be greater than or equal to 0.

    2. GET_LOCK()
    Code:
    SELECT @my_lock := GET_LOCK('my_lock', 10);
    SELECT @qty := item_quantity - 1 FROM item WHERE id = 1;
    UPDATE item SET item_quantity = @qty WHERE id = 1 AND @my_lock = 1;
    SELECT RELEASE_LOCK('my_lock');
    You can detect for success or not by checking the result of the RELEASE_LOCK statement. If it's NULL, then you were unable to update.

    3. Rely on warnings
    This method relies on warnings. The idea is to generate warning if item_quantity goes negative. So, alter the table on item_quantity column to have it unsigned. Then when you have 2 connections, 1 makes item_quantity becomes zero and the other one puts it to negative, you'll get a warning code 1264: Out of range value for column 'item_quantity' at row 1.

    There may be better solutions than the 3 above (It's highly possible that the 3 ways above may need improvements as well). But I hope this leads you to a starting point on how to solve your issue.

  10. #10
    Join Date
    Jan 2004
    Posts
    145
    mn, thanks for the reply, but I've already solved my problem. In my previous posts there are two critical steps to note. The first is to use a transaction to lock the needed records, a table lock is not required. The second important part is to use SELECT...FOR UPDATE when getting the current inventory, which will wait for any transactions to be committed before it selects any data. This takes care of your initial concern. If you do a regular SELECT it will not wait for the transaction and will get potentially old data. I guess another important element is this has to account for multiple items.

    So to summarize, everything is working how I want it to. The transaction is used as a locking mechanism, but you have to use SELECT...FOR UPDATE otherwise the locks created by the transaction will be ignored.

    This and my last two posts are just for anyone who stumbles upon this thread in the future. It sucks when you find a thread with your problem and the author's last post is "I fixed the problem" and that is it.
    GG
    Code:
    On Error Goto Hell
    
    Hell:
        Msgbox("An error occurred, but was sent to Hell. You may continue.")

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •