Page 1 of 2 12 LastLast
Results 1 to 15 of 16
  1. #1
    Join Date
    Oct 2013
    Posts
    163

    Unanswered: Fundamentals of Database Design

    Dear Seniors,

    I had created a document control Database with help of these forum for my project use and it works fine. However When I execute a particular query it tooks too much time, of course the result is having more than 10,000 rows. I was searching in Internet for the same and found that it can be solved when the database is properly designed. As per that I had organized the data and splitted the same into multiple tables. In this regard I would like to ask the following questions.

    1) I have a table which is having 60 fields and 10,000 Lines of data. And I am using this table for all my queries. If I have all the data in a single table then run query will take long time or it is due to some other reason. Whether it is advised to keep all the data in a single table.

    2) As per the basics of Database design, I had splitted 1 table in to multiple tables with common unique key and created a form to update all the tables at 1 time. Unfortunately in my database Document ID is a unique Key and it is present in all the tables. So When I create a form from multiple tables and update it I am getting error message Unique key can't be left blank. How to rectify this problem?

    3) I also understood when we use any formulas in query it tooks lot of time when producing the results. So I had tried to convert the formulas with help of Table. (i.e) When the Value is 0 then it should return the value as Blank, if the value is blank, then it should return "REISSUED". I was using the IIF formula to do the same, however I would like to do the same with Tables. I am getting problem when the value is blank.Please help me to sort out these issues.
    Thanks and Regards
    R. Vadivelan

  2. #2
    Join Date
    Apr 2014
    Location
    Kentucky
    Posts
    495
    Provided Answers: 24
    10,000 records should take 3 seconds to query. It is nothing in terms of quantity. So either you have no indexes, or you are running across a slow network. (or both)
    I cant see why it would be that slow.

  3. #3
    Join Date
    Oct 2013
    Posts
    163
    Hi Ranman,

    Thanks for your quick response. Actually my problem is only 1 query is taking too much time, remaining queries are working fine. I had already indexed my table. Also I had done the analysis by removing the formulas, then it works fine, when I put some iif conditions in Query then it takes quite long time.

    So it is very much appreciated if you explain me how to eliminate the formulas. In my case I have a table with following fields

    DocumentID & ReissueCount

    If the ReissueCount is 0, then my Query should return blank if not it should return "REISSUED"

    I tried to do the same with Linking tables, but I could not get the exact results.
    Thanks and Regards
    R. Vadivelan

  4. #4
    Join Date
    Nov 2004
    Location
    out on a limb
    Posts
    13,692
    Provided Answers: 59
    my expectation is that you have bludgeoned Access to work like a spreadsheet, using a spreadsheet methodology

    you are probably using a significant number of domain functions (they are always slow, using a few is not a problem using lots especially in a query or a report is always dangerous)


    but its all talk

    saying "my query is slow" without providing details of the query, the table design an so on is pointless. no one here can give any useful advice as there is nothing concrete to go on

    at the very least supply the tables and query, and ideally some sample data
    ideally consider finding out how the Access database engine is approaching your task
    https://www.google.co.uk/webhp?sourc...explain%20plan

    So it is very much appreciated if you explain me how to eliminate the formulas
    ...what formulas?
    I'd rather be riding on the Tiger 800 or the Norton

  5. #5
    Join Date
    Oct 2013
    Posts
    163
    Hi Healdem,

    Actually in my query I am using 6 domain functions, so I think it may be the problem. So I am trying to eliminate the use of domain functions.

    I have transmittal database, in which the document will be issued to COMPANY many times in different revision number. My objective is when a document is issued again it should be noted as "REISSUED"

    I had identified the First issue in a different Table named T21_DuplicateCheck with the following columns

    CTRUniqueNo_01
    CTR_Revision_Date
    ReissueCount

    ReissueCount will always be "0" when the document is issued 1st time.

    Now I had LeftJoined the TransmittalRegister & T21_DuplicateCheck in a Query. Upto this it works fine.

    In order to identify the Reissued Documents I am using the following formula

    ReissueStatus: IIf([ReissueCount]=0,"","REISSUED")

    Now I want to eliminate the formula by using the Left join the New table named T22_Reissue in which I have the following fields.

    Field1 Status
    0
    Reissued

    (i.e) I want to display the result has "Reissued" when the value is blank. But I am not getting the results as required by me.

    Herewith I had enclosed my sample database for your review and advice, please note that I had deleted 75% of the data due to big size.

    Also as per my earlier query if I have a table which is having 60 fields and 10,000 Lines of data. And I am using this table for all my queries. If I have all the data in a single table then run query will take long time or it is due to some other reason. Whether it is advised to keep all the data in a single table.
    Attached Files Attached Files
    Thanks and Regards
    R. Vadivelan

  6. #6
    Join Date
    Nov 2004
    Location
    out on a limb
    Posts
    13,692
    Provided Answers: 59
    so which query/queries are running slowly
    I'd rather be riding on the Tiger 800 or the Norton

  7. #7
    Join Date
    Nov 2004
    Location
    out on a limb
    Posts
    13,692
    Provided Answers: 59
    running query "Q00_TransmittalRegister" with 13,985 rows takes around 1..2 seconds here
    just compacting and repairing the db reduced that time to sub 1 second

    as expected this is a spreadsheet coerced, roughed over and bullied into a database. its not normalised and there are what look to be a few dodgy calls on the design.

    ferinstance a document is a document
    its title doesn't, or at least shouldn't change through each iteration
    so I'd expect to have a table for documents
    a table off that which stores the different revisions and the relevant dates for that revision
    logically you may well have to create an initial release in this revision table just for logical niceities

    managing department is not normalised (ie pushed into a table of its own, and just use the code)

    whenever you see columns with the same name but a different numeric suffix or prefix you know its trouble. mind you in this case we also have tables with odd prefixes which makes me think you have lost track of waht you are doing, sticking to a spreadsheet approach as opposed to a database approach (one peice of information is only ever stored once. its never (well virtually never) repeated

    spreadsheets are not good at being the original source of data. use a database for that (the clue is in the name DATAbase)
    Last edited by healdem; 09-05-14 at 08:13.
    I'd rather be riding on the Tiger 800 or the Norton

  8. #8
    Join Date
    Oct 2013
    Posts
    163
    Thanks Healdem,

    In the same database, I have few more tables and Queries. If I delete the same then my query runs fast, however I can't delete since it is needed for other tracking.

    I will create a Table for managing department and use the code. Also I try to eliminate the Titles.

    However I need to do the suffix in order to create a unique ID of the document. Is there any other method available without doing the suffix, if it is kindly give me some ideas to do the same.
    Thanks and Regards
    R. Vadivelan

  9. #9
    Join Date
    Nov 2004
    Location
    out on a limb
    Posts
    13,692
    Provided Answers: 59
    Quote Originally Posted by velu130486 View Post
    Thanks Healdem,

    In the same database, I have few more tables and Queries. If I delete the same then my query runs fast, however I can't delete since it is needed for other tracking.

    if you have a slow running query and you are asking for help here and in doing so you post a copy of your db here. what is the point of you deleting tables and queries so that make it run fast. the whole point of posting a db here is so that people can look at what the problem IS.


    the clue is its those very tables that are probably the source of the problem

    |if you want help on resolving slow running queries then at the very least POST the SQL of ALL the queries involved in the slow running process. AND ideally the DDL the table design of the column referred to in those queries.

    effectively by not doing so you wasted anybody's time who looked at this thread

    slow runn ign queriers are usually caused by
    1) badly formed joins (joining on the wrong element(s))
    2) using poor SQL techniques (eg using dlookups instead of a join)
    3) using inappropriate techniques (sometimes people try to do everything in SQL when say the output is going into a report. sometimes its easier and quicker to do part of that in the report itself
    4) not using indexes on joined and ordered columns
    5) dodgy table design using columns that are of the wrong type or uneccesarily big. as this is effectivley a spreadsheet imported into Access most of your columns are text 255... which is way, way too big


    changing one or two bits of the design into a more normalised form isn't going to significantly reduce the run time of the process. but without identifying where you bottleneck(s) is/are is just a waste of time. FWIW many years ago I looked at a colleagues report that was running slow.. (IIRC taking over 6 minutes) by removing all the domain functions (largely DAVG) and doing the calculation inside a report it reduced the running time to sub 15 seconds

    so first off
    make certain you are using the appropriate data type and size. if the columns are the wrong type and or wrong size that can force access to do wasted operations and waste memory . in a big resultset that can kill peformance for both the local machine and the network. If Access has to do a tablescan it has to suck all the data down the network pipe, process what is required then perform whatver joins

    make certain your joins are on the correct data type and on INDEXED columns

    if you are usign where clauses or order by's frequently on the same columns then consider making certain that column is indexed

    make certain you are only pulling across columns you ACTUALLY NEED. YES SELECT * FROM mytable is nice and easy for the developer but if you are hitting performance problems pulling columns you don't need in a query is a waste of resources

    trying to unpick this mess of a spreadsheet, sorry database, and then redesign it into normalised form is probably too late. it was probably too late back in October last year. you could try telling your boss the muppet who designed this needs to seriously think about their future....

    https://www.google.co.uk/webhp?sourc...unnign%20query
    I'd rather be riding on the Tiger 800 or the Norton

  10. #10
    Join Date
    Oct 2013
    Posts
    163
    Thanks Healdem,

    For your detailed reply and Tips. I will try to redesign my database by incorporating your comments.
    Thanks and Regards
    R. Vadivelan

  11. #11
    Join Date
    Oct 2013
    Posts
    163
    Hi Healdem,

    I had made some changes in my database as per your tips. Now I have better results while executing the Query. But when I export the query to Excel it tooks lot of time. I am facing this problem when I export the query which produces results more than 10,000 rows. Is it possible to avoid the same, if it is how to do it?
    Thanks and Regards
    R. Vadivelan

  12. #12
    Join Date
    Nov 2004
    Location
    out on a limb
    Posts
    13,692
    Provided Answers: 59
    how are you exporting this, what method
    is it any quicker importing into Excel
    have you identified what "it tooks lot of time" actually means
    is the time taken unreasonable
    have you taken any steps to try and identify why it is slow
    have you taken any steps to work out where the time is going/being consumed
    is it the process (the export itself) or is is something on your network or your computer) is it memory issues, processor, network....

    have you used explain/showplan to find out if there are query problems

    lets look at some metrics
    opening your querry Q00_TransmittalRegister and T00_MDRMaster and exporting either to Excel takes sub 1 second here, dumping to a local disk, 4 disk on the same lightly loaded network segment, remote disk on another segment considerably longer. but it doesn't matter you just save locally, then copy the file tot eh remote disk(s) as required

    what count as runnignslow
    slower than you'r like
    slower then user expect
    painfully slowly... minutes
    without hard and fast numbers its impossible to work out whats happenign, when and why.
    I'd rather be riding on the Tiger 800 or the Norton

  13. #13
    Join Date
    Oct 2013
    Posts
    163
    Hi Healdem,

    I am just exporting by External Data>Excel.. If there is any method to output the same in excel please help me to know.

    When I export Q00_TransmittalRegister to Excel then it takes 3 Min (180 Seconds) to complete the export, where us the remaining queries I can do the same within 10 - 15 seconds. So I thought it takes time since it contains more than 13,000 rows.

    I had used Database Tools>Analyze performance to identify the problem with Queries. I didn't find any problems/suggestions since I already fixed it.
    Thanks and Regards
    R. Vadivelan

  14. #14
    Join Date
    Nov 2004
    Location
    out on a limb
    Posts
    13,692
    Provided Answers: 59
    running the query here takes sub 2 seconds
    exporting the query to an Excel file on a local disk takes around 4 seconds
    ..which suggests the problems are more to do with your network and or computer
    so either your computer has insufficient processor or memory
    or you have network issues
    if this process is taking too long, and by too long something that is causing problems then you need to look at this.

    the easiest workaround will be to to birng the access model onto a local disk, save the resultant excel file on to a local disk then copy the file back over the network back to wherever its needed.

    if its network related then it doesn't matter what you do it will still take time to move the file about.

    I doubt your dodgy design is holding back the speed that much.

    if it were me and I knew the db isn't going to grow that much Id put up with the those sort of time issues,three minutes isnt' a huge amount of time.
    I'd rather be riding on the Tiger 800 or the Norton

  15. #15
    Join Date
    Oct 2013
    Posts
    163
    Thanks Healdem,

    I am working on redesign of my database. So If there is any improvement after redesign I will keep you posted.

    Anyhow thanks for your tips in explaining the root causes of problem.
    Thanks and Regards
    R. Vadivelan

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •