I am trying to run a large database, currently its in a access in one large table containing half a million records and 100 fields and several small tables. When running a query as you can imagine it takes ages. I've converted it to Microsoft sql 2000 but its still slow. The exact queries that I ran seemed to be quicker in access than in MS SQL. Please help. What can/should I do to be able to run this database online.
If you are using a large database I would be concerned about your indexes as achorozy pointed out. Also, check to see when your statistics were updated last - using STATS_DATE. You can use sp_updatestats or UPDATE STATISTICS to update these numbers. Since your database is large this is especially important. Sequential scans on a large table are brutal. You can use sp_autostats to view and change the current settings. Also, have you been using clustered indexes ??? These can be highly beneficial.
[list=1][*]You may look at creating an index on Holiday_Startdate [*]Remove the LIKE argument and use '=' since you are not using pattern matching '%_'[*]Is Holiday_Startdate a DATETIME field if so I don't think the LIKE statement would work since DATETIME stores HH:MMS[/list=1]
The reason I say you may create an index on Holiday_Startdate is because I don't know all of the queries used. An index could be created on (Holiday_Startdate,Owner_Code ) OR (Owner_Code,Holiday_Startdate). But you need an index.
As Achorozy already suggested, run this query under Query Analyzer using Show Execution Plan and you will see that a table scan is being done which is where your cost is greatest. Once you have created the indexes suggested, run Query Analyzer again and notice the difference.
The problem that I have with setting up indexes etc is that the database is being supplied daily as an access database so the tables will need to be updated daily. Does that mean redoing all the steps or is there a way that the process could be automated i.e. press one button sql would import the table and setup everything?
Now I don't know if the Access database is used to UPDATE data or you cleanout SQL Server and reload from Access. However you could create a Linked Server to the Access database, then create a stored procedure that loads the data from Access.
set nocount on
declare @server sysname
declare @userid varchar(10)
declare @pswd varchar(10)
set @server = 'Client'
set @userid = 'admin'
exec sp_dropserver @server = @server , @droplogins ='droplogins'
'OLE DB Provider for Jet',
exec sp_addlinkedsrvlogin @server, 'FALSE', NULL, @userid, @pswd
This is a one time definition.
Then create a stored procedure to load Access and update SQL Server. Either you truncate SQL Server and do a INSERT...SELECT or do an UPDATE where keys equal followed by a INSERT..SELECT..WHERE NOT EXISTS matching keys. After the update, UPDATE STATISTICS or DBCC DBREINDEX if you have time. If no time then schedule a weekly job (off hours, weekends) to do the reindexing.
If the Access database is given to you at a specific time via file transfer or whatever, but if you know you will receive automatically by a specific time then create a schedule job to do the update.
I don't usually buy books but I think a book or a resource on the web might be useful to me do you know any.
I've worked extensivley with Access but SQL server is new to me although I can run the same scripts on SQL server databses its become apparent that they could run a lot quicker if SQL server is setup in certain ways especially for large databases.
Can you please recommend a book or a resource?
Is "Inside SQL Server 2000 by Kalen Delaney" any good?