Results 1 to 8 of 8
  1. #1
    Join Date
    Oct 2006
    Posts
    3

    Unanswered: Export SQL server 2000 to mysql

    Does anyone have any tutorials for exporting SQL server 2000 to Mysql. I have a SQL server 2000 database and want to use it for a website. (has products etc). This database has to be in sync with the 'web' database (mysql). I've been reading about ODBC and downloaded the mysql connector from mysql.com. Basically when updates are made to the SQL server 2000 database, I want to be able to export the changes to the live website so only one database will need maintaining. Please note that simply using the SQL server 2000 database for the website is not an option. (so please don't suggest that).

    Just looking for more information could someone give me some tutorials/advice.

    Thank you. Appreciated.

  2. #2
    Join Date
    Apr 2006
    Location
    Denver, Co. USA
    Posts
    242
    I have not used it, but the mysql migration tool-kit promises to do what you need - http://www.mysql.com/products/tools/migration-toolkit/

  3. #3
    Join Date
    Oct 2006
    Posts
    3
    Thank you, I will look into that.

  4. #4
    Join Date
    Oct 2006
    Location
    CA
    Posts
    210
    Sounds like you want an ongoing replication, not a migration.

    What is your budget? How many Edits? How often? Is perpetual up-to-the-second updating required? Will a nightly refresh suffice? Weekly? How big? Must it be automated or do you just want it once? How many tables?

    dbmab: Nice link. Thanks!
    Last edited by vich; 10-11-06 at 04:54.

  5. #5
    Join Date
    Nov 2004
    Location
    out on a limb
    Posts
    13,692
    Provided Answers: 59
    an alternative strategy could be
    to get the user interface that is creating / manipulating the data to commit cahnegs tro the MySQL db at the same time
    OR
    to write the relevant data into an audit table in SQL server 2000 then peridodically run a job that strips out that data and inserts it into your MySQL db. the ausit table could take the form of native SQL statements eg "insert into <mytable>.....", or it could be the rtaw data int he form the MySQL db is expecting (that would be preferable for performance reasons)
    I'd rather be riding on the Tiger 800 or the Norton

  6. #6
    Join Date
    Oct 2006
    Posts
    3
    Quote Originally Posted by vich
    Sounds like you want an ongoing replication, not a migration.

    What is your budget? How many Edits? How often? Is perpetual up-to-the-second updating required? Will a nightly refresh suffice? Weekly? How big? Must it be automated or do you just want it once? How many tables?

    dbmab: Nice link. Thanks!
    Not a 'demanding' amount of times, perhaps a few times per month. And no its not a crucial export. Basically the SQL server DB holds lots of product information and it's used internally for various uses. All the information about the products is used on the website in a product listing. So when a person makes a change internally, everywhere now and then I would like an easy way of sending the changes to the web database. Initially, an entire database migration would be needed but then, If there was a way when a query would be sent to the server every time a certain row is edited in the SQL server DB.... That would suit me best.

    I'm no database expert, but I thought ODBC could be used for this. Like http://dev.mysql.com/downloads/connector/odbc/3.51.html I'm more of a designer/scripter then a database guru by any means. Mysql is the only one I'm familar with, never worked with SQL server before.
    Last edited by lionheart008; 10-11-06 at 05:33.

  7. #7
    Join Date
    Nov 2004
    Location
    out on a limb
    Posts
    13,692
    Provided Answers: 59
    if its a periodic export (say monthly and the volumne of changes wasnt large)), then I would seriously consider writing to an audit file (either as a table or a flat file) the SQL to insert into the MySQL db. and then do a load data infile as a cron / timed job. The you know precisely what is going on where and when. you can time the cron job to run when your webserver is getting low usage, you have minimum downtime, your webserver may not need to go down at all, except for periodic housekeeping.
    I'd rather be riding on the Tiger 800 or the Norton

  8. #8
    Join Date
    Oct 2006
    Location
    CA
    Posts
    210
    Quote Originally Posted by lionheart008
    Not a 'demanding' amount of times, perhaps a few times per month. And no its not a crucial export. Basically the SQL server DB holds lots of product information and it's used internally for various uses. All the information about the products is used on the website in a product listing. So when a person makes a change internally, everywhere now and then I would like an easy way of sending the changes to the web database. Initially, an entire database migration would be needed but then, If there was a way when a query would be sent to the server every time a certain row is edited in the SQL server DB.... That would suit me best.

    I'm no database expert, but I thought ODBC could be used for this. Like http://dev.mysql.com/downloads/connector/odbc/3.51.html I'm more of a designer/scripter then a database guru by any means. Mysql is the only one I'm familar with, never worked with SQL server before.
    As I'm sure you know, ODBC is just a connection protocol. It doesn't "do things" on an application level - like replication. I've never used MySQL but I assume an ODBC connection can be established for it, same as SQL Server. I've used ODBC to connect a Database Management tool (Enterprise Manager) so I know it will support all levels of database update.

    SQL Server (as with all enterprise level databases) supports triggers. They require programming, but they are a good way to capture all changes. However that's a lot of programming and has a performance hit.

    I presume you need to capture things like Product Description, price, whether it's still active, etc. Doesn't sound like you need stuff like dynamic info like stock levels.

    SQL Server uses "Enterprise Manager" (comes free) for things like creating/editing database objects, exporting data, setting up replication to another SQL Server (also built in). Check out the Export function.

    If the data is pretty static and small (like maybe 10,000 rows in 5 or 6 tables), here's something similar to what I did for a customer who needs this information regularly (they use MySQL for their web server). We didn't want to spend a lot of programming effort and the tables exported weren't considered so confidential that we were concerned with a customer (who has signed a non-disclosure) getting a full copy. Took a total of about 30 minutes for my part.

    1. Create the same table structure in MySQL for the tables you want exported. (my customer did this following my first export).

    2. Create a scheduled DTS job in SQL to export those tables from SQL to these MySQL tables. Be warned; it's very dumb about it, to the point of trying to repeat even unique keys so you have to clear out your "to" tables first. There's a wizard in SQL server for creating an export. (actually; I create Excel spreadsheets - one of the options). You can export query results if you want to apply a filter or join some tables first - and guarentee the process won't break if you make changes in your SQL Server database.

    3. Create MySQL tables that your reports will actually use. (my customer's task)

    4. Write a MySQL job also to be scheduled to run after hours and after the SQL export one completes. Could even be a day later. This job will compare the freshly imported files with your report files. That way, if structures change your existing reports don't break. Also; you may not want to keep a full set of data in the tables you expose to the web programs. (I do the export when they request an update - about every 6 months).

    Or; buy some replication software that's capable of doing a single-direction replication from SQL Server to MySQL. You could also develop a local application in your language of choice that simply opens both databases and does an intelligent re-sync.

    If your web were running via SQL Server, you could use the built in replication feature. Of course; you then have all sorts of licensing expenses - replication software is likely cheaper.


    Surely there are better solutions that involve an XML interface or something, but that's beyond me. SQL Server has the "OPENDATASOURCE" keyword, implying that you can create your own data transfers.
    Last edited by vich; 10-12-06 at 01:41.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •