Results 1 to 2 of 2
  1. #1
    Join Date
    Feb 2003

    Unanswered: MySQL Table Design Question


    I run a lyrics website, I am storing all the lyrics in a MySQL Database, with this table structure:
    CREATE TABLE `lyrics` (
      `id` int(128) NOT NULL auto_increment,
      `artist` varchar(255) NOT NULL default 'Unknown',
      `album` varchar(255) NOT NULL default 'Unknown',
      `title` varchar(255) NOT NULL default 'Unknown',
      `text` text NOT NULL,
      `hits` int(255) NOT NULL default '0',
      `vote_count` int(255) NOT NULL default '0',
      `vote_total` int(255) NOT NULL default '0',
      PRIMARY KEY  (`id`),
      UNIQUE KEY `id` (`id`),
      FULLTEXT KEY `artist` (`artist`,`album`,`title`)
    ) TYPE=MyISAM;
    Just by looking quick do think this would be my best bet to have this table, or I was thinking about spliting it up into 28 diffrent ones, as they would start with just the letter of each artist, and a number table for those that start with numbers.

    Right now with only 2000 records the table is 3.3MB large.

    Thanks for any help or suggestions.

    Jeremy Ross

  2. #2
    Join Date
    Oct 2002
    Baghdad, Iraq

    Re: MySQL Table Design Question

    Splitting the table isn't going to make much of a difference, except for making your life really complicated, and it can do odd and unpredictable things to full text searches.

    Why don't you index the 'text' field? That stores the actual lyrics and wouldn't you want people to be able to do a natural language search for lyrics?

    After all that indexing, 3.3MB is really not too bad for 2000 records.

    The UNIQUE constraint is redundant. (Of course, this is a bit gratuitous since MySQL doesn't know the difference between a key and an index.) (Okay, a key is logical, an index is physical.)

    You're not going to be able to scrape off more than a few bytes per record by tweaking field sizes. Read section 6.2 of The Fine Manual for info on column types. Using MEDIUMINT instead of INT and SMALLTEXT would save you a paltry 8 bytes or so and it's really not worth the extra typing.

    Also, trying to slim your tables this way is like being on a diet: just as your body pumps 10 pounds of water through it each day giving your scale a false reading, your DBMS keeps a certain amount of slack space in its physical tables so it's very hard to gauge how much space you're saving with these timewasting tweaks.

    Now, if you wanted to get fancy (okay, this is something the DBMS ought to handle for you automatically so it's more drudgery than fancy) you could periodically compress your table with myisam-pack. This technique only works if the data is "write-once" (a reasonable assumption, given the material), and I don't even know if it works with full-text indexing. Read sections 7.1 and 4.7.4 of The Fine Manual. Experiment.

    Ah, wait, bonus: MySQL will do *some* work for you. Check up on MERGE tables. This keeps your logical design (somewhat) sane in that all the split tables are automatically lumped together. This might even work with your idea of splitting by author name, though that wouldn't allow you to use myisam-pack. Read section 7.2 of The Fine Manual.


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts