Results 1 to 12 of 12
  1. #1
    Join Date
    Jan 2010
    Posts
    6

    Database layout for a web counter

    Hi all I'm a new user here
    So here's my issue, on which i am thinking for awhile. I'm developing a web counter (something like google analitycs) and i can't think of the best database model. I have limited disk space and memory (10GB/256MB) on the server machine so i need to use something compact. The queryes aren't realtime so speed isn't the top priority (though it is a priority).
    I have to log all daily unique visits (one per ip per site) and impressions. I have the possibility of loging every single request but i think that will be very bad in terms of space usage. I have seperated the IP-s into their own table, where i keep some generic geopositioning data and i use the IDs from that table

    So, what is the best way to log visits for lots of sites into a table, along with several fields (e.g. browser name, color debth and so on) so that the table will be compact and not very slow?
    I was thinking of only adding one field per user per day per site () and then updating a field with the impressions (i was using a similar model in a previous version, but it was all screwed up and wrong, a total fail) but i can't really think out the details.

    Thank you. I'm sorry if my post was unclear, or if i have any typo's

  2. #2
    Join Date
    Apr 2002
    Location
    Toronto, Canada
    Posts
    20,002
    what's wrong with just using your web server logs? they already store data for each web request, including IP, browser, page visited, etc.
    rudy.ca | @rudydotca
    Buy my SitePoint book: Simply SQL

  3. #3
    Join Date
    Jan 2010
    Posts
    6
    Because i need more than that Like geopositioning and so on. And they take too much space, it'll be hard for the machine to run through several GBs of logs to get the data for different sites
    This is a public web counter (like statcounter, clicky, google analitycs and so on), not just for my site. That's why i think i need sql.

    And by the way is SQLite slower than MySQL in big tables? And do you have any impressions on MariaDB

  4. #4
    Join Date
    Apr 2002
    Location
    Toronto, Canada
    Posts
    20,002
    Quote Originally Posted by alex95 View Post
    And by the way is SQLite slower than MySQL in big tables? And do you have any impressions on MariaDB
    no idea, sorry
    rudy.ca | @rudydotca
    Buy my SitePoint book: Simply SQL

  5. #5
    Join Date
    Dec 2007
    Location
    London, UK
    Posts
    741
    Quote Originally Posted by alex95 View Post
    I'm developing a web counter (something like google analitycs) and i can't think of the best database model. I have limited disk space and memory (10GB/256MB) on the server machine
    Are you serious? You can forget it unless you can invest some decent hardware. That's not sufficient to run a dev machine in my judgement, never mind a web analytics service!

  6. #6
    Join Date
    Mar 2003
    Location
    The Bottom of The Barrel
    Posts
    6,102
    Quote Originally Posted by dportas View Post
    Are you serious? You can forget it unless you can invest some decent hardware. That's not sufficient to run a dev machine in my judgement, never mind a web analytics service!
    That's a pretty standard shared hosting arrangement...

    Speaking of... What is your hosting situation? This type of analytic tool is often included with hosting packages. Often times it will source raw data on your server logs and then reference geocoding stuffs to produce more detailed reports.

    I get this for free on all my webhosts. I'm sure there's FLOSS solutions for this too.



    When you mention this isnt' just for your site, what exactly do you mean?
    oh yeah... documentation... I have heard of that.

    *** What Do You Want In The MS Access Forum? ***

  7. #7
    Join Date
    Jan 2010
    Posts
    6
    I'm on a VPS with lots of space to grow, this is the lowest plan. I was running on a home nginx/sqlite machine and it held quite nicely, but my internet connection didn't so i moved to a VPS
    I've had the web counter running for half an year, and now the database grew to 7GB and caching the data was a bit slow so i started all over, because i realised that my database sucked
    Servers aren't my problem, i just can't think of a suitable database model to record all these things nicely
    And the counter is for people who want a public counter or don't get something like awstats/webalizer with their hosting. And also a local alternative of another (and quite buggy) Bulgarian counter
    So it's hosted with me but it isn't for my sites, but rather the user's and i don't have access to their server logs

  8. #8
    Join Date
    Apr 2002
    Location
    Toronto, Canada
    Posts
    20,002
    so you're basically building something similar to http://www.w3counter.com/ ??

    i know the guy who developed this, he's a really decent fellow, why don't you contact him, tell him r937 sent you, and ask him if he'll share the table layout with you
    rudy.ca | @rudydotca
    Buy my SitePoint book: Simply SQL

  9. #9
    Join Date
    Jan 2010
    Posts
    6
    Thanks I will send him a mail, it sounds cool to hear a successful model
    Yes, i'm building something like w3counter (and still using it along with statcounter ) but localized

  10. #10
    Join Date
    Jan 2010
    Posts
    6
    He is a really nice fellow really
    He answered me a couple of questions, based on that i decided that i'll use several small tables, record every page visit and use a batch script
    Thank you all for your help

  11. #11
    Join Date
    Feb 2010
    Posts
    1
    It looks like you've already got your answer but I just thought I'd throw in my 2 cents.

    I have done this before for an undergraduate project. I was only maintaining dates, ips, and in & out pages. I think that worked well enough for me, and it certainly didn't take up much space. I even setup a page which would take all of the results from a date range and I could download them to a text file. You'd be surprised what statistical data you can get using only 4 rows in a table, but maybe you're looking for a lot more than that. Either way, keeping it simple is one of the best ways to not confuse yourself or get your tables and code out of order. Good luck!

  12. #12
    Join Date
    Jan 2010
    Posts
    6
    Hey, thanks for your answer.
    I pulled it out that way, writing every record in the database
    It seems pretty speedy right now, with about 11 000 records for three days in closed beta on two master-master replicated MySQL servers
    Hope this thread helps someone else too, thank you all for the help

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •