Results 1 to 3 of 3
  1. #1
    Join Date
    Aug 2012

    Tick data management & backtesting


    To add some color to this request.. i'm new to this so I would really appreciate comments that include references so that I can do this all on my own. I don't have the resources to test out different methods via trial and error. I am asking this question on this forum because I want to get the most optimized, cost effective structure (btw if anyone is interested in this project PM me) $500-$1000.

    Has anyone ever used a stock trading software that also does historical backtesting? What I'm trying to do is create a script that can access huge amounts of data (terabytes) and perform calculations on the data and give me the desired results based on pre-defined criteria.

    Example of the data:

    date, time, price

    I'm not sure what kind of structure I should use for the data so that when I am doing the backtesting; it is fast.

    So for example If I was going to backtest my historical data; and the backtesting criteria was:

    if price > x and indicator 1 is less < y; enter.
    if price > y and indicator 1 is less < x; exit

    Do you guys think I should have all the historical calculations for the indicators saved in my data base? Or should my script be doing this calcualtion every single time I run a quiry. Keep in mind there can be many indicators that can be added to a backtesting inquiry.

    I need your help with
    1)picking the type of database (I have the data; but i'm not sure what the fastest option we can use to store the data so backtesting is fast)

    2)recompiling the 1 min data to different variables (5 min etc) - The data that I have is 1 min data; I want the program to be able to recompile that data to 5 min; 10 min etc if the user chooses to do so.

    1)data storage hardware etc? What kind of hardwares do I need etc

    And how do I do this/???

    Thank you

  2. #2
    Join Date
    Aug 2012

  3. #3
    Join Date
    Nov 2004
    out on a limb
    Terabytes of data requires a database server
    accquiring stock prices is going ot be time consuming and or prohibitively expensive.

    if you are budget constrained then a server databaase such as MySQL, Postgres or similar may be a good solution. although there are 'sample' editions of the major database server systems such as SQL Server, Oracel, DB2 and god knows what else.

    Whether you have 10, 10,000 or 10,000,000 rows shouldn't matter providing you design the tables correctly, and thats going to invovle carefull design of the indexes
    Off hand I guess you need a minimum of two tables
    contains details of equities being traded

    contains details of the relevant price data on specific date and times

    retrieving the data in the correct time slices depends on making good use of the server you choose's date and time functions, and a bit of integer mathmatics. you shouldn't need to'recompile the data, unless you are looking for a moving average over that 5,10 minutes time slice.

    as to the 'right' hardware, dunno, depends on your budget your requirements and so on
    if its a single user then the processor doesn't need to be that big, but the more memory you can sling at the problem the better.
    I'd rather be riding on the Tiger 800 or the Norton

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts