Results 1 to 8 of 8
  1. #1
    Join Date
    Dec 2002
    Posts
    4

    Unanswered: Uptime from heartbeat logging..

    I am generating a report of kiosk uptime, based off of a log table that has "heartbeat" logs as its entries. I basically need to make a report that shows how long the kiosk was "up" and "down" grouped by day. Basically the application that runs on the kiosk logs an entry in this table every 15 minutes. Definition of down would be a gap greater than 15 minutes between entries. My question is can I constuct a SQL query that will return a recordset representing this, or will I have to use some java or perl code to process the recordset. Thanks for the help.

  2. #2
    Join Date
    Feb 2002
    Posts
    2,232
    What is the structure of the table - post a sample of some data. Also, when you say you want to show "how long" - what does "how long" mean - basically post an output of what you want returned.

  3. #3
    Join Date
    Dec 2002
    Posts
    4
    Thanks for the reply rnealejr, when I say "how long" I mean how many minutes out of the day. When there is no more than 15 minutes between log entries I know the kiosk is "up". When there is more than 15 minutes between log entries I know the kiosk is down for the amount of time in that difference. I need to total these two amounts (up and down) per day.

    The table is structured as follows:

    CREATE TABLE `log_entries` (
    `KIOSK` varchar(10) NOT NULL default '',
    `DATE_TIME` datetime NOT NULL default '0000-00-00 00:00:00',
    `TYPE` char(2) NOT NULL default '',
    `DATA` varchar(50) NOT NULL default ''
    )

    some example data (comma delimited export):

    BJ047,2002-12-05 14:15:35,HB,Active
    BJ047,2002-12-05 14:16:52,HB,Active
    BJ047,2002-12-05 14:31:54,HB,Active
    BJ047,2002-12-05 14:38:53,HB,Active
    BJ047,2002-12-05 14:53:54,HB,Active
    BJ047,2002-12-05 15:08:55,HB,Active
    BJ047,2002-12-05 15:24:07,HB,Active
    BJ047,2002-12-05 15:39:07,HB,Active
    BJ047,2002-12-05 15:53:58,HB,Active
    BJ047,2002-12-05 16:09:00,HB,Active
    BJ047,2002-12-05 16:24:00,HB,Active
    BJ047,2002-12-05 16:39:14,HB,Active
    BJ047,2002-12-05 16:54:02,HB,Active
    BJ047,2002-12-05 17:09:03,HB,Active
    BJ047,2002-12-05 17:11:45,HB,Active
    BJ047,2002-12-05 17:16:35,HB,Active
    BJ047,2002-12-05 17:31:35,HB,Active
    BJ047,2002-12-05 17:46:36,HB,Active
    BJ047,2002-12-05 18:01:49,HB,Active
    BJ047,2002-12-05 18:16:49,HB,Active

  4. #4
    Join Date
    Oct 2002
    Posts
    369

    Lightbulb

    RE:
    Thanks for the reply rnealejr, when I say "how long" I mean how many minutes out of the day. When there is no more than 15 minutes between log entries I know the kiosk is "up". When there is more than 15 minutes between log entries I know the kiosk is down for the amount of time in that difference. I need to total these two amounts (up and down) per day.The table is structured as follows:

    CREATE TABLE `log_entries` (
    `KIOSK` varchar(10) NOT NULL default '',
    `DATE_TIME` datetime NOT NULL default '0000-00-00 00:00:00',
    `TYPE` char(2) NOT NULL default '',
    `DATA` varchar(50) NOT NULL default ''
    )

    some example data (comma delimited export):
    BJ047,2002-12-05 14:15:35,HB,Active
    BJ047,2002-12-05 14:16:52,HB,Active
    There are many ways to implement logging tables, functions, procedures, etc., (some work out much better than others).

    While I do not fully understand your requirements, in your case, you might wish to consider implementing a "PulseInterval" column to store interval data (in minutes, computed at the time of data insertion into table log_entries). {Doing so, you should be able to easily and efficiently sum and group result sets where "PulseInterval" < X minutes (or for that matter >=Y minutes), for any datetime interval required.} When I get the opportunity I'll post an example (ddl dml).

    General considerations in various different logging table scenarios:
    * It is often efficient to compute and store interval data at the time of entry (this is especially the case if the logging tables will never be purged / archived to keep the quantity of on-line historical data manageable).
    * Processing large logging table result sets i.e.(interval data) by means of cursors may be very resource intensive at run time (again, this is especially the case if logging tables are never purged / archived out).
    * When logging table rows are frequently updated, e.g.,(by multiple processing steps, etc.), consider implementing computed columns instead of storing interval data at the time of entry (especially if log table historical data IS periodically purged / archived out).

    You may also wish to consider avoiding implementing [object] names (for tables, columns, etc.) using names that happen to be reserved words, keywords, system names, include spaces, etc., etc. (columns named Type, and Data may at some point cause some unintended / undesirable effects). It is generally best, or at least less painful to prefix or suffix such names to make them unique)

  5. #5
    Join Date
    Feb 2002
    Posts
    2,232
    I don't have much time but try the following:

    select sum(cast(interval as float(18,8))) * 24 * 60, data from (select a.date_time - (select max(date_time) from log_entries as b where b.date_time < a.date_time) as interval,data
    from log_entries as a) AS B group by B.data

    You can optimize it better - I am sure by changing it to an inner join - but I am walking out the door. Also, this will only handle same day intervals - but this should give you a good start - Anyway if you could attach a larger file that crosses several days and how you need to treat that scenario - please post.

    If you do decide to add an interval column the above sql will help in the conversion. Especially with datetimes, anytime you have rows that are dependent on each other, creating sql statements can be difficult but not impossible, so creating a column that eliminates this dependency would be beneficial.

    Good luck.

  6. #6
    Join Date
    Oct 2002
    Posts
    369

    Lightbulb

    RE: When I get the opportunity I'll post an example (ddl dml).
    Here is the example (ddl dml).

    The dba_Log_Entry.sql example procedure dba_Log_Entry works with a supporting fn and a log_entries table similar to yours (the function may be implemented as a stored procedure for Sql Server versions earlier than Sql Server 2k).

    Using the dba_Log_Entry.sql sample one should be able to get 24 Hr Up / Down time result sets via: (not tested or checked; but should work OK, at least using the example)

    -- Display result sets (detail)
    SELECT Kiosk, PulseInterval As 'DownTimeInterval', EntryPulseDateTime, LastPulseDateTime
    FROM log_entries
    WHERE (PulseInterval >= 15) AND (EntryPulseDateTime >= GETDATE() - 1)

    SELECT Kiosk, PulseInterval As 'UpTimeInterval', EntryPulseDateTime, LastPulseDateTime
    FROM log_entries
    WHERE (PulseInterval < 15) AND (EntryPulseDateTime >= GETDATE() - 1)

    -- Display result set summaries for last 24 Hour period
    SELECT Kiosk, SUM(PulseInterval) AS 'UpTime (min. in last 24 Hour Period)'
    FROM log_entries
    WHERE (PulseInterval < 15) AND (EntryPulseDateTime >= GETDATE() - 1)
    GROUP BY Kiosk

    SELECT Kiosk, SUM(PulseInterval) AS 'DownTime (min. in last 24 Hour Period)'
    FROM log_entries
    WHERE (PulseInterval >= 15) AND (EntryPulseDateTime >= GETDATE() - 1)
    GROUP BY Kiosk
    Attached Files Attached Files

  7. #7
    Join Date
    Oct 2002
    Posts
    369

    Lightbulb

    I guess what you actually wanted to display was result set summaries grouped by 'ResultDate' (and not just the last 24 Hours); the following should do the latter: (not tested)

    -- Display result set summaries grouped by ResultDate
    SELECT
    Kiosk,
    SUM(PulseInterval) AS 'UpTime (min. in last 24 Hour Period)',
    CAST(MONTH(EntryPulseDateTime) AS VarChar(2)) + ' / ' + CAST(DAY(EntryPulseDateTime) AS VarChar(2)) + ' / ' + CAST(YEAR(EntryPulseDateTime) AS Char(4)) AS ResultDate
    FROM log_entries
    WHERE (PulseInterval < 15)
    GROUP BY
    Kiosk,
    CAST(MONTH(EntryPulseDateTime) AS VarChar(2)) + ' / ' + CAST(DAY(EntryPulseDateTime) AS VarChar(2)) + ' / ' + CAST(YEAR(EntryPulseDateTime) AS Char(4))

    SELECT
    Kiosk,
    SUM(PulseInterval) AS 'DownTime (min. in last 24 Hour Period)',
    CAST(MONTH(EntryPulseDateTime) AS VarChar(2)) + ' / ' + CAST(DAY(EntryPulseDateTime) AS VarChar(2)) + ' / ' + CAST(YEAR(EntryPulseDateTime) AS Char(4)) AS ResultDate
    FROM log_entries
    WHERE (PulseInterval >= 15)
    GROUP BY
    Kiosk,
    CAST(MONTH(EntryPulseDateTime) AS VarChar(2)) + ' / ' + CAST(DAY(EntryPulseDateTime) AS VarChar(2)) + ' / ' + CAST(YEAR(EntryPulseDateTime) AS Char(4))

  8. #8
    Join Date
    Dec 2002
    Location
    Czech Republic
    Posts
    249

    Cool

    /*
    I suppose that you log only HB as pings (Active state), modify to meet your needs.

    Problem - detecting gaps from activity pings
    0.I generated random data
    1.I tried to solve your problem by inner join with data range subquery - it took hours
    2.I applied cursors - time reduced to several minutes
    3.I used ordered streams and time is reduced to tens of seconds
    4.Some improvements
    */

    --generating 1 mil random rows (50 s on my old comp) -> Skip this, you have your own data
    if object_id('dbo.log_entries') is not null drop table log_entries
    GO
    CREATE TABLE dbo.log_entries (
    KIOSK varchar(10) NOT NULL default ''
    ,DATE_TIME datetime NOT NULL default '0000-00-00 00:00:00'
    ,TYPE char(2) NOT NULL default ''
    ,DATA varchar(50) NOT NULL default ''
    )
    GO
    insert dbo.log_entries(KIOSK,DATE_TIME,TYPE,DATA)
    select
    KIOSK='KIOSK'+convert(varchar(10),convert(tinyint, (checksum(newid())+2147483648)/400000000))
    ,DATE_TIME=convert(datetime,(checksum(newid())+214 7483648)/40000000+103*365)
    ,TYPE='HB'
    ,DATA='Active'
    from ( select 1 as X union all select 1 union all select 1 union all select 1 ) XXX
    cross join ( select 1 as X union all select 1 union all select 1 union all select 1 ) XXX1
    cross join ( select 1 as X union all select 1 union all select 1 union all select 1 ) XXX2
    cross join ( select 1 as X union all select 1 union all select 1 union all select 1 ) XXX3
    cross join ( select 1 as X union all select 1 union all select 1 union all select 1 ) XXX4
    cross join ( select 1 as X union all select 1 union all select 1 union all select 1 ) XXX5
    cross join ( select 1 as X union all select 1 union all select 1 union all select 1 ) XXX6
    cross join ( select 1 as X union all select 1 union all select 1 union all select 1 ) XXX7
    cross join ( select 1 as X union all select 1 union all select 1 union all select 1 ) XXX8
    cross join ( select 1 as X union all select 1 union all select 1 union all select 1 ) XXX9
    GO

    --Extracting pings from log in ordered streams
    if object_id('tempdb..#Pings') is not null drop table #Pings
    GO
    create table #Pings(
    IdPing int identity(1,1) primary key
    ,IdKiosk varchar(10) not null
    ,PingDate datetime not null
    )
    GO
    insert #Pings(IdKiosk,PingDate)
    select KIOSK,DATE_TIME
    from log_entries
    where TYPE='HB' and DATA='Active'
    order by KIOSK,DATE_TIME

    /*
    Detecting gaps - using sequential join on ordered streams to eliminate cursors,
    other replacement of joins on data ranges, which are really time-consuming
    */
    /* Assuming, that gap started 15 min after last good known state - optimistic access, easy to modify */
    if object_id('tempdb..#Gaps') is not null drop table #Gaps
    GO
    create table #Gaps(
    IdKiosk varchar(10) not null
    ,StartDate datetime not null
    ,EndDate datetime not null
    )
    GO
    declare @LastDate datetime
    set @LastDate=(select max(PingDate) from #Pings)
    insert #Gaps(IdKiosk,StartDate,EndDate)
    select p.IdKiosk,StartDate=p.PingDate,EndDate=isnull(p1.P ingDate,@LastDate)
    from #Pings p
    left join #Pings p1
    on p.IdPing=p1.IdPing-1 and p.IdKiosk=p1.IdKiosk
    where datediff( minute, p.PingDate, isnull(p1.PingDate,@LastDate) )>=15

    --Distribution of intervals between days
    if object_id('tempdb..#Dates') is not null drop table #Dates
    GO
    create table #Dates(
    DayBeginningDate datetime not null
    )
    GO
    set nocount on
    declare @MinDate datetime
    declare @MaxDate datetime
    set @MinDate=(select min(StartDate) from #Gaps)
    set @MaxDate=(select max(EndDate) from #Gaps)
    declare @Counter int
    declare @StopCounter int
    set @Counter=floor(convert(numeric(36,18),@MinDate))
    set @StopCounter=floor(convert(numeric(36,18),@MaxDate ))
    while @Counter<=@StopCounter begin
    insert #Dates(DayBeginningDate) values (convert(datetime,@Counter))
    set @Counter=@Counter+1
    end
    set nocount off
    GO
    if object_id('tempdb..#OverGaps') is not null drop table #OverGaps
    GO
    create table #OverGaps(
    IdKiosk varchar(10) not null
    ,StartDate datetime not null
    ,EndDate datetime not null
    )
    GO
    insert #OverGaps(IdKiosk,StartDate,EndDate)
    select IdKiosk,StartDate,EndDate
    from #Gaps g
    where exists(
    select * from #Dates d
    where g.StartDate<d.DayBeginningDate and d.DayBeginningDate<g.EndDate
    )
    delete g
    from #Gaps g
    join #Dates d
    on g.StartDate<d.DayBeginningDate and d.DayBeginningDate<g.EndDate
    --first day
    insert #Gaps(IdKiosk,StartDate,EndDate)
    select g.IdKiosk,g.StartDate,EndDate=convert(datetime,flo or(convert(numeric(36,18),g.StartDate))+1)
    from #OverGaps g
    --days in between
    insert #Gaps(IdKiosk,StartDate,EndDate)
    select g.IdKiosk,StartDate=d.DayBeginningDate,EndDate=con vert(datetime,floor(convert(numeric(36,18),d.DayBe ginningDate))+1)
    from #OverGaps g
    join #Dates d
    on g.StartDate<d.DayBeginningDate and d.DayBeginningDate<convert(datetime,floor(convert( numeric(36,18),g.EndDate)))
    --last day
    insert #Gaps(IdKiosk,StartDate,EndDate)
    select g.IdKiosk,StartDate=convert(datetime,floor(convert (numeric(36,18),g.EndDate))),EndDate=g.EndDate
    from #OverGaps g
    where convert(datetime,floor(convert(numeric(36,18),g.En dDate)))<g.EndDate
    drop table #OverGaps
    GO

    --Final analysis by day, with corrected starting and final days
    declare @MinDate datetime
    declare @MaxDate datetime
    set @MinDate=(select min(StartDate) from #Gaps)
    set @MaxDate=(select max(EndDate) from #Gaps)
    select
    IdKiosk
    ,"Date"=convert(datetime,floor(convert(numeric(36, 18),StartDate)))
    ,GapsPerCent=100*sum(
    (
    convert(numeric(36,18),EndDate )
    -convert(numeric(36,18),StartDate)
    )*case convert(datetime,floor(convert(numeric(36,18),Star tDate)))
    when @MinDate then 1-convert(numeric(36,18),StartDate)+floor(convert(nu meric(36,18),StartDate))
    when @MaxDate then convert(numeric(36,18),EndDate)+floor(convert(nume ric(36,18),EndDate))
    else 1 end
    )
    ,ReadyPerCent=100
    -100*sum(
    (
    convert(numeric(36,18),EndDate )
    -convert(numeric(36,18),StartDate)
    )*case convert(datetime,floor(convert(numeric(36,18),Star tDate)))
    when @MinDate then 1-convert(numeric(36,18),StartDate)+floor(convert(nu meric(36,18),StartDate))
    when @MaxDate then convert(numeric(36,18),EndDate)+floor(convert(nume ric(36,18),EndDate))
    else 1 end
    )
    from #Gaps
    group by IdKiosk,convert(datetime,floor(convert(numeric(36, 18),StartDate)))
    order by IdKiosk,convert(datetime,floor(convert(numeric(36, 18),StartDate)))

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •