objective: Write a query to find out how many new connections or session were connected in db from dba_hist_active_sess_history.
There is already way to mine listener to find this detail, but let us try this for fun.
step 1) Write a query to get the sample time.
select distinct(sample_time)
from dba_hist_active_sess_history
order by 1;
Step 2) Write a wrapper on top of step 1 to fetch sample time and its lag
select samp_time,lag(samp_time,1) over (order by samp_time) lag_time
from
(
select distinct(sample_time) samp_time
from dba_hist_active_sess_history
order by 1);
Step 3) Write a wrapper on top of the step 2 to fetch sid,serial# from dba_hist_active_Sess_history for both lag and actual sample time and count the difference.
select a.sample_time
,count(1)
from dba_hist_active_sess_history a
,(select samp_time,lag(samp_time,1) over (order by samp_time) lag_time
from
(
select distinct(sample_time) samp_time
from dba_hist_active_sess_history
order by 1)) c
where a.sample_time=c.samp_time
and (a.session_id,a.SESSION_SERIAL#) not in (select session_id,sESSION_SERIAL# from dba_hist_active_sess_history where sample_time=c.lag_time)
group by a.sample_time
order by 1;
==> works well.
to test its effectiveness... replaced dba_hist with v$ to query the current info for every second sample.
select a.sample_time
,count(1)
from v$active_session_history a
,(select samp_time,lag(samp_time,1) over (order by samp_time) lag_time
from
(
select distinct(sample_time) samp_time
from v$active_session_history
order by 1)) c
where a.sample_time=c.samp_time
and (a.session_id,a.SESSION_SERIAL#) not in (select session_id,sESSION_SERIAL# from v$active_session_history where sample_time=c.lag_time)
group by a.sample_time
order by 1;
sample record from v$active_session_history...
21-DEC-18 11.31.39.436 AM 136 188
21-DEC-18 11.31.40.436 AM 136 188 << repeating all the time
21-DEC-18 11.31.41.436 AM 12 3 << new 1
21-DEC-18 11.31.41.436 AM 21 5 << new 2
21-DEC-18 11.31.41.436 AM 128 1 << new 3
21-DEC-18 11.31.41.436 AM 136 188 << repeating
21-DEC-18 11.31.42.446 AM 136 188
21-DEC-18 11.31.43.446 AM 136 188
21-DEC-18 11.31.44.446 AM 136 188
21-DEC-18 11.31.45.446 AM 136 188
21-DEC-18 11.31.46.446 AM 136 188
21-DEC-18 11.31.47.446 AM 136 188
21-DEC-18 11.31.48.446 AM 136 188
21-DEC-18 11.31.49.446 AM 136 188
21-DEC-18 11.31.50.446 AM 136 188
21-DEC-18 11.31.51.446 AM 136 188
21-DEC-18 11.31.52.446 AM 136 188
21-DEC-18 11.31.53.456 AM 136 188
21-DEC-18 11.31.54.456 AM 136 188
21-DEC-18 11.31.55.456 AM 136 188
21-DEC-18 11.31.56.456 AM 136 188
21-DEC-18 11.31.57.456 AM 136 188
21-DEC-18 11.31.58.456 AM 136 188
21-DEC-18 11.31.59.456 AM 136 188
21-DEC-18 11.32.00.456 AM 136 188
21-DEC-18 11.32.01.466 AM 136 188
21-DEC-18 11.32.02.476 AM 136 188
21-DEC-18 11.32.03.476 AM 136 188
21-DEC-18 11.32.04.486 AM 136 188
21-DEC-18 11.32.05.486 AM 136 188
21-DEC-18 11.32.06.486 AM 136 188
21-DEC-18 11.32.07.486 AM 136 188
21-DEC-18 11.32.08.486 AM 136 188
21-DEC-18 11.32.09.486 AM 136 188
21-DEC-18 11.32.10.486 AM 136 188
21-DEC-18 11.32.11.496 AM 136 188
21-DEC-18 11.32.41.556 AM 12 3 << new 1
21-DEC-18 11.35.00.897 AM 14 20 << new 1
so ideally we should have 21-DEC-18 11.31.41.436 AM (3) and 21-DEC-18 11.32.41.556 AM & 21-DEC-18 11.35.00.897 AM (1) each. Other samples had repeating value, so should be skipped.
SAMPLE_TIME COUNT(1)
--------------------------------------------------------------------------- ----------
21-DEC-18 10.47.14.345 AM 2
21-DEC-18 10.47.17.365 AM 2
..
21-DEC-18 11.30.27.236 AM 1
21-DEC-18 11.30.31.256 AM 1
21-DEC-18 11.30.39.266 AM 1
21-DEC-18 11.30.41.276 AM 1
21-DEC-18 11.30.42.276 AM 1
21-DEC-18 11.31.21.396 AM 1
21-DEC-18 11.31.41.436 AM 3 << 3 as expected
21-DEC-18 11.32.41.556 AM 1 << 1 as expected
21-DEC-18 11.35.00.897 AM 1 << 1 as expected
So this works as expected.
Thanks for paying a visit.
All the best.
There is already way to mine listener to find this detail, but let us try this for fun.
step 1) Write a query to get the sample time.
select distinct(sample_time)
from dba_hist_active_sess_history
order by 1;
Step 2) Write a wrapper on top of step 1 to fetch sample time and its lag
select samp_time,lag(samp_time,1) over (order by samp_time) lag_time
from
(
select distinct(sample_time) samp_time
from dba_hist_active_sess_history
order by 1);
Step 3) Write a wrapper on top of the step 2 to fetch sid,serial# from dba_hist_active_Sess_history for both lag and actual sample time and count the difference.
select a.sample_time
,count(1)
from dba_hist_active_sess_history a
,(select samp_time,lag(samp_time,1) over (order by samp_time) lag_time
from
(
select distinct(sample_time) samp_time
from dba_hist_active_sess_history
order by 1)) c
where a.sample_time=c.samp_time
and (a.session_id,a.SESSION_SERIAL#) not in (select session_id,sESSION_SERIAL# from dba_hist_active_sess_history where sample_time=c.lag_time)
group by a.sample_time
order by 1;
==> works well.
to test its effectiveness... replaced dba_hist with v$ to query the current info for every second sample.
select a.sample_time
,count(1)
from v$active_session_history a
,(select samp_time,lag(samp_time,1) over (order by samp_time) lag_time
from
(
select distinct(sample_time) samp_time
from v$active_session_history
order by 1)) c
where a.sample_time=c.samp_time
and (a.session_id,a.SESSION_SERIAL#) not in (select session_id,sESSION_SERIAL# from v$active_session_history where sample_time=c.lag_time)
group by a.sample_time
order by 1;
sample record from v$active_session_history...
21-DEC-18 11.31.39.436 AM 136 188
21-DEC-18 11.31.40.436 AM 136 188 << repeating all the time
21-DEC-18 11.31.41.436 AM 12 3 << new 1
21-DEC-18 11.31.41.436 AM 21 5 << new 2
21-DEC-18 11.31.41.436 AM 128 1 << new 3
21-DEC-18 11.31.41.436 AM 136 188 << repeating
21-DEC-18 11.31.42.446 AM 136 188
21-DEC-18 11.31.43.446 AM 136 188
21-DEC-18 11.31.44.446 AM 136 188
21-DEC-18 11.31.45.446 AM 136 188
21-DEC-18 11.31.46.446 AM 136 188
21-DEC-18 11.31.47.446 AM 136 188
21-DEC-18 11.31.48.446 AM 136 188
21-DEC-18 11.31.49.446 AM 136 188
21-DEC-18 11.31.50.446 AM 136 188
21-DEC-18 11.31.51.446 AM 136 188
21-DEC-18 11.31.52.446 AM 136 188
21-DEC-18 11.31.53.456 AM 136 188
21-DEC-18 11.31.54.456 AM 136 188
21-DEC-18 11.31.55.456 AM 136 188
21-DEC-18 11.31.56.456 AM 136 188
21-DEC-18 11.31.57.456 AM 136 188
21-DEC-18 11.31.58.456 AM 136 188
21-DEC-18 11.31.59.456 AM 136 188
21-DEC-18 11.32.00.456 AM 136 188
21-DEC-18 11.32.01.466 AM 136 188
21-DEC-18 11.32.02.476 AM 136 188
21-DEC-18 11.32.03.476 AM 136 188
21-DEC-18 11.32.04.486 AM 136 188
21-DEC-18 11.32.05.486 AM 136 188
21-DEC-18 11.32.06.486 AM 136 188
21-DEC-18 11.32.07.486 AM 136 188
21-DEC-18 11.32.08.486 AM 136 188
21-DEC-18 11.32.09.486 AM 136 188
21-DEC-18 11.32.10.486 AM 136 188
21-DEC-18 11.32.11.496 AM 136 188
21-DEC-18 11.32.41.556 AM 12 3 << new 1
21-DEC-18 11.35.00.897 AM 14 20 << new 1
so ideally we should have 21-DEC-18 11.31.41.436 AM (3) and 21-DEC-18 11.32.41.556 AM & 21-DEC-18 11.35.00.897 AM (1) each. Other samples had repeating value, so should be skipped.
SAMPLE_TIME COUNT(1)
--------------------------------------------------------------------------- ----------
21-DEC-18 10.47.14.345 AM 2
21-DEC-18 10.47.17.365 AM 2
..
21-DEC-18 11.30.27.236 AM 1
21-DEC-18 11.30.31.256 AM 1
21-DEC-18 11.30.39.266 AM 1
21-DEC-18 11.30.41.276 AM 1
21-DEC-18 11.30.42.276 AM 1
21-DEC-18 11.31.21.396 AM 1
21-DEC-18 11.31.41.436 AM 3 << 3 as expected
21-DEC-18 11.32.41.556 AM 1 << 1 as expected
21-DEC-18 11.35.00.897 AM 1 << 1 as expected
So this works as expected.
Thanks for paying a visit.
All the best.
No comments:
Post a Comment