Hi,
I have a stats table where I am getting 1 minute data from hundreds of server which have serveral processes each on them. The stats themselves are various counters which give me a rate of updates per minute.
The table I created used the following SQL:
CREATE TABLE STAT_RATE
( "SYSTEM" VARCHAR2(30 BYTE),
"PROCESS" VARCHAR2(30 BYTE),
"DAY" DATE,
"TIME" VARCHAR2(8 BYTE),
"STAT" VARCHAR2(50 BYTE),
"VALUE" NUMBER
)
However I am getting around 3 million records per day and I am trying to keep 30 days of data.
The table is now huge and takes time to query despite the following indexes:
1. unique index of system, process, day, time, stat
2. non-unique index on day
3. non-unique index on system and process
4. non-unique index on stat
I want to improve performance of this table and was thinking of using partitioning which is new to me.
Can I just partition by day and use the same tablespace?
Or is there a better way to partition?
Or is there a better way to improve performance?
Thanks for your help,
Tom
I have a stats table where I am getting 1 minute data from hundreds of server which have serveral processes each on them. The stats themselves are various counters which give me a rate of updates per minute.
The table I created used the following SQL:
CREATE TABLE STAT_RATE
( "SYSTEM" VARCHAR2(30 BYTE),
"PROCESS" VARCHAR2(30 BYTE),
"DAY" DATE,
"TIME" VARCHAR2(8 BYTE),
"STAT" VARCHAR2(50 BYTE),
"VALUE" NUMBER
)
However I am getting around 3 million records per day and I am trying to keep 30 days of data.
The table is now huge and takes time to query despite the following indexes:
1. unique index of system, process, day, time, stat
2. non-unique index on day
3. non-unique index on system and process
4. non-unique index on stat
I want to improve performance of this table and was thinking of using partitioning which is new to me.
Can I just partition by day and use the same tablespace?
Or is there a better way to partition?
Or is there a better way to improve performance?
Thanks for your help,
Tom