Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

CMS Data Archiving

Status
Not open for further replies.

travis13

MIS
Oct 3, 2002
17
0
0
US
Greetings... I want to be able to archive my historical CMS data for historical reporting, etc. I am talking about interval data for splits and VDNs, etc. Has anyone setup a solution to do this?

Thanks for your input!

t
 
Has the CMS been installed?.

If you access cms through the cms terminal select data storage allocation from the system setup menu.

from CentreVu supervisor choose tools --> system setup -->
then data storage allocation. To be honest if your running a sun sparc server it should have been set-up.

jay
 
It is setup to maintain data for a certain number of days - but 31 days isn't good enough for me. I need to maintain all hourly, daily, weekly, and monthly skill, agent, and vdn data indefinately.
 
You can export your data to a text or excel file from CentreVu. If you have Microsoft Access knowledge you can store all your data in a database.
 
The CMS can store interval data for 62 days. You can manage this via the data storage allocation table. There is a new product called Operational Analyst that will store teh CMS summary interval data from all tables for several years. It is open complaint and also will store the ECH data feed. It will support up to 240 ACDs on a single machine an allow you to use Oracle or a SQL db.
 
There a variety of solutions available.

1. ExplorerII (Centerpoint) uses an SQL server to retain information.

2. A less expensive solution is the ODBC driver for CMS. This allows connectivity to historical data. This would require you to develop a database to migrate data from.

3. Flat file, ftp, import. Not clean but functional.

4. Depending upon the platform, ACD data collected, and hard drive space, the data allocation tables can be modified to retain data for a longer period. Most installs usually get the defaults even if you have available disk space. The new SunBlade100 platform comes complete with a 13Gig Harddrive. Of course if you are on the older Sunsparc or Ultra you may have very limited space. James Middleton
ACSCI/ACSCD/MCSE
Xeta Technologies
jim.middleton@xeta.com
 
We chose to do a simple ETL each half hour to an Oracle DB via the ODBC link. Split and VDN data doesn't grow too fast, but you may find that HAGENT (interval agent data) may grow quickly as it creates a record for each individual skill that an agent is logged into in the interval.

We found the HAGLOG table (agent login and logout) data to be useful. This is useful have much fluidity with agent skill profiles.

You may also want to export the SYNONYM Table and timestamp the changes. This is helpful in backing out recent dictionary changes and getting back to what CMS names, etc looked like way back when.

Our biggest challenge was keeping the individual call segment history. This grows really fast and there was a great deal of complexity in reconciling and validating the ECH records to the interval data rollups.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top