Greetings... I want to be able to archive my historical CMS data for historical reporting, etc. I am talking about interval data for splits and VDNs, etc. Has anyone setup a solution to do this?
If you access cms through the cms terminal select data storage allocation from the system setup menu.
from CentreVu supervisor choose tools --> system setup -->
then data storage allocation. To be honest if your running a sun sparc server it should have been set-up.
It is setup to maintain data for a certain number of days - but 31 days isn't good enough for me. I need to maintain all hourly, daily, weekly, and monthly skill, agent, and vdn data indefinately.
The CMS can store interval data for 62 days. You can manage this via the data storage allocation table. There is a new product called Operational Analyst that will store teh CMS summary interval data from all tables for several years. It is open complaint and also will store the ECH data feed. It will support up to 240 ACDs on a single machine an allow you to use Oracle or a SQL db.
1. ExplorerII (Centerpoint) uses an SQL server to retain information.
2. A less expensive solution is the ODBC driver for CMS. This allows connectivity to historical data. This would require you to develop a database to migrate data from.
3. Flat file, ftp, import. Not clean but functional.
4. Depending upon the platform, ACD data collected, and hard drive space, the data allocation tables can be modified to retain data for a longer period. Most installs usually get the defaults even if you have available disk space. The new SunBlade100 platform comes complete with a 13Gig Harddrive. Of course if you are on the older Sunsparc or Ultra you may have very limited space. James Middleton
ACSCI/ACSCD/MCSE
Xeta Technologies
jim.middleton@xeta.com
We chose to do a simple ETL each half hour to an Oracle DB via the ODBC link. Split and VDN data doesn't grow too fast, but you may find that HAGENT (interval agent data) may grow quickly as it creates a record for each individual skill that an agent is logged into in the interval.
We found the HAGLOG table (agent login and logout) data to be useful. This is useful have much fluidity with agent skill profiles.
You may also want to export the SYNONYM Table and timestamp the changes. This is helpful in backing out recent dictionary changes and getting back to what CMS names, etc looked like way back when.
Our biggest challenge was keeping the individual call segment history. This grows really fast and there was a great deal of complexity in reconciling and validating the ECH records to the interval data rollups.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.