Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

CentreVu Open Database Connectivity (ODBC)

Status
Not open for further replies.
Jul 1, 2002
21
GB
CentreVu Open Database Connectivity (ODBC)

Has anyone used this feature?

We are looking to warehouse CMS data, utilising the ODBC option. This will be done at a low use period, i.e. over night.

Sabio, a reseller is trying to disuade us from using this, they say it is not stable to do it this way. They are proposing they can do it themselves, at a huge cost to avoid these problems.

Has anyone hit any problem in using the ODBC option? Are they just trying to fleece us?

Or any other information you have about this would be a great help.

Nick
 
ODBC is a hog by design, but it should work OK, depending on what you try to accomplish.
Overnight should not be a problem, but keep in mind when your system runs its batches.
If you want an effective, lean, data transfer out of your system, to another host you can also use the CMSQL/ISQL
interface.
This is an command line option that is allready included with CMS, you can use to run SQL queries, against the Informix data.
I'll give you an example:
/cms/toolsbin/cmsql -q < /export/home/cmssvc/cmstbls.sql > /export/home/cmssvc/cmstbls.txt
This will run the query file cmstbls.sql and puts the result in cmstbls.txt
The cmstbls.sql file you need to create, should contain this string:
select * from cmstbls
This of cource, could be any table you like I just used cmstbls as an example, but this could also be d_split or any other.
You can also play around with the SQL strings, if you only want data for one date-range, or want to do make-up jobs for example.

You can put several of these into a shell script, and schedule it with CRON, and if needed FTP it out to the host that collects the data from the same shell script.

It takes a little more work to plan, but it's free, and you only have to set up once, and never touch it again if you don't need to change anything.
For our organisation it has proven to be more reliable as well.

Rob



 
Nick,
Sabio are feeding you a line there, the ODBC driver is okay for small repetetive functions but not really advisable for exporting say the whole database, although if you are doing this in downtime then you shouldn't have a problem. If you're a 24/7 centre then you may find that trying to dump the DB will crash the cms server. There are loads of other options that may prove more stable for your needs, what exactly is it that you are trying to accomplish? We found that whilst we wanted to export everything from the informix db, we were never going to use about 90% of what we had collected. By rationalising the data we needed, wanted and would never use we managed to design very basic reports that gave us the info we needed and export it into bog standard txt files which we could open in Access (or excel) and write reports integrating our sales and customer service information.
 
Sorry,
I meant to start that with: Sabio are NOT feeding you a line!!!! (apologies all at Sabio!)
 
Thanks for that PD.

We will be choosing our fields from within the tables, at this stage probably only from the Historical Interval tables of Split, Agent, Vector and VDN,maybe CWC.

It will basically be a daily dump at a quiet time into a warehouse, where we will be building a new db with other data streams.

 
We use it with Microsoft Access for all of our historical reporting and to get around the 5000 call record limitation of the call record table (we export all call records twice a day). Very happy!
 
Nick,
In that case why dont you experiment with creating a script from CMS to dump the data into your DRS. Scripting is dead easy with cms as you simply click script and select the options that you'd require (i.e. straight text file or maybe HTML) then when you've done that just run the script file by using MS Scheduler, this is how I do it and it negates the need to mess around with the ODBC driver which is by all intents and purposes a bloody ugly app!!!
 
BHodgins,
I was wondering how much strain this puts on your CMS server. How long does it take to get the 5000 records and does the CPU utilization max out during this period? Also, what type of system is your CMS running on? Thanks in advance for any input on this, as we are looking into this as an alternative to using CentreVu Explorer (which we currently have). CVExpII is nice, but it is useless if you want to export data from it's 40+ tables... it's too time consuming to try to understand them to get the data you want.
 
Hi guys,
It is very interesting to know someone is doing exactly the same as what I am trying to do. The difference is that, under my company's security policy, I do not have any rights to access the CMS server. So to persuade the big boss to allow data transfer to happen, I have to ask other people to do it for me and I need be absolutely certain with every single step of what I am trying to do on the server. Therefore, I need your guys' expertise to show me what exactly needs to be done.
What I am trying to do is to export the Aux code data, talk time, wrap time etc, a couple of fields from the CMS database to a SQL 7 database on the nightly basis.
Can anyone help me please. Thank you in advance.

Daniel
 
Danp129,

It takes about 30 seconds to export 5000 records from the call_rec table to a local Access Table. I've never looked at the CPU utilization during the export, but have not heard any complaints.

I'm using the standard Solaris CMS server.

If you're concerned about the load on the CPU, you could always do a scheduled SQL Server job to run on the hour to pull the records for the previous hour. For example,

At 12:05 am, pull all records where row_date = yesterday and row_time >= 2300

This will catch all call recs from the previous night 11pm on

Then at 1:05 am, pull all records where row_date = today and row_time >=0 and < 100

Then at 2:05 am, pull all records where row_date = today and row_time >=100 and < 200

and so on for all the other hours.

Row_time is an integer and the rightmost two digits are the minutes and everything to the left are the hours in 24 hour format.

0 = 12 am
100 = 1 am
200 = 2 am
300 = 3 am
...
1200 = 12 pm
1300 = 1 pm
...
2300 = 11 pm

Hope this helps.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top