Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Issue adding setting million records using webservices

Status
Not open for further replies.

oscript

Programmer
Mar 25, 2003
63
GB
Just wonder if anyone else had a nice 'fix' for this general issue ?

I have the unenviable job of adding 10's of millions of docs to Livelink (lots of small ones !) but notice in testing that each one produces 10 or so audit entries and perhaps the same category audit rows. Obviously this affects performance.

I know that I can switch of auditting at the server level (?func=admin.auditopts) but of course this means that in production neither will genuine activity be tracked ! This is clearly not ideal.

Is there a way of suppressing the writing of audit entries for example on an individual front end (for example) or some method in web services that I can utilise (and haven't found) to suppress the unnecessary audit rows ?

Any help appreciated,
 
not if you changed code using 'oscript' on the front ends.Whay is audit not needed.Could you dlete them with a DB trigger as soona s it is inserted using some db logic if it is playing data?

Well, if I called the wrong number, why did you answer the phone?
James Thurber, New Yorker cartoon caption, June 5, 1937
Certified OT Developer,Livelink ECM Champion 2008,Livelink ECM Champion 2010
 
Good points.......

I realise that an oscript change was likely but obviously is not the path of least resistance (need to make the change, test it and get it past the administration team etc) so was looking for anything else that others might have done.

The issue is that with so many records and an already large daudit and dauditnew table I was hoping to avoid further unnecessary 'pollution' which in this case (old archived records which will never be changed) will serve no purpose.

The 'remove it' option (db trigger for example) is another option but seems back to front and doesn't help with shaving important miliseconds off the time taken for each record to be imported.

This cannot be a new issue so I wondered if anyone had any other ideas ?

Steven
 
if the dbtrigger can be implemented in a smart way it would serve your purpose.What I had in mind was you would know which livelink servers are your "polluters"
so you would need some mechanism to understand if the rows being inserted are coming form the 'polluting' front ends.

For a oscripter this is a very trivial task as they can write an "override". So you would install the module with the "overridden audit feature" so you know pretty much that it is coming from your polluters so you would "kill" the call that audits.

Frankly I do not know what you are trying to do.Is this small documents test stuff or are they important stuff because in my mind nobody uses a production system to do testing something and will have ways to prevent rapid data growth,clean up etc.Any ways if it was a system that was used in a controlled industry the government would not look very favorably if you mucked with the system of record.

I would have done it with Oscript because that is what best fits this purpose.

Well, if I called the wrong number, why did you answer the phone?
James Thurber, New Yorker cartoon caption, June 5, 1937
Certified OT Developer,Livelink ECM Champion 2008,Livelink ECM Champion 2010
 
BTW every big company out there has ways of making sure DAUDIT and DAUDITNEW bloat how they can be addressed.If you are unsure ask OT for suggestions.They should have some for the type of DB in your ORG.

Well, if I called the wrong number, why did you answer the phone?
James Thurber, New Yorker cartoon caption, June 5, 1937
Certified OT Developer,Livelink ECM Champion 2008,Livelink ECM Champion 2010
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top