Hello,
I have a really odd performance problem with MSSQL. My staff is trying to load test our SQL server in a disaster recovery effort.
We are using the SQL Northwind DB as a test, and our script is inserting a random amount of rows.
The performance problem is that the amount of time to insert the random records linearly increases as the script runs.
For example:
Run 1: Time to insert 400 rows: 60 ms
Run 2: Time to insert 800 rows: 100 ms
Run 3: Time to insert 200 rows: 120 ms
.
.
.
Run N-1: Time to insert 10 rows: 3230 ms
Run N: Time to insert 1000 rows: 3403 ms
Has anyone seen a situation like this, or can anyone shed any light as to why MSSQL would behave like this?
If you need problem clarification, just ask. I can even post the procedure on here it's so simple.
Best,
Diebels
I have a really odd performance problem with MSSQL. My staff is trying to load test our SQL server in a disaster recovery effort.
We are using the SQL Northwind DB as a test, and our script is inserting a random amount of rows.
The performance problem is that the amount of time to insert the random records linearly increases as the script runs.
For example:
Run 1: Time to insert 400 rows: 60 ms
Run 2: Time to insert 800 rows: 100 ms
Run 3: Time to insert 200 rows: 120 ms
.
.
.
Run N-1: Time to insert 10 rows: 3230 ms
Run N: Time to insert 1000 rows: 3403 ms
Has anyone seen a situation like this, or can anyone shed any light as to why MSSQL would behave like this?
If you need problem clarification, just ask. I can even post the procedure on here it's so simple.
Best,
Diebels