Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Efficient task property reference method wanted

Status
Not open for further replies.

popper

Programmer
Dec 19, 2002
103
0
0
AU
I have discovered painfully that references to task properties in the form of

Activeproject.tasks(ID).name

and even

TS(ID).name

after the required
'Dim TS as tasks
Set TS = ActiveProject.tasks'

are hugely expensive statements to use.

I have also learnt (see previous post) that the execution time for traversing a list of N tasks is proportional to N*N rather than the expected N. That is, if you double the number of tasks, you quadruple the execution time.

Put these two effects together, that is, use code that traverses a big task list AND uses these property references for each task and you get a blowout in time.

The two effects are summarised in the table below. The column for 'Ref' contains execution time (in seconds) for some menial code that traverses the entire list of tasks (25 and then 50 in number) and involves three lines containing a reference 'ActiveProject.Tasks(ID).name' for each row while the one in 'No Ref' relates to the identical code except that the reference statements are replaced by three ordinary assignment statements such as 'z = 1'.

N Ref Non Ref
25 5.95 0.19
50 23.03 0.74


Compare the two columns to see how expensive reference statements are. Compare the two rows to see how the execution time behaves on doubling the number of tasks.

My question to anyone is: Is there a more efficient way of referencing task (and resource assignment) properties than the method shown above, so that I can dramatically reduce the time taken to run my application.

Help would be greatly appreciated.

Thank you.
 
I still have no idea what your code does (you only provided unrelated snippets). I took a project that has 1000 tasks and ran the following. The timings for each loop were trivial. At the same time I was running this, I had a copy of Notes running as well as a half dozen instances of IE and one of Excel in memory at the same time.

My hardware is a Dell latitude D610 (many years old) 1.60Ghz, 500MB RAM, XP SP2, Project Standard 2003
Code:
Option Explicit
Sub pdqbach()
Dim tsk As Task
Dim tsks As Tasks

Dim var1 As Variant
Dim var2 As Variant
Dim var3 As Variant
Dim strTask As String

Dim lLoop As Long

Dim varTimer1 As Variant
Dim varTimer2 As Variant
Dim varTimer3 As Variant
Dim varTimer4 As Variant
Dim varTimer5 As Variant

Set tsks = ActiveProject.Tasks

'Loop 1000 times
varTimer1 = Timer
For lLoop = 1 To 1000
    strTask = ""
    var1 = var2 + lLoop
    var2 = var3 + lLoop
    var3 = lLoop
    'Debug.Print "1 " & lLoop
Next

'Loop 1000 times and access one MSPrj task
varTimer2 = Timer
For lLoop = 1 To 1000
    strTask = tsks(1).Name
    var1 = var2 + lLoop
    var2 = var3 + lLoop
    var3 = lLoop
    'Debug.Print "2 " & lLoop
Next

'Loop 1000 times; access specific task in MSPrj
varTimer3 = Timer
For lLoop = 1 To 1000
    strTask = tsks(lLoop).Name
    var1 = var2 + lLoop
    var2 = var3 + lLoop
    var3 = lLoop
    'Debug.Print "3 " & lLoop
Next

'Loop 1000 times; access each task in MSPrj
varTimer4 = Timer
For Each tsk In tsks
    strTask = tsk.Name
    var1 = var2 + lLoop
    var2 = var3 + lLoop
    var3 = lLoop
    'Debug.Print "4 " & tsk.Name
Next
varTimer5 = Timer

Debug.Print varTimer1
Debug.Print varTimer2
Debug.Print varTimer3
Debug.Print varTimer4
Debug.Print varTimer5

End Sub
 
Thanks for your reply.

I get similarly quick times when I run your code. I added some timescalevalue assignments which are computationally heavy and of course the times inflated significantly. But things remained linear. Doubling the number of tasks 1 day tasks only doubled the time as it should.


With reference to my earlier experiments:
I was unclear about the data. 25 and 50 referred to the number of phases (wBS parents) and not tasks as stated, each of which had ten tasks beneath them. I ran the code that follows.

I find that if I double the number of phases, the execution time quadruples. I wonder if you do too. I would appreciate your trying this and explaining the effect.




Sub Macro16()

a = Timer()
i = 1
For i = 1 To ActiveProject.Tasks.Count

For j = 1 To 30
namer = ActiveProject.Tasks(i).Name
namer = ActiveProject.Tasks(i).Name
namer = ActiveProject.Tasks(i).Name
'z = 1
'z = 1
'z = 1
Next j


Next i

b = Timer() - a
MsgBox ("time is: " & b)

End Sub
 
I didn't have the time to run your full test.

I changed the j loop from
for j = 1 to 30
to
for j = 1 to 10

1000 tasks - 64 seconds
900 tasks - 53 seconds
800 tasks - 41 seconds
700 tasks - 31 seconds
600 tasks - 23 seconds
500 tasks - 16 seconds
400 tasks - 10 seconds
300 tasks - 6 seconds
200 tasks - 2 seconds
100 tasks - <1 second

Giving this a little thought, I'm not at all surprised that it isn't a linear progression. My guess (I'm not an operating systems guru though I did take a number of CS courses at university) is that although "technically" the entire project file is read into memory and should, therefore, be immediately available, the reality is that the entire file is read into virtual memory. In other words, the file appears to be fully in memory but the file is, in fact, partly in memory and partly in an OS swap file.

The larger the number of tasks in the swap file means the larger amount of time that has to be spent retrieving them because there is both the computation time needed to determine which page has to be retrieved but also -- most especially -- the physical bandwidth that is consumed. It's one thing if the data is available from L1 or L2 cache; it's an entirely different thing if the data has to come from the swap file.

The file structure for MSProject is not publicly documented and, given that it's probably a highly customized implementation of a sparse matrix, it wouldn't surprise me to find that there are numerous internal linked lists that need to be traversed. (Without seeing the documentation on the file structures, that's just guessing on my part.)

There is also (though I suspect this is generally trivial) the fact that the OS is never fully quiescent but is always performing various operations (including -- since VBA is interpreted -- garbage collection). The more frequently memory is used and released, the greater the number of occasions that the OS has to perform house-keeping. I didn't take a compiler/interpreter writing course and the OS course I took was an overview ... not the detailed investigation that the CS majors took.

Without greater technical awareness on my part, I can't offer you anything more than:

Yup, the increase in time isn't linear; gonna have to live with it.

As a potential workaround, you could try exporting appropriate MSPrj data to Excel and then running an equivalent macro there. It's possible that the VBA interpreter is optimized to handle Excel structures more efficiently than Project structures since Excel is used far, far more broadly than Project is used.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top