There are 2 different ways I would approach this.
1. Create a computed column.
You could add a computed column to the table, index the computed column, and then use the computed column in the query. By placing an index on the computed column, the values get stored in an index. There is the added overhead of another index on the table, but with only a couple million rows, this should not be a problem.
This, of course, assumes that you have the appropriate permissions to add the computed column.
2. When optimizing functions, it's important to understand what "takes the time". The slowest thing you can do in a function is to access data stored in a table. You're not doing this, which is good. The next thing to realize is that the more code you have, the slower the function will be. A function that only operates on the input variables, and only has 1 line that is returned immediately, will operate faster than the same function that has multiple lines of code to accomplish the same thing.
Alter Function dbo.gmmastros_ConvertZip(@Zip Float)
Select Case When @Zip = 0 Then NULL
When @Zip > 99999.0 Then Convert(Int, @Zip / 10000)
Else Convert(Int, @Zip)
Optimizing the function will cause the query to run faster, but I wouldn't expect any miracles here. The computed column really is the better way to go, if you can.
Microsoft SQL Server MVP
"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom