LastDance74
Programmer
We're a small development team within a large organisation developing web apps for SQL Server. We charge our clients for our development work based on an hourly rate, but are having some difficulties working out a fair way of charging them for ongoing system support and maintenance. This cost would include a percentage of all resources used - ie. software, hardware, personnel.
At present we are supporting three large systems and a number of small-medium systems. User base size and system usage is not necessarily relevant to system size, for example the smallest system has the largest number of users and transactions per month.
We were looking at working out a usage rate based on the three factors of system size, number of users and transactions per month, and then using this usage rate to bill as a percentage of our yearly running costs. This seems unfair though as it means that the fewer the systems utilising the resources the more they each pay.
How do other people achieve this? Is there a standard cost recovery formula or model we could follow? Love to hear other people's ideas
At present we are supporting three large systems and a number of small-medium systems. User base size and system usage is not necessarily relevant to system size, for example the smallest system has the largest number of users and transactions per month.
We were looking at working out a usage rate based on the three factors of system size, number of users and transactions per month, and then using this usage rate to bill as a percentage of our yearly running costs. This seems unfair though as it means that the fewer the systems utilising the resources the more they each pay.
How do other people achieve this? Is there a standard cost recovery formula or model we could follow? Love to hear other people's ideas