I worked for a Fortune 200 company for some 15 years. (Not initially, but they did buy the company I worked for.) Little did I know that they had 3 divisions all with separate retirement account systems. Ours was not the latest & greatest (VFP versus Java) but it was the one home staff said had the fewest rejected transactions at end of day. Eventually came the day when someone had the bright idea to consolidate them all. Naturally, Java won. Once the transition was complete I was informed I would be let go along with one other in IT. The scuttlebutt was the other was let go so I couldn't make an age complaint, I was decades older than everyone else.
(I later took a programming boot camp in C#/.NET but while I excelled in class against other newbies, no one would hire me. So I'm on hiatus, sabbatical, whatever during the pandemic. Advise to others... don't get invaluable for an older program since you're no longer valuable when they move on.)
Anyway, I wanted to tell a story about a major issue I had midway through my term there. I was always told to stay on focus there, I was valued for my diligence but only where they wanted it. I had noticed over time that retirement loans taken out by retirement account participants were repaid sometimes by a penny or two too much. It went against my grain that one repayment (principle and interest of their own money) sometimes resulted in two payment records, one for the expected amount and the other for the extra penny or two. No one was concerned about that, except for me. After all, they were paying themselves back. Just a frustrating niggle in the back of my head.
I often was tasked with working in that section of code but just what the problem was, I couldn't get a handle on. Then I generated a quick report on mismatched payments. It was not just a little too much repaid, sometimes it was underpaid too. That got me to thinking, why did we tell participants one amount to pay when we were expecting a different amount?
This is illustrative of what I found, not every detail of course. All retirement monies were broken down into money types (such as traditional, roth, etc) and transaction types (deposits, withdrawals, etc) and funds (some plans had several dozens of funds). Throw in that fund prices were always approximated, home office sent us 3 decimal values, the new system was capable of up to 6 decimals precision though I don't know if they really used it all.
Once I sorted it all out, I discovered we were using one order of calculations to tell the participant what to pay, then used other results from a different order to log what we expected. You can see where this is going... the separate processes were adding up the fractional pennies differently and a small percentage of the loans had the fractions of pennies accumulate to differing totals.
IT hadn't discovered it. Nor had the Loans team. I presented my results and the fix was reviewed and implemented. I viewed this as one of my hallmarks, though I had no fanfare. By the time we shut down the system, it was very finely tuned. Isn't that always how it is?
Oh, what about the existing loans? No one asked me to adjust the payment amounts. I was able to assure that all the impacted loans were 5 year terms (maximum allowed by law for personal loans). Hearing nothing more, those loans were allowed to mature and close in the following years.
(I later took a programming boot camp in C#/.NET but while I excelled in class against other newbies, no one would hire me. So I'm on hiatus, sabbatical, whatever during the pandemic. Advise to others... don't get invaluable for an older program since you're no longer valuable when they move on.)
Anyway, I wanted to tell a story about a major issue I had midway through my term there. I was always told to stay on focus there, I was valued for my diligence but only where they wanted it. I had noticed over time that retirement loans taken out by retirement account participants were repaid sometimes by a penny or two too much. It went against my grain that one repayment (principle and interest of their own money) sometimes resulted in two payment records, one for the expected amount and the other for the extra penny or two. No one was concerned about that, except for me. After all, they were paying themselves back. Just a frustrating niggle in the back of my head.
I often was tasked with working in that section of code but just what the problem was, I couldn't get a handle on. Then I generated a quick report on mismatched payments. It was not just a little too much repaid, sometimes it was underpaid too. That got me to thinking, why did we tell participants one amount to pay when we were expecting a different amount?
This is illustrative of what I found, not every detail of course. All retirement monies were broken down into money types (such as traditional, roth, etc) and transaction types (deposits, withdrawals, etc) and funds (some plans had several dozens of funds). Throw in that fund prices were always approximated, home office sent us 3 decimal values, the new system was capable of up to 6 decimals precision though I don't know if they really used it all.
Once I sorted it all out, I discovered we were using one order of calculations to tell the participant what to pay, then used other results from a different order to log what we expected. You can see where this is going... the separate processes were adding up the fractional pennies differently and a small percentage of the loans had the fractions of pennies accumulate to differing totals.
IT hadn't discovered it. Nor had the Loans team. I presented my results and the fix was reviewed and implemented. I viewed this as one of my hallmarks, though I had no fanfare. By the time we shut down the system, it was very finely tuned. Isn't that always how it is?
Oh, what about the existing loans? No one asked me to adjust the payment amounts. I was able to assure that all the impacted loans were 5 year terms (maximum allowed by law for personal loans). Hearing nothing more, those loans were allowed to mature and close in the following years.