One of the Payroll Specialists called us today, wondering if her session in Darwin was hung up and needed to be killed. Darwin, as I may have mentioned before, is a Microsoft Great Plains Dynamics product customized for the Professional Employer Organization (PEO) industry. We have SQL Server as the back-end.The IT Manager asked me to put a trace on her session in SQL Profiler so that we could ascertain whether her session was indeed hung or not. As I saw the actions scrolling on the screen, I called the Payroll Specialist and told her whatever she was doing was still processing.
I began examining the items in the trace to see what was going on. For those that are unfamiliar, GP Dynamics hasn't named their tables, views, and stored procedures in a very intuitive way. Nevertheless, I have been able to deduce from their stored procedure naming convention the base tables involved.
In addition, this exercise has helped me learn and understand SQL programming better. I usually open the procedure as a script in a Create New SQL window. Inside these procedures that were being executed in the trace, I saw nothing more than a serious of SELECT statements. It looks as though Darwin (or really the code behind it) is getting information from the table in groups of 25 at a time.This particular recordset has close to 5000 rows. Getting 25 rows at a time means that it will execute this stored procedure 200 times. If the program takes 5 seconds to execute each batch, then it will take about 17 minutes to go through the entire recordset. I guess it's no wonder that the process seems hung to the user.
I'm guessing that something is going on behind the scenes as Darwin is preparing and and posting a payroll. Unfortunately, I don't have access to the code behind it. This may be something we'll want to bring up with Thinkware, the vendor who provides this software to us.
-----
Check out my other blogs:
Daniel Johnson, Jr.
Get That Job!
Journey Inside My Mind Blog
Journey Inside My Mind Podcast
QuotesBlog
Twitter.com/danieljohnsonjr
Connect on LinkedIn
Interesting Things I've Read
Related tags: daniel+johnson+jr microsoft+great+plains sql+server sql+profiler peo
Welcome to my online portfolio, the complement/substitute for my resume. The opinions included herein are my own and do not reflect those of any client or employer, past or present. Please check out the new site: http://danieljohnsonjr.com
- 401(k) report (7)
- accounting (1)
- administrivia (11)
- ADO (1)
- announcements (6)
- audio (1)
- batch files (1)
- benefits audit (2)
- blogging (8)
- branding (7)
- bridge application (17)
- business intelligence (7)
- cincinnati (6)
- client relations (4)
- community (1)
- competitive intelligence (2)
- conference (3)
- conversions (2)
- dashboard (3)
- data warehousing (1)
- design (4)
- documentation (2)
- financial reporting application (3)
- first post (1)
- get that job blog (2)
- global consumer products company (3)
- idea generation (2)
- integration manager (2)
- jazzmania productions (1)
- lessons learned (3)
- meetup (9)
- mentoring (3)
- microsoft great plains (5)
- music (2)
- nafta (1)
- networking (2)
- new media (22)
- payroll (6)
- peo (33)
- podcamp (3)
- podcampohio (2)
- podcasting (8)
- presentation (1)
- programming (31)
- public relations (1)
- public speaking (1)
- questions (1)
- requirements (4)
- search tool (3)
- social media networks (7)
- SQL (9)
- testing (1)
- training (1)
- troubleshooting (7)
- trucking (2)
- twitter (9)
- uml (1)
- VB.NET (2)
- VBA (1)
- wikis (1)
Tuesday, June 10, 2008
Troubleshooting with SQL Profiler
Posted by
Daniel
at
3:51 PM
0
comments
Monday, February 18, 2008
Finding Tables for Employee Census Report
This is one of the posts where I describe the more technical aspects of a project I've been working on. The contents here are meant mainly to trigger my own memory when needed in the future.
The HR department has requested a census report so that the company can shop around for benefits with other vendors. In typical fashion, the information needs to be laid out in such a way that none of the "canned" reports from the business system will suffice.
I know where to get the demographic information (employee ID, client ID, city, state, zip, salary, etc.), but this is the first time I've had to track down information such as premium rates, employer contribution, employee contribution, and coverage tier.
I sat with the HR director to find out where she's able to see this information. Back at my desk, I launched a SQL Server Profiler Trace, filtering to my login ID and session ID. Then I navigated to those same screens to help identify which objects in SQL Server I need to look at.
From there I've been able to target the specific tables I need for my SQL scripts.
-----
Check out my other blogs:
Journey Inside My Mind Blog
Journey Inside My Mind Podcast
Get That Job!
QuotesBlog
Twitter.com/danieljohnsonjr
Related tags: daniel+johnson+jr
Posted by
Daniel
at
4:38 PM
0
comments
Topics: business intelligence, design, microsoft great plains, peo, programming, SQL
Thursday, December 13, 2007
Reorganizing clients in Darwin provides opportunity for SQL Server clean-up
This is a post where I share more of the technical aspects of my job as a 'conscientious programmer/analyst'.
The company I work for is reorganizing clients into new databases in the Darwin business system (a customized version of Microsft Great Plains Dynamics), and the business sees this as a great opportunity to clean up a lot of things. This view is especially shared by us who work in the IT department.
The 12 current databases are, for the most part, the same in their structure; i.e., same tables, views, stored procedures, and so forth. The company has used these multiple SQL Server databases to for specific types of clients, based on their industry classifications, etc.
It may be easy to imagine, then, over time, and as clients come and go, that the databases would have lots of various codes. Mirror that across 12 databases, and it becomes more complicated. Furthermore, I've discovered that the code descriptions are not consistent from database to database. That the business has a need to reorganize clients into new databases presents a great opportunity to clean things up, as a result.
Yesterday, the Director of IT and the Director of Special Projects asked for a list of active codes for active employees, across all 12 databases. I am the guy they turn to in order to get this done quickly. Because of my experience with how the databases have been set up, I usually know pretty quickly which tables to use in my SQL scripts.
In this particular case, I was interested in the Transaction History table, since it contains the three most important elements my internal customers needed: check date, transaction type, and paycode.
I initially set up the script to pull all paycodes, but I found close to 10,000 codes in use since the business started using Darwin in 2005. I checked with the Director of Special Projects, and she asked me to limit to just those codes in use since October 2007. Thankfully, that narrowed the list to just under 2000. I also included, at her request, the name of the database in which the codes were used. This proved especially helpful, since not all codes are in use in all the databases.
On my way home last night, I called into Jott to remind myself to set this up as a stored procedure.
Just another way I'm able to help keep the business engine going.
-----
Check out my other blogs:
Journey Inside My Mind Blog
Journey Inside My Mind Podcast
Get That Job!
QuotesBlog
Twitter.com/danieljohnsonjr
Utterz by danieljohnsonjr
Related tags: daniel+johnson+jr, sql+server, programming, business+intelligence, microsoft+great+plains, payroll, peo, darwin
Posted by
Daniel
at
12:03 PM
0
comments
Topics: business intelligence, microsoft great plains, payroll, peo, programming, SQL
Tuesday, November 6, 2007
05 Nov 2007 - Utterz from the road
Three topics discussed:
- Effects of Daylight Savings Time ending one week later this year, especially on the commute home.
- New Media Cincinnati: http://www.newmediacincinnati.com or http://danieljohnsonjr.com/2007/11/new-media-cincinnati-meetup-november.html
- Dashboard-types of reporting I've been doing to provide a good pulse on the company. I love doing this kind of marketing-type of analysis.
Hear all my Utterz: http://www.utterz.com/~h-danieljohnsonjr/list.php
-----
Check out my other blogs:
Journey Inside My Mind Blog
Journey Inside My Mind Podcast
Get That Job!
QuotesBlog
Twitter.com/danieljohnsonjr
Related tags: daniel+johnson+jr
Posted by
Daniel
at
9:43 AM
1 comments
Topics: business intelligence, cincinnati, dashboard, new media, peo
Monday, September 24, 2007
Keeping upper management knowledgeable and salespeople paid
This is another post where I share technical details about a project I have been working on.
SITUATION
Sales executives within the company receive monthly commission checks based on active client employee counts and gross payroll, for clients that they have brought on. In addition, upper management needs to see high-level numbers such as active clients, active employee counts, and gross payrolls - dashboard-type information.
A easy-to-use tool to generate this information did not exist at a user level. Previously, upper management relied on IT or the Controller to generate this information and send it to them.
Through some personnel reorganization, the process for generating this report fell through the cracks. Salespeople were waiting for their commission checks for the previous month, so the project was both urgent and important.
As usual, this information needs to come from the multiple SQL Server databases the company uses to manage client information through the Darwin PEO System, a customized, version of Microsoft Great Plains for the Professional Employer Organization (PEO) industry.
TASK
I was asked to develop a tool that upper management can use to generate information themselves. Some of the application requirements and thoughts that guided the development:
- Let users pick the date range, click a button, and have the system produce a report.
- Develop the application quickly to meet the immediate needs of the organization, yet with the ability to be reused whenever upper management so desires.
- Since upper management is most comfortable with Microsoft Excel and will want the data in a workbook anyway, use Excel Visual Basic for Applications (VBA)) and ActiveX Data Objects (ADO) within a single Excel workbook to produce the results.
- Choose Excel over Access because the application overhead is low (i.e., no need for tables, forms, reports, etc.).
- Since the company doesn't mark employees and clients as inactive in the system immediately when they are terminated, define an active employee during a date range as a paid employee.
- In addition to a paid employee count, obtain a total check count and gross payroll amount for each client during the date range.
- If an employee received a check and it wasn't voided, it counts.
- Take advantage of server-side processing to achieve the best performance.
I first developed the SQL statement to unite data across the twelve SQL databases, based on prior knowledge of where to find information. Then I wrapped the SQL statement up in a stored procedure, with start and end dates as parameters.
After testing the procedure with different date ranges to make sure the information was accurate and made sense, I moved on to the Excel piece. I wrote code in Excel VBA and ADO to execute the stored procedure and output the results to a worksheet in the workbook.
Once I had tweaked the completed application to make sure everything ran smoothly, I e-mailed it to the director who requested it.
RESULTS
Within a few minutes I received a phone call from her, telling me how awesome I am. She also sent the application to the owner of the company so that he can run the report as often as he wants.
Now they are able to generate the information in a matter of seconds themselves, versus waiting for the Controller or someone else in IT to generate it for them; or, even worse, spend hours compiling the information themselves.
-----
Check out my other blogs:
Journey Inside My Mind Blog
Journey Inside My Mind Podcast
Get That Job!
QuotesBlog
Twitter.com/danieljohnsonjr
Related tags: daniel+johnson+jr business+intelligence sql+programming microsoft+great+plains data+mining
Posted by
Daniel
at
11:07 AM
0
comments
Topics: ADO, business intelligence, dashboard, microsoft great plains, payroll, peo, programming, requirements, SQL, VBA
Friday, September 21, 2007
Reaching the point of diminishing returns on a project
Over the past month, I have been working on a project to automate payroll entry for a client that has close to 300 employees and is growing. Naturally, it is the company's best interest to get this payroll automated as much as possible.
To do this I've been developing an application to bridge the payroll info to Microsoft Great Plains Integration Manager.
One of the criteria I've used to evaluate a client's payroll is a consistent layout in a good format, so that I can tell the program where to map regular hours, overtime, cash reimbursements, etc. In some cases I'm able to tweak the payroll to make it into a layout that is consistent enough for me to automate.
This particular client is a construction company, and the only electronic version of the payroll is a text file that is not delimited. The file is basically a report from the client's system, and it has page headers and department footers, as well has breakdowns of withholding, etc. Most of that I can ignore in the program.
My colleague was able to import it into Microsoft Excel, using spaces as the delimiter. From there I saved the file as tab-delimited text, the format used in the other bridge applications I've developed.
The big problem is the inconsistent layout, which has come from using spaces to delimit the text, depending largely on how much detail is on a particular line of text. Look at the following 3 lines of text:
- E 22 Per Diem - C AZ 3 99
- E 22 Per Diem - C AZ 3 XYZ111 99
- E 22 Per Diem - C 99
This is just one major aspect of complexity that has come from my attempt to automate the payroll. After talking it over with my boss, we realize that we've come to the point of diminishing returns.
I'm off this project until there is another way to parse the payroll information and am able to move on to another project that's in the queue.
Related tags: microsoft+great+plains integration+manager payroll+automation microsoft+access vba programming application+design peo daniel+johnson+jr
Posted by
Daniel
at
9:53 AM
0
comments
Topics: bridge application, integration manager, lessons learned, microsoft great plains, payroll, peo, programming
Wednesday, August 22, 2007
Server-side processing rocks the house!
I've spent the last couple of weeks learning how to optimize the search tool we built. The short story is that we're now using a pass-through query in Access to pass parameters to stored procedures on SQL Server. Processing time has gone from about 2 minutes to less than 2 seconds.
Now that I know how to work with stored procedures, I'm thinking of how else to implement them with other types of applications.
BACKGROUND
The company has clients spread across multiple databases for various reasons; namely specific business rules (e.g., worker's comp rate, state tax rate, unemployment insurance, etc.) apply across an entire database.
Quite often someone in the company receives paperwork from an agency or client with nothing but the client employee's name or their SSN. Searching for this employee has been trial and error, going through each database at a time. Also, in many cases the person in our company just needs to see some basic information about the employee. For example, during tax season we just needed to verify a client employee's address and phone number. Hence the reason for such a tool.
We had initially designed the tool to have individual pass-through SELECT queries, uniting and applying criteria to them in Access. It was the uniting and applying criteria that has been slowing things down.
OPTIMIZING
So, I created two stored procedures with parameters: one that searches for an SSN one one that searches on a name. Each stored procedure unites the individual SELECT queries on a database separate from the other 12 that were being searched.
In VBA code, based on which criteria the user enters, we determine which stored procedure to run. We create a pass-through query to EXECUTE the stored procedure. This pass-through query then contains the search results, which is what we had with the earlier version of the tool.
PREVENTING SQL INJECTION
While reading the literature, I came across the concept of SQL injection and the need to prevent it. Since users are passing text to a SQL statment, we need to prevent the intentional or unintentional insertion of code that would prematurely terminate a text string and append a new command.
I wrote one procedure that scans what users enter in the first and last name fields for "--", "CREATE", and other such items. My coworker gave me some code to prevent users from entering other illegal characters. This double-barreled approach will prevent any SQL injection from occurring.
COMMUNICATING WITH THE MASSES
I enjoyed sharing the news of the updated tool with the rest of the company. I knew many had not even heard of it. I constructed the email the way I would a friendly blog post, including pictures.
One user immediately replied: "Awesome! Works very fast. Good job." I was contacted by two people in Accounting and Payroll who asked that I put the tool on their machines as well.
Much joy all around, I tell you. And IT looks good.
----------------
Now playing: Tim Hawkins - KidsRock
via FoxyTunes
Posted by
Daniel
at
2:36 PM
0
comments
Topics: client relations, design, peo, search tool, SQL
Monday, June 18, 2007
Microsoft Great Plains - Integration Manager in action
The company uses a version of Microsoft Great Plains (GP) that's been customized for the PEO industry. One piece of that is Integration Manager (IM). I've talked about the bridge application a number of times before. With an Excel file from a client and, with some sophisticated programming, we create a tab-delimited text transaction file for Integration Manager. We've set up some profiles in Integration Manager to upload payroll transactions into a batch in the Great Plains system, saving a lot of time and money. In some cases we've seen a 91% drop in the amount of time it takes to run a payroll.
Over the next few weeks or so, I'll be running Integration Manager for the Payroll department. Here are some numbers for today:Industry Transactions Emps Trucking 149 51 Trucking 168 90 Restaurant 86 43 Restaurant 94 48 Restaurant 72 39 Restaurant 82 34 Restaurant 94 40 Restaurant 97 40 Restaurant 87 47 Manufacturing 173 98
1002 transactions, for 530 employees of 10 clients.
Posted by
Daniel
at
5:33 PM
3
comments
Topics: bridge application, integration manager, microsoft great plains, payroll, peo
Wednesday, June 6, 2007
At a Conference in Downtown Cincinnati
Over the next few days I'll be attending a conference in downtown Cincinnati. I'm really glad that I'll be able to catch the bus instead of driving the hour commute to Dayton.
The conference is being conducted by Thinkware, a vendor that has -- and I'm probably oversimplifying this -- customized Microsoft Great Plains for Professional Employer Organizations (PEOs). I look forward to learning more about the latest versions and seeing how I might be able to use it on my job.
I may or may not provide updates here.
June 7th Update: I found out it's not until today. More to come later perhaps!
Posted by
Daniel
at
5:46 AM
2
comments
Topics: conference, peo, training
Sunday, June 3, 2007
Helping company take advantage of a tax credit
I found out last year from the Accounting department at my current company that employers with tipped employees can get a tax credit based on how employees have reported their tips. I guess it's the IRS's way to encourage employers to get their employees to report tips so that they can get the tax from it.
The PEO model is one of co-employment: the client employs someone to do the work while the PEO employs the worker from an HR perspective. If the client doesn't take the tipped employee tax credit (or whatever it's actually called), then the PEO can take it.
For me it has meant developing yet another report from the SQL Server database. The business, of course, wants the report as soon as they can get it, and I and my boss help them understand that it cannot always happen overnight or even in a week.
The report, as many of these I develop lately, involves looking at the transaction history on employees' paychecks to identify reported tips. It sounds simple, and it really is, as long as you know where to look.
Over the past year or so, I've gotten better as I've come to understand where the types of information are stored - the database and the front-end are a modified version of Microsoft Great Plains, and the tables are not always intuitively-named.
I probably should push this application out to the Accounting department, since all I did was run it for last year and the current year. The upper management was certainly surprised at fast I'd turned their request around.
Posted by
Daniel
at
6:39 PM
0
comments
Topics: accounting, financial reporting application, peo, programming, SQL
Wednesday, April 25, 2007
Migrating Existing Access Applications to Access 2007
This post is one of the more esoteric ones where I delve into the geeky details of some of my programming work. I know - it's really sexy, isn't it?
I have mentioned the bridge application I developed that helps make payrolls run faster, helping client employees get paid faster, and so forth, using Microsoft Access 2003 with VBA, ADO, Excel, Office, etc.
Some members of the company are starting to migrate to Office 2007, and we can see the entire organization moving there soon. A few weeks ago, I tried opening and running one of the bridge applications in Access 2007, and it bombed horribly, specifically in how I've written it to use the Office 11 FileSearch object.
I just found a couple of few documents on MSDN and TechNet that I hope will help understand what is involved in the migration:
Posted by
Daniel
at
12:16 PM
0
comments
Topics: bridge application, conversions, documentation, payroll, peo, programming, troubleshooting
Tuesday, April 24, 2007
More Benefits Auditing and Follow me on Twitter
I'm going to put up a Twitter badge in the sidebar so you can know about other stuff I'm doing. For now, feel free to follow me on Twitter, and let me know you found me via this website.
Tags: daniel+johnson+jr twitter benefits+auditing analysis
Posted by
Daniel
at
3:56 PM
0
comments
Topics: peo, programming, twitter
Quick Turnaround on a Benefits Audit
I can tell I've become branded as the "go-to" guy for certain types of projects. Last week the HR manager came to me and said that she didn't know who else to talk with about a benefits audit the company is doing for one of its clients.
The company and the client are trying to resolve a $20,000 discrepancy that appears to have arisen from November 2003 through November 2006, when the client stopped using the company's health and dental plan. Oh, and since this all needed to be finalized by the end of this month, which was about two weeks away, she needed whatever I could do in a day or so.
The HR manager sent me an email that contained about six Excel workbooks. Each workbook contained several worksheets. These worksheets were benefit summaries for each payroll that had been run during that three-year period, from the client's perspective.
The company needed to compare this information with what existed in the system, but the HR manager recognized that it was very cumbersome to go through the information in its existing format.
With the task at hand, I imported each worksheet into an Access database and compiled the information into a single table that I exported and sent back to the HR manager.
It sounds like I'm making it simpler than it was at the time. I had to write a couple of Excel macros to get the data in the proper format for importing into Access. That and a VBA function to import each worksheet from each workbook did the trick, for the most part.
Posted by
Daniel
at
1:33 PM
0
comments
Topics: benefits audit, peo
Friday, March 16, 2007
Handling errors in the payroll bridge application
Clients email the company their payroll files, and the columns on the payroll indicate Regular Hours, Overtime, Vacation, and so forth. To make the bridge application work, I and the other developer map the program that assigns payroll items based on this layout. It only works, however, when the layout is consistent. Whenever a client changes it by adding or deleting a column, the program generates an error.
One thing I have found so helpful in resolving errors is the Error Handler function. I use the Err object in Visual Basic, but I also add additional information such as the module name and procedure name where the error occurred. The user gets a message box, which is helpful for them, but I also wrote the program to create a log file with the same information so that I can go back and see errors that have occurred.
Posted by
Daniel
at
9:07 AM
0
comments
Topics: bridge application, peo, programming, troubleshooting
Wednesday, March 14, 2007
Rapid response to payroll layout change
One hour ago I got an e-mail from a payroll specialist, telling me that a client with 107 active employees had added a column to their payroll file and asking if I could fix their bridge application to allow for the new payroll item.
I opened the code, and after I made the changes needed in a few class modules, we were good to go. I tested it out on my machine and then put it out on the network for the payroll specialist to use.
I was able to get this done so quickly because there were only a few places to add some more code. Having the system set up like this was very efficient.
Posted by
Daniel
at
2:22 PM
0
comments
Topics: bridge application, peo, programming, troubleshooting
Monday, March 12, 2007
Expanding the 401(k) reporting tool
As the company grows, the need to be able to report on client's using the 401(k) plan does as well. We are adding clients to additional databases, and that has meant that the 401(k) reporting tool needed to be updated so that the user can identify which clients are participating in the company plan.
Using Microsoft SQL Server Management Studio, I created the additional tables: one for each additional database. Then I added ODBC links to these tables in the 401(k) reporting tool, the desktop application created using Microsoft Access 2003. I have to remind myself that, for each computer using this tool, I have to create the ODBC connection to each database and table. Then I updated the VBA code so that it can recognize and use the new tables as it compiles the information for the reports.
After testing out the changes, everything appeared to be working fine. I contacted the user and updated her version of the tool, refreshing the ODBC links where needed.
The company is now more ready to begin managing the 401(k) program for new clients.
Posted by
Daniel
at
11:09 AM
0
comments
Topics: 401(k) report, peo, programming, SQL
Helping 145 employees get paid
Not my normal function, but the payroll specialist was in a pinch, so she called and asked me to run the program that loads transactions into the business system, helping 145 client employees get paid on time.
Posted by
Daniel
at
9:51 AM
0
comments
Wednesday, March 7, 2007
When the feature became a problem
When I was initially developing the bridge application a couple years ago, I had set up the Close button on several forms to be the Cancel button. That way a user can just hit the Esc button to close the form, if they prefer to use the keyboard.
As the applications for various clients have been used, this feature had become more of a bane than a benefit. It turns out that users hit the Esc key to Undo what they have typed, but the application would close the file. This results in some records being incomplete, causing errors when the application is processing.
While you can try to anticipate the user's experience as much as possible beforehand, some bugs still show up.
No big deal. I just removed that functionality from every bridge application we have in use so that program behaves in a way that users expect it to behave.
Posted by
Daniel
at
11:02 AM
2
comments
Topics: bridge application, lessons learned, peo, programming
Monday, February 12, 2007
Deploying a much-needed search tool in the workplace
We have a limited number of licenses to use the payroll management software in our organization. As we have added more staff whose responsibilities involve using that software, those licenses get used up pretty quickly. It's not uncommon to receive 5 emails every day asking if someone would log out of the system so that someone else can get in to search something or run a payroll.
With some knowledge of the database, we've been able to develop a tool to work around the system by querying the server directly. In fact, this is exactly what I've done when developing the 401(k) reporting tool and other custom tools for the organization.
We have created a search tool that members of the organization can use to return some basic information about an employee. I may have mentioned before that we have several databases for different clients, and many times, we don't know the for whom the client works or in which database his or her records are. As a result, the user has to use trial and error to find the employee, which is time-consuming.
Our new Search Tool, which is what I'll refer to it as here, returns an employee's address and company name, along with whether he or she is marked as Active in the system. I have to give a lot of credit to my fellow developer for taking the intial idea I had, and, with some guidance on the SQL, has created the tool to execute the search.
I spent part of the day deploying the tool to different members of the organization who would most likely use it the most: One person from the Risk Management department who coordinates the paperwork related to unemployment claims; one person who is fielding calls from employees about not getting W-2 forms (probably because of an address change); and the receptionist, who helps out with overflow work, especially with respect to returned mail.
We're still building on to the tool while the application is in use, but we are developing in a separate environment. Once again, by having this application on a shared network drive, we can update the file on that one location as needed.
Posted by
Daniel
at
5:43 PM
0
comments
Topics: peo, programming, search tool
Tuesday, January 30, 2007
Fixing the client's 401(k) reporting application to make it run better
The company didn't have a process in place to make sure that this one client receives their 401(k) information on a regular basis. I had developed an application last fall for this specific client to produce the information, but there was no one set up specifically to run the application on a regular basis.
I was contacted earlier today to set up this application on several people's computers. Interestingly enough, everyone could get to the application, since it exists on a networked drive, but, to make things as easily accessible as possible, I prefer to put a shortcut to the application on users' desktops.
After I finished that, I sent an email to everyone affected, letting them know about the application, and letting them know how to get the report.
About 15 minutes later, I was summoned to a user's computer because there was a problem with the report details. She was running the report for last week, but the Last Pay Date for several employees was for this week.
As I looked into the program, I realized that all the other details were correct except for this Last Pay Date, which gets updated whenever a payroll is run. I began brainstorming other ways to get the correct date.
One option was to look at the last pay check date for the date range specified, which is from the first of the month to the date the user selects on a form. That would make sure that date for employees who are getting paid this week is the date the user selected.
As I thought about that, I realized I was not finished. What if we run this report a month from now, and an employee had been terminated this month? The client would still want that employee to show up on the list.
I continued brainstorming, and wrote some mock VBA code on my marker board. Satisfied with that, I created a function, tested it out with various scenarios, and eventually came to a solution that will either use the last pay date for the reporting range OR the Last Pay Date field from the other data table.
I generated the report and sent it off to the client service representative, who will then forward it on to the client.
Posted by
Daniel
at
6:10 PM
0
comments
Topics: 401(k) report, client relations, peo, programming, troubleshooting