Welcome to my online portfolio, the complement/substitute for my resume. The opinions included herein are my own and do not reflect those of any client or employer, past or present. Please check out the new site: http://danieljohnsonjr.com

Showing posts with label global consumer products company. Show all posts
Showing posts with label global consumer products company. Show all posts

Sunday, June 10, 2007

Every programmer should be on a testing project

I think every programmer should get a chance to be on a testing project because it will make him or her a better programmer. On one project I supported the complete software development lifecycle process to develop a multi-tiered, Intranet-based, Product Data Management System from a software quality assurance perspective. The company is a global consumer products company, and they were creating a system to manage information about their products.

To do this the testing team, of which I was a part, analyzed business requirements and functional specifications, and designed test cases. Then we built test data and executed the test cases. For the most part, each of us would concentrate on a specific aspect of the system.

Since my time on this and other testing projects, I've found myself approaching development from the perspective as a tester, and it's helped me be a better programmer. I wish the same for every programmer.

Monday, February 19, 2007

Lesson learned while on a mission critical conversion project

Here's a project I worked on where I learned an important lesson.

The client was a global consumer products company, and the project was to convert an Access 2.0 application to Access 95, which was a conversion from a 16-bit environment to a 32-bit environment. I'm not even going to claim that I understand what all that means, except to say that there were a lot of procedure calls to the Windows Application Programming Interface (API) that needed to be changed over. This application was used to facilitate preparing the company's profit/loss and balance sheets for all of their business units around the world. Hereafter, I'll call it the "financial reporting application".

Where I went wrong was that I didn't test portions of the program as I converted the code. It demonstrated a skill in programming that I needed to learn, and I learned that lesson the hard way because, once the program was "converted", i.e., all the code compiled correctly with no errors, there were other application errors that began popping up all over.

I created a tool to help me keep track of all the errors and the steps taken to resolve them and provided the client with daily progress reports.

As I think back on it, I would have done much better if I had taken more time to understand how the program worked in its prior environment and tested the program incrementally as I revised the code.

Tuesday, January 30, 2007

Mining data from 3,000 separate Excel files

One of the consulting projects I worked on was for the finished product exports department at a global consumer products company. The company produces products that get shipped all around the world, but this particular group concentrated on North America.

Because of the North American Free Trade Agreement (NAFTA), the company was able to save millions of dollars each year by taking advantage of tariff exemptions for products shipped among the U.S., Canada, and Mexico. Prior to my joining the project, the company had discovered from their legal department that they needed to provide documentation that these products were indeed exempt.

The regulations describe raw materials that compose the finished products, so we needed to look at the constituent level for every product the company manufactures. I was informed that the company had close to 3,000 Excel spreadsheets for various products that listed the composition of each. Additional information would be forthcoming in the near future after I had started on the project.

I looked at the Excel files and discovered that the composition was not in a flat layout (headings and rows), and thus, I wouldn't readily be able to import the information into a database table. I would need to manipulate the data into such a layout and then import it into a database.

Instead of going brute force into all 3000 of these files, I opted to automate the process. I decided to build a conversion tool in Excel that would search for the relevant contents of each file and copy and paste them into a new spreadsheet in a table layout. I wrote VBA code in a standalone Excel workbook to do the job. In addition, I also wrote code to record errors and other anomalies in a log.

I also wrote some VBA code in Microsoft Access to launch the Data Conversion tool in Excel, along with some use of the Microsoft Office object interface. I put all the directories where the files were located into a Lookup table and had the program go through each directory, performing the conversion on each Excel file, importing each Excel spreadsheet's information to a master database table.

I wound up being able to click a button and watch as the program performed the conversion. I had set up a flag in the Lookup table to indicate whether files were imported successfully. Then I would check the log in Excel for anomalies, handle them appropriately and retry.

I'll share more about this project in future updates.