Thursday, March 26, 2009

Automating Release Management - the XMLP way

Most organisations have a process where a document containing all the objects of a Peoplesoft project that is to be migrated to another environment is prepared. This document will be the reference point for the DBAs or the release management team who will be responsible to migrate Peoplesoft projects across environments. For a developer, the task of preparing the document is an over head especially if it's a project with a large number of objects - it's close to being a painful exercise of going through all the objects in the project and reflecting that in the document. Amongst the innovative tools developed here at TESCO we had a .NET based Automatic Release Notes generator. The user just had to input the name of the project and the tool would produce a file (again we had the choice to select the output type required - .doc or .xls) containing the details of all the objects in the project including the comments attached to each individual object definition wherever possible. But this tool had few limitations - (a) The output format was not in the template dictated by our standards (b) The tool was not available locally, we had to login to a remote desktop to access that and (c) The source code was in .NET and that was not appealing to a bunch of Peoplpesoft developers!
So the team could not use the tool and had to resort to a manual preparation of the release notes. It was then that we came up with the idea of writing an XML Publisher report to do the same. XML Publisher is a perfect tool to produce letters and documents because we directly develop the template in MS Word and that was just perfect in our case. For the initial version, we decided to just print the details of the objects in the project and not the comments for each object. Printing the comments would mean that we had to query the metadata table of each of the objects and that would be nearly impossible with a simple PS Query.
The datasource of the XMLP report was a simple PS Query that pulled data from the PSPROJECTITEM table. We added some prompts to the PS Query so that the user could input details to be printed in the release notes like the change number, name of author, change description etc. It's a very simple template which I am sure your organisation can develop with minimal effort. This tool has surely automated a non-value add area in our development process and helped our developers spent more time of development rather than documentation!
A small presentation on the report and how to use that is given below:

Tuesday, March 24, 2009

My WebEx session on important Time and Labor tables

I've been wanting to upload audio visual sessions on Time and Labor on my blog for pretty long time now. This is because I strongly believe that there is a serious lack of authentic Peoplesoft training sources available and seeing and listening is any day better than plain reading! So here is the first audio cast I have come up with. It is a WebEx recording. I could not find a free tool to convert the recording to a format accepted by a video uploading site - so I had to upload the file in a free file hosting site.
The session is on my Top Six tables in Time and Labor. The session covers the structure and importance of each of the six tables and explains the context and place where they are used in Time and Labor. It should be an informative watch for any person working on Time and Labor. The audio cast is almost 60 minutes long.

You can download the session from this url -

You will need a WebEx player to view the session. That can be downloaded here.
I sincerely hope that the session will be useful to you. Please leave your comments on the same here and also suggest any aspect on Peoplesoft Time and Labor that you might want the next session on.

Simulating Time Collection Devices using Peoplesoft Handler Tester Utility

What would you do if you had to send adhoc punches from a TCD to Peoplesoft for testing and you don't have access to a TCD or an immediate mechanism to send offline punches? Use the Peoplesoft Delivered Handler Tester utility. Handler Tester is a tool where you can feed the XML file and simulate the invocation of the handler of a Service Operation. It is an excellent utility while you are developing any service operation.
Any punch time interface that comes from a TCD calls the PUNCHED_TIME_ADD Service Operation. Extract the XML of the message (this can be easily taken from the Service Operation monitor) and make appropriate changes to the XML file (like adding the correct EMPLID, PUNCH_DTTM, PUNCH_TYPE etc.) and submit the XML to the tester using the 'Provide XML' button. To fire the handler - click on 'Execute Event'. If the handler was fired a text - 'Done' will be displayed below the Execute Event button. Check the Process Monitor to see whether there is an instance of the ST_LOADER AE. The handler calls the ST_LOADER AE that actually loads the punch time to the TL_RPTD_TIME table. I found it to be a very handy mechanism to simulate the action of TCDs.
A snapshot of the Handler Tester page is given below:

Thursday, March 19, 2009

Innovation is a matter of culture

Innovation is undoubtedly the most overused cliche ('value addition' would be a close competitor) in modern management jargon. Infinite media and printing space has been dedicated on this single subject and I am sure many companies and individuals have made millions selling that idea! But I believed that getting an entire organisation to innovate together was almost an impossibility until I joined TESCO.
Read the rest of the post here.

Leaving it to XSL to do the dirty work!

The biggest misconception I had about XML Publisher was it's ability to do any logical processing - for me it was merely a presentation layer when I initially saw the product and far inferior to anything SQRs could do. How wrong I was! And this post is about the power of using XSL to do the programming logic within the template of XML Publisher to design faster and better reports.

When we initially started designing XML Publisher reports we stuffed in most of the calculations and logic in the data source itself - majority being PS Queries. This meant we had pretty bulky and costly queries with a number of PS Query expressions. This was not helping the cause at all with the reports taking an unacceptably long time to run. But as we got to understand the tool, we started doing more and more of the calculations and logical processing within the template using xsl - the groupings, summations, summaries etc. This has lead to dramatic improvements in performance and we believe that's the right way to approach XMLP reporting.
For example in one of our reports we had to print the total scheduled hours for all employees in a department along with some other information. The xml schema consisted of individual employee's data along with the deptid. A normal for-each-group:row;DEPTID tag was used to print individual departments, but to retrieve the total scheduled hours for a department an expression was used in the PS Query. This lead to significant performance drop, the query itself was taking over 30 minutes to execute. By eliminating the expression and summing up the scheduled hours for the department in the template itself the report was running in less than 5 minutes. A further grouping was done inside the DEPTID grouping to sum the scheduled hours of each employee (note that there were multiple rows for a single employee in the data set, but grouping by emplid would result in a single distinct row) and the set_variable function was used to add the scheduled hours to a variable.

In short keep the data structure of your data source as flat as possible and leave all the dirty processing work to the XMLP template! (And keep learning more of XSL to leverage the capabilities fully).

Friday, March 13, 2009

Experiences in T&L-Absence Management Integration

It's always exciting to work on cross module assignments and this post details some of the learnings I've had piloting the T&L-AM integration for my present organisation.

The integration of T&L and AM can be again divided into two distinct categories - integration with NA Payroll as the pay system and integration with GP as the pay system. One of the biggest drawbacks we found with the delivered integration is that it doesn't work if you've decided to use PI (Payroll Interface) as your pay system. This was a problem as we were not using Global Payroll and an initial design decision was made to use Payroll Interface as the pay system of all employees in job data. We had to reverse this decision and forcefully use GP as the pay system in job to ensure that we could leverage the delivered integration between AM and T&L.

The purpose of integrating these two modules is two folds:
  • Bring AM data into T&L and display the same on the timesheet.
  • Feed the absence data to NA Payroll through Time and Labor.
With 9.0, absences can be reported in the timesheet itself and is a major enhancement, but this post is about versions lower than 9.0.
There are some limitations with the display of absences on the timesheet. Absences will not be loaded on the timesheet grid, so a user cannot see an absence TRC on the timesheet grid per se. Just a greyed out row with a reported status of approved and the number of hours of absence will appear on the timesheet. This can be pretty confusing and sometimes even useless for the users. Absences are shown in the grid called 'Reported Hours Summary' that is just below the timesheet. But if it's the punch timesheet page, only the time reporting category is displayed here. This is where the TRC Category which was introduced in 8.9 plays an important part. If one creates appropriate TRC Categories for each category of absence and link the category to an absence TRC, the data appearing on this grid can be useful for the users.
Another limitation with the integration of the two modules with NA Payroll as the pay system is that the integration program can be run only for finalised AM calendar groups. This is a significant limitation as the users will be able to see the absence data on the timesheet only after a calendar is finalised which happens at the end of a period. To overcome this limitation, we had to customise the process to enable it to run for any absence calendar that is open.
On the other hand, the integration with GP as the paygroup can happen after a manager approves an absence and need not even wait till the absence calculation is run. This means that the data that is seen in T&L can be inaccurate and incomplete.
We faced interesting challenges while designing the payroll interface (custom interface to local payrolls) in taking T&L and Absence data because of the above design. There was no way we could take the entire 'payable data' from time and labor though it contained the absence data too. This was because the absence data in time and labor was inaccurate and incomplete. This necessitated the design of two interface programs - one to take T&L data from the TL_PAYABLE_TIME table and another to take AM data from GP_PI_GEN_DATA table.
Also note that any system that has the integration of these two modules will have to closely tie any setup change to one module with the other. For example, any change to the absence code setup or the creation of a new absence code etc. will require similar actions on the T&L TRC end too.

Calendars related to T&L that needs periodic update

This was a question posed today in the popular forum ittoolbox. The question was about all the calendars related to T&L that required periodic update. This should be on the checklist of every implementation to have a yearly check to ensure that the following are up to date:

1. T&L Time Period Calendars which can be found in Setup HRMS --> Product Related --> Time and Labor --> Periods. Time Period Calendars are critical for the Time Administration process to find the Period of Interest (POI) and for T&L rules. These calendars will have to be built on a periodic basis.

2. Holiday Schedule has to be updated every year to reflect the statutory holidays of a year. The Holiday Schedule is a common HRMS definition shared across various modules like AM, GP, NA Payroll, T&L etc.

3. Time and Labor dates table which can be found in the second tab of the Time and Labor installation page will also have to be built periodically.

4. Time Zone Offset table found in Peopletools, International folder also has to be updated as necessary. The Time Zone offset table is used in a number of T&L processes and they will not run unless and until the table is populated by using the 'Generate Query Offset' feature.

Wednesday, March 11, 2009

The case of the missing exceptions

Peoplebooks say that if an exception is defined with the 'archive' checkbox ticked, then once an exception is resolved, it will remain in the TL_EXCEPTION table. But does that happen always? The answer is no. If you make any manual change that nullfies the criteria of an exception, time administration will clear the same from the table.

Tuesday, March 10, 2009

Time and Labor and Day Light Savings

This was a very interesting piece of change for me - to understand how T&L handles the reported and payable time of time reporters working across the time when day light saving starts or ends. I was surprised to find that despite being on the latest tools version, T&L was not completely error proof handling DST changes. We have a large number of employees working in PST time zone and PST moved ahead by one hour (from GMT -8 to GMT -7) at 2:00 AM on 8th March, so what happens to employees who started work before 2:00 AM and ended work after 2:00 AM? For example sample the following punch timings:

In: 7th March 10:00:00 PM
Break: 8th March 1:00:00 AM
In: 8th March 1:30:00 AM
Out: 8th March 3:30:00 AM

Though it appears from the punch timings that the employee has worked for 5 hours and 30 mins, the actual hours worked is only 4.5 as at 2:00 AM the clocks gained an hour and became 3:00 AM. There are many implications of this scenario which every organisation that is using T&L has to be aware of:

1. A high severity exception (TLX10076) will be generated if any punches are done during the DST change window - in the case above, the window would be between 2:00:00 AM and 2:59:59 AM. Though this is very unlikely if your organisation uses TCDs - this can happen if any manual change is made to the timesheet.

2. For employees who are working across the DST change window - the difference between the last Out and the first In will not add up to the punch total on the timesheet. In the example above, the punch total on the timesheet will come as 4.5 hours. This is Peoplesoft doing the right timezone calculation and should not be taken as an issue.
This caused considerable confusion among the users where they even manually adjusted the timesheets in an effort to bring the reported time difference equal to the punch time total.

3. Payable time will be affected by this change. When the clocks move ahead by an hour - the payable time generated will be one hour lesser than the actual worked time and when the clocks move back, the payable time generated will be one hour more than the actual reported time (yeah Peoplesoft overpays your emplyees!). On 8th March, we found that all employees who worked during the DST change were paid one hour less.

So how do you prepare for the DST change?
  • Its advisable to have audit queries prepared before hand to identify all employees who worked across the DST change. Track the reported and payable time of these employees to ensure that there are no discrepancies.
  • Have a communication plan in place telling the users to not edit the timesheet on a DST change date/time if possible. Also let the users know that a high severity exception might be generated which is due to the DST change. There is nothing much an employee or a manager can do to resolve that exception so it's better to get the HR support team to do it.
  • Payable time will be affected by this change. This means that there are chances of overpaying or underpaying employees who worked across a DST change. I prefer to correct this in T&L itself before passing data to downstream systems. When the DST change happened on March 8th, we had employees who had payable time one hour less than the actual worked hours. This difference was offset by reporting 1 hour of Regular TRC on the elapsed timesheet.
In conclusion, I feel Time and Labor handles DST time changes pretty well. The most important thing is to monitor the system properly to identify the affected employees and ensure that their reported and payable time are as expected.