Tuesday, September 25, 2007

How to unleash the best of Oracle Features… Part I

In this series, I will cover some of the Oracle features that can be easily implemented to optimize our PeopleSoft environment.
Resumable Space
In todayworld, we will hardly find a database that is not monitored for space usage. But there is still a possibility for our process failing because it ran out of space in the UNDO tablespace or user defined tablespace or the DBA was too late to allocate the required space. Majority of PeopleSoft processes have restart capability. However, there might be instances where you just cannot afford to let the process fail and restart from the last commit/rollback step, example – data conversion during cutover.
The Resumable Space feature provides the facility to suspend transactions when they hit the space errors. The transaction will resume when the error is corrected.
Below are the steps to implement this feature for a SQR process.
1. Modify the SQR to include the following procedure which will be called at the start of the process.
begin-procedure SetResumable          ! Set RESUMABLE in current session
begin-SQL
ALTER SESSION ENABLE RESUMABLE;
end-SQL
end-procedure

2. Create an AFTER SUSPEND trigger to monitor any transactions which have been suspended
CREATE OR REPLACE TRIGGER resumable_alert_notifier
AFTER SUSPEND
ON DATABASE
BEGIN
– LOG ERROR
INSERT INTO RESUMABLE_ERR_LOG (
SELECT SQL_TEXT, ERROR_MSG, SUSPEND_TIME
FROM USER_RESUMABLE
WHERE SESSION_ID = (SELECT DISTINCT(SID) FROM V$MYSTAT));
– Send Email notification using UTL_SMTP
– Code not shown
END;
You can determine the best strategy (notification or timeout or abort or log error) when transaction is suspended by coding in the AFTER SUSPEND trigger.
Summary
Resumable Space is a cool feature and I recommend it be considered for implementation in your production environment to avoid business critical processes from failing when it encounters space issues. As with any implementation the mantra is “Understand > Plan > Implement in non-production db > Test > Test > Test > implement in production > relax”.
Read More About  Oracle Features

Tuesday, September 18, 2007

Data Integration Challenge – Understanding Lookup Process –II


Most of the leading products like Informatica, DataStage support all the three ways of lookup process in their product architecture. The following table lists ‘when to use’ and ‘when not to use’ the particular type of lookup process.
LookupWhen To UseWhen Not To Use
Direct Query (Uncached lookup in Informatica)
  • When the lookup process is to be invoked only once or a very few times
  • The ETL server and the database are co-located or well connected
  • Reading in large volume of source records and executing lookup queries for every incoming record can be costly in terms network load, query parsing, data parsing and disk hits
  • Querying the same set of records again and again
Join Query (Joiner Transformation or a Join on the Source Qualifier in Informatica)
  • When multiple records are returned by the lookup process and all the returned records are required for further processing
  • Both the source and lookup table are on the same database
  • When the source record performs a lookup based on some other ‘TRUE’ condition i.e., not all the records that are read from source do a lookup
  • When the source and lookup table columns are not indexed by the ‘lookup condition’
  • When the database memory is fully utilized and the Outer Joins are badly executed by the database
Cached Query (Cached Lookup in Informatica or Hash files in Datastage)
  • Many times the lookup process being executed
  • Presence of Large volume of data in the looked up table
  • Set of records from the lookup table used by multiple jobs
  • Disk space is a constraint
  • Multiple records from lookup required for processing
Advantage Cache Lookup:
The advantages of using cache file based lookups are that
  • Fields that are present in the cache file is only that is needed by the lookup process so when querying the cache file the return would be faster as compared to the lookup table that might have more fields present
  • The data structure of the cache file would be designed in such that the query from the ETL server is easily understood without any additional layer like SQL
Though in general it is said in user manuals that usage of cache files is best suited for low volume of lookup but in practical scenarios I have seen cache files are more valuable in terms of performance when the lookup records are huge.
Dynamic Cache: We have the concept of Dynamic Cache in Informatica and as well in Hash files of Datastage where you can Insert/Update or delete records from these cache file. The feature of  updating the cache files is useful when we want to keep the cache file and the lookup table in sync.
Handling Multiple Return Records: Handling the return of multiple records by a lookup process is still a challenge not implemented in any of the leading products – limited to my knowledge. Probably in release 9 Informatica’s lookup can have a parameter for defining the number of records to return as an array like in its Normalizer transformer.

In Part III we shall see some of the things to be considered for better performance when using the lookup process
You might want to read these awesome related posts Data Integration Challenge

Monday, September 17, 2007

PSADMIN and Remote Desktop


Ever had problems connecting to your server hosting the NT process scheduler or App Server using Remote Desktop and not being able to use“psadmin” to perform administrative tasks?

Here is what happens when you connect to the server using remote desktop and invoke psadmin.


You will receive the above message when you try to check the status of process scheduler.
So, how do you determine if you NT process scheduler is running? There are multiple ways to determine if your process scheduler is running.
1.Look at the task manager
2.Look at the log files
3.Check status using psadmin
In this post, I will cover the last method since the other two are straighforward.
Here is how you can use psadmin when connected using remote desktop.

1.Ensure that the “Telnet” service is started on the server

2. Go to Start > Run > Telnet <servername>

3.You will get a command prompt window

4.From this prompt, go to your PS_HOME/appserv

5.Invoke psadmin

Now, you should be able to use the “psadmin” utility for monitoring, configuring, start, stop, etc.

Read More: PsAdmin

Tuesday, September 11, 2007

PeopleCode Properties???

Sometime back, I had this question from my team mentioning – How do we determine when and who had customized this particular Record PeopleCode? The compare report showed that the PeopleCode was customized but there were no comments (bad practice!) in the code to show who touched it last.
In Application Designer the properties button is grayed out when you open the peoplecode. Also, the properties at the record level will not answer this question. Here is the SQL you can use to determine who last updated the code and when.
SELECT A.OBJECTVALUE1 RECORD, A.OBJECTVALUE2 FIELD, A.OBJECTVALUE3 EVENT, TO_CHAR(A.LASTUPDDTTM,’YYYY-MM-DD-
HH24.MI.SS.”000000″‘),
A.LASTUPDOPRID
FROM PSPCMPROG A
WHERE A.OBJECTVALUE1 = ‘&RECNAME’
AND OBJECTVALUE2 = ‘&FIELDNAME’
AND UPPER(OBJECTVALUE3) = UPPER(‘&EVENTNAME’)
/
RECORD FIELD EVENT TO_CHAR(A.LASTUPDDTTM,’YYY LASTUPDOPRID
——— ————— ——— ————————– ————
VOUCHER BUSINESS_UNIT RowInit 2006-11-13-18.25.23.000000 PPLSOFT
VOUCHER BUSINESS_UNIT RowInit 2006-11-13-18.25.23.000000 PPLSOFT
Read More about PeopleCode Properties

Monday, September 10, 2007

Business Intelligence Utopia – Enabler 3: Data Governance


The “Power of Ten” introduced earlier in this forum is a list of pre-requisites to deliver the real promise of BI. We have already seen the first two – Changes to OLTP systems and Real time Data Integration.
The third enabler in the list is ‘Data Governance’. With increasing volumes of data coupled with regulatory compliance issues, the topic of Data Governance is very much in vogue, to the extent that anybody can look intelligent (beware!) by coining new terms like Data Clarity, Data Clairvoyance etc.
Data Governance at a very fundamental level is all about understanding the data generated by business, managing the quantity / quality of data and leveraging it to make sound business decisions for the future. From my view, the steps needed in a practical data governance program are:
1) Organizational entity, headed by a Chief Data Officer (CDO), whose task is to formulate and implement decisions related to Data Management across multiple dimensions, viz. Business Operations, Regulatory compliance etc.
2) Comprehensive understanding of the data ‘value chain’ – From the source of origination to its consumption. It is important to understand that the origination and / or consumption can also be outside the organizational boundaries.
3) Understand the types of data within the enterprise by following a ‘divide-and-conquer’ strategy. One of my previous posts on this blog illustrate one way of dividing data into ‘mutually exclusive collectively exhaustive’ (MECE) categories.
4) Profile data on a regular basis to statistically measure its quality.
5) Set-up a Business Intelligence infrastructure that effectively harnesses data assets for making decisions that affects (positively, of course!) the short, medium & long-term nature of business.
6) Continuous improvement program to ensure that data is optimally leveraged across all aspects of business. A data governance maturity model like the one illustrated here ,  can be envisaged for your organization.
Competing on Analytics’ – A classic Harvard Business Review article by Thomas Davenport illustrates the power of fact-based business decisioning. For businesses to realize that power, it is important to realize that good data is a source of competitive advantage and not ‘any’ data.
Data Governance is fundamental to making organizations better and that is the reason that it figures as number 3 in my list of ten enablers for Business Intelligence Utopia. . Informative articles on Data Governance are present at the following link.

Thursday, September 6, 2007

Good Practice for executing PSBUILD scripts

I have had change control call me several times in the past to restore a table because it was accidentally dropped when executing the SQL script generated using Application Designer. Whenever you use Application Designer to generate a script with “Alter by Table Rename” option checked, the script will have the following SQL’s
– Start the Transaction
– Create temporary table
– Copy from source to temp table
– CAUTION: Drop Original Table
– Rename Table
– Done
The risk of running this script As-Is is that if any of the above SQL encounters an error then the script will still proceed to run the next SQL in the script. So if there is an error while running any SQL prior to dropping the original table then we would have lost the table and all the data.
Hence the best practice for running any scripts created using Application Designer is to include ‘WHENEVER SQLERROR EXIT’ at the beginning of the script. This will ensure that the script will abort anytime there is an error. Making this part of our Change Control checklist has ensured that there is no reoccurrence of this issue.
Note:
Scripts generated using Change Assistant automatically have ‘whenever sqlerror exit’ at the beginning of the script.
Read More about  PSBUILD Scripts

Wednesday, September 5, 2007

Learning Management System drives a Company’s Growth

It was Lewis Hamilton – the young player in the formula one championship who managed a tough fought win against other players in the recently held Hungarian F1 Grand Prix. Looking back into his performance, there are a lot of features that had made him to win the race. Similarily, young players in the corporate world are growing at a pace of 200% when compared to their counterparts. There are a lot of influential aspects for a company’s growth, one being their talent of maintaining the talent pool and the other by nurturing their talent pool. Yes,…It is achieved by Training, Knowledge Sharing, Mentoring, and Self-Learning by their employees. One of the aspects that drives these activities is the Learning Management System that the company uses. It’s the LMS (acronym for Learning Management System) that drives the growth of an organisation.
Lets have a look at the Docent Enterprise Suite which was rated as the leader in the Learning Management System by Gartner a couple of years ago. The suite comprises of Learning Management System – LMS, Content Delivery System – CDS, Authoring Tool – Docent Outliner,..to take care of the learning activity as well as the performance evaluation of a particular or group of people within an organisation.
Friends, I have just given a gist about the Docent – Learning Managment System. Leave me your queries as a comment to continue further discussion about the system

Monday, September 3, 2007

Not able to view record in PeopleSoft Query Manager!!

Sometime back, I was approached with a request from a developer who had created a new record in Application Designer but was not able to view this record in Query Manager. So in this post, I will demonstrate the steps required to view a new or existing record in Query Manager. Below steps are from Tools version 8.47.02.
  1. New record HX_TEST has been created and we would like to use it in our PS Query.
  2. Navigate to PeopleTools > Security > Query Security > Query Access Manager
  3. Search for the Tree which should have access to this record.

  1. Click on the hyperlink and add the record HX_TEST
Query_sec1
  1. Open the appropriate Permission List which has access to HX_TEST_TREE and click on the Query tab
Query_sec2
  1. Click on Access Group Permissions
Query_sec3
  1. Verify the access
Query_sec4
  1. Open PeopleSoft Query Manager and you should have access to this new record.
Query_sec5

Query_sec6 
Note that this activity is the realm of PeopleSoft Security Administrator since it needs careful analysis of the question