Showing posts with label Implementation Service. Show all posts
Showing posts with label Implementation Service. Show all posts

Thursday, June 19, 2008

Data Integration Challenge – Carrying Id-Date Fields from Source


Ids in Source System: Sometimes we would have been in a dilemma to decide whether to carry the identity (id) fields from source system as identity fields into the data warehouse as well. There is couple of situations which would push us to this state1.The business users are familiar with the product ids like 1211 , 1212 than by the product name it self and they need them in the target system

2.Why should we create an additional id field in the target table when I can have a unique identity for each record from the source system
What are Opensource Id fields, they are usually the surrogate keys or unique record keys like product id, customer id etc which the business might be more familiar than with their descriptions, descriptions of these ids are more found on report printouts. In general most of the source id fields would get mapped to the dimension tables.
Here are the reasons why we should carry the source id fields as it is
  1. The business is comfortable talking and analyzing in terms of ids than descriptions
  2. Having source ids fields which are usually numeric or if not of smaller length is very much lookup friendly, using ids for lookup or filter or join conditions when pulling data from source systems is much better than descriptions
  3. Source id fields enables linking of the data from the data warehouse to the source system
  4. Just consider the id as another attribute to the dimension and not as a unique identifier
Here are the reasons why we should create additional id (key) field in addition to the source id field
  1. Avoiding duplicate keys if the data to be sourced from multiple systems
  2. The source ids can merge, split, anything can happen, we would want to avoid the dependency on the source system
  3. The id field created in the data warehouse would be index friendly
  4. Having a unique id for each record in the data warehouse would help in determining the number of unique records in a much easier way
Dates in Source System: One other field that we usually confuse is the date field in the source systems. The date field present in the source record might provide the information when the record arrived in the source system and a date field generated in the data warehouse system would provide the information when a source record arrived into the data warehouse.
The data warehouse record date and the source record date can be same if the source record gets moved into the data warehouse the same day, certainly both date fields might represent different date values if there is a delay in the source data arrival into the data warehouse.
Why we need to store Source Date in the data warehouse, this need is very clear, we always perform date based analysis based on the arrival of source record in the source systems.
Why we need to generate a Data Warehouse Date, capturing the arrival of the record into the data warehouse answers queries related to audit, data growth and as well to determine what new records arrived into the warehouse which is especially useful for providing incremental extracts for downstream marts or other systems.

Read Brief About: Data Integration Challenge 

Monday, May 26, 2008

Part I – Peoplesoft and Microsoft Active Directory Single sign on Setup


Hi Friends, Its been a long time that I had posted in our Pitstop

May be because I have been kept busy with the work load and also the 2008 Formula One Championship,.. Its was too good to watch the thundering drive to the chequered flag by the team ferrari and hardly was concentrating on the opponents pitstops,…

Here we go,.. I would like to bring in the ‘technical collaboration’ (term taken out of air) between the Microsoft Active Directory and Oracle-Peoplesoft. You may find a lot of information on this in the Internet and other related material of interest to you. So I thought of listing down some of the basic aspects that needs to be concentrated while we are into a single sign on process.
  • Gathering the correct ‘Connect String’ Information from the Active Directory (LDAP)
  • Selecting a Network Analyzer Tool to be used to trace the data packet movement – Just in case if we head into connectivity problem, and
  • Configuring the Schema Cache on the Peoplesoft Environment.
Do post in queries if you do happen to run into any issues when you are performing single sign on or when you need more information on the procedure.

I’m happy that i had a very quick pitstop here back on track and will be back soon
Read more About: Peoplesoft

Monday, March 3, 2008

PeopleSoft Fine Grained Auditing – Part II

Now, letlet’s test this policy. Log on to PeopleSoft environment using the browser and create a PRIVATE query referring to the above table. The query will not have any criteria and will fetch all rows (this table only had 1002 rows).
After executing the query, the audit data is populated in DBA_FGA_AUDIT_TRAIL.
select timestamp, db_user, client_id, object_name from dba_fga_audit_trail where object_name = ‘PS_ABC_COMPANY_TBL’
/
TIMESTAMP DB_USER CLIENT_I OBJECT_NAME
——— ——– ——– ———————
21-MAY-07 SYSADM NPAI PS_ABC_COMPANY_TBL
21-MAY-07 SYSADM NPAI PS_ABC_COMPANY_TBL

We can also, select the actual TEXT executed by the user by selecting the SQL_TEXT column in the above data dictionary view.
Summary:
I have shown a small example utilizing FGA for auditing the PeopleSoft database. FGA is a neat feature and allows us to audit specific rowset instead of auditing all the rows in the table. This is very useful when there exists a table which has sensitive + non-sensitive information, and you want to audit any un-authorized access to the sensitive column or rowset.
Note 1:
* As of 9i, FGA feature only allows auditing SELECT. 10g supports SELECT, DELETE, UPDATE and INSERT statements.
Note 2:
If you need to drop the policy then use the below SQL
begin
dbms_fga.drop_policy (
object_schema=>’SYSADM’,
object_name=>’PS_ABC_COMPANY_TBL’,
policy_name=>’ABC_COMPANY_TBL_ACCESS’
);
end;
Bug Note:
Do not forget to check out the bug related to FGA
http://www.red-database-security.com/advisory/oracle-fine-grained-auditing-issue.html

Monday, January 28, 2008

Instant Peoplesoft Cloning Refresher Kit


Here are the instant refresher steps for performing a peoplesoft clone.
  • The target system – the system to which the Peoplesoft Instance to be cloned.
  • The source system – the system from which the Peoplsoft Instance is cloned.
  • Oracle Application 9i with the patch at version 7 is installed in the target system.
  • Application and Web Server is also installed in the target system
  • The Database structure files from PSHOME from the source system is copied to the target system’s separate drive.
  • Update the domain names and coordinates in the TNS Listener files
  • Configure the application and web server domain names
Points to Ponder
  • Before initializing the activity, the very first step would be to shutdown the source database using
    • Set Oracle_SID = Domain Name, and
    • Connect Sys as Sysdba shutdown immediate
  • Shutdown the Application Server in the Source system
  • Shutdown the Web Server in the Source system
  • Access the Oracle\Oradata\Domainname and copy the *.dat files to the target system
  • Make sure all the control files, data files and redo log files are in place in the destination system
  • Execute the ‘alter database backup control file to trace’ in the source system
  • Make sure that the file is created under Udump folder as a result of the above script
  • Make the necessary changes in the file and save it as *.sql
  • Copy the parameter file names initdomainname.ora under oracle\ora92\database and edit the coordinates and parameters in order to reflect the target system’s features
  • Startup Nomount; and execute the script generated from the source system
  • Update PSDBOWNER Set DBNAME =”New Domain Name”
  • Use NET Configuration Assistant wizard to configure the listener with the parameters for TNS, Host Name, IP Address, System Name(Target System) and perform a test
  • Using the configuration assistant, test for the access of Peoplesoft Application Designer , Data Mover with the same user access credentials of the Source system database
  • Configure the Web Server by providing the domain name, Peoplesoft web site name and startPIA.exe; Make sure that the environment variables are also set for JavaHome.

Take 3: Jolt Session Pooling Covered….

Before reading this entry (always a friendly warning first!), please read about my two blog entries that I posted earlier. This entry is kind of closure to my previous topic “Jolt Session Pooling”. If you still have decided to read this one, you cant complain about not able to understand this one….
In our last blog entry, we talked about Jolt Session Pooling. As I stated earlier, Jolt Session Pooling is enabled by default in Tools Version 8.48 and later. Considering this new feature(!), there are some things that are changed due to this parameter single change starting with Tools 8.48 and later….
Also, we talked about the sevlets (like psc, psp etc) that are having ‘definitions” in the web.xml file (under DOMAIN/PORTAL/WEB-INF directory). So, back to our main point.. why JoltPooling needs to be disabled for all the servlet entries in the web.xml file to resolve the “download to excel” button to work… Probably you have guessed it already.
The key point here is, when Jolt Session Pooling is enabled, the session is shared across all the servlets. The existing sessions are shared across multiple servlets within weblogic server. This is supposed to be provide good performance results. There are no dedicated sessions. As of now, I know some of the effects of this change in 8.48 environment:
  1. “Download to Excel” button not working
  2. Tuxedo is unable to list the online users on the system
  3. “View Attachment” is not working
  4.  
If you know anything else due to this “Jolt Session Pooling” enabled option causing in Peoplesoft, please write on the comment below for me to know and learn. My policy is to share the knowledge and feel free to learn. :)
To know more about Jolt and Weblogic, I would recommend you to read the BEA documentation about “Using BEA Jolt with BEA Weblogic Server” Guide. I am going to write some new things going forward… One of the m is about my favorite. Inter Process Communication in the Unix System and Peoplesoft.
Read More About  Jolt Session Pooling

Thursday, December 27, 2007

Fine Grained Access Control for PeopleSoft Database – II

previous post…
 Here are the steps to implement the Fine Grained Access Control feature to mimic the row-level security in your PeopleSoft online Query Manager.
Step 1:
We will create a function QRY_SEC_FUNCTION that will be used by the policy to add the filter.
create or replace function QRY_SEC_FUNCTION (schema_name IN varchar2,
table_name IN VARCHAR2)
return varchar2
as
V_OPRID VARCHAR2(32);
V_emplid varchar2(20);
V_CLIENT_INFO VARCHAR2(1000);
V_QRYSECRECNAME VARCHAR2(32);
V_SQL_TEXT VARCHAR2(4000);
V_TABLE_NAME VARCHAR2(32);
begin
V_CLIENT_INFO := SYS_CONTEXT(‘USERENV’,'CLIENT_INFO’);
V_OPRID := SUBSTR(V_CLIENT_INFO,1,INSTR(V_CLIENT_INFO,’,',1)-1);
V_TABLE_NAME := TABLE_NAME;
If V_OPRID is null then
V_SQL_TEXT := ‘EXISTS (SELECT ”X” FROM PS_PERS_SRCH_QRY A1 WHERE A1.EMPLID = ‘||V_TABLE_NAME||’.EMPLID AND A1.OPRID = USER)’;
else
V_SQL_TEXT := ‘1=1′;
end if;
RETURN V_SQL_TEXT;
exception
when others then
return ‘1=1′;
end;
/
Step 2:
Now, we will create the policy
begin
dbms_rls.add_policy
( object_name => ‘PS_PERSONAL_DATA’,
policy_name => ‘PERSONAL_DATA_POLICY’,
policy_function => ‘QRY_SEC_FUNCTION’,
statement_types => ’select’,
update_check => TRUE );
end;
That’s it!!
Now let’s test the result…
Fgac_5
As we can see above, now our results from the database match the results from online query. The function has dynamically added the additional criteria similar to what was done by Query Manager.

Summary:
In today’s world, it has become critical to ensure that there are no security loopholes in the system that will expose data to people who should not be seeing them. Row-level security provided by PeopleSoft helps us secure online access and we most often forget that users setup in the database can by-pass this security and have access to all the data. FGAC helps us replicate the online row level security in the database thus helping us further secure the database.
Read More About  Fine Grained Access Control

Saturday, December 8, 2007

How to create new Manager Self Service Transaction?

As everyone knows, PeopleSoft itself provides a lot of MSS transaction.
For example
  1. Transfer Employee
  2. Job Change
  3. Promote Employee
  4. Terminate Employee
  5.  
Like that, say for example, if we are getting new MSS related requirements for Paygroup Change, How will we accomplish? Here is an insight!!!
Broadly saying, we should have 3 types of components that needs to be built to accomplish any MSS transaction
  1. Transaction Level Components
  2. Approval related Components (If Required)
  3. Viewing Transaction/Approval Workflow Status related Components
  4.  
Each of these components are in turn driven by 2 types of Component Pages, one for Launching Manager’s Direct Reports and the other one for Adding / Approving/ Viewing a (Paygroup change) Transaction.
In order to maintain all these transactional and approval workflow status related data, we should create 2 records to hold the data. As per PS Convention, One of the ways to maintain the data is achieved by having
  1. HR_PAYGRPCHG_DAT – This record used to hold all the (Paygroup change related) Transactional data
  2. HR_PAYGRPCHG_STA – This record used to hold all the Approval Workflow Status related data
  3.  
Transactional Level Components
This type of components is purely meant for doing transactional changes. In our case, transactional change is nothing but Paygroup change for an employee.
  1. Launching Page:

  2. Manager’s Direct Reports  Launching Component Page is constructed either by Direct Report Setup (Using OPRROWS component to show Direct Reports) or Configure Direct Reports UI Setup (Using HR_DR_SELECTION_UI component to show Direct Reports)
    Navigation: Setup HRMS >> Common Definition >> Direct Reports for Managers
  3.  
  4. Transaction Page:

  5. Transactional Component Page should be manually built as per the Requirements. As a manager, this is where Manager has to change the Paygroup of an employee, which in turn will fire Approval workflows (if required) or insert the data to Job component directly.
    Once the data is saved on this page, it should insert this transaction related data to DAT record and all the Approval workflow related data to STA record.
  6.  
Approval related Components / Viewing Approval Workflow Status related Components
  1. Launching Page:

  2. The Launching Page for both Approval / Viewing Status related Components should be manually designed to populate the Manager’s Direct Reports.

  3. Basically Launching pages are used to show the Direct Reports of the Manager. But in this case, Population of Manager’s Direct reports is restricted. For Eg. If the MSS transaction needs an approval, then the corresponding employee for that transaction will be shown as one of the Direct Reports in Launching page. Needless to say, Showing data for Manager’s Direct Reports should be determined by both Transactional (HR_PAYGRPCHG_DAT) and Status (HR_PAYGRPCHG_STA) data records.
  4.  
  5. Transaction Page:

  6. In the case of Approval related components, the main Approval page should be provided with 2 extra buttons to approve or deny the transaction. All other transactional data information should be placed in this page in non editable mode.

  7. For Viewing Statuses, the entire page should be in READ only access mode. This page should be used only to view the transaction and approval workflow status related details.

Read More About  Self Service Transaction