Monday, January 28, 2008

Instant Peoplesoft Cloning Refresher Kit


Here are the instant refresher steps for performing a peoplesoft clone.
  • The target system – the system to which the Peoplesoft Instance to be cloned.
  • The source system – the system from which the Peoplsoft Instance is cloned.
  • Oracle Application 9i with the patch at version 7 is installed in the target system.
  • Application and Web Server is also installed in the target system
  • The Database structure files from PSHOME from the source system is copied to the target system’s separate drive.
  • Update the domain names and coordinates in the TNS Listener files
  • Configure the application and web server domain names
Points to Ponder
  • Before initializing the activity, the very first step would be to shutdown the source database using
    • Set Oracle_SID = Domain Name, and
    • Connect Sys as Sysdba shutdown immediate
  • Shutdown the Application Server in the Source system
  • Shutdown the Web Server in the Source system
  • Access the Oracle\Oradata\Domainname and copy the *.dat files to the target system
  • Make sure all the control files, data files and redo log files are in place in the destination system
  • Execute the ‘alter database backup control file to trace’ in the source system
  • Make sure that the file is created under Udump folder as a result of the above script
  • Make the necessary changes in the file and save it as *.sql
  • Copy the parameter file names initdomainname.ora under oracle\ora92\database and edit the coordinates and parameters in order to reflect the target system’s features
  • Startup Nomount; and execute the script generated from the source system
  • Update PSDBOWNER Set DBNAME =”New Domain Name”
  • Use NET Configuration Assistant wizard to configure the listener with the parameters for TNS, Host Name, IP Address, System Name(Target System) and perform a test
  • Using the configuration assistant, test for the access of Peoplesoft Application Designer , Data Mover with the same user access credentials of the Source system database
  • Configure the Web Server by providing the domain name, Peoplesoft web site name and startPIA.exe; Make sure that the environment variables are also set for JavaHome.

Take 3: Jolt Session Pooling Covered….

Before reading this entry (always a friendly warning first!), please read about my two blog entries that I posted earlier. This entry is kind of closure to my previous topic “Jolt Session Pooling”. If you still have decided to read this one, you cant complain about not able to understand this one….
In our last blog entry, we talked about Jolt Session Pooling. As I stated earlier, Jolt Session Pooling is enabled by default in Tools Version 8.48 and later. Considering this new feature(!), there are some things that are changed due to this parameter single change starting with Tools 8.48 and later….
Also, we talked about the sevlets (like psc, psp etc) that are having ‘definitions” in the web.xml file (under DOMAIN/PORTAL/WEB-INF directory). So, back to our main point.. why JoltPooling needs to be disabled for all the servlet entries in the web.xml file to resolve the “download to excel” button to work… Probably you have guessed it already.
The key point here is, when Jolt Session Pooling is enabled, the session is shared across all the servlets. The existing sessions are shared across multiple servlets within weblogic server. This is supposed to be provide good performance results. There are no dedicated sessions. As of now, I know some of the effects of this change in 8.48 environment:
  1. “Download to Excel” button not working
  2. Tuxedo is unable to list the online users on the system
  3. “View Attachment” is not working
  4.  
If you know anything else due to this “Jolt Session Pooling” enabled option causing in Peoplesoft, please write on the comment below for me to know and learn. My policy is to share the knowledge and feel free to learn. :)
To know more about Jolt and Weblogic, I would recommend you to read the BEA documentation about “Using BEA Jolt with BEA Weblogic Server” Guide. I am going to write some new things going forward… One of the m is about my favorite. Inter Process Communication in the Unix System and Peoplesoft.
Read More About  Jolt Session Pooling

Business Intelligence and Six Sigma

I just finished a Six Sigma project and was left wondering as to why BI practitioners are not using more of that Six Sigma power in Business Intelligence. Let me delve on this subject a bit more.
The Six Sigma project that I just completed was on “Developing a Function Point based estimation model for ETL loads”. Essentially, I was facing a lot of problems in estimating the effort for ETL (in this case, Informatica) loads that led to “Effort variances” beyond specified limits. So we kicked off a Six Sigma project that had the following DMAIC phases:
1. Define – Definition of the problem (Ex: Estimation process is out of whack)
2. Measure – We measured the effort variances before the start of the project and also set ourselves a target of where it should be.
3. Analyze – Analyzed the root-cause of the problem. The solution was to let go of the complexity based estimation that was done initially and to adapt Function points. In fact, this FP based estimation model was presented at the International Software Estimation Colloquium last year and won the Runner-up prize (http://www.qaiasia.com/Conferences/sec2007/leadership.htm)
4. Improve – Based on a pilot within the project, the Function points based linear regression model was arrived at and the team was educated on the estimation process. The improvements to the estimation process (effort variances) were measured on a regular basis.
5. Control – Periodic checks to ensure the institutionalization of the process and also fine-tune wherever necessary.
That in a nut-shell is what my Six Sigma project was all about. Basically, Six Sigma tries to improve process efficiencies by following the phases mentioned above.
Now let’s see the connection to Business Intelligence. Analytics at this stage of evolution (in majority of organizations) are being used to find the improvement area at a given point of time. The improvement area can be a problem (Ex: Trend chart showing that the Sales in the West region is dropping by 10% every quarter for the last 3 quarters) or an opportunity (Ex: Market potential for a product is huge and our share is small). BI is reasonably good at providing this information and it will only get better. But BI by itself does not enforce the process / execution rigor that is required for successful organizations.
To summarize, Six Sigma needs an improvement opportunity as the starting point for it to unleash its power to improve processes. BI generates lot of these opportunities with its DW/Reporting/Analytics components but does not enforce the process implementation rigor. I feel that there is lot of synergy in bringing both together – Six Sigma, the left hand and BI, the right hand when brought together can earn a lot of claps in the quest to create learning, performing organizations.
Just to sample the power of Six Sigma techniques, please take a look at the following link:http://www.kaushik.net/avinash/2007/01/excellent-analytics-tip-9-leverage-statistical-control-limits.html, which illustrates the use of control charts (one of Six Sigma’s potent tools) in metrics / KPI management. Fascinating!
Agree / Not Agree, Have more thoughts on this topic, this post is good / rubbish, for anything – Please do send in your comments.
Information Nugget:Having talked about execution rigor, let me recommend one of the best books I have read in that area. “Execution – The Discipline of Getting Things Done” by Larry Bossidy and Ram Charan (http://www.amazon.com/Execution-Discipline-Getting-Things-Done/dp/0609610570)

Friday, January 18, 2008

Take 2: Jolt Session Pooling Continued….

Take 2: Jolt Session Pooling Continued….
In our last blog, we talked about Jolt Session Pooling. That was kind of introduction to this concept I suppose. I am now going much deeper into this parameter to understand how this parameter works with Peoplesoft context (also I am interested to know more about this parameter anyway!)…
A quick look at the web.xml file at PORTAL/WEB-INF folder can give you a list of servlets that Peoplesoft application uses: Some of them are:
  • psc
  • psp
  • cs
  • xmllink
  • PSAttachServlet
  • psreports
  • SchedulerTransfer
  • SyncServer
  • monitor
  • ppmi etc…
I am not going into each and every servlet that Peoplesoft uses and their details. That is out of scope for my explanation here. And, primarily because, I do not know about them neither… However an overall understanding of this will definitely help us to understand the underlying architecture behind the Peoplesoft Internet Architecture.
For each and every servlet that Peoplesoft uses, there exists a definition at web.xml file. For simplicity sake, this file can be treated as Servlet Configuration File and it is an XML file by nature. If you work with weblogic and Java J2EE, they call this file as Deployment Descriptor Elements. I find it hard to remember that way. So, just to keep things easy, consider this web.xml file as Servlet Configuration File created as part of application deployment…
We are talking here for Peoplesoft technology. We dont need that much gory details to know about weblogic (we need to know some basics!). Just basics. A weblogic server has many servlets that is used for processing requests. A servlet basically connects to the Tuxedo using a session pool manager, which assigns a session based on availablity etc. This session is, then, connecting to the Tuxedo system (application server) using Jolt Server.
So, back to our main point.. why JoltPooling needs to be disabled for all the servlet entries in the web.xml file to resolve the “download to excel” button to work… Probably you have guessed it already.
Read More About  Jolt Session Pooling

Wednesday, January 16, 2008

Jolt Session Pooling on the Web Server Configuration

Okay, long time no see! I am back. First of all, let me wish you all a Happy and Prosperous New year. Lets hope for the best.
I had some project , as well as, some personal commitments that I needed to complete during last year. One of my project commitment was, obviously due to the reason that I moved to a new project. Also, coincidently I moved to a new location in another country which added some more complications with my new project assignment…
I had a goal last year (2007) to become a certified PMP. PMP is an acronym for Project Management Professional offered by PMI (Project Management Institute). I took the training from Hexavarsity starting of the year. And eventually I have become PMP after spending quite a considerable time. That was my personal commitment that delayed me from writing in this blog.. So, long story short, I am going to write again in this blog. All that I know about Peoplesoft and Unix (Dont laugh, I will try!).
Lets start! We had a situation recently with our UAT environment. The “download to Excel” button is not working. You should be aware of what this “Download to Excel” button I am talking about. For those who are unable to guess, here it is. It is a small image, just like an excel sheet, that will be shown on the right corner of the any tabular output from Peoplesoft application ( Example Process monitor, Reports etc). If you click on this image, you will receive the excel sheet of whatever data you were looking at…
We started doing some research on this issue(still doing!). You know where to start first. You guessed, right, Peoplesoft Customer Connection. They recommended to try disabling JoltPooling in Web Server. This directive is available in web.xml file for the weblogic web server, and called Jolt Session Pooling.
Starting with Tools 8.48, Jolt Session Pooling is enabled by default. What does this mean? Let me explain this from the scratch. As you are aware, web server makes a connection to app server using Jolt. If you enable Jolt Session Pooling , the user connections between web server and app server are simply shared. This setting is expected to minimise system resources by sharing the user connection by pooled sessions…
Web.xml file is an xml file that has directives for all the peoplesoft servlets that the web server uses ( for example, psp, psc etc). Every servlet is set to JoltPooling set as true by default starting with Peopletools Version 8.48. All we did was, disabled JoltPooling on all of these servlets. And this issue is resolved.
Is the issue really resolved? Not exactly, I will say. We still need to find why Jolt Session Pooling needs to be disabled to make “Download to Excel” button to work. Mystery continues…. Until next time.
Read More About  Web Server Configuration

Future of PSS Solutions

Airlines are one of the earliest adopters of technology in the field of reservations, check in and departure. These systems are mainly on TPF (IBM) or USAS (Unisys) and are main frame based. The dilemma of the airlines to move to new technology probably could be due to doubts on the efficiency of new technology in meeting the split second response and ability to handle large volume of transactions in real time and that is one of the main reasons that majority of the airlines are still using systems that are 30+ years old in technology.
With the advent of new technology resulting in direct contact with customer, the justification for CRS, shifted from operational efficiency, to marketing strategies, to competitive advantages of essential business tools, etc. This progress contributed to fundamental changes in the structure of the industry. In large part, network externalities created by these systems, possibility of avoiding the intermediate layer GDS, drove these changes.
As we all know, Air Travel is not an isolated activity and the customers are looking for comprehensive package of which air travel is only one component. Sensing this and also due to the threat of airlines marketing directly to customers, GDS have actively started providing connectivity to all the players thus helping the agents provide customer service with a single window. Further many have started forward integration, reaching out to the customers with their own websites.
The new systems of CRS/GDS, called GNE (GDS, new entrants), pronounced “genie”, are internet-based access and distribution systems not requiring data in the system to be stored, unlike traditional CRS/GDS antiquated mainframes which required advance storage in order to book (and where they made most of their revenue in the past.). The GNE can search multiple individual travel sites (airlines, car rental agencies, hotels, tour operators, cruise lines, etc.) as well as other consolidated travel sites (Expedia, Travelocity, Orbitz, etc.), and other airline/car/hotel/tour/cruise, consolidators/wholesalers/discounters, etc., all at the same time; create a virtual data set for use in the GNE, and then present the data under many different parameters/filters for the purpose of creating a travel arrangement, i.e. a Passenger Name Record (PNR.). Unlike traditional CRS/GDS systems, elements of the PNR do not have to be booked thru the same supplier of data; each element can be booked directly with each individual supplier, thus lowering data storage costs. The GNE then creates a master record of the arrangements booked, i.e. a Super PNR (SPNR), and provides a summary of the arrangement for the traveller.
It is likely that global distribution systems new avatar will lead to the fragmentation of airline inventories across different distribution channels. Airlines will seek to increase the proportion of sales they make directly on their own website, thereby reducing their costs. Providers of channels of reservations (GDS, Airlines, Portals, etc) need to go the extra mile and demonstrate their value to corporate clients clearly. Corporate clients are most concerned about ensuring access to the widest possible range of airline products and tariffs, at the same time as distribution costs are removed from the value chain. It is going to be a challenge for each of the service provider to stay afloat unless they invest on technology and come out with a ‘single point’ solution.
Read More about  Point to Point

Linking BI Technology, Process and People – A Theory of Constraints (TOC) View


With the advent of a new year, let me do a recap of what I have discussed through my 15 odd posts in 2007 and also set the direction for my thoughts in 2008.
I started with the concept (http://blogs.hexaware.com/business_intelligence/2007/06/business-intell.html) of BI Utopia in which information is available to all stakeholders at the right time, right time and in the right format. The bottomline is to help organizations compete on analytics in the marketplace. With that concept as the starting point, I explored some technology enablers like Real Time Data Integration, Data Modeling, Service Oriented architecture etc. and also some implementation enablers like Agile Framework, Calibration of DW systems and Function points based estimation for DW. In my post on Data Governance, I introduced the position of a CDO (Chief Data Officer) to drive home the point that nothing is possible (atleast in BI) without people!
To me, BI is about 3 things – Technology, Process, People. I consider these three as the holy triumvirate for successful implementation of Business Intelligence in any organization – Not only are the individual areas important by itself but the most important thing is the link between these 3 areas. Organizations that are serious about ‘Analytics’ should continuously elevate their technology, process & people capability and more importantly strengthen the link between them – afterall, any business endeavor is only as good as its weakest link.
Theory of Constraints (http://en.wikipedia.org/wiki/Theory_of_Constraints) does offer a perspective, which I feel is really useful for BI practitioners. I will explore more of this in my subsequent posts.
My direction in 2008 for posts on this blog are:
  1. Continue with my thoughts on Business Intelligence along Technology, Process and People dimensions

  2. Provide a “Theory of Constraints” based view of BI with focus on strengthening the link between the 3 dimensions mentioned above.

Almost every interesting business area – Six Sigma, Balanced Scorecard, System Dynamics, Business Modeling, Enterprise Risk, Competitive Intelligence, etc. has its relationship with BI and we will see more of this in 2008.
Please do keep reading and share your thoughts as well Business Intelligence

Friday, January 11, 2008

Determine Portal Navigation for all custom processes

The below SQL lists the complete portal navigation along with the process name. I built this SQL when we had to provide this list to our client to review processes that could be retired. The output is useful for generating an inventory or assisting in KEEP/DROP decisions during upgrade.
Here it is:
SELECT NAVIGATION, A1.PRCSNAME
FROM
(SELECT LPAD(‘–’,2*(LEVEL-1)) || PORTAL_LABEL “NAVIGATION”, PORTAL_URI_SEG2 FROM (SELECT PORTAL_LABEL, PORTAL_PRNTOBJNAME, PORTAL_OBJNAME, PORTAL_URI_SEG2 FROM PSPRSMDEFN A
WHERE PORTAL_NAME = ‘EMPLOYEE’ ) B
WHERE B.PORTAL_PRNTOBJNAME != ‘ ‘
START WITH (B.PORTAL_URI_SEG2 IN
(SELECT D.PNLGRPNAME
FROM PSMENUITEM A, PSMENUDEFN B, PS_PRCSDEFNPNL C, PSPNLGROUP D
WHERE A.MENUNAME=B.MENUNAME
AND A.PNLGRPNAME = C.PNLGRPNAME
AND A.PNLGRPNAME = D.PNLGRPNAME
AND C.PRCSNAME IN
(SELECT PRCSNAME FROM PS_PRCSDEFN
WHERE LASTUPDOPRID !=

 Read More About  Determine Portal Navigation

Thursday, January 3, 2008

BI Appliances


What is a BI Appliance?
If a data warehouse class database product or a reporting product or a data integration product or an all-in-one software package is pre installed and available in a preconfigured hardware box, then such a “hardware + software” box is called a ‘Business Intelligence'  Appliance’. The very purpose of an appliance model is to cover the underlying software components complexity and intricacies and make it simple like operating a TV system.
How an Appliance Model evolved?
As businesses gathered huge data, the demand for faster and better ways of analyzing data increased, the data warehouse as a software technology got evolved; there have been continuous efforts to build software systems that are cognizant of data warehouse environments.


We have seen IBM and Oracle releasing their data warehouse specific database editions
We seen the growth of data warehouse specific databases like RedBrick(now part of IBM), Teradata, Greenplum…
We have seen simple list reporting tools getting into proprietary data structures cubes and the emergence of acronyms MOLAP, HOLAP, ROLAP, DOLAP
We had a very new software market created for ETL and EII products
We have seen more new software applications related to BI being created BAM, CPM, Metadata Management, Data Discovery and lot more getting defined every day into the market….

As many organizations started setting up its BI infrastructure or enhanced its existing BI environment with different BI software packages they needed, they also imbibed different platforms and hardware, the maintenance of these became frightening. Getting started with a BI project by itself became a bigger project; we needed to spend sufficient time not just on choosing the right set of BI products but also on the supported hardware, dependent software packages and the platform. No BI vendor currently addresses the complete stack of BI system needs and this has been the driving factor for more acquisitions.
Products like Nettezza (Data base Appliance), CastIron (ETL Appliance) came up with their ‘software in a box’ concept, where we can buy or rent preconfigured ‘hardware + software’ boxes which in a way addresses the need of ‘ready to use’ BI market. Many of these boxes have Linux, open source databases, web server, message queues and proprietary software.
The Appliance based model is not new, IBM has been renting its ‘mainframe + software’ for decades. IBM has addressed the BI market with its ‘Balanced Warehouse’; a preconfigured ‘hardware + software’, its OS can vary from Windows – AIX – Linux with DB2 as database and data reporting can vary from DB2 Cubes – Crystal – Business Objects. HP in a similar way has come out with its Neoview platform which is a revitalized version of NonStop SQL database and NonStop OS.
The need of a CIO has been always ways to shorten the application deployment cycle and reduce the maintenance factor of the servers; the Appliance based products meet these KRA of a CIO and are getting accepted widely.
The Future
More Appliances, Focus on Performance:
We would see more BI appliances coming into market; as the Appliance model covers what’s underneath and in many cases the details being not available; the buying focus would be more on what the products deliver rather than what they have inside.
Common Appliance Standards:
Getting best of breed of software and hardware from a single vendor would not happen. We might see both software and hardware vendors defining a set of basic standards among themselves for the Appliance model. New organizations would also evolve similar to “tpc.org” which would define performance standards for appliances. We might see companies similar to DELL coming up which can assemble best of breed components and deliver a packaged BI Appliance.
More Acquisitions: The current  Business Intelligence Market landscape can also be interpreted as
  1. Hardware + Software or Appliance based vendors – HP, IBM
  2. Pure software or Non-Appliance based vendors – Oracle, Microsoft, SAP

Once the current BI software consolidation gets established the next wave of consolidation would be towards companies like Oracle looking for hardware companies to be added to their portfolio.
TechnologyAppliance Products
DatabaseNetezza
Teradata
DATAllegro
Dataupia
Data IntegrationCASTIron
Reporting-DashboardCognos NOW (Celequest LAVA)
Configurable Stack (with third party support)IBM Balanced Warehouse
HP Neoview