Tuesday, October 16, 2007

UPGCOUNT – 2

In some cases the two preferred methods for executing UPGCOUNT do not work for you because
  1. You do not have access to PSAE executable
  2. You have very limited access to the environment to successfully run the AE
Here is a PL/SQL that you can use to provide similar result.
set serverout on size 1000000
DECLARE
CURSOR C1 IS
SELECT RECNAME, SQLTABLENAME FROM PSRECDEFN WHERE RECTYPE = 0 ORDER BY RECNAME;
VAR_COUNT NUMBER;
VAR_REC_COUNT NUMBER :=0;
VAR_SQLTEXT VARCHAR2(32000);
VAR_RECNAME VARCHAR2(64);
VAR_PAD VARCHAR2(100);
VAR_DATE DATE;
BEGIN
SELECT SYSDATE
INTO VAR_DATE
FROM DUAL;
DBMS_OUTPUT.PUT_LINE(‘Report for Record Rowcount’);
DBMS_OUTPUT.PUT_LINE(‘START DATE AND TIME:’||TO_CHAR(VAR_DATE,’MM/DD/YYYY HH24:MI:SS’));
DBMS_OUTPUT.PUT_LINE(‘RECNAME Row Count’);
DBMS_OUTPUT.PUT_LINE(‘============================== =========’);
FOR CUR_REC IN C1
LOOP
VAR_PAD := RPAD(‘ ‘,31 – LENGTH(CUR_REC.RECNAME));
IF NVL(LENGTH(RTRIM(CUR_REC.SQLTABLENAME)),0) = 0 THEN
VAR_RECNAME := ‘PS_’||CUR_REC.RECNAME;
END IF;
VAR_REC_COUNT := VAR_REC_COUNT + 1;
BEGIN
VAR_SQLTEXT := ‘SELECT COUNT(*) FROM ‘||VAR_RECNAME;
EXECUTE IMMEDIATE VAR_SQLTEXT INTO VAR_COUNT;
DBMS_OUTPUT.PUT_LINE(CUR_REC.RECNAME||VAR_PAD||VAR_COUNT);
EXCEPTION
WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE(CUR_REC.RECNAME||VAR_PAD||’ERROR – ‘||SQLERRM);
END;
END LOOP;
DBMS_OUTPUT.PUT_LINE(‘============================== =========’);
DBMS_OUTPUT.PUT_LINE(‘ ‘);
DBMS_OUTPUT.PUT_LINE(‘Total Number of Tables = ‘||VAR_REC_COUNT);
SELECT SYSDATE
INTO VAR_DATE
FROM DUAL;
DBMS_OUTPUT.PUT_LINE(‘END DATE AND TIME:’||TO_CHAR(VAR_DATE,’MM/DD/YYYY HH24:MI:SS’));
END;
/
Read More About UPGCOUNT

Monday, October 15, 2007

UPGCOUNT – 1

UPGCOUNT is an AE in current tools releases. Also, there is no delivered page to kick-off this process. The output from this report is critical to determine the success of a major migration (MP or multiple bundles or Upgrade). This helps validate the process to ensure that no data was lost as a result of the migration (running Data Mover scripts, data conversion, etc).
Here are some methods that you can use to run this AE program.
1.Through App Designer
Upgcount_1
  • Open the UPGCOUNT AE in App Designer
  • Click on Run Program icon
  • Choose appropriate options on the “Run Request” dialog box
Upgcount_2
The output file will be located in your temp folder
Upgcount_3
2.Through command line
  • Execute the below commands for running on UNIX
export PS_SERVDIR=$PS_HOME/appserv/prcs/FDMO exportPS_SERVER_CFG=$PS_HOME/appserv/prcs/FDMO/psprcs.cfg psae -CT Oracle -CD FDMO -CO VP1 -CP VP1 -R UPGCOUNT1 -AI UPGCOUNT -I 11 -TRACE 135
Read More About UPGCOUNT

Business Intelligence Utopia – Enabler 5: Extensible Data Models


Enabler 5 in my list for Business Intelligence Utopia are the ubiquitous, hard-working “Data Models”. Data Model is the heart of any software system and at a fundamental level provides placeholders for data elements to reside.
Business Intelligence systems with all its paraphernalia – Data Warehouses, Marts, Analytical & Mining systems etc. typically deals with the largest volume of data in any enterprise and hence data models are highly venerated in the Data Warehousing world.
At a high level, a good Data Warehouse data model has the following goals: (Corollary – If you are looking for a data modeler look for the following traits)
1) Understand the business domain of the organization
2) Understand at a granular level the data generated by the business processes
3) Realize that business data is an ever-changing commodity – So the placeholder provided by the data model should be relevant not only for the present but also for the future
4) Can be described at a conceptual and logical level to all relevant stakeholders
5) Should allow for non-complicated conversion to the physical world of databases or data repositories that is manipulated by software systems
Extensible Data models deal with all the 5 points mentioned above and more specifically has future-proofing as one of its stated goals. Such extensible models are also “consumption agnostic”, i.e. – it provides for comparable levels of performance irrespective of the way data is being consumed.
It is important for Business Intelligence practitioners to understand the goals of their data models before embarking to use specific techniques for implementation. Entity-Relationship & Dimensional modeling (http://www.rkimball.com) has been the lingua-franca of BI data modelers operating at the conceptual and logical levels. Newer techniques like Data Vault (http://www.danlinstedt.com/) also provides some interesting thoughts in building better logical models for Data Warehouses.
At the physical implementation level, relational databases still form the backbone of the BI infrastructure, supplemented by multi-dimensional data stores. Even in the relational world, traditionally dominated by row-major relational vendors like Oracle, SQL Server etc. there are column-major relational databases of the likes of Sybase IQ with claims of being built ground-up for data warehousing.
In this article on column major databases – http://www.databasecolumn.com/2007/09/one-size-fits-all.html, there is reference to a new DW specific database architecture called Vertica. It makes for a fascinating read – http://www.vertica.com/datawarehousing. The physical layer is also seeing a lot of action with the entry of data warehousing appliance vendors like Netezza, Datallegro etc. (http://www.dmreview.com/article_sub.cfm?articleId=1009168).
The intent of this post can be summed up as:
a) Understand the goals of building data models for your enterprise – Make it extensible and future proof
b) Know the current techniques that help envisage and build data models
c) Be on the look-out for new developments in the data modeling and database world – There is lot of interesting action happening in this area right now.
Extensible data models combined with the right technique for implementing them, lists as Enabler 5 in the “Power of Ten” for implementing Business Intelligence Utopia .





Monday, October 8, 2007

PeopleSoft Audit Reports and Migration

I have always recommended the following reports to determine a clean migration especially when there are a ton of objects to be migrated (PS Upgrade, Application of Maintenance Packs or Bundles, etc).
Prior to application of the bundles execute the following.
1. SYSAUDIT
2. DDDAUDIT
3. ALTER AUDIT
4. UPGCOUNT (just prior to the actual migration)
The first three can be executed a few hours or days before the actual migration as long as you ensure that no changes are migrated or performed in to the target environment which might cause an impact on these reports. However the UPGCOUNT process has to run just prior to the actual migration because this provides a row count of every record in the DB.
You need to re-run the above reports after completion of the migration. The objective should be as noted below.
1. Ensure that the reports match
2. For any mismatch there has to be an explanation
Note 1:
UPGCOUNT might not match for all records after application of Maintenance Pack/Bundle/Upgrade. You need to identify what caused the change. You can denote “Bundle 12 – Data Mover script upgnnnn.dms” in your documentation. This exercise will ensure that you have a clean migration.
RecordBeforeAfterComments
PSPRCSRQST1000210008Bundle # 11 – Execution of SQR program in step 21, 22, 29.
BENEFIT_PARTIC103110103212Bundle # 12 – Execution of data mover script upgnnnn.dms
Note 2:
It is a good practice to ensure that you have a clean SYSAUDIT, DDDAUDIT and ALTER AUDIT report.
Read More About  PeopleSoft Audit Reports

Thursday, October 4, 2007

Rise of Data Transformation Services – DTS Packages

Interoperability and communication between applications plays a vital role in today’s emerging business needs.One such component which helps the business to transfer data between different system is the Data Transformation Service Packages famously known as DTS packages.This pasage will cover the differences between SQL Server 2000 and SQL Server 2005 with respect to the DTS Packages.Enterprise Manager of the Microsoft SQL Server 2000 provides an inbuilt menu within its IDE and the dts packages are accessed using the SQL Server 2000 DTS Designer IDE.
The DTS Designer 2000 allows us to perform the following tasks over a package such as 1. Designing 2. Executing and 3.Scheduling.
In SQL Server 2005, the DTS packages cannot be accessed the same way as we use to access it in SQL Server 2000. The reason being the advent of Microsoft SQL Server Integeration Services (SSIS) in the 2005 version. The two main IDE of the SQL Server 2005 are
Unlike the DTS Designer 2000 that was used to access the DTS Packages, the SQL Server Business Intelligence Development Studio supports the packages with its SSIS. Microsoft also provides a backward compatibility method of accessing the DTS Packages that were created in SQL Server 2000. Even after the migration of the database from SQL Server 2000 to SQL Server 2005, the DTS Packages which were originally created in the SQL Server 2000 environment can also be migrated to the 2005 Server. Since the DTS Packages are Server specific, by 1. opening the SQL Server 2000 Enterprise Manager and by 2. accessing the SQL Server Management Studio, the DTS Packages can be opened by DTS Designer 2000 through the SQL Server 2005 environment.