Monday, March 28, 2011

Automation Feasibility Checklist


Abstract:
Software testing is an activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. The difficulty in software testing stems from the complexity of software. Testing is more than just debugging. The purpose of testing can be quality assurance, verification and validation or reliability estimation. Testing can be used as a generic metric as well.
In this article, we came up with an Automation Feasibility Checklist (AFC) which would be helpful for effective and efficient automation testing.
Introduction:
Manual Test case is the entry point for any automation, but there are some criteria which hinders the automation process which includes:
  • Lack of clarity in test steps
  • Lack of test data
  • Traceability coverage
  • Identification of Dependencies/Pre-requisites of test cases in a business scenario
  • Presence of logical closure
It is better to identify these problems earlier so that it can reduce the effort in manual execution thereby pave way for effective automation.
Automation Feasibility Checklist (AFC):
Automation Feasibility Checklist is used to identify whether the manual test case is feasible for automation or not. The following are the criteria to determine the automation feasibility of the test cases:
Essential Criteria
  • Dependencies/Pre-requisites
  • Detailed Test steps
  • Test Data availability
  • Expected results
  • Traceability
Optional Criteria
  • Meagre Application Knowledge
  • Subject Matter Expert’s (SME) support
  • Duplication of test steps
  • Availability of multiple sets of data
Snapshot of AFC
Benefits of Automation Feasibility Checklist:
  • Reduces the Manual execution effort.
  • Improves Automation efficiency.
  • Helps to derive effective Manual test cases.
  • Control and avoid risks in Automation.

Thursday, March 17, 2011

Welcome back … Good Times!


As the winter recedes and spring is ready to bloom in the northern hemisphere – the fears of doom and gloom recede too. If one looks at the IT Industry worldwide with lens of India-based software service providers, the sense of déjà vu is back. Fond memories of the good times come rushing back.

Now you may ask; what makes me say this?  After a dismal period ranging from late 2008 to early 2010; the tide has turned. After few years, this could well be the first year where Fortune 500 companies may have a year-on-year increase in their planned IT Spend. Further, the stress that had engulfed everyone from the Wall Street to the Main Street and everyone in the middle has dwindled and the early signs of a return to growth have clearly emerged. The visible trends, sound bites cuts across most industry verticals which further reinforce the confidence that IT Industry could register healthy growth in 2011.

You could also ask me; have we gone to the good old days again? I would say – Well, that is where we want to go but we are not there yet. While the mood is buoyant, all the pieces of the jig saw puzzle have not fallen into place yet. There are pockets of concern particularly in continental Europe which are still battling sovereign debt, geo-political uncertainties in Middle East, North Africa and natural calamities in Japan. Hence, it may take a little while longer for all the regions to buzz in unison. Having sounded the cautionary tone, the mood has certainly swung in favor of committing investments; driving growth and leveraging the winds of change.

If you agree with this, you may ask – what does the return to growth mean to me? It certainly signals good times! It means the activity levels will remain high. New clients will be added. New deals will be signed. The clients will reap the benefits of IT Outsourcing and the Service Providers will deliver value. The economics will work in favor of both economies and the employees at both sides will be happy to see more greenbacks (or whatever the color of your currency).
So, join me in ushering in the Good Times again!
===============================================================
The author of the piece is: Sreenivas V. He is the Chief Strategy Officer at Hexaware Technologies Limited. The contents on the blog above are his personal views.

Tuesday, March 15, 2011

Configuring Informatica File Transfer Protocol


Informatica File Transfer Protocol can be used to transfer/move files from different environment into our pre-defined Landing Zone. It can also be used to transfer file to the destination folder/directories.   The Integration Service can use FTP to access any machine it can connect to, including mainframes.

Configuring FTP in Informatica Workflow
To use FTP file sources and targets in a session,
  • Create an FTP connection object in the Workflow Manager and configure the connection attributes
  • Configure the session to use the FTP connection object in the session properties.
  • Specify the Remote filename in the connection value of the Session properties.
Guidelines
  • Specify the source or target output directory in the session properties. If not specified, the Integration Service stage the file in the directory where the Integration Service runs on UNIX or in the Windows System directory.
  • Session cannot run concurrently if the same FTP source file or target file located on a mainframe.
  • If a workflow containing a session that stages an FTP source or target from a mainframe is aborted, then the same workflow cannot be run until it’s timed out.
  • Configure an FTP connection to use SSH File Transfer Protocol (SFTP) while connecting to an SFTP server. SFTP enables file transfer over a secure data stream. The Integration Service creates an SSH2 transport layer that enables a secure connection and access to the files on an SFTP server.
  • To run a session using an FTP connection for an SFTP server that requires public key authentication, the public key and private key files must be accessible on nodes where the session will run.
Configuring Remote Filename
Attribute Description
Remote Filename The remote file name for the source or target.  Indirect source file name to be entered, in case of indirect source file is sent. Use 7-bit ASCII characters for the file name. The session fails if it encounters a remote file name with Unicode characters.
If the path name is provided with the source file name, the Integration Service ignores the path entered in the Default Remote Directory field. The session will fail if the File name with path is provided with single or double quotation marks.
Is Staged Stages the source or target file on the Integration Service. Default is “Not staged”.
Is Transfer Mode ASCII Changes the transfer mode. When enabled, the Integration Service uses ASCII transfer mode.
-           Use ASCII mode when transferring files on Windows machines to ensure that the end of line character is translated properly in text files. When disabled, the Integration Service uses Binary Transfer mode.
-          Use Binary Transfer mode when transferring files on UNIX machines. Default is disabled.

Tuesday, March 1, 2011

Informatica – User Defined Functions


Informatica User Defined Functions are similar to Built-in Functions, where these functions need to be created once and execute multiple times. Transformation logics that are common across the ports are the ideal candidate for User Defined Functions.
Transformation Logic implemented without User Defined Functions

Validation “IIF( ISNULL(LTRIM(RTRIM(INPUT))),’TRUE’,’FALSE’)” is being performed in multiple ports.
The disadvantage with this approach is any changes to this validation need to be done in all the ports.
This can be addressed by creating a User Defined function and have the logic incorporated there.
Steps to Create User Defined Functions
Step 1 : Right-click on the User-Defined Functions folder in a repository folder in the Designer.
Click on “New”
Step 2: In Editor add the transformation logic / validation that needs to be performed.
Click ok and validate the UDF.
User Defined Function – Type:
Public if the function is callable from any expression. Private if the function is only callable from another user-defined function.
To Call User Defined functions from Port:

Monday, February 28, 2011

Xcelsius Dashboards – using QAAWS and Live Office

Hi All,
Hope you did not encounter any issues in integrating Xcelsius with SQL Server Reporting Services and SharePoint data.  In this blog, I would like to explain to you on integrating Xcelsius dashboard with Query As A Web Service (abbreviated QAAWS) and Live Office.
Introduction
  • QAAWS is an Xcelsius provided web service that gets data from a Business Objects Universe using queries and exposes the URL that can be used to fetch data at run-time from the database using the universe metadata.
  • Live Office integrates with Microsoft Office, embedding up-to-the-minute corporate data in Microsoft PowerPoint, Excel, and Word documents.
  • The server component of QAAWS is installed automatically when BusinessObjects Enterprise XI Release 2 Service Pack 2 with Web Intelligence.
  • The server component of Live Office is installed automatically with BusinessObjects Premium while keys need to be purchased for Professional edition.
  • Client tool for QAAWS needs to be purchased as a separate license.
  • Client components of both Live Office and QAAWS are available in the collaterals CD in the Adds-on folder.
Xcelsius Connector to be used
  • Using Query as a Web Service client tool installed in the user’s machine, login and connect to the universe from where data needs to be fetched for the dashboard
  • Generate a WSDL (Web Service Definition Language) similar to creating a query using a desktop intelligence report query.  Result objects and Filter objects will be defined in the WSDL.
  • Prompts can be specified in case parameters need to be passed between the queries.
  • In the Xcelsius dashboard, the Query as a Web Service component should be used to fetch data using the URL created through the client tool as described above.
  • Live Office client once installed on a machine appears as another Tool Bar option in Office tools.
  • Using the menu, data from Web Intelligence reports and Crystal reports can be fetched into Excel that can be a source of information for the dashboard.
  • Similar to QAAWS, the Web Intelligence and Crystal Report queries can have prompts to pass parameters between queries at run-time.
  • In the Xcelsius dashboard, the Live Office component can be used to fetch data from Web Intelligence reports and Crystal reports.
Dashboard Features
  • For a dashboard shown as sample that depicts the Service Requests (SRs) handled for a client, there are three queries to fetch the data.
    • SRs raised, opened and closed
    • High and Critical SRs open more than 3 days
    • High and Critical SRs opened today
  • For the first two queries Live Office queries is used to fetch data and the third query QAAWS is used to fetch data.
  • Live Office queries are used when data needs to be fetched in the form of cross-tabs which is not possible through QAAWS.
  • Using both QAAWS and Live Office, it is possible to fetch data from more than one data source.
  • On clicking a specific month, details of the SRs would be displayed in a new window.  The details are typically a drill down in business intelligence terminology.
  • Drill downs
    • Drill down can be provided from the main dashboard to another Xcelsius dashboard with more details or a Web intelligence / Crystal report
    • Even PDF documents that have been generated from other tools can be viewed in the Xcelsius dashboards
    • Flash variables can be used to pass data from the main dashboard to the drill down dashboard
    • opendocument URL can be used to call drill down reports and pass parameters from the dashboard to the drill down reports
  • Security
    • Xcelsius dashboards can handle the security similar to web intelligence documents.  The dimension level security specified in the universe, applies to the dashboards too.
    • In the specific sample shown, if the SRs are from different applications, it is possible for managers, to view only the applications to which they are responsible for.
We have provided ways of integrating Xcelsius dashboards with live data using QAAWS, Live Office, Reporting Services and SharePoint Consumer, Provider and Param components.
Let me know if you would like to know more on other dashboard topics.  Get back to me in case on any specific queries.  I would be happy to assist you.  Happy reading!
Read More about  QAAWS

Business Objects Query Builder – Part II


OM CI_INFOOBJECTS WHERE SI_KIND=’Webi’ AND SI_NAME LIKE ‘Annual%’ AND SI_RUNNABLE_OBJECT=1
3. To extract  list of Web Intelligence documents that are scheduled in a specified period of time
SELECT SI_NAME FROM CI_INFOOBJECTS WHERE SI_KIND=’Webi’  and SI_RUNNABLE_OBJECT=1 and
SI_NEXTRUNTIME between ‘2010.07.08.09′ and ‘2010.07.08.11′
4. To return all report folders containing a string
SELECT * FROM CI_INFOOBJECTS WHERE SI_NAME LIKE ‘%Service%’ AND SI_KIND=’Folder’
5. To returns all Universe folders containing a string
SELECT * FROM CI_APPOBJECTS WHERE SI_NAME LIKE ‘%Sales%’ AND SI_KIND=’Folder’
6. To see what type of rights you have for your BO software
SELECT SI_NAME from CI_SYSTEMOBJECTS where SI_NAMEDUSER=0 AND SI_KIND=’User’
7. To find all crystal and webi reports – not instances
Select si_id, si_name from ci_infoobjects where (si_kind = ‘CrystalReport’ or si_kind = ‘Webi’) and si_instance = 0 and si_children = 0
  • To find all crystal reports – not instances or shortcuts
select si_id, SI_NAME,   si_owner,  SI_PARENT_FOLDER,  si_children, SI_PROCESSINFO.SI_FILES,  SI_PROCESSINFO.SI_LOGON_INFO, SI_PROCESSINFO.SI_RECORD_FORMULA from CI_INFOOBJECTS where (si_kind = ‘CrystalReport’) and si_instance = 0 and not si_name like ‘Shortcut to%’
  • To find all the failed instances
select SI_NAME, SI_OWNER, SI_AUTHOR, SI_STATUSINFO from CI_INFOOBJECTS where SI_SCHEDULEINFO.SI_OUTCOME>=2 and SI_SCHEDULEINFO.SI_STARTTIME>=’2011.01.01′
  • To find successful instances
select SI_NAME, SI_OWNER, SI_AUTHOR, SI_STATUSINFO, SI_SCHEDULEINFO from CI_INFOOBJECTS where SI_SCHEDULEINFO.SI_OUTCOME<2 and SI_SCHEDULEINFO.SI_STARTTIME>=’2011.01.01′
  • To find successful instances of a particular report after a specific date
select SI_NAME, SI_OWNER, SI_AUTHOR, SI_STATUSINFO, SI_SCHEDULEINFO from CI_INFOOBJECTS where SI_SCHEDULEINFO.SI_OUTCOME<2 and SI_SCHEDULEINFO.SI_STARTTIME>=’2011.01.01′ and SI_NAME = ‘Test.rpt’
  • To find scheduled instances for a specific time range
select SI_NAME, SI_SCHEDULEINFO.SI_submitter, SI_SCHEDULEINFO.SI_STARTTIME from CI_INFOOBJECTS where SI_SCHEDULEINFO.SI_STARTTIME>=’2011.01.01.16.00.00′ and SI_SCHEDULEINFO.SI_STARTTIME<’2011.01.02.13.00.00′ order by SI_SCHEDULEINFO.SI_STARTTIME
  • To find successfully scheduled reports (not instances) scheduled after a certain date
select si_id, SI_NAME, si_owner, SI_PARENT_FOLDER, si_children, SI_PROCESSINFO.SI_FILES, SI_PROCESSINFO.SI_LOGON_INFO, SI_PROCESSINFO.SI_RECORD_FORMULA,  SI_SCHEDULEINFO.SI_STARTTIME, SI_SCHEDULEINFO.SI_SUBMITTER,  SI_SCHEDULEINFO.SI_DESTINATION, SI_SCHEDULEINFO.SI_UISTATUS from CI_INFOOBJECTS where SI_SCHEDULEINFO.SI_OUTCOME = 1 and si_instance = 0 and SI_SCHEDULEINFO.SI_STARTTIME>=’2008.11.01′
  • To find recurring instances
select si_id, SI_NAME, si_owner, SI_PARENT_FOLDER, si_children, si_recurring, SI_PROCESSINFO.SI_FILES, SI_PROCESSINFO.SI_LOGON_INFO, SI_PROCESSINFO.SI_RECORD_FORMULA, SI_SCHEDULEINFO.SI_STARTTIME, SI_SCHEDULEINFO.SI_SUBMITTER, SI_SCHEDULEINFO.SI_DESTINATION, SI_SCHEDULEINFO.SI_UISTATUS from CI_INFOOBJECTS where not si_name like ‘Shortcut to%’ and si_recurring=1 and SI_SCHEDULEINFO.SI_STARTTIME>=’2008.11.01′
  • To find users who have logged in since a specified date or whose userid was created after a specified date, but may not have logged in
select si_name, SI_CREATION_TIME, si_lastlogontime from ci_systemobjects where si_kind = ‘user’ and (si_lastlogontime > ’2008.11.01.04.59.59′ or SI_CREATION_TIME > ’2009.04.01.04.59.59′ )
  • To find reports that have not been scheduled
select SI_NAME, SI_OWNER, SI_AUTHOR, SI_SCHEDULEINFO, SI_PARENT_FOLDER from CI_INFOOBJECTS where (si_kind = ‘CrystalReport’ or si_kind = ‘Webi’) and si_instance = 0 and si_children = 0 and SI_SCHEDULEINFO.SI_SCHED_NOW = 0
  • To find users are all logged in to Business Objects at a given Point of time
SELECT TOP 1000 * FROM CI_SystemObjects WHERE si_kind = 'Connection' AND si_parent_folder = 41 AND si_authen_method != 'server-token' ORDER BY si_name
  • To get list of Crystal reports by data connection from BO Enterprise
SELECT SI_NAME FROM CI_APPOBJECTS WHERE SI_KIND=’MetaData.DataConnection’
  • To find universe used by the report
SELECT SI_ID, SI_NAME, SI_WEBI , SI_OWNER
FROM CI_INFOOBJECTS, CI_SYSTEMOBJECTS, CI_APPOBJECTS
Where PARENTS(“SI_NAME=’Webi-Universe’”,”SI_NAME =’Your Universe Name’”)
  • To get all recurring reports from Specific folder
SELECT * FROM CI_INFOOBJECTS WHERE si_parent_folder = ’3711′ and
SI_recurring = 1
  • To get all recurring reports from Specific folder NOT Paused
SELECT * FROM CI_INFOOBJECTS WHERE si_parent_folder = ’3711′ and
SI_recurring = 1 and SI_SCHEDULEINFO.SI_SCHEDULE_FLAGS = ’0′
  • To get all recurring reports from Specific Folder, All Recurring PAUSED:
SELECT * FROM CI_INFOOBJECTS WHERE si_parent_folder = ’3711′ and
SI_recurring = 1 and SI_SCHEDULEINFO.SI_SCHEDULE_FLAGS = ’1′
  • To get list of users who is logged in to your Business Objects XI at a given Point of time
SELECT TOP 3000 * FROM CI_SystemObjects WHERE si_kind = 'Connection' AND si_parent_folder = 41 AND si_authen_method != 'server-token'
ORDER BY si_name
  • To get Get All Webi reports from the repository
Select SI_ID, SI_NAME From CI_INFOOBJECTS Where SI_PROGID=’CrystalEnterprise.Webi’ And SI_INSTANCE=0
  • To get Full Client Reports from the repository
SELECT SI_ID, SI_NAME,SI_FILES FROM CI_INFOOBJECTS WHERE SI_KIND in( ‘webi’ ,’FullClient’)
  1. To get all reports from the repository

  2. Select SI_ID, SI_NAME From CI_INFOOBJECTS Where SI_PROGID=’CrystalEnterprise.Report’ And SI_INSTANCE=0

  3. To get all universes from the repository

  4. Select SI_ID, SI_NAME, SI_WEBI, SI_KIND From CI_APPOBJECTS where SI_KIND =’Universe’

  5. To get all Users from the repository

  6. SELECT SI_ID, SI_NAME FROM CI_SYSTEMOBJECTS WHERE SI_PROGID=’CrystalEnterprise.USER’

  7. To get all groups from the repository

  8. Select * from CI_SYSTEMOBJECTS Where SI_KIND=’UserGroup’

  9. To get all folders from the repository

  10. Select SI_ID, SI_NAME From CI_INFOOBJECTS Where SI_PROGID=’CrystalEnterprise.Folder’

  11. To get all categories from the repository

  12. SI_ID, SI_NAME From CI_INFOOBJECTS Where SI_KIND=’Category’

  13. To get all personal categories from the repository

  14. Select SI_ID, SI_NAME From CI_INFOOBJECTS Where SI_KIND=’PersonalCategory’
Hope all these could be useful to you when it comes in to handy. In the forthcoming post, I will discuss on Business Objects file repository servers in detail.
You can Read it more Business Objects Query Builder

Monday, February 14, 2011

Provisioning to two Active Directory Domains with Oracle Identity Manager – Connector Cloning – Part I


In many large enterprises, there can be two Active Directory Domains used (sometimes more than two), for example, one for India users and one for North America users (Considering the company has two major locations). This requires two AD Connector instances to be created in OIM, for provisioning and reconciliation purposes. OIM Connector Guide for Active Directory User Management provides following description for creating copies of the Connector to provision into multiple target systems. However detailed instructions are not available in the connector.

From the Oracle Connector Documentation (Oracle Identity Manager Connector Guide for Microsoft Active Directory User Management – Release 9.1.1 – E11197-11 – Page 186):
Section: 4.15.1
To create a copy of the connector:
  1. Create copies of the IT resource, resource object, process form, provisioning process, scheduled tasks, and lookup definitions that hold attribute mappings.
  2. Create a copy of the Lookup.AD.Configuration lookup definition. In the copy that you create, change the values of the following entries to match the details of the process form copy that you create.
      1. ROUserID
      2. ROUserManager
      3. ROFormName
      4. ROUserGUID
  3. Map the new process tasks to the copy of the Lookup.AD.Configuration lookup definition.
Initially I was not sure how I can setup the Cloning. I had two Active Directory Domains. When the users are created in OIM, access policies will identity to which one it has to be provisioned. However I have to setup two AD Connectors for these two domains.
Based on my investigation, following AD Connector Specific objects are involved:
  1. Copy of the IT Resource
  2. Copy of the RO
  3. Copy of the Process form
  4. Copy of the Provisioning Process
  5. Copy of the Scheduled Tasks
  6. Copy of the Lookup Definitions
  7. Copy of the Reconciliation Rule
First, you need to export the relevant objects as XML file, rename them by manually editing the XML file, then re-import them. One recommendation, is run your XML file through “xmllint –format” on Linux, that should make it more readable, so it is easier for you to edit (Thanks to Oracle Support for providing this – xmllint – information).
Here are the steps for cloning a connector – based on my personal experience:
  1. Identify all the connector Objects used by the Active Directory Connector (Mostly the below tables – but I am still not sure whether I covered all the objects – Please let me know if I missed any).
  2. Export these Objects using Deployment Manager Export Utility. This will create an XML File during the export.
  3. Once you have the XML file, you need to identity and replace the values for the objects in the XML file. This is the main reason you should be aware of the AD Connector Objects.
  4. Then, you can import this manipulated XML file into the OIM System. I faced errors during the import. I will write about those errors in the next post.
AD Connector Objects:
S. No.Object TypeObject Name for AD Connector
1.IT ResourceAD IT Resource
2.Resource ObjectAD User
3.Process FormUD_ADUSERUD_ADUSRC
4.Provisioning ProcessAD User
5.Scheduled TasksTarget Recon
6.Lookup DefinitionsMany…
7.Child TablesUD_ADUSER*
In my current OIM System, I have the default connector configured to the First AD Domain. The cloned connector is configured to the second AD Domain. I thought it was confusing. So, I had a question about this and received the below information from Oracle Support. Hope it is useful.
The best approach is to import the connector twice for the two domains by using the cloning method to clone twice, and leave the original objects installed unused. That way, when you upgrade to newer connector version on top the existing one, you will update the original unused template objects, then clone the change on to the two domain objects.
Second method is, keeping the installed AD Connector for one domain, and the clone the AD Connector for the second AD Domain, will also work.
I liked the approach of keeping two connectors cloned. You may like the other approach, but it is up to you to decide.
I will write a continuation of this post later. Until then