Tuesday, December 27, 2011

Cloud Computing- Easily Config To Any Computer Resource's


What is Cloud Computing?


Cloud Computing is a model that enables access to a configurable computing resource, which is easily accessible with no or only minimal service provider input.

“If the user does not know the physical location and configuration of the system that delivers the services of computation, software, data access, and storage services”, then it is Cloud Computing
.
The Flexible usage of IT services which are available in real-time/on-line via the Internet/Intranet, enabling users global access to standardized services. Cloud Services require no (or minimal) initial fixed cost investment and are charged on a usage basis.

What can be achieved with Cloud?


Within Cloud Computing, varieties of technically innovative solutions are combined and can deliver the potential for an innovative business approach leading to:
  • Cost  Reduction
  • Cost Structure Improvement
  • Variability of Cost
  • Flexibility of Services
  • New business models
  • Time to market for new products is significantly reduced
Steps to Optimize Cloud Computing:
Is the Business ready to use Cloud?

If yes, which services offered within the Cloud are the most suitable to support the Business Model?

Before engaging in Cloud Computing, initiate a project that reviews existing processes and products. The review process should go through the following steps in order to optimize use of Cloud Services:

Step 1: Standardization – Evaluate the actual level of process automation within the business being serviced.

Step 2: Virtualization – Identify the opportunities (from an IT and process perspective) within the existing operating model to virtualize services.

Step 3: Automation – Analyze level of automation of processes and identification of potential by using the Cloud for virtualized processes.

Step 4: Cloud Computing – Introduce those selected services into Cloud and define how to integrate Cloud services into existing IT/process environment.

When should a company go for cloud computing?


There is a growing belief that over the next few years, Cloud Computing will become a major stimulus for change in how corporations view and use Information Technology.

Cost efficiency, scalability and availability are the main drivers in the discussion regarding Cloud Computing. Security and privacy are the main issues, which need to be dealt with when using services in the Cloud.

From an end user perspective, Cloud Services will offer small and midsize companies access to a level of technology that is currently not available to them. Large corporations will be able to make their IT environment more flexible and reduce their cost base. The speed of new product developments will increase significantly and will offer major advantages for the end customer.

Conclusion:


All major providers offer services in the Cloud, which can bring enormous advantages to large, small and midsized companies when properly implemented and used. There are challenges in implementing Cloud Computing, which are similar to those inherent in IT sourcing engagements and can be successfully dealt with when addressed in a structured way.

Tuesday, December 6, 2011

HP Extensibility Accelerator


The rapid adoption of Rich Internet Application (RIA) technologies and Web 2.0 innovations has significant implications for both web users and the teams involved in the functional testing of web-enabled applications. The advent of Web 2.0 applications has created unprecedented challenges for organizations that focus on automated functional testing.  Web 2.0 enabled applications can leverage various technologies on the client side and through web browsers. With Web 2.0, the client side of the application processes more scripted code and rich presentation frameworks than in traditional environments. This shift of processing to the client side challenges the capabilities of all toolsets designed for the functional testing of Web-Enabled Applications.

Testing challenges in Web 2.0:

Some of the typical challenges faced were as follows:
  • Web pages are dynamic and asynchronous.
  • Portions of web pages can now be refreshed automatically to give users updates on sports scores, stock quotes, etc.
  • Current activities of people they connect with via social networking sites
  • Users have more control than ever before. Via sites such as iGoogle, users can now create their own home pages that bring together information and content from across the web like, local weather forecasts, headlines from prominent news outlets, and videos from YouTube etc.
  • Client side of the application processes more scripted code and rich presentation frameworks than in traditional environments
Why HP Extensibility Accelerator?
HP thought of putting forth advanced tools to test Web 2.0 technologies with HP Functional Testing. As a result, HP came up with a new accelerator for functional testing to overcome those challenges. The Extensibility Accelerator for HP Functional Testing provides a Visual Studio-like IDE that accelerates and facilitates the design, development and deployment of HP QuickTest Professional Add-in Extensibility support sets.

These support sets extend the HP Functional Testing Web Add-in so you can test Web controls that are not supported out-of-the-box.

Evolution of HP Extensibility Accelerator:

Extensibility is enhanced and accelerated with the new HP Extensibility Accelerator for Functional Testing software, which provides an environment that speeds the development of Web Add-in Extensibility toolkit.

What is HP Extensibility Accelerator?

HP Extensibility Accelerator for Functional Testing is a separate utility that can be used on a machine with or without an installed copy of HP Functional Testing.

It provides a user interface and special tools that help us define new test object classes, map those test object classes to the controls in our application, and teach QTP how to identify the controls, perform operations on the controls and retrieve their properties.

Features of HP Extensibility Accelerator:
  • Creating and defining the test object classes using JavaScript functions for the custom controls.
  • HP Extensibility Accelerator provides JavaScript editing capabilities and debugging tools to facilitate the writing of these functions.
  • Maps the test object class to your control and application, and it automatically identifies the rules that will teach HP QuickTest Professional how to recognize the test object class in your application.
  • The HP Extensibility Accelerator IDE simplifies the process of creating and editing the test object/toolkit configuration XML files.
  • The HP Extensibility Accelerator deployment capabilities enable you to automatically deploy your new toolkit support set to HP QuickTest Professional or to package it so you can share it with other HP QuickTest Professional users
Conclusion:

With the HP Extensibility Accelerator for Functional Testing, we’re making it easy for our users and partners to create their own extensibility assets and extend our software to support web controls that are not supported out of the box. With the hundreds of Ajax toolkits in use today and new ones coming out each month, the HP Extensibility Accelerator provides an extremely important set of tools for your organization.

The software itself can be installed and used on a machine that does not have HP QuickTest Professional on it. Custom toolkits developed with the software can then be deployed on one or more systems that are running HP QuickTest Professional.

Pre-requisites for General Automation


Automation testing is the most preferred way to achieve our testing goals on time. But it is in turn has lots of dependencies on other parameters. We will see below some of them.

Why Test Automation?


Below are some reasons on why to go for automation:

  • New releases and bug fixes in working module
So automate your testing procedure when you have lot of regression work.
  • Testing web applications with multiple users simultaneously
Automate your load testing work for creating virtual users to check load capacity of your Web Application.
  • Testing application where code is changing frequently
Automate your testing work when your GUI is almost frozen but you have lot of frequently functional changes.

Risks associated with Automation:


There are lots of risks associated with test automation. We need to give importance to each one of the below to reap the benefits out of automation.
  • Skilled resources:
For automation you need to have resources having some programming knowledge.

“Do resources have sufficient programming knowledge for automation testing? “


If not, do they have technical capabilities or programming background that they can easily adapt to the new technologies?
  • Initial cost for Automation is very high:
Automation cost is too high for initial setup i.e. cost associated to automation tool purchase, training & maintenance of test scripts are very high.
  • Do not think to automate your UI if it is not fixed:
Beware before automating user interface. If user interface is changing extensively, cost associated with     script maintenance will be very high. Basic UI automation is sufficient in such cases.
  • Is your application stable enough to automate?
It would be bad idea to automate testing work in early development cycle (unless it is agile environment). Script maintenance cost will be very high in such cases.
  • Are you thinking of 100% automation?
You cannot 100% automate your testing work. Certainly you have areas like performance testing, regression testing, load/stress testing where you can have chance of reaching near to 100% automation. Areas like User interface, documentation, installation, compatibility and recovery where testing must be done manually.
  • Do not automate tests that run once:
Identify application areas and test cases that might be running once and not included in regression.
  • Will your automation suite be having long lifetime?
Every automation script suite should have enough life time that its building cost should be definitely less than that of manual execution cost.

This is bit difficult to analyze the effective cost of each automation script suite. Approximately your automation suite should be used or run at least 15 to 20 times for separate builds (General assumption. depends on specific application complexity) to have good ROI.

Conclusion:


Automation testing is the best way to accomplish most of the testing goals and effective use of resources and time. But you should be cautious before choosing the automation tool.
Be sure to have skilled staff before deciding to automate your testing work. Otherwise your tool will remain on the shelf giving you no ROI. Handing over the expensive automation tools to unskilled staff will lead to frustration.
Before purchasing the automation tools make sure that tool is a best fit to your requirements. You cannot have the tool that will 100% match with your requirements. So find out the limitations of the tool that is best match with your requirements and then use manual testing techniques to overcome those testing tool limitations. Open source tool is also a good option to start with Automation.

SAP TAO and Panaya – Complementers and not competitors


Panaya is new Software as a Service solution that reduces the cost and risk of making changes to ERP systems. The tool was founded in 2005 by Yossi Cohen, an expert in the field of application management.


It is designed for business users, from CIOs to business analysts, Panaya automatically analyzes the impact of pending business process changes and informs all stakeholders on a need to know basis. Panaya can be used during implementation, maintenance and upgrades.


Utilizing a SaaS model, Panaya can be deployed within a few hours to instantaneously lessen the risk of business disruption and reduce ERP maintenance costs.


What Panaya provides?
  • Effective Test Management
  • Shows you what will break
  • Tells you exactly what to test
  • Automatic code corrections
  • No more test maintenance: test scripts always stay up-to-date
  • Creates test scripts for you, as you go
  • Tests can be run manually or automatically
  • Test results are documented for you
What can be achieved by using Panaya?
  • Cut 70% of your SAP upgrade cost and Effort.
  • Cut 50% of your planning and development costs.
  • Enable companies that use SAP to save up to 50% of their application lifecycle costs.
  • Minimize the risks associated with system changes.
  • Utilizing cloud-based simulation to analyze the impact of pending changes, Panaya automatically pinpoints which custom programs will break as a result of an upgrade or support package implementation.
  • No Installation required.
  • Reduce risk of post go-live downtime.
How it complements SAP TAO?
  • The main pre-requisite for automation using SAP TAO is availability of manual test cases in a standard format. Panaya provides manual test cases as you go saving lots of time and effort.
  • SAP TAO has high reusability and ease maintenance – minimal effort is required during maintenance phase – Panaya can really help TAO when SAP Upgrade happens.

Monday, December 5, 2011

HexaFLOW GRABBER


Introduction: In the third generation of ERP Applications and Web Based Application are the proficient solution in the Automation arena, the automation process is being accelerated by the BPT Accelerator Framework, which speeds up test design phase since it acts as a ready reckoner. However a gap analysis on the existing Accelerator Framework to make the Accelerator more efficient was required.
Being the pioneers as the BPT Accelerator solution providers across the globe for almost a decade, there was a need to perform a gap analysis, the gap analysis highlights, that the effort distributed on the BPT Accelerator workflow is still biased towards the Test Design Phase and can be fine-tuned.
2.Abstract:In the new era of automation, the focuses of the organizations are mainly to minimize script maintenance effort thereby increasing the Return on Investment (ROI) and to look for a framework where a Functional Subject Matter Experts (SMEs) or Business Analysts will able to build the Automation Test Scripts. This paper promotes an optimal approach to accelerate the automation, in order to leverage the Efforts spent specifically on the test design Phase. Through a simplified and swift test design process.
With a groundbreaking Solution Via a new framework which will transcribe the manual interactions to Automation script. The Solution is intended to be adopted for various ERP applications, Windows Application and Web-Based Applications and to reduce the gap that exists between the Functional Knowledge and Technical feasibility.
3.Challenges Faced:The BPT Accelerator framework provides an exclusive and easy solution for the automation testing, leaving behind many areas of improvements and main challenges to be address by an evolution of a concept or framework.
There are many challenges that have to be addressed on the solution that has to be provided to increase the efficiency of the BPT Accelerators in a drastic manner.
3.1High dependency on Manual mapping of the correct components against the Business process flow:
BPT Accelerators has a ready to go or a Jump start kit which covers nearly 40% of the Test Design Phase in any Automation Project starting from scratch. The Accelerators comes along with three solutions that lay a framework on top of the Traditional automation to drive the complete Test Design Phase. By using the highly reusable Jump Start Kit and the Accelerator solutions the Test Design Phase could be automated. However there is a major dependency on the Accelerator Experts to possess monotonous knowledge on the Business Components namely sequencing the Business Components as per the Business Flow, Parameterization and creation of data table for iterative execution.
3.2Non-existence of Application Independent solution:
The non-availability on single acceleration solution that will handle business flows across various ERPs and web applications simultaneously in spite of the BPT Accelerator which could leverage the Automation script development for multiple applications, and this indeed creates a necessity for a solution which could be used in designing Test Scripts across different Applications (ERP and Web) for Test Script Design using the BPT Framework.
3.3Need of a One Stop package Solution
An End to End Acceleration solution which will package all aspects of automations Testing and render quality at its best by performing Verification, validation and synchronization of the scripts to serve the customer’s Requirements’ is the need of the hour.
3.4 JIT (Just In Time)Considering the role played by the Automation scripting inside the testing delivery, there is high dependency of a Freezing of Requirements and stable instance, which makes positioning of the automation critical and thus calls swift and speedy delivery .Customers expectations are also more toward the above call ensuring the same quality.
3.5Erroneous Test Design-The Test design involves critical decision making of replication of the business flow into Automation flow, human errors, assumptions and misconceptions often proves very costly in Time, effort and Money .This pushes the industry to evolve with a solution that self handles and transcripts the Business flow all by itself and reduces the human intervention even during the Test design phase
3.6 Application & Version Compatibilities:Though the challenges are on the expectation of a solution, there would be an additional challenge that need to be taken into account is the solution need to be compatible to any application and any versions.
4.Proposed Solution(s):The following section will throw more light over a solution proposed and the approach that could be adopted for finding a remedy to various challenges that are faced by the current acceleration industry
4.1Application & Version Compatibilities-The solution provided for the challenges faced should be capable for capturing the flow on the application (ERP applications and the Web Based Applications). Thus a solution requires a medium that could capture the events or steps replicated in GUI for ERP and Web Based Applications.
4.1.1HP Quick Test Professional-On the analysis over the applications under test a tool that can capture all the steps happening on the GUI and interpret to an expected BPT Test. The Best solution for capturing the test step could be providing by “HP Quick Test Professional”.
•The solution would be apt to serve the purpose of capturing the manual flow, since the BPT Accelerator has a major prerequisite as HP Quick Test Professional. This way we can have a feature added to the solution to understand manual flow performed with nil investment of the first and major stage of the mission.
•The solution can be used to capture the flow on any version and any ERP applications since the required Add-ins would be installed in the QTP for recognizing the Test Objects in the AUT for the execution of the designed automation scripts
4.2Manual Interventions-As the initial challenge of understanding and capturing the business flow could be handled through the HP QTP the functional Automation tool. The next we are in need of a solution to override the manual intervention on
•Manual Execution to confirm
o whether application is stable
o whether the data and flow is valid
• Creation of the Business Process Test in HP Quality Center
• Stitching the Business Components in the BPT as per the business flow
• Synchronization and Proof of Execution
These challenges need to be addressed on the venture of the solution.
4.2.1HP Quality Center Automation:-As the first challenge in the manual intervention could be handled far better using the HP QuickTest Professional in our first solution, let us concentrate on getting a solution that can interpret the captured flow and understand which component has to be placed and in the sequence.
The solution for eliminating the manual intervention on the second and the third point could be accomplished by automating the HP Quality Center to create the Business Process Test and to stitch the business components in sequence; here the interpretation plays a major role. Thus by the interpretation the challenges of stitching the business components could be addressed.
4.3Synchronization and Proof of Execution:
The key component for a testing either a manual or automation would be the Proof of execution. This will provide an evidence of both capturing the defect and also to confirm that the application is working as expected. There could be a scalability included to the solution. Inclusion of the Capture Screen Business Component in the manual flow as and when required with the screen name, during the manual flow through the control bars having the controls.
The synchronization is a key solution that is really expected in an automation script. The inclusion of the feature in the solution would be more added edge. The Business Component for the synchronization could be added in the flow along with the specific time taken for the screen to be loaded or an action to be completed.
4.4Tangible ROI – Business Benefits:
An analysis has be done on the current system and the system where the Flow Transcriber could be included in the workflow and the script can be designed and the ROI study also done on the same. Considering the effort distribution on a regular BPT Accelerator Project and the Project once the Flow Transcriber is included, there would be a drastic change in the effort spent on the Test Design Phase of the overall BPT Accelerator workflow. The table and the graph below will state the difference in the effort spent to design a medium complexity scenario in real time.
There would be a drastic reduction on the Test Design Phase of the Workflow of the BPT Accelerators using this solution in the Business in real time.
Accelerator Activities – Flow Transcriber Implementation Using Accelerator(Efforts in Minutes) Using Flow Transcriber (Efforts in Minutes)
Review of Manual Test case
- Data Identification
- Manual Execution 30 30
Test Design Phase
- Creation of Business Process Test
- Stitching of Business Process Test using Pre-defined
- Parameterization of the Components in the Workflow
- Creation of the Data Table 170 60
Consolidation 15 15
Execution
- Debug and Execute in test plan
- Execute in test lab 45 45
Report Log validation 10 10
Table: Accelerator Effort Distribution Vs Flow Transcriber Implemented Accelerator Effort Distribution
Graph: Comparative Effort Distribution Graph – Test Design Phase
5 Business Benefit
• Extensive Reduction of Test Design Phase.
• Provision of Ready to Run scripts in negligible time.
• Reduction of Manual Intervention in the BPT Scripts Creation Phase.
• Instant parameterization and Datatable creation.
• Provision to include synchronization and screenshot in midst of the Design Phase through User Interactive controls.
• Cuts across boundaries to support various ERPs and Web Applications
o A Single solution for various ERPs and Web Application
Figure: Compatibility of HFG across various Applications
6 Future Direction / Long-Term Focus
• Inclusion of UI Scanner – In the current feature of the concept ,the UI Scanner Tool has not been included and in the Long-Term Focus the UI Scanner is being planned to be included in the flow.
• Verification & Validations – A Testing of an application gets fulfilled only when the Verification and Validations are provided. In the Flow Grabber there is a good scope and the enhancement to include the same.
7 Conclusion
A complete solution for the Test Design Phase of any accelerator project irrespective of any ERP applications is the necessity for optimizing the accelerator, this solution is being taken as the requirement and an analysis and development is being done to evolve the solution into a tool named Hexa Flow Transcriber in real time. It has an expectation to migrate into a complete package for the Test Design Phase in an accelerator Project.

Thursday, November 24, 2011

Business Objects Administration – Cache Management

Cache management in Business Objects is going to be our topic today. We can manage the cache in Server as well as client side. Let us see how this is being done.
1. Managing Cache in Server side
Changing the Business Objects Registry Parameter: MaximumObjectsToKeepInMemory
Memory usage is controlled by the number of objects stored in the object cache. This is controlled by the Windows registry key called “MaximumObjectsToKeepInMemory” and specifies the maximum number of objects that the CMS stores in its memory cache. Increasing the number of objects reduces the number of database calls required and greatly improves CMS performance. However, placing too many objects in memory may result in the CMS having too little memory remaining to process queries. The upper limit is 100,000 and the default setting is 40,000.
Business Objects Registry Parameter
The other place to enable this parameter is in the CMS server’s command line. The parameter is called “-maxobjectsincache”. This is invoked by adding it in CMC to the command line.

Business Objects Registry Parameter
2. Managing Cache in Client side
In client side we can clear the cache in Web browser and Java control panel.
Clearing cache in web browser
To clear the cache in Internet Explorer (IE7) follow these instructions:
Tools > Internet Options > General > Browsing history > Delete…
Business Objects Registry Parameter
Clearing cache in Java Control Panel
To clear the cache in Java Control Panel follow these instructions:
Control Panel > Java > General > Delete Files…
Business Objects Registry ParameterBusiness Objects Registry Parameter
Hope you will be much comfortable dealing with Cache henceforth after reading this blog. Keep reading!

Wednesday, November 16, 2011

Configuring Oracle Unified Directory (OUD) 11g as a Directory Server


I used Oracle Unified Directory (OUD) Version 11.1.1.5.0 during my test deployment locally here. I tried to collect as much information possible in this post for configuration.
Ideally, there are three possible configuration options for OUD:
  • as a Directory Server
  • as a Replication Server
  • as a Proxy Server
Directory Server provides the main LDAP functionality in OUD. Proxy server can be used for proxying LDAP requests. And Replication Server is used for replication from one OUD to another OUD or even to another ODSEE (earlier Sun Java Directory) server. You can my previous posts on OUD here and here.
In this post, we will talk about configuring OUD after installation as a Directory Server. You can read about OUD installation in my previous post here.
Once installation is completed, you will find following files in $ORACLE_HOME Directory.
-rwxr-x---  1 oracle oracle 1152 May 17 11:16 oud-proxy-setup

-rwxr-x---  1 oracle oracle 1482 May 17 11:16 oud-proxy-setup.bat

-rwxr-x---  1 oracle oracle 1180 May 17 11:16 oud-replication-gateway-setup

-rwxr-x---  1 oracle oracle 1510 May 17 11:16 oud-replication-gateway-setup.bat

-rwxr-x---  1 oracle oracle 1141 Aug 10 16:50 oud-setup

-rwxr-x---  1 oracle oracle 1538 May 17 11:15 oud-setup.bat
In this listing, .bat files are used in windows. So, In Linux (that is what I am using), we will be using following files.
  • oud-setup – To configure Directory Server
  • oud-replication-gateway-setup – To configure Directory Replication Server
  • oud-proxy-setup – To Setup Proxy Server
You can run the script shown below.
$ ./oud-setup
OUD Instance location successfully created - /u01/oracle/Middleware/Oracle_OUD1/../asinst_2
Launching graphical setup...

The graphical setup launch failed. Check file /tmp/oud-setup-8836874387532698932.log for more details.

Launching command line setup...

Oracle Unified Directory 11.1.1.5.0
Please wait while the setup program initializes...

What would you like to use as the initial root user DN for the Directory
Server? [cn=Directory Manager]:
Please provide the password to use for the initial root user:
Please re-enter the password for confirmation:

On which port would you like the Directory Server to accept connections from
LDAP clients? [1389]: 389

ERROR: Unable to bind to port 389. This port may already be in use, or you
may not have permission to bind to it. On UNIX-based operating systems,
non-root users may not be allowed to bind to ports 1 through 1024
On which port would you like the Directory Server to accept connections from
LDAP clients? [1389]:

On which port would you like the Administration Connector to accept
connections? [4444]:
Do you want to create base DNs in the server? (yes / no) [yes]:

Provide the base DN for the directory data: [dc=example,dc=com]:
Options for populating the database:

1) Only create the base entry
2) Leave the database empty
3) Import data from an LDIF file
4) Load automatically-generated sample data

Enter choice [1]: 1

Do you want to enable SSL? (yes / no) [no]: yes
On which port would you like the Directory Server to accept connections from
LDAPS clients? [1636]:

Do you want to enable Start TLS? (yes / no) [no]: yes
Certificate server options:

1) Generate self-signed certificate (recommended for testing purposes
only)
2) Use an existing certificate located on a Java Key Store (JKS)
3) Use an existing certificate located on a JCEKS key store
4) Use an existing certificate located on a PKCS#12 key store
5) Use an existing certificate on a PKCS#11 token

Enter choice [1]:
Provide the fully-qualified host name or IP address that will be used to
generate the self-signed certificate [ut1ef1]:

Do you want to start the server when the configuration is completed? (yes /
no) [yes]:

Setup Summary
=============
LDAP Listener Port: 1389
Administration Connector Port: 4444
LDAP Secure Access: Enable StartTLS
Enable SSL on LDAP Port 1636
Create a new Self-Signed Certificate
Root User DN: cn=Directory Manager
Directory Data: Create New Base DN dc=example,dc=com.
Base DN Data: Only Create Base Entry (dc=example,dc=com)

Start Server when the configuration is completed

What would you like to do?

1) Set up the server with the parameters above
2) Provide the setup parameters again
3) Print equivalent non-interactive command-line
4) Cancel and exit

Enter choice [1]: 3

Equivalent non-interactive command-line to setup server:

oud-setup \
--cli \
--baseDN dc=example,dc=com \
--addBaseEntry \
--ldapPort 1389 \
--adminConnectorPort 4444 \
--rootUserDN cn=Directory\ Manager \
--rootUserPassword ****** \
--enableStartTLS \
--ldapsPort 1636 \
--generateSelfSignedCertificate \
--hostName ut1ef1 \
--no-prompt \
--noPropertiesFile

What would you like to do?

1) Set up the server with the parameters above
2) Provide the setup parameters again
3) Print equivalent non-interactive command-line
4) Cancel and exit

Enter choice [1]: 4
No configuration performed. OUD Instance directory deleted.
$
Then you need to run the oud-setup with the options provided for creating the directory server.
$ ./oud-setup           –cli           –baseDN dc=example,dc=com           –addBaseEntry           –ldapPort 1389           –adminConnectorPort 4444           –rootUserDN cn=Directory\ Manager           –rootUserPassword ******           –enableStartTLS           –ldapsPort 1636           –generateSelfSignedCertificate           –hostName ut1ef1           –no-prompt           –noPropertiesFile
OUD Instance location successfully created – /u01/oracle/Middleware/Oracle_OUD1/../asinst_2
An error occurred while parsing the command-line arguments:  An unexpected error occurred while attempting to initialize the command-line arguments:  Argument “bat” does not start with one or two dashes and unnamed trailing arguments are not allowed
Here, the issue is with the rootUserPassword value. Since I put * here, it replaced with all the files in the local directory, so it failed. Replace it with the required password for the “cn=Directory Manager” as shown below.
$ ./oud-setup           --cli           --baseDN dc=example,dc=com           --addBaseEntry           --ldapPort 1389           --adminConnectorPort 4444           --rootUserDN cn=Directory\ Manager           --rootUserPassword pass_t3st           --enableStartTLS           --ldapsPort 1636           --generateSelfSignedCertificate           --hostName ut1ef1           --no-prompt           --noPropertiesFile
OUD Instance location successfully created - /u01/oracle/Middleware/Oracle_OUD1/../asinst_2

Oracle Unified Directory 11.1.1.5.0
Please wait while the setup program initializes...

See /tmp/oud-setup-5822533240188214866.log for a detailed log of this operation.

Configuring Directory Server ..... Done.
Configuring Certificates ..... Done.
Creating Base Entry dc=example,dc=com ..... Done.
Starting Directory Server ......... Done.

To see basic server configuration status and configuration you can launch /u01/oracle/Middleware/asinst_2/OUD/bin/status
$ cd bin
$ ./status

>>>> Specify Oracle Unified Directory LDAP connection parameters

How do you want to trust the server certificate?

1) Automatically trust
2) Use a truststore
3) Manually validate

Enter choice [3]: 1

Administrator user bind DN [cn=Directory Manager]:

Password for user 'cn=Directory Manager':

--- Server Status ---
Server Run Status: Started
Open Connections: 1

--- Server Details ---
Host Name: ut1ef1
Administrative Users: cn=Directory Manager
Installation Path: /u01/oracle/Middleware/Oracle_OUD1
Instance Path: /u01/oracle/Middleware/asinst_2/OUD
Version: Oracle Unified Directory 11.1.1.5.0
Java Version: 1.6.0_26
Administration Connector: Port 4444 (LDAPS)

--- Connection Handlers ---
Address:Port : Protocol : State
-------------:------------------------:---------
-- : LDIF : Disabled
0.0.0.0:161 : SNMP : Disabled
0.0.0.0:1389 : LDAP (allows StartTLS) : Enabled
0.0.0.0:1636 : LDAPS : Enabled
0.0.0.0:1689 : JMX : Disabled

--- Data Sources ---
Base DN: dc=example,dc=com
Backend ID: userRoot
Entries: 1
Replication: Disabled

$
Now, your newly created OUD Directory Server is running in the machine. You can check this with the ldapsearch command.
$ ldapsearch -h localhost -p 1389 -D “cn=Directory Manager” -w ebs_t3st -s sub -b “dc=example,dc=com” “(objectclass=*)” cn
dn: dc=example,dc=com
$
LDAP Search command will return one entry as shown above.
Here are some of my Observations:
  • If you want to use the port 389/636 for your Directory Server, then you need to run the setup using root user. Then you need to use start-ds and stop-ds commands using root user only.
  • There are six scripts to setup OUD components (three for unix/linux and three for windows environments)
  • You can setup a new TLS based certificate as part of configuring a new Directory Server.
Okay, thats all for now. We will meet in another post. Until then

Monday, October 24, 2011

Advanced Replication Setup for High availability and Performance


In my personal opinion, Oracle leads the market in Directory Product offerings (LDAP Directories). Starting from Oracle Internet Directory (OID), to the latest Oracle Unified Directory (OUD), Oracle definitely provides variety of LDAP Directory related products for integration.
With increasing demand for mobile computing and cloud computing offering, there is a need to standardize LDAP Deployments for Identification, Authentication and (sometimes) Authorization (IAA) services. With a highly scalable, highly performing, highly available, highly stable and highly secure LDAP Directory, these IAA services will be easier to integrate with applications in the cloud or for the mobile applications.

Introduction

Oracle Unified Directory (OUD) is a latest LDAP Directory offering from Oracle Corp. As mentioned in my previous post, OUD comes with three main components. They are:
  • Directory Server
  • Proxy Server
  • Replication Server
Here, Directory Server provides the main LDAP functionality (I assume you already know what an LDAP Directory Server means). Proxy server is used for to proxy LDAP requests (how?). And Replication Server is used for replicating (copying) data from one OUD to another OUD or even to ODSEE server (we will talk more about replication in this post). You can read about my first post on OUD here. In this current article, I will write about replication server and advanced replication setup for Oracle Unified Directory.
Many people want a step by step guide (kind of cheat sheet) to setup something like OUD or OID for replication. Unfortunately I am not going to give you that here. In my personal opinion, that (cheat sheet) is not a right approach at all and will not be helpful in the long run for gaining concepts or knowledge. First of all, we need to give importance to the basic concepts behind how something works.

First of all, read OUD Documentation

Product Documentation must be read before you plan your deployment. You can find the OUD Documentation here. This link is for OUD Version 11.1.1. Make sure to refer the latest product manual. Documentation provides lot of details about the product and save lot of time with investigation later. For Replication, you need to start with “Architecture Reference” Guide.

When do you want to setup replication?

There should be a reason, right? If there is no reason, then there is no need for you to setup replication at all. Instead, you can have a beer and pass the time happily doing something else.
Ideally, you need replication setup for “High Availability” and “Performance”. Usually, there will be multiple instances of OUD Directory Server processes running in Production. Let’s say we need to have around four OUD Directory Servers (and four more for Business Continuity/Disaster Recovery).
Unfortunately, there is no single process to update all the eight OUD Directory Servers in our example. We need to find a mechanism to synchronize the directory entries across these servers.  For this, we need to use the OUD Replication Server Component.

Securing the Replication Traffic

We don’t want network sniffers taking away critical user information (even inside the internal network, it is possible). We need to encrypt the traffic between the replication servers. Do not consider setting up a Replication Server communication without encrypted traffic.
Since OUD provided identity data, all the network traffic is prone to sniffing attacks. Always use encrypted or secure connections to OUD or to any LDAP Directory.

Deciding a Replication Method to use

Next important thing is to decide what replication method you are going to use. This is mostly site specific and you need to know lot of details before deciding a replication method to use. I am planning to use the following sample architecture for this post. Let’s understand our sample OUD Architecture first.
Here are the quick components of the architecture:
  • We have one master OUD Server called PROD-01. All the updates to the directory happens here. Most probably, HR System will update the directory. Also, Updates can happen using a custom developed application plug-in for LDAP Directory or using a Identity and Access Management System (IAM) system such as Oracle Identity Manager or Tivoli Identity Manager.
  • PROD-02 will be used with PROD-01 for High Availability and Performance in this Production Deployment.
  • In Disaster Recovery deployment, we have PROD-03 and PROD-04 servers. These servers need to synchronize the user data from the master server PROD-01.
One way to setup replication is by provisioning users into all the six OUD Directory Servers by an Identity and Access Management (IAM) System (such as Oracle Identity Manager or Tivoli Identity Manager). However this provisioning can be time consuming to complete because it will be treated as updating six different LDAP Directories. So a better way to achieve this is using a Replication Server.
We will continue setting up the Replication Server for this architecture. Lets meet in another post - Until then.

Thursday, September 1, 2011

Business Objects Administration – Managing Timeout settings

Hello Booglers,
From this Blog onwards we are going to see more in detail about Business Objects Administration and Best practices.
Let us start with timeout settings. As we started moving from 2-tier to 3-tier architecture (DeskI to WebI), the role of timeout becomes significant as it is associated with reports running time.This post will be handy for those who struggle with the timeout issue and various methods to apply itin a Business Objects Enterprise deployment.
Timeout can be configured at the below levels.

1. Designer level

In Designer Application Go to: File > Parameters > Controls
Configure the Limit execution time to setting which limits how long the SQL query is allowed to execute. This hasgreatimpact on reports that contain multiple queries.
Universe Parameters

2. Web Application Server level (Tomcat)

  • In the Business Objects Installation server go to Program Files\Business Objects\BusinessObjects Enterprise 12.0\warfiles\WebApps \InfoViewApp\WEB-INF
  • Open web.xml in a text editor.
  • Change the session-timeout value to the one desired. The default value is 20 minutes.
<session-config>
<session-timeout>60</session-timeout>
<!– 20 minutes for session objects –>
</session-config>
Similarly, modify web.xml on the following folders:
  • AdminTools– for Query Builder
  • CmcApp– for CMC
  • CrystalReports– for Crystal reports
  • InfoViewAppActions – for Infoview

3. Central Management Console

Go to: Home > Servers > Web Intelligence Processing Server > Properties
You can configure various timeout setting heresuch as cache timeout, idle document timeout, Idle Connection timeout.Configure these values to the desired.
Similarly, modify settings for Desktop Intelligence Processing Server and Crystal Reports Processing Server.
And finally configure the Auto-save time delay option for Web Intelligence under Business Objects Applications.

4. Central Configuration Manager

Go to: CCM > Web Intelligence Report Server > Properties > Command
Add “-RequestTimeoutxxx” to its command line (where xxx is the duration in milliseconds)
Configure this value to the desired one(for 60 minutes the value would be 3600000).
Similarly, modify settings for Web Intelligence Report Server.
Hope this will be useful for timeout setting in your BOE deployment.
Watch out this space as more to come on BOE Administration. Keep reading!

Tuesday, August 23, 2011

Track Data Changes

What is Data tracking?
Data tracking is a new feature available in Web intelligence in BOXI 3.1. Data tracking places your current report data in context by highlighting changed data and displaying the previous value of a dimension or measure along with its current value.
Data Tracking Options:
Web Intelligence XI 3.1 highlights changed data according to parameters you set. You control the formatting of the changed data, the types of changed data highlighted and the amount of data change that triggers data highlighting.
Reference Data:
When you track data changes, you select a particular data refresh as a reference point. This data is known as the reference data. When you display the data changes, Web Intelligence places your current data in context by showing how it relates to the reference data.
Data Refresh with Data Tracking Options and Reference Data:
After some time, when we refresh the same report, it will check the new updated data with the reference report data (given above) and it will show the changes as per the parameters you set in data tracking options.
Advantages:
We can focus our analysis on key areas and avoid wasting time exploring irrelevant data.
Read More about  Data Tracking