Showing posts with label Testing. Show all posts
Showing posts with label Testing. Show all posts

Thursday, March 29, 2012

Strategies For Testing Data Warehouse Applications


Introduction:

There is an exponentially increasing cost associated with finding software defects later in the development lifecycle. In data warehousing, this is compounded because of the additional business costs of using incorrect data to make critical business decisions. Given the importance of early detection of software defects, let’s first review some general goals of testing an ETL application:

Below content describes the various common strategies used to test the Data warehouse system:
Data completeness: 

Ensures that all expected data is loaded in to target table.

1. Compare records counts between source and target..check for any rejected records.
2. Check Data should not be truncated in the column of target table.
3. Check unique values has to load in to the target. No duplicate records should be existing.
4. Check boundary value analysis (ex: only >=2008 year data has to load into the target)

Data Quality:

1.Number check: if in the source format of numbering the columns are as xx_30 but if the target is only 30 then it has to load not pre_fix(xx_) .. we need to validate.

2.  Date Check: They have to follow Date format and it should be same across all the records. Standard format : yyyy-mm-dd etc..

3. Precision Check: Precision value should display as expected in the target table.

Example: In source 19.123456 but in the target it should display as 19.123 or round of 20.

4.  Data Check: Based on business logic, few record which does not meet certain criteria should be filtered out.
Example: only record whose date_sid >=2008 and GLAccount != ‘CM001’ should only load in the
target table.

5. Null Check: Few columns should display “Null” based on business requirement
Example: Termination Date column should display null unless & until if his “Active status”
Column is “T” or “Deceased”.

Note: Data cleanness will be decided during design phase only.

Data cleanness:

Unnecessary columns should be deleted before loading into the staging area.

1.  Example: If a column have name but it is taking extra space , we have to “trim” space so before loading in the staging area with the help of expression transformation space will be trimed.

2. Example: Suppose telephone number and STD code in different columns and requirement says it should be in one column then with the help of expression transformation we will concatenate the values in one column.

Data Transformation: All the business logic implemented by using ETL-Transformation should reflect.

Integration testing:

Ensures that the ETL process functions well with other upstream and downstream processes.

Example:
1.  Downstream:Suppose if you are changing precision in one of the transformation “column”, let us assume a “EMPNO” is column having data type with size 16, this data type precision should be same for all transformation where ever this “EMPNO” column is used.

2.  Upstream: If the source is SAP/ BW and we are extracting data there will be ABAP code which will act as interface between SAP/ BW and map where there source is SAP /BW and to modify existing mapping we have to re-generate the ABAP code in the ETL tool (informatica)., if we don’t do it, wrong data will be extracted since ABAP code is not updated.

User-acceptance testing:

Ensures the solution meets users’ current expectations and anticipates their future expectations.
Example: Make sure none of the code should be hardcoded.

Regression testing:

Ensures existing functionality remains intact each time a new release of code is completed.

Conclusion:

Taking these considerations into account during the design and testing portions of building a data warehouse will ensure that a quality product is produced and prevent costly mistakes from being discovered in production.

Monday, January 2, 2012

HP Sprinter – New Era in Manual Testing


Problems faced in manual testing:


Manual software testing is the oldest mode of testing soft wares and is still in practice. But all these years of practice do not make it a routine as one would expect. We need to accept the below mentioned facts first.
  • Manual testing can often be very tedious and time consuming. The productivity of a tester depends on multiple sources (test script, test data, defects tracked, application under test).
  • In today’s world where software needs to operate in multiple operating system environments and web browsers, manually testing software adds a large amount to the time needed to release an application.
  • Manual testing is often error prone. Test steps are easily missed, test data often incorrectly entered, and defects are often incorrectly captured thereby decreasing overall quality of the application, increasing risk the application poses, and increasing costs due to associated work replication.
What is HP Sprinter?


HP Sprinter is an easy to use solution provided by HP that delivers accurate and efficient manual software testing fully integrated with HP Oracle Application Management and HP Quality Center.

How HP Sprinter ease manual testing?


With this new era of HP Sprinter, manual software testing does not have to be tedious, error prone, or time consuming anymore. Some of the features are:
  • HP Sprinter dramatically reduces time needed to perform manual software tests and increases their accuracy and effectiveness.
  • Manual tests are launched from HP Application Lifecycle Management or HP Quality Center into HP Sprinter where tester carries out the test.
  • The actions and results of the test are simply recorded and results are saved within HP Application Lifecycle Management or HP Quality Center.
  • Defects can also be directly logged within HP Application Lifecycle Management or HP Quality Center without leaving HP Sprinter, helping bridge the gap that exists with developers.
  • HP Sprinter also handles automated injection of data into fields under test increasing the speed and accuracy in which a test can be executed.
  • HP Sprinter allows screen capturing, screen annotation, and movie recording. It can also be used to automatically record and log tester’s activities and actions when executing exploratory testing without pre-defined steps.
  • HP Sprinter’s mirror testing capabilities allow users’ actions to be automatically replicated across multiple systems hosting multiple environment configurations.
Conclusion:


HP Sprinter thus helps to streamline manual testing and improve collaboration and communication. It increases speed of execution, improves productivity, reduces costs and accelerates application delivery.

Thanks For Reading This Article. Know More About: HP Sprinter

Wednesday, April 28, 2010

QATS @ Hexaware

Welcome to the world of QATS.
Our Motto
Strive for perfection
Our Goal
Testing to make the application fool proof
What we do
We at QATS live eat sleep test everything. Innovation is in blood. We do not talk about innovation but we live it.
We never take things for granted. We love our work and we never leave any stone unturned. This belief philosophy helps us differentiate us from other. The team strives to bring that wow effect in every engagement that we undertake.
The WOW effect
A testimony of our innovation @ our QATS is now also on Microsoft site. We hope you will keep coming again and again here.
We promise that we will serve you with more and more info/tips/solutions that will help you make your life easy.
For more details on the complete suite of QATS offerings check out this link
If you would like to learn more, check out this link http://www.hexaware.com/new_testing.htm