Pages

Tuesday 31 December 2013

Agile!!! An Effective Testing Process

AGILE!!! AGILE!!! AGILE!!! one of the common word we are listening in testing society  now a days. What does agile means, Why agile came into picture, What makes agile differs from the regular testing practice.

The word "agile" in dictionary means, quicker, smart. The same in the software testing is make the testing process in quicker and smarter way. Main agenda of the agile process is to ensure delivering the business value desired by the customer at frequent intervals, working at a sustainable pace.

Agile principles 

The Agile Manifesto(Wikipedia) is based on the below twelve principles:

  1. Customer satisfaction by rapid delivery of useful software
  2. Welcome changing requirements, even late in development
  3. Working software is delivered frequently (weeks rather than months)
  4. Working software is the principal measure of progress
  5. Sustainable development, able to maintain a constant pace
  6. Close, daily cooperation between business people and developers
  7. Face-to-face conversation is the best form of communication (co-location)
  8. Projects are built around motivated individuals, who should be trusted
  9. Continuous attention to technical excellence and good design
  10. Simplicity—the art of maximizing the amount of work not done—is essential
  11. Self-organizing teams
  12. Regular adaptation to changing circumstances
 There are several models used by the organizations during software life cycle. For all these models WaterFall, a traditional testing approach is the basic I believe. Basically water-fall model has the linear activity.Once a phase has finished its part, these outputs are taken as inputs to the next phase. However, due to its sequential nature, this model is not capable of dealing with iterations and evolutions. We cannot alter the requirements in middle of the phases. Testing cannot be started until development phase completed. If any bug identified during one phase, it should be solved in same phase itself. Bugs identified in later phases of this model will consume more money as well as time. The main disadvantage of this model is, no iterative evolution for any phase once a phase done cannot go back.From a tester point of view a traditional testing approach looks like,
Image Courtesy: Google Images
To over come the pitfalls of the traditional testing approach, several advanced models have been developed. Among them V-Model is the most popular and many organization started following this model. Typically V-Model looks like,

                             
Image Courtesy: Google Images

In short V model is a Verification and Validation model. The left pane which is top to bottom is the Verification phase and the right pane which is bottom to top. The reason for calling it V is because most of the phases on the left have a corresponding phase of activity on the right. Coding part will started after all the design and document part has been completed, and corresponding test deign needs to be prepared.
  • BRS is prepared and verified during requirement analysis as well as the user acceptance test deign will be prepared.
  • SRS will be prepared and verified during system design and Test cases and scenarios prepared for all the functional specifications.
  • High Level Design(HLD) will be prepared during architectural phase based on the specifications and integration test design parallel to it.
  • Low Level Design(LLD) will be prepared for modular design phase and coding phase is officially started based on the LLD.
  • Validation part will be started after coding phase by executing according to the respective test design plan
 Advantage of the V-Model is validation and verification will done parallel in the same phase itself. It gives more importance to the strict process flow to develop a quality product.

Disadvantage is, if any changes made in middle of the phase then all its design documents need to be updated from the initial level. Client involvement during the software development is negligible.

The main difference we can observe in the agile process is the customer involvement during software development process and frequent deliverable. And the interesting thing is all the phases will start in parallel pace. New requirements are also accepted in the middle of the development. Here software will be developed in am incremental way I.e, a module is designed and developed. Immediately given for the testing and after that takes the customer feedback. In the same process, another module will be developed and integrate it with the previous module and release to the customer feedback.The same process would be iterated n number of times. Typically life cycle of agile methodology would be like,

Image Courtesy: Google Images

The above figure shows us agile process is a iterative with integrating module by module. To fulfill the agenda of the agile methodology some hard work needs to be beard by the team. Apart from the figure above in life cycle, few activities are also part of agile process. They are Regular Team Meetings, F2F Communication.

Team meeting is one of the most common practice and often easiest to implement. Each module is divided into the teams. Teams will be called as Sprints. Team players named as Scrums and Scrum Master is the Team Leader/Project Lead. All the regular meetings should be time boxed, facilitated by the scrum master. Agenda of the meeting would be discussed mainly on Tasks done by the scrum and the blockers struck. Scrum master will correlate the road blockers with the development team. Also the Project Report(PR) meeting should be held in a weekly or monthly basis with the customer, so that client can easily know the status of the project.

A face to face session arranged among SME's Developers and Testers. A defect management meeting about the high level defects in the application with SME's. They must discuss and prioritize all the high level defects and conclude are they valid one, if valid what is the impact of that on the application due to that. Who can fix this bug and more. Often walkthrough should be arranged about the changing parts, new working process or any additions requirements. Ensure that both development and testing teams are shared with the same kind of information.

That's it Agile is just Develop-> Integrate-> Test-> Feedback-> Release with sufficient iterations until customer requirements fulfilled. Disadvantage of agile process is, regular meetings. Yes, if the facilitator is  not well organized. If the initial project has planned without proper plan, then defensively the intended out come will be deviated.  However, the disadvantages can be nullified if you have a good consultant at hand.


Happy Testing...




A Journey towards Testing...:)


SDLC Vs STLC

Software Development Life Cycle(SDLC) and Software Testing Life Cycle(STLC), both has equal emphasis during the software development. Before started my journey towards testing, I know only one name of those, i.e., SDLC. Because that's what I read during my grad. Till then I thought of testing is just a method. For all the latest life cycle models water fall is the base and the initial one, in every model "testing" is just a phase. I was in a doubt, after-all it is a phase/process in a life cycle model why do it has a separate life cycle again?
There I started Google'd  with that name. I have read most of the articles and blogs. I got the answer but not able to convince with their explanation. They said strategy, plan, environment, testing levels. That's it. Haven't found any systematic way in that. But in one post, I saw the difference written by a testing geek. I was impressed with the explanation wrote and I convinced with the answer about how  s/w life cycle phases related with development and testing.Below is the table about SDLC and STLC.


S. No. Phase SDLC - Software Development Life cycle STLC - Software Test Life Cycle
1 Requirements Gathering Requirements gathering is done by business analyst. Development team analyse the requirements from the design, architecture & coding perspective. Testing team also review & analyse the requirements. Testing team identifies the testing requirements like what types of testing will be required and review the requirements for logical functional relationship between various features / modules, so that any gaps can be caught at an early stage.
2 Design Technical architect works for the high level & low design of the software. Business analyst works for the UI design of the application Here, test architect generally the test lead/manager, does the test planning, identify high level testing points. Basically, requirement detailing is done in this phase. 
3 Coding or development Development team does the actual coding based on the designed architecture. Testing team write the detailed test cases.
4 Testing In SDLC, actual testing is carried out in this phase. It includes unit testing, integration testing & system testing etc.. Test Execution and bug reporting, manual testing, automation testing is done, defects found are reported. Re-testing and regression testing is also done in this phase. But, I don't agree with this statement. So, if I want to relate the testing phase with STLC, I would say it it is testing of test cases & test plans i.e. is basically review of test cases, test scenarios etc..
5 Deployment Application is deployed on production environment for real end users. Final testing and implementation is done is this phase and final test report is prepared. For this statement as well, I don't agree. For software / application deployment is basically, when it is installed for real use. So, this way, STLC, deployment would be when test when test cases getting used i.e. execution of test cases.
6 Maintenance Basically, it includes, post production / deployment support & enhancements. Most of people say - Maintenance testing is carried out in this phase. My definition for this is - updation & maintenance of test plans, test case required for the testing of support requests & enhancements as a part of maintenance.
 Similarities Courtesy: http://www.softwaretestingstuff.com

Thanks for the author...
Happy Testing :)

 

Saturday 9 November 2013

How to write a better test case!!!

 Image courtesy: http://www.mindmeister.com

Starting with the basic definition, Test Case is a set of input pre-conditions and output post-conditions that are verified for a particular Test Condition identified for the given specification. My definition is, test case is nothing but investigating the functionality in various ways. And the test case could be determined by Pass/Fail status after the execution of it. Every where requirements are mapped with only two types of cases, they are like Positive test cases, Negative test cases. Subsequently go on with the sub requirements too.

Usually test cases are written to estimate the test coverage in the application. Most of the companies which follows standards will author test cases prior to the start testing. It is better to write test cases before starting official testing practice rather doing endless adhoc testing.

1.Before start writing a test case, read out the functional document of the application area. Analyze the test environment carefully and what is the expected behavior of the test area
2.Try to prepare a checklist of all the functionalists. And that checklist could be your test scenario and our target is to identify various cases for the scenarios.
3.For every front end design point mentioned in the document prepare at-least one positive and one negative case.
4.Ensure your test case must cover all the points mentioned in the specification document. Functional and GUI cases are to be written separately.
5.Start with the high priority scenarios which features are important to the application.
6.Test steps must be clear and accurate but not too long. Avoid duplication of test cases
7.Test data should be prepared for every test case. Expected and Actual result must be logged during execution

 With the above characteristics a better test case can be written.

Test Case id: ID Number
Test Scenario: Can be our checklist
Test Case:Test case identified from the scenario. Positive or negative and more...
Test Case Description: Proper and accurate steps for executing the test case
Test Data: Input data to be used for executing the test case
Expected Result: Result as the specification document
Actual Result: Result determined from the application after executing the test case
Status: Pass/Fail
Review Comments: Comments written by the reviewer


Happy Testing...




A Journey towards Testing...:)

Friday 18 October 2013

How to report a BUG???

Firstly, how do we call it. Either BUG or DEFECT?
As far as my concern, both are practically same. Defect is when the tester raises a issue, where as we call it them as Bug  when the defect is accepted by the developer officially.

Its a professional war :) between a developer and a tester. However its a friendly battle :) Only one can have the upper hand, either a quality code or a strong tester.
When a bug or a defect identified, a tester should report it to the respective developer. But how? What are the steps need to follow while assigning a bug to the developer?



With the following template I guess the bug/defect can be reported accurately

1.Title: A short description about the bug in a single line
2.Identified by: Name of the tester
3.Date: Bug identified date
4.Assigned to: Name of the respective developer.
5.Environment: In which environment does the bug identified. Like windows or linux or solaris... 6.Build no: Build release number in which the bug is identified.
7.Bug Type: States the type of bug it is. Typically these are,
  • Functional:  Bugs that are deviated from the expected flow.
  • Usability: When an end to end scenario accomplished in different way instead of the actual way
  • GUI: Bugs that affect the presentation, look or layout of pages, images and form elements.
8.Bug Severity: This renders the impact of the bug on the application. Defining the type of bug identified, whether its functional or usability or gui or security. Severity levels can be differ as per the process followed by the companies. My severity levels are likely critical, high, medium, low.
  •  Critical: If a bug is mapped to this means, then the screen/application encounter with unexpected errors and can not be tested further and need to be resolved immediately. Ex: Can not log in to app
  • High: If a bug is mapped to this means, then the functionality in the screen/app is deviating from the expected result. Ex: clicking on a link takes you to page X instead of the intended page Y
  • Medium: When a record is saved into database but improperly shown in the user interface. Then the bug can be mapped with this.
  • Low: Bugs that do not interfere with core functionality and are just annoyances that may or may not ever be fixed. Typically spell mistakes, color legends. Ex: Search results format display incorrectly in different browsers
9.Priority: How fast the bug has to be resolved. Decision take by the developer/manager
10.Test data: What kind of test data used while testing and through which data the bug is identified.
11.Module name: Name of the module bug identified in the application.
11.Screen name: Name of the screen under respective module, the bug identified.
13.Description: Detailed description of the identified bug with proper reproducible steps.
14.Root cause: Specify proper reason for the bug caused.
15.Attachment: Proper snapshot of the bug, if required.

"Although tester has classified the bug, lead or manager has right to re-classify the bug."

Happy Testing...





 A Journey towards Testing...:)

Thursday 27 June 2013

What should you test first when a new functionality change occurs

In my experience of testing an application when build released, as a team first we must read the release notes properly which was provided by the development team. Then prioritizing the our testing levels

1.Sanity Testing(Initial) : Go thru each and every links in the application and ensure that there will be no blockers and finally certify that build is ready for further testing

2.Adhoc Testing for a while: Testing the application with random scenarios for one or two hours

"After all we are most likely  to find bugs on new features which were released with the build, then why doing this Sanity and a sample adhoc?"

"Of-course I do accept that our first priority would be testing the new functionality. But before moving to that tester has to certify that build was properly deployed which is the initial phase before actual testing starts isn't it?  After successful completion of sanity we will proceed according to the release notes and all..."


3.Executing Test Scenarios(High Level) : After a while of adhoc testing, tester must execute all the high priority scenarios according to the release notes which i mean to say is positive way of testing. Surely tester can filter the high priority bugs in this stage itself

4.Executing Test Scenarios(Low Level) : Executing the application in depth including the medium and low priority scenarios also adding with negative scenarios.

5.Re-Testing : Retesting of the Fixed bugs and
After few iterations of the same process, in the final iteration, the big task ends with...


6.Regression Testing: Ensuring new changes would not effect to the existing functionality.

As recently started my journey in testing, I experienced these type process when build was released to testing team.

HAPPY TESTING!!!





A Journey towards testing...:)

Wednesday 26 June 2013

Testing is nothing but Common Sense

Its 2nd Jan of 2012 and its been over 1year passed started my journey towards testing. In this one year span of time I have learned a lot during my training period and afterwards. I can see much difference in myself.

During my journey, I learnt Why to test?, What to test? and How to test?. I started my journey with the question "Why to Test?" followed by "What to Test?" and now researching myself every time  "How to Test?".

"Why to Test?"
I can say testing is essential before  any product or an application gets or getting released into the market. Even a match stick will be tested before released into the market then why don't we do testing for our application which is more valuable. An organization with good product earns good reputation in similar fashion A company with a good testing team will always deliver high quality product. But never forget "No product is 100%bug free". :)

"What to Test?"
I can say this as Testing beyond the scope is called 'What to Test?'. We the people testers has to think out of the box in a generic way as an end user perspective. Testing only the functional specifications written in the document is rewriting the textbook examples. A person can achieve more when he can solve the exercise problems.

"How to Test?"
A tester thinking with little common sense will help in good results and that's what I believe and experienced in achieving better results. A developer uses logical thinking to indulge his logic into the code but somehow he loses his common sense while deploying it (I don't blame for it, its just a human mistake). But a tester has to use his common sense first then a logical thing. In my experience I have seen in many areas people are emphasis about bugs than the quality and I believe that is due to lack of common sense. But ultimately we need our product must be released with less bugs. That can be achieved by adding COMMON SENSE to our testing can get it easily.

With the reference of a functional document any one can test the product but executing beyond the document is nothing but common sense and that's the difference between an end user, a developer and a tester. I am using my common sense while testing. So I'm proud to be a tester and by this I can strongly say to anyone that

 "TESTING IS NOTHING BUT COMMON SENSE"

TESTING WITH COMMON SENSE GIVES BETTER RESULTS. AGREE????




A Journey towards Testing...:)