User's Manual
User's Manual
User's Manual
V1.2
XQual
Cidex 436, chemin des tourres . 6330 Roquefort les pins .France
Table of content
1. OBJECTIVES................................
................................................................
................................................................
................................................................
...................................... 6
3. BASICS................................
BASICS................................
................................................................
................................................................
............................................... 7
...............................................
3.1 USER TREE................................
................................................................
................................................................
........................................................ 9
........................................................
3.1.1 CREATE A COMPANY ................................................................
................................................................................................
................................
....................................... 9
3.1.2 CREATE A HIERARCHY IN THE COMPANY ................................................................
.........................................................................
................................ 10
3.1.3 CREATE A USER ................................
................................................................
................................
................................................................
................................
........................................... 10
3.1.4 SUBMIT A NEW ABSENCE ................................................................
...............................................................................................
............................................................... 10
3.1.5 EDIT AN ABSENCE ................................................................
................................................................................................
.........................................................................
................................ 11
3.1.6 CHECK THE CALENDARS ................................................................
................................................................................................
................................ 12
3.1.7 CHANGING ADMIN PASSWO PASSWORD ................................................................
........................................................................................
........................................................ 13
3.2 SUT TREE ................................
................................................................
................................................................
....................................................... 14
.......................................................
3.2.1 CREATE A SUT ................................
................................................................
................................
................................................................
............................................
............ 14
3.2.2 CREATE A SUT INHERITING REQUIREMEN REQUIREMENTS TS FROM ANOTHER SUT ...................................
................................ 15
3.3 AGENT TREE ................................................................
................................................................................................
...................................................................................
................... 16
3.3.1 ADD MY LOCAL HOST ................................................................
................................................................................................
................................ 16
....................................
3.3.2 ADD ALL NECESSARY HOSTS HOSTS................................................................
..........................................................................................
.......................................................... 17
3.4 REQUIREMENT TREE ................................................................
................................................................................................
................................
........................................ 18
3.4.1 CREATE A CATEGORY ................................................................
................................................................................................
................................ 18
...................................
3.4.2 CREATE A REQUIREMENT ................................................................
..............................................................................................
.............................................................. 19
3.4.3 EDIT A REQUIREMENT ................................................................
................................................................................................
................................ 20
...................................
3.5 SPECIFICATION TREE................................................................
................................................................................................
................................
........................................ 21
3.5.1 CREATE A SPECIFICATION
SPECIFICATIO ................................
................................................................
............................................................. 21
.............................................................
3.5.2 EDIT A SPECIFICATION................................................................
................................................................................................
................................ 22
...................................
3.6 PROJECT TREE ................................
................................................................
................................
................................................................
................................................
................ 23
3.6.1 CREATE A PROJECT ................................
................................................................
................................................................
................................
...................................... 24
3.6.2 CREATE A TASK ................................
................................................................
................................
................................................................
............................................
............ 24
3.6.3 CREATE A SPRINT ................................................................
................................................................................................
.........................................................................
................................ 24
3.6.4 EDIT A SPRINT ................................................................
................................................................................................
..............................................................................
.............. 25
3.6.5 ALLOCATE SOME RESOURC RESOURCES ES TO A SPRINT ................................................................
.....................................................................
................................ 25
3.6.6 ASSOCIATE
CIATE SOME TASKS TO A SPRINT ................................
................................................................
............................................
............ 26
3.6.7 DAILY UPDATE PROGRESS OF THE TASKS OF A SPRINT SP ..................................................... 27
.....................................................
3.6.8 SEE THE VELOCITY CHARTS CHARTS/CHECK THE STATUS OF THE SPRINT .......................................
................................ 27
3.7 TEST TREE ................................
................................................................
................................................................
...................................................... 29
......................................................
3.7.1 CREATE A TEST ................................
................................................................
................................
................................................................
............................................
............ 30
3.7.2 ASSOCIATE A TEST TO SPECIFICATIONS
SPECIFICATIONS ................................................................
..........................................................................
................................ 30
3.7.3 CREATE A TESTCASE ................................................................
................................................................................................
................................ 30
....................................
3.7.4 MOFIFY THE TESTPLAN OF O A TESTCASE ................................................................
..........................................................................
................................ 31
3.8 CAMPAIGN TREE ................................................................
................................................................................................
..............................................................................
.............. 32
3.8.1 CREATE A CAMPAIGN ................................................................
................................................................................................
................................ 33
....................................
3.8.2 ORDER THE TESTS IN THE THE CAMPAIGN ................................
................................................................
.............................................
............. 33
3.8.3 CREATE A CAMPAIGN SESSION SESSION ................................................................
.......................................................................................
....................................................... 34
3.8.4 RUN A CAMPAIGN SESSION SESSION ................................................................
............................................................................................
............................................................ 35
3.8.5 SEE THE EXECUTION DETAILS DETAILS ................................................................
........................................................................................
........................................................ 36
3.8.6 GET THE RESULTS ................................................................
................................................................................................
................................
........................................ 37
3.8.7 CHECK THE PROGRESSION OF A CAMPAIGN................................................................
.....................................................................
................................ 37
3.9 DEFECT TREE ................................
................................................................
................................
................................................................
..................................................
.................. 39
3.9.1 CREATE A DEFECT ................................................................
................................................................................................
................................
........................................ 40
3.9.2 EDIT A DEFECT ................................
................................................................
................................
................................................................
.............................................
............. 40
3.9.3 LINK A FAILED TEST TO A DE
DEFECT ................................................................
...................................................................................
................... 41
4. GETTING MORE INTO TH
THEE DETAILS ................................
..............................................................
.............................. 42
4.1 LOCALIZATION ................................
................................................................
................................
................................................................
.................................................
................. 42
4.2 INTERNATIONALIZATION ................................................................
................................................................................................
................................ 43
....................................
4.3 CHANGE TRACKING ................................................................
................................................................................................
..........................................................................
................................ 44
4.4 TESTING COVERAGE/TRACEABILITY MATRIX ................................
................................................................
................................
...................................... 45
4.4.1 GLOBAL REQUIREMENTS TRACEABILITY TRACEABILITY MATRIX ...............................................................
............................................................... 45
4.4.2 GLOBAL SPECIFICATIONS TRACEABILITY MATRIX ..............................................................
.............................................................. 45
4.4.3 DETAILED REQUIREMENTS COVERAGE ................................
................................................................
............................................
............ 46
4.4.4 DETAILED SPECIFICATIONS
SPECIFICATIONS COVERAGE ................................
................................................................
................................
........................................... 46
4.4.5 DETAILED TESTS COVERAGE
COVERAGE................................................................
..........................................................................................
.......................................................... 47
4.4.6 DETAILED CAMPAIGN COVERAGE COVERAGE................................
................................................................
....................................................
.................... 48
4.5 SCHEDULING TEST EXECUTION
EXECUTION ................................................................
..........................................................................................
.......................................................... 49
4.6 TRACKING THE TEST IMPLEMENTATION ................................................................
..............................................................................
.............. 50
4.7 DEFECTS REPORTING ................................................................
................................................................................................
................................
....................................... 51
4.7.1 PER-USER REPORTS ................................................................
................................................................................................
................................
..................................... 52
4.7.2 PER-SEVERITY REPORTS ................................................................
...............................................................................................
............................................................... 52
4.7.3 PER-PRIORITY REPORTS ................................................................
...............................................................................................
............................................................... 53
4.7.4 SUBMISSION/RESOLUTION RATES ................................
................................................................
..................................................
.................. 54
4.8 ATTACHMENTS................................
................................................................
................................
................................................................
.................................................
................. 55
4.8.1 ADD AN ATTACHMENT ................................................................
................................................................................................
................................ 56
...................................
4.8.2 DOWNLOAD/OPEN AN ATTACHMENT ................................
................................................................
...............................................
............... 56
4.9 GENERATE DOCUMENTATION
DOCUMENTATIO ................................................................
............................................................................................
............................................................ 56
4.9.1 REQUIREMENTS BOOKS ................................................................
................................................................................................
................................ 56
4.9.2 SPECIFICATIONS BOOKS ................................................................
................................................................................................
................................ 57
4.9.3 PROJECTS BOOKS ................................................................
................................................................................................
................................
........................................ 58
4.9.4 TESTPLANS ................................
................................................................
................................
................................................................
..................................................
.................. 58
4.9.5 TEST REPORTS ................................
................................................................
................................
................................................................
............................................
............ 59
4.10 CUSTOMIZING THE DOCUMENTS
DOCUMENTS ................................................................
.........................................................................................
......................................................... 60
4.10.1 CHANGING THE LOGO ................................
................................................................
................................................................
................................ 60
..................................
4.10.2 CUSTOMIZING THE REPOR REPORTS ................................................................
........................................................................................
........................................................ 61
4.11 SEARCHING ................................
................................................................
................................................................
.....................................................
..................................................... 62
4.11.1 BY NAME ................................
................................................................
................................
................................................................
....................................................
.................... 62
4.11.2 BY ID ................................
................................................................
................................................................
.........................................................
......................................................... 63
4.11.3 ADVANCED AND PLAIN TEXT TEXT SEARCH ................................
................................................................
............................................
............ 63
4.11.3.1 Requirements................................................................
................................................................................................
...................................................
................... 63
4.11.3.2 Specifications ................................................................
................................................................................................
...................................................
................... 64
4.11.3.3 Tasks ................................
................................................................
................................................................
................................................................ 64
................................................................
4.11.3.4 Tests ................................
................................................................
................................................................
................................................................ 65
................................................................
4.11.3.5 Defects ................................................................
................................................................................................
.............................................................
............................................................. 65
4.12 IMPORTING DATA ................................................................
................................................................................................
..............................................................................
.............. 66
4.12.1 FROM CSV ................................
................................................................
................................
................................................................
................................................
................ 66
4.12.1.1 Test and testcases ................................................................
................................................................................................
................................
........................................... 66
4.12.2 FROM XML ................................
................................................................
................................
................................................................
................................................
................ 68
4.12.2.1 Tests and testcases ................................................................
................................................................................................
................................
......................................... 68
4.12.2.2 Requirements ................................................................
................................................................................................
...................................................
................... 68
4.12.2.3 Specifications ................................................................
................................................................................................
...................................................
................... 69
2. General overview
Access to XStudio
XStudio is restricted to users who have suitable credentials. A login process is used to
authenticate the user before he can use the system. Each user is then grant granted
ed with a list of
permissions.
By default, an admin account is created (at installation time). The admin has (among some others)
the permission to create users. If you do not know your credentials, contact your XQual
Administrator.
Each of these entities can be smoothly and conveniently organized in separate trees.
tree
3. Basics
Here are the main entities managed in XStudio’s Data Model:
Entity Function
Company Several companies will be involved in the testing process:
• The company who deliver the product to test
• The company who is in charge of writing the testplan
• The company who is in charge of implementing and executing the tests
Of course, all these companies can be the same.
User Each company have users that will be involved in the testing process as:
• Author of the testplan
• Test developer
• Test operator
• Task performer
perf
SUT (System Under
nder Test)
What we want to test. This can be a software or a Hardware target.
Agent Tests can be run all locally or on any host having XAgent installed and running.
All hosts with XStudio or XAgent MUST be referenced in the Agent tree.
Requirement eatures required for the SUT.
Features SUT
Specification Deduced from the requirements, the specifications precisely detail each function
of the SUT.
Project A generic project
Sprint Some tasks are associated to a sprint. An intermediate deliverable will come out
of each sprint..
Category Generally, there will be different category of tests for one single product. A
Category is characterized by a unique way to rrunun all the tests
test under this category.
Test Tests are developed based on the specifications. Each test must verify one
particular item of the specifications. Tests can includes different testcases.
Table 1 - Entities
As you will see, XStudio is using a lot trees. Trees are flexible and allow managing entities a very
flexible way. Each tree will be associate with a few icons:
Overlay Represents
Create
Delete
Move
Copy
Edit
View
Initialized
Playing
Paused
Stopped
Open
Add
Remove
Search
Select
Special item (read-only)
(read
Move to the right
Move to the left
Move up
Move down
Show items (available for closable items such as sprints, tasks or defects)
XStudio’s
’s GUI is based on a simple and clean design:
• A left panel including a tree
• A right panel:
o showing information concerning the element selected in the tree
o proposing a toolbar allowing the user to execute some specific actions
action on the
element selected in the tree.
3.1 User Tree
Here is a typical use
userr tree. It immediately shows a number of useful information:
• the total number of users
• the number of users in each company
• the number of users in each folder
• the status of the users (enabled/disabled indicated by the color of the icon)
Note: this company entity will be automatically shared in the SUT tree.
3.1.2 Create a hierarchy in the company
Then, it’s important to well define the working groups and teams in the company. This can be
achieved by creating a complete tree of folders and sub
sub-folders.
folders. To do this:
• in the tree, select a company
• on the right panel, click on the create folder button
• enter the name of the folder and submit
• immediately, the folder appears in the tree
Of course, a sub
sub-folder
folder can be created into a folder.
Redo the same operation until you are satisfied with the organisation.
• fill the Details tab with the user name, password, preferred language, email address and
location. Do not forget to check the Enabled checkbox.
location. checkbox. A disabled user cannot login into
the system.
• the Rights tab is here to select what this user will be able to do. For now, just check the
st checkbox corresponding to the root folder (this will grant automatically ALL the rights
first
to this user)
• click on submit
• immediately, the user appears in the tree
3.1.4 Submit a n
new absence
bsence
Each user can enter some absences. After submission, absences have the status new.. Once the
manager has set them as approved, the status changes accordingly.
A SUT is an abstract object representing the target we want to test. It can be a hardware device or
a software
ware component. The SUT must be detailed enough so that we can identify it easily.
A SUT should be uniquely defined through its name and version.
All hosts on the network that will have XStudio of XAgent installed on MUST be included in the
tree to be able to execute some tests.
Requirements are all the conditions the SUT should be compliant with. Generally, the requirements
list is the first thing we do before working on detailed specification. In a perfect world, the SUT
should come with a list of requirements but this may not be the case.
Entering the requirements of the SUT is an optional task. We do encourage to completely defining
the requirements though. This information is very useful for coverage re
reporting.
porting.
You can add in the requirement tree the complete description of each requirement or you can
decide to just point to the external requirements document(s).
Overlay Represents
New New requirement
Ack The requirement
equirement is being reviewed by the person who is supposed to sign it off
Approved The requirement has been approved
Table 4 - Requirements
Requirements Status
To edit a requirement:
• in n the tree, select a requirement
• on the right panel (details tab), edit the information you wish and submit
Note: Depending of the rights you’ve been granted, you may or may not be able to set the status to
a specific
specif state.
3.5 Specification Tree
Here is a typical specification tree. It immediately shows a number of useful information:
• the total number of specifications
• the number of specifications in each category
• the number of specifications in each folder
• the status
status of each specification (indicated by the color of the icon)
• the priority of each specification (indicated by the column)
You can add in the specification tree the complete description of each specification or you can
decide to just point to the external requirements document(s).
Overlay Represents
New New specification
Ack The specification
ification is being reviewed by the person who is supposed to sign it off
Approved The specification has been approved
To edit a specification:
specification
• In the tree, select a specification
• on the right panel (details tab), edit the information you wish and submit
Note: Depending of the rights you’ve been granted, you may or may not be able to set the status to
a specific state.
3.6 Project Tree
Here is a typical project tree. It immediately shows a number of useful information:
• the total number of projects
• the total number of sprints
• the total number of tasks
• the number of sprints in each project
• the number of tasks in each project
• the status of each sprint (indicated by the color of the project icon)
• if thehe tasks have been already affected to a sprint or not (indicated by the color of the task
icon – a grayed icon indicates that the task is already affected to a spint
spint)
The scrum methodology defines the notion of sprints.sprints. A sprint is the result of any iteration in a
project. So,
o, to deliver a product, you will deliver several intermediate releases, each corresponding
to a sprint. A sprint is generally at most 2 or 3 weeks long. A number of tasks will be associated to
each sprint and at the end of the sprint a demo of all the fea
features
tures developed can be done.
Overlay Represents
Idle Idle sprint
Running The sprint is currently running
Finished The sprint if finished
To edit a sprint:
sprint
• In the tree, select a sprint
• on the right panel (details tab), ed
edit
it the information you wish and submit
• the duration (in effective man.days) is automatically updated
Note: Depending of the rights you’ve been granted, you may or may not be able to set the status to
a specific state.
This screen is a bit different from the others since the left panel includes 2 separate areas:
• thehe usual tree including all the tests that we have on the system
• a sub-tree
sub tree on the bottom that will list the testcases corresponding to the currently selected
test
est
Tests can be arranged by placing them in specific folders. For instance, the tests can be organized
the same way as the specification
specifications. But iff you’re testing a software API, it’s a common way of doing
to have one test per function/method and group these these functions by groups.
groups You can then extend
your test suite by adding stress test or negative tests etc. so probably need additional folders.
Tests and testcases are obviously what is the most important in the Data Model definition. The
normal process is to start from specifications and make as many tests and testcases as necessary
to check that specifications are all properly functioning.
3.7.1 Create a test
To create a test:
• inn the tree, select a folder (create one if necessary)
he create test button
• on the right panel, click on tthe
• a dialog box including three tabs is displayed
• fill the Details tab with the name and priority of the test (leave the canonical path blank).
blank)
• fill the Testplan tab with the prerequisites and the general description of the tes test.
t. You can
use the formatting tools (wiki-
(wiki-style)
style) to format the text. Later, in reports this text will appear
correctly formatted.
• Pick one user in the Author tab that will be registered as the author or the test.
• click on submit
• immediately, the test appears rs in the tree
One testcase must be able to check one specific function in some particular conditions. The sum of
all the testcases makes one test. To have a test succeeding, all the testcase must succeed.
To create a testcase:
• inn the tree, select a test
right panel, click on the create testcase button
• on the right
• a dialog box including two tabs is displayed
• fill the Details tab with the index (defining the order in which the testcases will be executed
within a test),
test), name and general description of the testca testcase. You can use the formatting
tools (wiki-style)
(wiki style) to format the text. Later, in reports this text will appear correctly formatted.
Do not forget to check the Implemented checkbox. Non Non-implemented
implemented test are NOT
executed when running a campaign. In case of ma manual
nual test, always check the
Implemented checkbox.
• select the Testplan tab and define all the steps and checks that will be needed in this
testcase:
o add a step:
step
in the tree, select the root folder
click on the create step button
enter the description of tthe
he step and submit
o add parameters (opt) (opt):
in the tree, select the parameters node
click on the create parameter button
enter the description of the parameter and submit
repeat the operation if you need to specify more parameters
o add checks (opt.):
in the tree,, select the checks node
click on one of the create boolean operators buttons
click on the new operator and click on the create check button
enter the description of the check and submit
repeat the operation if you need to specify more checks (you can mix as
many different b boolean
oolean operators
operator as you want)
o repeat the operation for every step and submit
• click on submit
• immediately, the testtestcase appears in the sub-tree
sub tree on the left panel
Once all the tests have been defined and implemented, you you have to run them. To do so, you’ll first
have to gather all the tests you which to run in a campaign. For instance, it may be of interest to
define a campaign including all the tests of a specific category (or just a subset). A campaign is by
definition just an ordered list of tests.
tests
At this stage, you would be able to run a campaign but what happens if you run a campaign several
times? of course you want to be able to retrieve results from each runs of a campaign. Here comes
the Campaign Session.
You can create from a campaign as many campaign sessions as you want. This allows to archive
independently all the results and report of each execution of the tests.
3.8.1 Create a campaign
To create a campaign:
• in the tree, select a folder (create one if necessar
necessary)
• on the right panel, click on the create campaign icon
• a dialog box including two tabs is displayed
• select one or several tests in the list and use the move buttons to position the
test(s) wherever you want in the list
• click on submit
3.8.3 Create a campaign session
To create a campaign session:
session
• in the tree, select a campaign
• on the right panel, click on the create campaign session button
• a dialog box including seven tabs is displayed
• If the campaign includes tests to be executed manually (the tests are part of one or sever
several
categories where you choose manual.jar or simple_manual.jar as launcher), then you will
get additional dialog boxes such as:
3.8.5 See the execution details
To see the details of the campaign session execution:
• in the tree, select a campaign session
• on tthe panel, select the Content tab
he right panel
At this point, a report analysis must be done by the test operator. Thi
Thiss analysis should lead to
associate all failed testcases to some defects.
• fill the Details tab with the name, description, steps to reproduce, reproducibility, platform,
operating system, status, severity and priority
• pick one user in the Assigned to tab who will be registered as the one assigned to resolve
thee defect
• in the Found in tab, check all the SUTs on which this defect can be observed
• click on submit
• immediately, the defect appears in the tree
Overlay Represents
New New defect
Assigned The defect has been assigned to a person to resolve it
Ack The defect is being investigated
Resolved The defect has been declared as resolved
Closed The fix has been verified
verified and the defect has been closed
To edit a defect:
defect
• In the tree, select a defect
• on the right panel (details tab), edit the information you wish and submit
Assigned:
Correction target date
Ack:
Correction target version
Completion %
Resolved/Closed:
Resolved/Closed
Correction type
Correction description
Correction patches
Note: For better tracking purposes, when setting a defect’s status to Resolved
olved,, you should also pick
one SUT in the Fixed in tab.
• check all the defects that need to be associated to the test execution
• oggle the select filter button
(opt.) re-toggle to display only the selected requirement
• click on submit
4. Getting more into the d
details
etails
4.1 Localization
XStudio is entirely localized. There are currently 4 languages supported:
• English
• French
• Italian (partially)
• Spanish (partially
rtially)
To have XStudio running in a specific language, you just need to set your profile with one on those
four languages:
languages
4.2 Internationalization
A user is given a language (localization) which is used to display the application with a specific
language but is also associated with a location (internationalization) which is used to know which
public holidays and week
week-ends
ends this user will benefit from.
Each country can be easily configured: public holidays can be added, edited and deleted.
deleted In
addition,
ddition, each country is associated to some week
week-ends
ends settings. Hence, most of the countries in
the world are using Saturday and Sunday for the weekweek-ends
ends but some others (such as Israel,
Qatar, Algeria etc.) are using Friday and Saturday and again some others (such as Saudi Arabia)
Thursday and Friday.
The default settings are supposed to be correct but it's good to let the administrator of the system
customizes it if necessary. These settings are accessible through the Settings menu entry.
Hence, all the calendars are affected by this change as each user may have different week-ends.
week
Note that the calendar tree is still expandable as usual. This greatly facilitates the reading i.e. when
you want to know the details about why a user is overloaded in a certain time frame.
A legend (including some gradient colors for the workload) has been added to help the reading
reading.
To check the history of an element, click on it and select the Changes tab on the right panel. The
panel shows a table gathering all the dated changes. Here are for instance changes you could see
on a defect:
Note: You can edit directly the details of a user by clicking on his name in the changes table.
4.4 Testing Coverag
Coverage/Traceability
/Traceability Matrix
One of the major interests in using test management tool is the ability to track the coverage of your
testing. As you’ve already seen in previous chapters, requirements are linked to specifications and
specifications are linked to tes
tests.
ts. From this data, it is possible to generate some coverage metrics.
Those metrics
trics are computed per category and are retrievable from requirements, specifications
and tests.
This traceability matrix is also present in the requirement book (that can be generated from
XStudio).
To get campaigns
campaigns coverage metrics:
• switch to the Campaigns tab on the left panel
• in the tree, select a campaign
panel, select the Coverage tab
• on the right panel
• the panel shows useful information such as:
o percentage of the coverage
o the list of specifications fully, partially or not covered by the campaign
o the list of requirements fully
fully,, partially or not covered by the campaign
4.5 Scheduling
Sc ing test execution
When some tests are aimed at being exec executed
uted completely automatically (using some specific
automatic launcher), you can schedule a campaign to be executed on a regular basis. To do this,
you need to create a Schedule
chedule.. Then, at the right time, a campaign session will be automatically
created and executed by XSTudio.
To create a schedule:
• in the tree, select a campaign
• on the right panel, click on the create schedule button
• a dialog box including seven tabs is displayed
• fill the Details tab with the name and the description of the schedule
• in the Scheduling tab, select the days and the time when you want the sessions to be
created and executed. Don’t forget to check the Enabled checkbox. A disabled schedule
will not generate any test execution
• (opt.) select the Test operator tab and pick the user who will (virtually) run the session
sessions
• select the Agent tab and pick the agent on which the session
sessions will be run. This agent
needs to have XAgent installed and running
• select the SUT tab and pick the SUT on which the session
sessions will be run
• select the Configuration tab and pick the configuration you which for each category
involved
nvolved in this schedule. Once a schedule is created, it is impossible (by purpose) to
change the configurations associated. If no configuration are available you will nee need to
create one
• (opt.) select the CC Emails tab and check some users to receive a notification when the
campaign session
sessions will be completed
• click on submit
• immediately, the schedule appears
ppears in the tree
When the time will come for an agent to execute tthe he tests, the agent will create the campaign
session and then execute it. From any station hosting XSTudio, after a refresh, the campaign
session will appear under the schedule node in the tree.
4.7 Defects R
Reporting
eporting
A defect tracking system is good only if it provides a simple and efficient way of retrieving
information about them. XStudio generates extensive reporting on defects.
4.7.2 Per-severity
Per severity reports
To get per-severity
severity reports,
• select
elect the tab corresponding to the desired group of defects:
o Active Defects (NewNew, Assigned and Ack)
o Passive Defects (Resolved
Resolved and Closed)
C
o All Defects
• select if you want some d data
ata about
o Major defects (Blocking
(Blocking, Major)
o Minor defects (Minor
(Minor, Cosmetic,, Enhancement)
Enhancement
• The report will display the corresponding trend
trends and an additional pie chart representing
the current status
4.7.3 Per-priority
Per priority reports
To get per-priority
priority reports,
• selec
electt the tab corresponding to the desired group of defects:
o Active Defects (New New, Assigned and Ack)
o Passive Defects (Resolved
Resolved and Closed)
C
o All Defects
• select if you want some data about
o High Priority defects (High)
(
o Low Priority defects ((Normal and Low)
• The repo
reportrt will display the corresponding trend
trends and an additional pie chart representing
the current status
4.7.4 Submission/Resolution rates
To get submission/Resolution rates reports,
elect the Resolution Rates tab
• select
The top part of the panel shows the progres sion of the number of Active and Passive defects as
progression
well as the metrics as of now and as of the last record.
The bottom part of the panel shows the submission and resolution rates.
4.8 Attachments
You have the ability to attach somesome files (whatever the forma
format is) to some entities managed by
XStudio.. This is an extremely powerful way of centralizing/sharing documents. Attachments can be
created to the following entities
entities:
Requirements tree:
• category
• folder
Specifications tree:
• category
• folder
Test tree:
• category
cate
• folder
• test
• testcase
Campaign tree:
• campaign session
• testcase execution (only uploadable programmatically by the launcher/test)
Defect tree:
• defect
Whenever it’s possible, two panels are displayed: one for attachmen
attachments
ts directly attached to the
entity and another one for the attachments inherited from the ancestor nodes. It is possible to mo
move
directly to one of these parents by clicking on the anchors.
anchors
4.8.1 Add an attachment
To add an attachment:
• click on the Create attach
attachment
ment button
• a dialog box is displayed
• pick a file and submit
To do so:
• switch to the Requirements tab on the left panel
• in the tree, select a category or a folder
• click on the Create
ate report button
• first select the destination folder for the book
4.9.2 Specification
pecification
pecifications books
To generate specification books, follow the same instructions than those for generating
requirement books (but from the Specificatio
Specifications tab on the left panel).
4.9.3 Projects books
To generate projects books, follow the same instructions than those for generating requirement
books (but from the Projects tab on the left panel).
4.9.4 Testplans
estplans
To generate testplans,, follow the same instructi
instructions
ons than those for generating requirement books
(but from the Tests tab on the left panel).
4.9.5 Test
est reports
To generate test reports,, follow the same instructions than those for generating requirement books
(but from the Campaigns tab on the left panel
panel).
4.10 Customizing the documents
All the documents that you can generate from XStudio are customizable.
4.10.2 Customizing
Customizing the reports
All documents that are generated by XSTudio are first internally generated in XML then possibly
transformed into a different format using specific XSLT tr
transforms
ansforms:
For instance, let’s imagine you want to generate a custom requirement book.
book You can create your
own XSLT file. For example, you can take the file requirement
requirementHTML_RawResult.xslt
HTML_RawResult.xslt as a basis
and modify it as much as you want, rename it requirementHTML_My_Own_Report1.xslt
requirementHTML_My_Own_Report1.xslt and
ensure it is located along with the other .xslt files (in the export/xsl folder – either locally if using
only a standalone install or on the Apache/Tomcat server to share this report with everybody
connecting to this server).
server
From now, when you will try to generate a requirements book, the type My_Own_report1 will be
available in the combo box:
The
he internal process is then the following:
• several SQL request are made to the database to get all the necessary information
• a structured XML document including all the required data is generated
• if the user chose XML,, the XML is saved on the disk in the requirements folder with the
correct name
• if the user chose HTML, the XML document is transformed using the XSLT
XSLT corresponding
to the selected type
ype.
4.11 Searching
4.11.1 By name
All the trees in XStudio include an indexing system allowing to search real
real-time
time an entity in the tree
by his name. tart entering some text in the Search field
name To experiment it by yourself just sstart of the
tree
To automatically select a searched item in the tree, you can move using the arrows keys within the
suggestion list and validate by pressing Enter or just click on the item in the list.
4.11.2 By Id
managed by XStudio.
The data model gives unique identifier for each entity managed XStudio. The Ids are on the
form:
Template Represents
R_<id> Requirement
S_<id> Specification
T_<id> Test
TC_<id> Testcase
C_<id> Campaign session
D_<id> Defect
TA_<id> Tasks
To search for an entity from its Id, just type the Id in the Search Id field and
validate.
4.11.3.1 Requirements
Procedure:
• in the requirement tree, select a category
panel, select the Search tab
• on the right panel
Status and Priority)
• check all the checkboxes (Status Priority that match your search criteria
• (opt.) type some text in the Text to search field and press Enter
Note: The results list is updated real-time
real time and you can reach one particular defect by just clicking on
it. The results table can be ordered by clicking on the column headers.
4.11.3.2 Specifications
Procedure:
• in the specification tree, select a category
panel, select the Search tab
• on the right panel
Status and Priority)
• check all the checkboxes (Status Priority that match your search criteria
• (opt.) type some text in the Text to search field and press Enter
4.11.3.3 Tasks
Procedure:
• in the project tree, select the root folder
• panel, select the Search tab
on the right panel
• type some text in the Text to search field and press Enter
4.11.3.4 Tests
Procedure:
• in the test tree, select the root folder
panel, select the Search tab
• on the right panel
• type some text in the Text to search field and press Enter
4.11.3.5 Defects
Searching defects based on a combination of some very specific criteria is extremely important to
control the quality of the product.
To do so:
• in the defect tree, select a category
panel, select the Search tab
• on the right panel
Reported by
• check all the checkboxes (Reported by, Assigned
signed to
to, Found in,
in Fixed in
in, Status,
Severity and Priority)
Priority that match your search criteria
• (opt.) type some text in the Text to search field and press Enter
To do so:
• from the menu, select File > Import from CSV
• a dialog box is displayed
• pick the Tests and Testcases (with testplan)
estplan) option
• click on the Open button and select the file you wish to import
• the raw data area displays the content of the file
• click on Submit
4.12.2 From XML
The XML format used is the same as the one used to export tests/testcase in XML.
4.12.2.2 Requirements
To import tests and testcases from XML:
• from the menu, select File > Import from XML
• a dialog box is displayed
• pick the Requirements option
• click on the Open button and select the file you wish to import
• the raw data area displays
displays the content of the file
• click on Submit
4.12.2.3 Specifications
To import tests and testcases from XML:
• from the menu, select File > Import from XML
• a dialog box is displayed
• pick the Specifications option
• click on the Open button and select the file you wish
wish to import
• the raw data area displays the content of the file
• click on Submit
5. The expert
expert’s
’s corner
5.1 Tests attributes
You can create and associate custom attribute to your tests. These attributes can be used later in
filters to automatically select some tests matching some specific criteria.
There are two types of attributes: static and dynamic. Dynamic attributes are identified with the
icon. A dynamic attribute is an attribute that you can overwrite when creating a campaign session.
• click on Submit
• immediately, the attribute appears in the tree
• check this attribute
• give a value to this attribute
• click on Submit
• immediately, the modified attribute appears in the tree
o click on submit
o click on the preset filter button
o filtering will be performed and the content tab will highlight and pre
pre-select
select all the
tests matching the filter. You can of course unselect or se
select
lect new tests from this
list
• click on submit
• immediately, the campaign appears in the tree
Note: You
ou can reach one particular campaign session by just clicking on it.
5.4 Duration
uration of test execution
5.4.1 Estimated duration o
off a campaign
As we’ve seen before, a campaign is made of a list of test
test.. If a test has already been run several
times, Xstudio
tudio can provide an estimated duration based on previous executions as well as the
probability of exactness.
You may have different types of test implementation that require completely different approaches to
execute them. To each type of test (i.e. tests developed in different languages) corresponds a
“category”. For any category of tests you have, you’ll need a sp
specific
ecific Launcher.
Several launchers can live together. You just need to add all the necessary launchers (.jar files) in
the <Install_folder >/bin/launchers folder.
Install_folder>/bin/launchers
To get more information on how to develop your own launcher, read the “Developer’s guide”
Launcher Description
manual.jar For manual testing (step
(step-by--step
step procedure)
simple_manual.jar For manual testing (all
(all-in-one
one procedure )
autoit.jar For autoit test sscripts
perl.jar For perl test scripts
tcl.jar For TCL test scripts
testcomplete7.jar For AutomatedQA® TestComplete 7 tests
testpartner.jar For Compuware® TestPartner tests
visualstudio.jar For Microsoft® VisualStudio Team System (Test Edition) tests
beanshell.jar For Beanshell test scripts
squishqt.jar For Froglogic® Squish for Qt tests
squishweb.jar For Froglogic® Squish for the web tests
junit3.jar For JUnit v3 tests
junit4.jar For JUnit v4 tests
pyunit.jar For PyUnit tests
nunit.jar For NUnit tests (so covering any .NET tests: C#, J#, C++/CLI,
C++/CLI Managed C++
C++,
VisualBasic.NET
VisualBasic.NET)
testng.jar For TestNG tests
marathon.jar For Jalian® Marathon tests
exe.jar For executable tests
6.1.1.1 Configuration
The manual.xml
manual file allows pre-configuring
pre configuring the launcher
launcher with some default values:
Parameter Description
General > set testcase as failed as soon as one If set to true, when one step fails in the
step fails testcase the complete testcase is set as
failed and the remaining steps are skipped.
i false
Default value is:
Timing > delay between testcases (ms
ms) Default value is: 0
These values can be changed while creating the campaign session from XStudio.
XStudio
• Go back
back to previous test:
• Go back to previous testcase:
• Restart current testcase:
• Pause current testcase:
• Resume current testcase:
• Go forward to next testcase:
• Go forward to next test:
6.1.1.3 Timeout
T per test
While executing manual tests,, the launcher provides instructions to the operator to execute actions
(and do verifications). The tests are configurable so that some timing restrictions can be added. For
instance, it is possible to give 10 minute minutess maximum for the operator to execute the
actions/verifications (i.e. you need to test that a message appears on the screen of the SUT in less
than 10 minutes; if the operator doesn't validate that he saw that message within the 10 minutes
then the test is automatically set as failed).
If you want to set some specific timeouts to a test you just need to associate to this test one or
several of the following attributes (with a value for the timeout):
When manually executed, each test will be applied with their specific timeout values.
Note also, that all these attributes are dynamic.. This means that when creating the campaign
session, you can overwrite any of these default values (on each test if necessary) to customize one
campaign session execution:
6.1.1.4 Tutorial:
Tutorial: Creating and executing manual tests
In this tutorial we will learn how to setup from scratch a manual test suite, execute it and analyze
the results.
• fill the Details tab with the index (defining the order in which the testcases will be executed
first test case
within a test), name (i.e. ““first case”)”) and general description of the
the testcase. You can
use the formatting tools (wiki-
(wiki-style)
style) to format the text. Later, in reports this text will appear
correctly formatted. Do not forget to check the Implemented checkbox. Non Non-implemented
implemented
test are NOT executed when running a campaign. In ca case
se of manual test, always check the
Implemented checkbox.
• select the Testplan tab and define all the steps and checks that will be needed in this
testcase:
o add a step:
step
in the tree, select the root folder
click on the create step button
enter the descript
description
ion of the step and submit
o add parameters (opt)(opt):
in the tree, select the parameters node
click on the create parameter button
enter the description of the parameter and submit
repeat the operation if you need to specify more parameters
o add checks (opt.):
(opt.)
tree, select the checks node
in the tree,
click on one of the create boolean operators buttons
click on the new operator and click on the create check button
enter the description of the check and submit
repeat the operation if you need to specify more che checks
cks (you can mix as
many different boolean operators as you want)
o repeat the operation for every step and submit
• click on submit
• immediately, the testcase appears in the sub
sub-tree
tree on the left panel
You can create other testcases and other tests using the ssame
ame procedure.
• basic test”)
In the test tree, select the newly created test (i.e. “basic test (not the test case!)
case ) and
press the "start" button
• Check in the Agent tab of the pop up window that your PC is selected by default
• My brow
Select the ""My browser 1.0"" in the SUT tretree
• Select a pre
pre-existing
existing configuration
configuration in the Configuration tab
• If no configuration are available
available:
o click on the create configuration button
o a dialog box is displayed
o enter the name of the configuration
o fill in all the forms displaye
displayedd and submit
• Press the Submit button
Some popup dialog boxes will ask the operator to execute some operations and/or verify some
assertions
• When the test is finished, you can control the results in the campaign tree (a new test
campaign
campaign and campaign session have been automatically created in the campaign tree)
• In the content tab, you can select a test, then a test case (in the dynamic sub
sub-tree)
tree) and
check the complete log of the execution
To create a campaign:
• in the tree, select a folder (create one if necessary)
• on the right panel, click on the create campaign icon
• a dialog box including two tabs is displayed
• fill the Details tab with the name of the campaign
• in the Content tab, select all the tests you want to be part of this campaign
• click on submit
• immediately, the campaign appears in the tree
Refer to the manual launcher section for the configuration and tutorial.
6.1.3.1 Configuration
The autoit.xml
xml file allows pre
pre-configuring
configuring the launcher with some default values:
Parameter Description
General > Test root path This must indicate where are located all the .au3 scripts.
This is a root path. Each test in XStudio has a canonical
path that wil
willl be appended to this
his path.
This path MUST not include an ending slashslash.
These values can be changed while creating the campaign session from XStudio.
XStudio
6.1.3.2 Requirements
1) Each test in XStudio must have his dedicated .au3 script. The name of the script MUST be
equal to the name of the test.
2) The .au3 script must be able to parse the argument testcaseIndex passed during interpretation.
This allows the script to execute different routines depending on the testcase index.
The interpreter is executed by the launcher using this syntax:
AutoIt3.exe
utoIt3.exe <testRootPath>/<testPath>/<testName>.au3 /debug /testcaseIndex=<testcaseIndex>
4) the .au3 script must generate a log.txt during its execution. This file MUST describe all the
actions performed by the test as well as the result of each action. This ffile
ile will be parsed by the
launcher and all the information will be passed/stored automatically in the XStudio database. The
log.txt MUST respect a specific format: Each line MUST include the strings ““[Success] [Success]",
"[Failure]"" or "[Log]"
" " or the line will not be treated. Based on this information, the testcase will be
flagged as passed or failed.
6.1.3.3 Tutorial:
Tutorial Creating and executing AutoIt tests
In this tutorial, we will learn to run some AutoIt test scripts.
6.1.3.3.1 Prerequisites
Install AutoIt in the folder C:\Program
Program files\AutoIt3
f AutoIt3
#include <File.au3>
$RESULT_FILENAME = "Log.txt"
FileDelete($RESULT_FILENAME)
$FLAG_FILENAME = "test_completed.txt"
FileDelete($FLAG_FILENAME)
; log a success
logFileAndConsoleWrite($RESULT_FILENAME, "[Log] fir
first
st message")
logFileAndConsoleWrite($RESULT_FILENAME, "[Log] second message")
logFileAndConsoleWrite($RESULT_FILENAME, "[Success] testcase succeeded")
;logFileAndConsoleWrite($RESULT_FILENAME, "[failure] testcase failed")
6.1.4.1 Configuration
The perl.xml
xml file allows pre
pre-configuring
configuring the launcher with some default values:
Parameter Description
General > Test root path This must indicate where are located all the .pl.p scripts.
This is a root path. Each test in XStudio has a canonical
path that will be appended to this
his path.
This path MUST not include an ending slash
slash.
These values can be changed while creating the campaign session from XStudio.
campaign XStudio
6.1.4.2 Requirements
1) Each test in XStudio must have his dedicated .pl
. script. The name of the script MUST be equal
to the name of the test.
2) The .pl script must be able to parse the argument testcaseIndex passed during interpretation.
interpretat
This allows the script to execute different routines depending on the testcase index.
The interpreter is executed by the launcher using this syntax:
perl.exe
erl.exe <testRootPath>/<testPath>/<testName>.pl /debug /testcaseIndex=<testcaseIndex>
3) When the .pl has executed all its ac tions, it MUST create an empty test_completed.txt file.
actions,
Indeed, the
he executions of the Perl scripts are asynchronous. This mechanism allows the launcher
to know when the test is completed. A timeout of 10 minutes is predefined. If the .pl script did not
create the test_completed.txt
test_completed. within the first 10 minutes, then the launcher considers the test has
crashed and skip
skips it.
4) the .pl script must generate a log.txt during its execution. This file MUST describe all the actions
performed
med by the test as well as the result of each action. This file will be parsed by the launcher
and all the information will be passed/stored automatically in the XStudio database. The log.txt
MUST respect a specific format: Each line MUST include the strin [Success]", "[Failure]
strings “[Success] [Failure]" or
"[Log]"" or the line will not be treated. Based on this information, the testcase will be flagged as
passed or failed.
6.1.5 TCL Launcher (tcl.jar)
The TCL launcher allows interfacing with TCL (.tcl)
l) scripts.
It has been tested with TCL 8.5.
6.1.5.1 Configuration
The tcl.xml
xml file allows pre-configuring
pre configuring the launcher with some default values:
Parameter Description
General > Test root path This must indicate where are located all the .tcl scripts.
This is a root path. Each test in XStudio has a canonical
path that will be appended to this
his path.
This path MUST not include an ending slash
slash.
These values can be changed while creating the campaign session from XStudio.
XStudio
6.1.5.2 Requirements
1) Each test in XStudio must have his dedicated .tc
tcl script. The name of the script MUST be equal
to the name of the test.
2) The .tcl script must be able to parse the argument testcaseIndex passed during interpretation.
This allows the script to execute different routines depending on the testcase index.
The interpreter is executed by the launcher using this syntax:
tclsh85.exe
.exe <testRootPath>/<testPath>/<testName>.tcl
<testRootPath>/<testPath>/<testName>.tcl /debug /testcaseIndex=<testcaseIndex>
4) the .tcl script must generate a log.txt during its execution. This file MUST describe all the
actions performed by the test as well as the result of each action
action.. This file will be parsed by the
launcher and all the information will be passed/stored automatically in the XStudio database. The
log.txt MUST respect a specific format: Each line MUST include the strings ““[Success] [Success]",
"[Failure]"" or "[Log]"
" " or the line will
will not be treated. Based on this information, the testcase will be
flagged as passed or failed.
6.1.6 AutomatedQA
AutomatedQA® TestComplete v7 Launcher
(testcomplete7.jar)
The AutomatedQA®
AutomatedQA TestComplete v7 launcher allows interfacing with TestComplete tests.
6.1.6.1 Configuration
Configuration
The testcomplete7.xml
testcomplete7 file allows pre
pre-configuring
configuring the launcher with some default values:
Parameter Description
TestComplete7 > TestComplete
Complete install This must indicate where are located all the
path TestComplete scripts. This is a root path. Each test in
XStudio has a canonical path that will be appended to
this path.
This path MUST not include an ending slash
slash.
These values can be changed while creating the campaign session from XStudio.
XStudio
6.1.6.2 Requirements
The tests are executed by the launcher using this syntax:
The test
st will be marked as passed or failed depending on the return code of the execution.
The TestComplete xml log is also attached to the testcase execution in XStudio.
6.1.7 Compuware
Compuware® TestPartner Launcher (testpartner.jar)
The Compuware®
Compuware TestPartner launcher al lows interfacing with TestPartner scripts.
allows
6.1.7.1 Configuration
The testpartner.xml
testpartner file allows pre-configuring
pre configuring the launcher with some default values:
Parameter Description
TestPartner > Test root path This must indicate where are located all the test scripts. This is
a root path. Each test in XStudio has a canonical path that will
be appended to this path.
This path MUST include an ending slash
slash.
from XStudio.
These values can be changed while creating the campaign session fr XStudio
6.1.7.2 Requirements
The tests are executed by the launcher using this syntax:
C:\Program
Program Files\TestPartner
Files TestPartner\tc.exe --d <dsn> -u
u <username> --p
p <password> -rr <projectName> -s
<testRootPath>/<testName>
<testRootPath> <testName>
6.1.8.1 Configuration
The visualstudio.xml
visualstudio file allows pre-configuring
pre configuring the launcher with some default values:
Parameter Description
General > Test root path This must in
indicate
dicate where are located the tests
tests.. This is a root
path. Each test in XStudio has a canonical path that will be
appended to this
his path.
This path MUST not include an ending slash
slash.
These values can be changed while creating the campaign session from XStudio.
XStudio
6.1.8.2 Requirements
The tests are executed by the launcher using this syntax:
<runnerPath>
Path> /testcontainer:
/testcontainer:<libraryName>
Name> /test:<test
/test:<testPath>.<testName>
.<testName>
/resultsfile:<results
<resultsFilePath>
Path>
The test will be marked as passed or failed depending on the log file generated by VisualStudio.
The xml file is parsed by the launcher. The xml log and the execution ttrace
race of the command are
also attached to the testcase execution in XStudio.
6.1.9 Beanshell Launcher (beanshell.jar)
The Beanshell launcher allows interfacing with beanshell test scripts.
It is provided with an additional module that first download and insta
installll the product but you can
easily remove this part and recompile the launcher).
It has been tested with Beanshell 2.0
2.0b4
6.1.9.1 Configuration
The beanshell.xml
beanshell file allows pre-configuring
pre configuring the launcher with some default values:
Parameter Description
General > Test
Test root path This must indicate where are located all the Beanshell
scripts. This is a root path. Each test in XStudio has a
canonical path that will be appended to this path.
This path MUST not include an ending slash
slash.
These values can be changed while creating the campaign session from XStudio.
XStudio
6.1.9.2 Requirements
The launcher starts by downloading a ssetup etup program and install the product. The product
<installerTemplateN
TemplateName
ame> (using the SUTVersion) is downloaded from <downloadServerURL> and
copied in <folderContainingtheInstallers>
folderContainingtheInstallers>
folderContainingtheInstallers>. Then the product is installed by running the executable
with the option /S (This is supposed that the installer is a NSIS installer).
1) Each test in XStudio must have his dedicated .bsh script. The name of the script MUST be
equal to the name of the test.
2) The .bsh
bsh script must be able to parse the testcaseIndex argument passed
passed during interpretation.
This allows the script to execute different routines depending on the testcase index.
The interpreter is executed by the launcher using this syntax:
3) When the .bsh has executed all its ac tions, it MUST create an empty test_completed.txt
actions, test_completed.txt file.
Indeed, the
he executions of the TCL scripts are asynchronous. This mechanism allows the launcher
to know
now when the test is completed.
4) the .bsh
bsh script must generate a log.txt during its execution. This file MUST describe all the
actions performed
formed by the test as well as the result of each action. This file will be parsed by the
launcher and all the information will be passed/stored automatically in the XStudio database. The
log.txt MUST respect a specific format: Each line MUST include the st [Success]",
strings “[Success]
"[Failure]"" or "[Log]"
" " or the line will not be treated. Based on this information, the testcase will be
flagged as passed or failed.
6.1.10 Froglogic
Froglogic® Squish Launcher (squish
(squish.jar)
.jar)
The Froglogic®
Froglogic Squish launcher allows interfacing with Squish tests.
It has been tested with Squish-3.4.4.
Squish
6.1.10.1 Configuration
The squish.xml
squish file allows pre-configuring
pre configuring the launcher with some default values:
Parameter Description
General > Test root path This must indicate where are located all the Squish tests.
This is a root path. Each test in XStudio has a canonical
path that will be appended to this
his path.
This path MUST not include an ending slash
slash.
These values can be changed while creating the campaign session from XStudio.
XStudio
6.1.10.2 Requirements
1) Each testcase
test in XStudio must have his dedicated squish script. The name of th
thee script MUST
be equal to tst_<testName>
<testName>.
6.1.10.3 Tutorial:
Tutorial: Creating and executing Squish tests
In this tutorial, we will learn to run some Squish test scripts. We will use Squish for Java but this
can be applied for any other squish application (Java, Qt, Web etetc.)
6.1.10.3.1 Prerequisites
Install Squish for Java in the folder C:
C:\tools\squish
squish-3.4.4-java-win32
win32
Create a file utils.js in the folder C:\src
src\squish\lib
lib the following content:
function globalLog()
globalLog() {
test.log("Global Log example",
example", "Traces");
}
Using Squish IDE create a new test suite (using javascript language
language,, with no AUT application)
application
called suite_fake in C:\src
src\squish\testsuites
testsuites.
function localLog()
Log() {
test.log("Log example"
example",
, "Some useful information");
test.warning("Warning example", "Some warnings to highlight");
}
function main() {
source(findFile("scripts", "utils.js"));
source(findFile("scripts",
ndFile("scripts", "objects.js"));
();
globalLog();
localLog();
test.compare(1, 1);
test.compare(2, 2);
test.compare(3, 4);
}
function main() {
source(findFile("scripts", "utils.
"utils.js"));
js"));
source(findFile("scripts", "objects.js"));
();
globalLog();
localLog();
test.compare(1, 1);
test.compare(2, 2);
test.compare(3, 3);
}
AUT =
CLASS =
CLASSPATH
SSPATH =
CWD =
ENVVARS = envvars
HOOK_SUB_PROCESSES = 1
LANGUAGE = JavaScript
NAMINGSCHEME = MULTIPROP
TEST_CASES = tst_
tst_success tst_failure
tst_
USE_WHITELIST = 1
WRAPPERS = Java
6.1.10.3.2 Create a dedicated category for Squish tests and create two tests
test
• create a category Squish associated to the launcher squish
squish.jar
• name success and
under this category, create (somewhere in the tree) two tests with names
failure with a canonical path set to suite_fake.
suite_fake
Refer to JUnit v4 for the details as these 2 launchers are very similar.
6.1.12.1 Configuration
The junit4.xml
xml file allows pre-configuring
pre configuring the launcher with some default values:
Parameter Description
General > Test root path This must indicate where are loc ated all the Squish tests.
located
This is a root path. Each test in XStudio has a canonical
path that will be appended to this
his path.
This path MUST not include an ending slash
slash.
These values can be changed while creating the campaign session from XStudio.
XStudio
6.1.12.2 Requirements
The tests are executed by the launcher using this syntax:
<javaInstallPath>/
Path>/bin/java.exe
bin/java.exe –classpath
classpath <junitJarPath
junitJarPath>;<additionalClassPath>;
;<additionalClassPath>;<testRootPath
;<additionalClassPath>; testRootPath>
org.junit.runner.JUnitCore <testPath>.<testName>
The test will be marked as passed or failed depending on the log file generated by JUnit. The text
file is parsed by the launcher. The log and the execution trace of the command are also attached to
the testcase execution in XStudio.
6.1.13.1 Configuration
The pyunit..xml file allows pre-configuring
pre configuring the launcher with some default values:
Parameter Description
Generall > Test root path This must indicate where are located all the PyUnit tests.
This is a root path. Each test in XStudio has a canonical
path that will be appended to this
his path.
This path MUST not include an ending slash
slash.
These values can be changed while creating the campaign session from XStudio.
XStudio
6.1.13.2 Requirements
The tests are executed by the launcher using this syntax:
The test will be marked as passed or failed depending on the log file generated by PyUnit. The text
file is parsed by the launcher. The log and the execution trace of the command are also attached to
the testcase execution in XStudio.
6.1.14.1 Configuration
The nunit.xml
xml file allows pre-configuring
pre configuring the launcher with some default values:
Parameter Description
General > Test root path This must indicate where are located all the Squish tests.
This is a root path. Each test in XStudio has a canonical
path that will be appended to this
his path.
This path MUST not include an ending slash
slash.
These values can be changed while creating the campaign session from XStudio.
XStudio
6.1.14.2 Requirements
The tests are executed by the launcher using this syntax:
<nunitConsolePath>/bin/net
<nunitConsolePath>/bin/net-<netVersion>/nunit
<netVersion>/nunit-console.exe
console.exe /run:<testPath>.<testName>
<testRootPath>/<assemblyName>
The test will be marked as passed or failed depending on the log file generated by NUnit. TThe text
file is parsed by the launcher. The log and the execution trace of the command are also attached to
the testcase execution in XStudio.
6.1.15.1 Configuration
The testng..xml file allows pre-configuring
pre configuring the launcher with some default values:
Parameter Description
General > Test root path This must indicate where are located all the TestNG
estNG tests.
This is a root path. Each test in XStudio has a canonical
path that will be appended to this
his path.
This path MUST not include an ending slash
slash.
These values can be changed while creating the campaign session from XStudio.
XStudio
6.1.15.2 Requirements
The tests are executed by the launcher
launcher using this syntax:
<javaInstallPath>/bin/java.exe –classpath
classpath <testNGJarPath>;<additionalClassPath>;<testRootPath>
org.testng.TestNG –testclass
testclass <testPath>.<testName>
6.1.15.3 Tutorial:
Tutorial: Creating and e
executing
xecuting TestNG tests
TODO
6.1.16 Jalian®
Jalian Marathon Launcher (marathon.jar)
The Jalian®® Marathon launcher allows interfacing with Marathon tests.
It has been tested with Marathon 1.2.1.1
6.1.16.1 Configuration
The marathon.xml
marathon file allows pre-configuring
pre configuring the launcher with
with some default values:
Parameter Description
General > Test root path This must indicate where are located all the Marathon tests.
This is a root path. Each test in XStudio has a canonical path
that will be appended to this
his path.
This path MUST not incinclude
lude an ending slash
slash.
6.1.16.2 Requirements
The tests are executed by the launcher using this syntax:
<javaInstallPath>/bin/java.exe –classpath
classpath <marathonClassPath> -
Dmarathon.home=”<marathonHome>” -Dpython.home=”<marathonHome>”
Dpython.home=”<marathonHome>” -classpath
<marathonClassPath>
<marathonClassPath> net.sourceforge.marathon.Main -batch -xml
xml
<testRootPath>/marathon_report.xml < <testRootPath
RootPath>
The test will be marked as passed or failed depending on the xml log file generated by Marathon.
The xml file is parsed by the launcher. The logs and the execution trace of the command are also
attached to the testcase execution in XStudio.
6.1.17.1 Configuration
The exe.xml
xml file allows pre-configuring
pre configuring the launcher with some default values:
Parameter Description
General > Test root path This must indicate where are located all the .exe files.
This is a root path. Each test in XStudio has a canonical
path that will be appended to this
his path.
This path MUST not include an ending slash
slash.
These values can be changed while creating the campaign session from XStudio.
XStudio
6.1.17.2 Requirements
1) Each test in XStudio must have his dedicated .exe file. The name of the executable MUST be
equal to the name of the test.
2) The .exe
exe file must be able to parse the argument testcaseIndex passed during execution. This
allows executing ddifferent
ifferent routines depending on the testcase index.
3) In asynchronous mode, when the .exe has executed all its action, it MUST create an empty
test_completed.txt
test_completed. file. This mechanism allows the launcher to know when the test is completed.
A timeout is predefined for this. If the executable did not create the test_completed.txt
test_completed.txt file within
the timeout value then the launche
launcherr considers the test has crashed and skip
skips
s it.
4) In synchronous mode, the returned code is used to determine if the test passed or failed: a
returned code equals to 0 will be understood as a success, any other value will be interpreted as a
failure.
5) In asynchronous mode, the he executable must generate a log. log.txt during its execution. This file
MUST describe all the actions performed by the test as well as the result of each action. This file
will be parsed by the launcher and all the information will be passed/stored automatically in the
XStudio database. The log.txt MUST respect a specific format: Each line MUST include the
[Success]", "[Failure]
strings “[Success] [Failure]" or "[Log]
[Log]"" or the line will not be treated. Based on this information,
the testcase will be flagged as passed or failed.
6.1.18 Simulation Random Launcher (random.jar)
The random launcher is for demo purposes only. It basically simulates time
time-consuming
consuming operation
and generates random results.
With a very minimum effort, you can make all these tests (maybe several thousands of tests)
manageable by XStudio. To do this, you just have to develop 2 lau
launchers
nchers (10 lines of code each).
Test 1
Custom Test 2
Executable
xecutable executables
Test 3
Tests Launcher
Database Test 4
String testPath,
testPath
String testName,
testName
testcaseIndex {
int testcaseIndex)
String[] command = new String[] {
{testPath
testPath + testName
testName};
run(testId
testId,
testPath
testPath, Runtime run = Runtime.getRuntime
Runtime.getRuntime();
();
testName
testName, run.exec(command);
run.exec
testcaseIndex
testcaseIndex) int exitValue = p.waitFor
p.waitFor();
Test 1
Test 2
Class java class
Launcher Test 3 files
Test 4
Of course,
e, the launcher interface /API is wider than just a run() method but the run() method is the
interface/API
principal one.
I’m currently working on the Developer’s Guide that will explain in detail how to develop you own
launcher.
7. Memory Profiling
The application is regularly
regularly profiled to check any possible memory leak. This is to maintain a good
robustness of the application. Latest measurements showed the following results:
XStudio needs between 20 and 150 Mbytes to run smoothly (less than Firefox!).
Firefox!).
XAgent
Agent needs between 5 and 10 Mbytes to run smoothly.
Figures
Figure 1 - User Tree ................................
................................................................
................................................................
......................................................... 9
.........................................................
Figure 2 - Calendar
Calendar Monthly View ................................
................................................................
................................................................
................................ 12
..................................
Figure 3 - Calendar Half
Half--Yearly View ................................
................................................................
............................................................. 13
.............................................................
Figure 4 - SUT Tree ................................
................................................................
................................................................
....................................................... 14
.......................................................
Figure 5 - Agent Tree ................................
................................................................
................................................................
.....................................................
..................................................... 16
Figure 6 - Requirement Tree ................................................................
................................................................................................
..........................................................................
................................ 18
Figure 7 - Specification Tree ................................................................
................................................................................................
..........................................................................
................................ 21
Figure 8 - Project Tree ................................................................
................................................................................................
...................................................................................
................... 23
Figure 9 - Test Tree ................................
................................................................
................................................................
....................................................... 29
.......................................................
Figure 10 - Campaign Tree ................................
................................................................
................................
................................................................
............................................
............ 32
Figure 11 - Defect Tree ................................
................................................................
................................
................................................................
..................................................
.................. 39
Figure 12 – Custom Launchers ................................
................................................................
................................................................
................................
.................................... 107
Tables
Table
Table 1 - Entities ................................................................
................................................................................................
.............................................................. 7
..............................................................
Table 2 - Tree Buttons
Buttons................................
................................
................................................................
................................
................................................................
......................................................
...................... 7
Table 3 - Overlay Icons ................................
................................................................
................................
................................................................
....................................................
.................... 8
Table 4 - Requirements Status ................................................................
................................................................................................
................................
....................................... 20
Table 5 - Specification Status ................................................................
................................................................................................
.........................................................................
................................ 22
Table 6 - Sprint Status................................
Status................................
................................................................
................................
................................................................
....................................................
.................... 25
Table 7 - Defect Status................................................................
................................................................................................
...................................................................................
................... 40
Table 8 - Unique Identifier Templates ................................................................
............................................................................................
............................................................ 63
Table 9 – XStudio’s Default Launchers ................................................................
..........................................................................................
.......................................................... 77
Acronyms
Acronym Meaning
SUT System Under Test