sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr
         +===================================================+
         +=======    Quality Techniques Newsletter    =======+
         +=======            February 2005            =======+
         +===================================================+

QUALITY TECHNIQUES NEWSLETTER (QTN) is E-mailed monthly to
subscribers worldwide to support the Software Research, Inc. (SR),
eValid, and TestWorks user communities and to other interested
parties to provide information of general use to the worldwide
internet and software quality and testing community.

Permission to copy and/or re-distribute is granted, and secondary
circulation is encouraged, provided that the entire QTN
document/file is kept intact and this complete copyright notice
appears in all copies.  Information on how to subscribe or
unsubscribe is at the end of this issue.  (c) Copyright 2004 by
Software Research, Inc.

========================================================================

                       Contents of This Issue

   o  Real vs. Synthetic Web Performance Measurements: A Comparative
      Study, by John Bartlett & Peter Sevcik

   o  SQRL Reports Available

   o  International Journal of Information Technology and Web
      Engineering

   o  DETester Portal Website

   o  eValid: Latest News and Updates

   o  ROI Dashboard Offered

   o  Mini-Survey on Test Training, by Franco Martinig

   o  QTN Article Submittal, Subscription Information

========================================================================

Real vs. Synthetic Web Performance Measurements, A Comparative Study
                                 by
                    John Bartlett & Peter Sevcik

      Summary:  This new study finds that synthetic Web
      performance tests give organizations a false sense of
      user experience.  Specifically, synthetic measurements
      frequently understate actual response time seen by real
      users by as much as 50% and overstate the response time
      of other applications by as much as 350%.

                         Executive Summary

With respect to accuracy, our analysis shows that synthetic
measurements randomly and frequently understate the actual response
time seen by real users by as much as 50% and overstate the response
time of other applications by as much as 350%.  These error margins
are much too broad to serve as the basis for infrastructure
management or SLA enforcement.  When comparing the results of real
versus synthetic measurements over time, we also found poor
correlation of response time trends.  Establishing accurate
baselines is an essential first step in proactive detection of
service level degradation or traffic anomalies.

With respect to coverage, our analysis shows that synthetic agents
usually interact with less than 10% of total website URLs and less
than 15% of the ISPs connecting real users to the site. It is
dangerous to extrapolate the performance characteristics of an
entire application environment and its user communities from this
small sample.  In one site analyzed, we found that 190 synthetic
agents were used to represent 1.7 million total website visitors.

The principle risk of using synthetic measurement for service level
management is that service problems can impact an unknown quantity
of users in unknown locations with an immediate and negative impact
on the business.  This can leave the IT organization without the
data they need to detect and analyze service problems so they can
take the necessary corrective actions to restore service.

                        Permission Statement

You may excerpt small parts of this report as long as you add the
following citation:  Source: NetForecast Report 5077:
                     http://www.netforecast.com

========================================================================

                         SQRL Report No. 26

         Inspection of Software with Incomplete Description

                       By:  Michael Sharygin

         http://www.cas.mcmaster.ca/sqrl/sqrl_reports.html

Abstract:  Software quality problem has become increasingly visible,
as software has emerged as the dominant factor in determining system
costs. Software inspection, or static analysis of software
artifacts, is commonly recognized as an efficient way of finding
defects. Unfortunately, in software inspection we often face the
problem of non-existent, unsatisfactory or inconsistent software
documentation.

This thesis investigates existing inspection methods and derives a
generic inspection process for assuring quality, evaluating and
specifying post-release software product with incomplete undergoing
description. The inspection strategy refers to a set of scenarios
and to the associated comparison scheme to be used for diagnosing
software defects.

To be rigorous and systematic, the process of inspecting software
with incomplete description needs an evolvable base of software
artifacts analysis in order to avoid desperate understanding of the
software system under study. To realize this, a generic process
within the framework suggested employs a lifecycle-centric approach
to software inspection. To facilitate the process, we propose a
stepwise abstraction system recovery while engineering the system in
reverse.

We will also introduce an experiment investigating the development
of a rigorous approach for effective software inspection.

                         SQRL Report No. 25

Use of Tabular Expressions in the Inspection of Concurrent Programs

                         By:  Xiao-Hui Jin

         http://www.cas.mcmaster.ca/sqrl/sqrl_reports.html

Abstract:  This thesis presents a systematic, rigorous inspection
approach for concurrent programs. The approach has been successfully
applied to a classic concurrent program of the Readers/Writers
problem.

In the inspection process, we rewrite the concurrent program by
assigning each primitive statement a label; the transfer of control
from statement to statement is made explicit. Auxiliary variables
are used to record extra information for inspection without
affecting the original intent of the program. The resulting program
is a non-deterministic sequential program with the same behavioral
effect as the original concurrent program. The rewritten program is
then examined through checking the truth-value of the system
invariant that fully captures program structure. A decreasing
quantity of the program states is also used to show the clean
completion of the program.

We use tabular expressions, program-function tables, to describe the
function of the program. Each column in the table is inspected
individually; the program is divided into small components to be
conquered with ease. The correctness of the whole program is implied
(evaluated) by the correctness of the columns examined through the
inspection.

========================================================================

                      International Journal of
             Information Technology and Web Engineering
                 (http://www.idea-group.com/ijitwe)

                   Inaugural Issue: January 2006
                 Submission Deadline April 10, 2005

                   Editor-in-Chief: David C. Rine
                   Professor of Computer Science
                      George Mason University
                        DavidCRine@aol.com

               Co-Editor-in-Chief: Ghazi I. Alkhatib
                       Senior Lecturer of MIS
                    Qatar College of Technology
                    alkhatib.JITWENG@qu.edu.qa

An official publication of the Information Resources Management
Association Published: Quarterly (Print ISSN: 1554-1045 and
Electronic E-ISSN:  1554-1053)

Organizations are continuously overwhelmed by a variety of new
information technologies, many are Web based.  These new
technologies are capitalizing on the widespread use of network and
communication technologies for seamless integration of various
issues in information and knowledge sharing within and among
organizations.  This emphasis on integrated approaches is unique to
this journal and dictates cross platform and multidisciplinary
strategy to research and practice.

The Mission of the Journal The main objective of the journal is to
publish refereed papers in the area covering Information Technology
(IT) concepts, tools, methodologies, and ethnography, in the
contexts of global communication systems and Web engineered
applications.  In accordance with this emphasis on the Web and
communication systems, the journal publishes papers on IT research
and practice that support seamless end-to-end information and
knowledge flow among individuals, teams, and organizations.

This end-to-end strategy for research and practice requires emphasis
on integrated research among the various steps involved in
data/knowledge (structured and unstructured) capture (manual or
automated), classification and clustering, storage, analysis,
synthesis, dissemination, display, consumption, and feedback.  The
secondary objective is to assist in the evolving and maturing of
IT-dependent organizations, as well as individuals, in information
and knowledge based culture and commerce, including e-commerce.

Coverage Topics to be included are the following.  Please refer to
the web site for more detailed list of topics.

  * Web systems architectures, including distributed, grid computer,
    and communication systems processing
  * Web systems design and performance engineering studies
  * Web user interfaces design, development, and usability
    engineering studies
  * RFID research and applications in web engineered systems
  * Mobile, location-aware, and ubiquitous computing
  * Ontology and semantic Web studies
  * Software agent-based for web-engineered applications
  * Integrated user profile, provisioning, and context-based
    processing
  * Security, integrity, privacy and policy issues
  * Case studies validating Web-based IT solutions
  * Data and knowledge capture, quality, validation, and
    verification issues
  * IT readiness and technology transfer studies
  * IT Education and Training
  * Virtual teams and virtual enterprises: communication, policies,
    operation, creativity, and innovation

In addition to complete research articles, the journal will publish
book reviews and research notes on current innovative, substantial
and novel concepts and research areas to foster collaboration
research and encourage further exploration and exploitation by
academicians and practitioners.  Both research and instructional
case studies of success stories of IT-augmented systems development
will be solicited form practitioner of business and government
organizations.

Publisher The International Journal of Information Technology and
Web Engineering will be published by Idea Group Inc. , publisher of "Idea Group Publishing", "Information
Science Publishing", "IRM Press", "Cybertech Publishing", and "Idea
Group Reference" imprints.  The inaugural issue of this journal is
due for publication in January 2006.

========================================================================

                      DCTester Portal Website
                                 by
                            Jeff Rashka

              http://www.effectivesoftwaretesting.com/

The DCTester Portal website has been updated to reflect events in
the area for 2005.  Check out the seminars, tutorials and
conferences.  Also note that new jobs are posted on the website each
week.

Of special note, visit the site to learn about the free seminar on
23 February that addresses practices for testing across the entire
development life cycle for effectively validating application
functionality and performance. The approach applies a business
context to the test management effort, where critical success and
high-risk functions are prioritized.

Also, obtain a free white paper, associated with the approach
addressed in the seminar, to learn how you can implement a risk-
based test approach that will enable you to build higher quality
applications while meeting deadlines. Visit the site below today to
read "Implementing a Risk-based Approach to Ensuring Quality".
Discover the benefits of risk-based testing and what you can do to
accurately identify and prioritize testing, while controlling costs
and resources.

      Free white paper: http://www.compuware.com/media.asp?cid

========================================================================

                   eValid: Latest News & Updates

eValid is the premier WebSite Quality Testing & Analysis Suite.
eValid solutions help organizations maintain e-Business presence,
improve WebSite quality and performance, reduce down time, prevent
customer loss, and control costs.

eValid's Web Analysis and Testing Suite is comprehensive, yet
scalable and easy to use, and applies to a wide range of web
applications.  Built entirely inside an IE-compatible full-featured
browser, 100% realistic user experience results are guaranteed.

              New Variable/Value Manipulation Commands
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The latest revisions of eValid V5 include new features for:

* A new capability to provide a context to a script at playback
  time, via a new "EnvironmentVariableFile" command.  For details on
  how this command operates, see:

  http://www.soft.com/eValid/Products/Documentation.5/Technical/environment.variables.html

* A new selectable option for site analysis, including a Browse...
  file selection option, to permit you to choose where to store you
  site analysis filter reports.

  Assuming you have enough machine capacity you now can run launch
  multiple simultaneous eValid site analysis scans.

* Expansion of the currently available set of Reserved Variables
  that allow you to instantiate important fixed local values into an
  eValid test script at playback time.  See this description for
  details of how this works:

  http://www.soft.com/eValid/Products/Documentation.5/Technical/reserved.variables.html

                         CeBit Show Details
                         ^^^^^^^^^^^^^^^^^^
eValid will be represented at the upcoming CeBit show in Hannover,
Germany, 10-16 March 2005.  If you are planning on being at CeBit
please visit:

                  Business Consulting Saxony (BCS)
              http://www.sw-products.de/evalid_eng.htm

You'll find BCS people at CeBit Hall 4, Booth F-064.

German-speaking readers are invited to review the German Language
Brochure (PDF):

http://www.soft.com/eValid/Alliances/Germany/eValidFlyer.pdf

And, don't forget to check the German language White Paper,
"Qualitat von WebPrasenzen -- Eine Herausforderung."

You can read this entire paper in PDF format at:

http://www.soft.com/eValid/Alliances/Germany/WP.WebSiteQualityChallenge.pdf

            New Log Filtering Option Support Monitoring
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^i^^^^
The latest builds of the eValid website test engine include two
capabilities that will be of interest to users who are using eValid
in monitoring mode.  There are two changes now available:

* Selectable options via a new Message Log Filter that will let you
  select which messages are put in the Message Log.  For details
  see:

  http://www.soft.com/eValid/Products/Documentation.5/Settings/project.log.filters.html

* A capability to create a Custom Log as a subset of the Event Log,
  based on a set of string matches that are user specified.  For
  details see:

  http://www.soft.com/eValid/Products/Documentation.5/Settings/project.log.filters.html

                        HTTP Error Reporting
                        ^^^^^^^^^^^^^^^^^^^^
A new capability for monitoring HTTP errors has been added to the
eValid playback engine.  Users can select to have HTTP errors
reported as WARNINGs or ERRORs.  In addition, detailed timing logs
generated by eValid now include the specific byte size and download
time of each page component separately.  For complete details see:

http://www.soft.com/eValid/Products/Documentation.5/Settings/project.log.filters.html

                     New Data Import Capability
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^
A new eValid capability now simplifies importing data from
CallScript, GoScript, and _eValid commands, and with the -DATA
command line switch.

The new Value Extraction< feature gives eValid users a simple way to
import complex settings for use in processing complex parametric
scripts.

For complete usage information please see:

http://www.soft.com/eValid/Products/Documentation.5/Generate/value.extract.html

                New 3D-SiteMap Capabilities, Options
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The 3D-SiteMap portrays page-to-page dependency information that the
eValid site analysis process obtains by systematically scanning a
WebSite and then analyzing the dependencies between all of the pages
it viewed.

A new release of the eValid 3D-SiteMap display engine is now
available.  The new additions to the 3D-SiteMap engine include:

o A capability to limit the displayed dependencies to an adjustable
  depth for children and/or parents of a chosen root node.

o An option to show only the "immediate family" of a chosen root
  (base) URL.  This lets you see only the key parts of any 3D-
  SiteMap.

You can see a complete working example along with the updated 3D-
SiteMap Summary Documentation at this page:

http://www.soft.com/eValid/Products/Documentation.5/Mapping/3Dsitemap.html

         Enhanced Adaptive Playback for ButtonClick Command
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^i^^^^^^^^^^^^^^^^^^^^
Changes in the adaptive playback support have been made for eValid's
ButtonClick command.  The new implementation provides for increased
flexibility in how eValid adapts to a changed button name or changed
button location.  With this change fewer tests will fail for
inconsequential changes in the underlying website.  The improved
method is available to regular eValid users as part of their regular
maintenance subscription.

           Command Line Switches and Error Codes Expanded
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To support expanded use of eValid in monitoring applications and
other unattended operational roles, we have added a number of new
exit codes.  The complete Playback Error Codes documentation gives
all of the details.  See this page for details:

http://www.soft.com/eValid/Products/Documentation.5/Technical/error.codes.html

Of special note is the addition of error/exit codes for the new
eV.Manager batch-mode to automatically repeat application of an test
suite a fixed number of times in immediate succession.

See the eV.Manager Command Line Switches description at:

http://www.soft.com/eValid/Products/Documentation.5/Technical/interface.html#eV.Manager

                 Product Download Location, Details
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Here is the URL for downloading eValid if you want to start [or re-
start] your evaluation:

http://www.soft.com/eValid/Products/Download.5/down.evalid.5.phtml?status=FORM

                   Contact Us With Your Questions
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
We welcome your questions about eValid and its applications.  We
promise a response to every question in ONE BUSINESS DAY if you use
the WebSite request form:

http://www.soft.com/eValid/Information/info.request.html

========================================================================

                       ROI Dashboard Offered

Finding accurate and reliable benefit data from software technical
and management improvements can be a hit or miss proposition.

The Data & Analysis Center for Software (DACS) and its newly
launched web-based ROI Dashboard at:

http://r.vresp.com/?TheDataandAnalysisCe/5d905f23d1/283909/254d27cbd2/79227cd

DACS is offering a new solution where the software community can
find benefit data from software technical and management
improvements.

The ROI Dashboard augments and updates the DACS Report "A Business
Case for Software Process Improvement"

http://r.vresp.com/?TheDataandAnalysisCe/70891d5a0c/283909/254d27cbd2/79227cd

with the latest published data on benefits. The ROI Dashboard
graphically displays open and publicly available data and provides
standard statistical analysis of the data.

The data used in the ROI Dashboard was collected from open
literature:

    * Articles in Communications of the ACM, CrossTalk, the IBM
      Systems Journal, IEEE Transactions on Software Engineering,
      Computer, Software, the Journal of Systems and Software,
      Software Practice and Experience, and Software Process
      Improvement and Practice.

    * Papers presented at conferences, such as conferences sponsored
      by the Software Engineering Institute and Software Engineering
      Process Group (SEPG) conferences.

    * Data taken from technical reports.

    * Data collected by the DACS through surveys

The DACS ROI Dashboard) provides the user with capabilities to
examine the empirical distribution of the impact of software
technology improvements on the following variables:

    -Quality
    -ROI
    -Productivity

The processes/methodologies covered include:

    -SEI CMM/CMMI
    -SEI Team Software Process (TSP)
    -SEI Personal Software Process (PSP)
    -Inspections
    -Reuse
    -Cleanroom

========================================================================

                    Mini-Survey on Test Training
                                 by
                          Franco Martinig
                       Martinig & Associates
             Editor of the Methods and Tools Newsletter
                      www.methodsandtools.com

Our last poll question was: How many weeks of training on software
testing have you completed in your professional life?

                ---------------------------  -----
                None                           43%
                Less than one week             19%
                One week (5 days)               7%
                One to two weeks                7%
                Two weeks to one month          6%
                More than one month            18%
                ---------------------------  -----
                Number of participants:        240
                ---------------------------  -----

As you can see, a large majority of the participants received none
or few testing related training from their employers. I think that
this situation is typical of the importance given to the testing
phase in many software development projects. The time is often
limited and there are few processes, tools and infrastructure
available to optimise the efforts of the developers. The lack of
training is just another factor that limits the efficiency of
testing efforts.

At the other end of the spectrum, you see an important percentage
with more than one month of training. This result should be
connected with the activity of M&T readers. Around 20% of them are
working in the software quality area and people working in this area
should have more testing related training than the average
developer.

========================================================================

========================================================================
    ------------>>> QTN ARTICLE SUBMITTAL POLICY <<<------------
========================================================================

QTN is E-mailed around the middle of each month to over 10,000
subscribers worldwide.  To have your event listed in an upcoming
issue E-mail a complete description and full details of your Call
for Papers or Call for Participation at
<http://www.soft.com/News/QTN-Online/subscribe.html>

QTN's submittal policy is:

o Submission deadlines indicated in "Calls for Papers" should
  provide at least a 1-month lead time from the QTN issue date.  For
  example, submission deadlines for "Calls for Papers" in the March
  issue of QTN On-Line should be for April and beyond.
o Length of submitted non-calendar items should not exceed 350 lines
  (about four pages).  Longer articles are OK but may be serialized.
o Length of submitted calendar items should not exceed 60 lines.
o Publication of submitted items is determined by Software Research,
  Inc., and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items appearing in QTN represent the
opinions of their authors or submitters; QTN disclaims any
responsibility for their content.

TRADEMARKS:  eValid, HealthCheck, eValidation, InBrowser TestWorks,
STW, STW/Regression, STW/Coverage, STW/Advisor, TCAT, and the SR,
eValid, and TestWorks logo are trademarks or registered trademarks
of Software Research, Inc. All other systems are either trademarks
or registered trademarks of their respective companies.

========================================================================
        -------->>> QTN SUBSCRIPTION INFORMATION <<<--------
========================================================================

To SUBSCRIBE to QTN, to UNSUBSCRIBE a current subscription, to
CHANGE an address (an UNSUBSCRIBE and a SUBSCRIBE combined) please
use the convenient Subscribe/Unsubscribe facility at:

       <http://www.soft.com/News/QTN-Online/subscribe.html>.

               QUALITY TECHNIQUES NEWSLETTER
               Software Research, Inc.
               1663 Mission Street, Suite 400
               San Francisco, CA  94103  USA

               Phone:     +1 (415) 861-2800
               Toll Free: +1 (800) 942-SOFT (USA Only)
               FAX:       +1 (415) 861-9801
               Web:       <http://www.soft.com/News/QTN-Online>