sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr
         +===================================================+
         +=======    Quality Techniques Newsletter    =======+
         +=======              March 2004             =======+
         +===================================================+

QUALITY TECHNIQUES NEWSLETTER (QTN) is E-mailed monthly to
subscribers worldwide to support the Software Research, Inc. (SR),
eValid, and TestWorks user communities and to other interested
parties to provide information of general use to the worldwide
internet and software quality and testing community.

Permission to copy and/or re-distribute is granted, and secondary
circulation is encouraged by recipients of QTN, provided that the
entire document/file is kept intact and this complete copyright
notice appears in all copies.  Information on how to subscribe or
unsubscribe is at the end of this issue.  (c) Copyright 2003 by
Software Research, Inc.

========================================================================

                       Contents of This Issue

   o  2nd National Software Summit & Workshop Series

   o  Five Reasons Why People Don't Do Automated Testing

   o  Testing and Certification of Trustworthy Systems

   o  1st International Workshop on Integration of Testing
      Methodologies

   o  eValid: Latest News and Information

   o  Preventing the Internet Meltdown

   o  2nd European Workshop on Object Orientation and Web Services

   o  eValid: A Quick Summary

   o  Software Test Managers Roundtable: Working Effectively with
      Outsourcing

   o  QTN Article Submittal, Subscription Information

========================================================================

           2nd National Software Summit & Workshop Series

          Senior Executives, Legislators and Academics to
        Consider Global Outsourcing, R&D and Competitiveness
              in Creation of National Software Agenda

                    <http://www.cnsoftware.org>.

The second National Software Summit (NSS2), themed "Software: The
Critical Infrastructure Within the Critical Infrastructures!", will
convene in Washington, D.C., May 10-12 at the J.W. Marriott hotel.
NSS2 will gather an invited group of senior software and technology
leaders from industry, academia and government to address growing
concerns over the role of software and the software industry in our
nation today.

The objective of this summit meeting is to develop findings and
recommendations leading to the creation of a national software
public policy agenda, as well as a follow-on action plan.

Phillip Bond, United States Undersecretary of Commerce for
Technology, will keynote the summit at a dinner address the evening
of May 10th. Additional keynotes the following morning will be
delivered by Amit Yoran, National Cybersecurity "Czar" with the
United States Department of Homeland Security; Dr. Alan Merten,
president of George Mason University; and John Chen, chairman,
president and CEO of Sybase, Inc. Sybase software is widely deployed
within corporate and government infrastructures, particularly in the
financial services, government, defense and telecommunications
sectors. Dr. William A. Wulf, president of the National Academy of
Engineering will also address the summit participants at their May
11th luncheon.

"The first software summit was held in 1995 with Steve Case, then
CEO of AOL, and Arati Prabkahar, then Director of NIST, delivering
the keynote addresses," said Dr. Alan Salisbury, President of the
CNSS and General Chairman for NSS2.  "In a post-9/11 world, it's
particularly important that we elevate the attention given to the
primacy of software in maintaining both national security and global
economic leadership, and to advance a national software agenda. NSS2
will start this process."

As the theme of NSS2 indicates, software underpins virtually all of
the nation's critical infrastructures, but policy on this critical
part of our economy is ill defined. Issues of major concern that
will be addressed at the conference include the trustworthiness of
software, the education and qualifications of the software
workforce, the adequacy of current software research and
development, the impacts of outsourcing, and the competitiveness and
stability of the nation's software industry.

To frame these issues, NSS2 is being preceded by a series of four
one-day workshops:

Trustworthy Software Systems, co-chaired by Dr. Jeff Voas, CTO of
Cigital, Rick Linger of the CMU Software Engineering Institute, and
Prof. Bret Michael of the Naval Postgraduate School The Software
Workforce, chaired by Harris Miller, CEO of the Information
Technology Association of America (ITAA) Software Research &
Development, co-chaired by Prof.  Bill Scherlis of Carnegie-Mellon
University, Dr. Larry Druffel, CEO of the South Carolina Research
Authority (SCRA), and Tony Jordano, corporate vice president of SAIC
The Software Industry chaired by Jim Kane, CEO of the Software
Productivity Consortium (SPC). NSS2 is being hosted by the Center
for National Software Studies (CNSS) in cooperation with a growing
list of organizations, including the Council on Competitiveness, the
Information Technology Association of America (ITAA), the Software
Productivity Consortium (SPC), and the IEEE Reliability Society.
Participating universities include George Mason University, George
Washington University, Carnegie Mellon University and the Naval
Postgraduate School.

Additional information on NSS2 and the workshop series can be found
at www.cnsoftware.org/nss2.

About the Center for National Software Studies Headquartered in
Washington D.C., the Center for National Software Studies is a not-
for-profit organization whose mission is to elevate software to the
national agenda, and to provide objective expertise, studies and
recommendations on national software issues. More information about
the Center is available at <http://www.cnsoftware.org>.

Contact:  Dr. Alan B. Salisbury, President, Center for National
Software Studies, (703) 319-2187.


========================================================================

         Five Reasons Why People Don't Do Automated Testing

eValid is the industry leading provider of Web quality assurance,
functional testing and load testing solutions.  In the course of our
work we have run across many developers and QA pros for whom
automated testing has been a real blessing.  It has enabled them to
deploy high quality, critical business applications on time and with
high confidence.

We've also run into people who claim to have no use for automated
testing.  We suspect they've been burned by old-school client/server
testing tools, which, sorry to say, have gained a reputation as
being money pits. Sad to say, many of these sat on a shelf gathering
dust while shell-shocked users retreated to manual testing methods.

So, with apologies to various pundits, eValid presents the "Top Five
Bogus Excuses for Not Using Automated Testing Tools."

          Reason 1. I don't have time for formal testing,
                     our schedule is too tight!

This is an oldie but goodie.  Has there ever been a development
project that went according to schedule?  Does anyone know of a
project where QA time wasn't cut at least in half in order to meet
the deployment timeline? Automated testing can actually help a great
deal in this scenario, enabling you to perform more tests (and more
types of tests) in less time.  Put another way, you can spend less
time scripting, and more time testing using automated testing tools.

                   Reason 2. It's too expensive!

Don't let a few bad apples spoil the whole bunch.  Not all automated
testing tools are overpriced, and not all vendors are looking to
gouge you. eValid testing tool licenses start at less than $1,000
and our hosted load testing solutions can bring down the cost even
more while eliminating the need for you to develop a load-testing
infrastructure.  And don't even get us started about the cost of
unplanned application downtime.  A leading technology research group
estimates that an online retailer will lose nearly $60,000 an hour
if a critical application goes down.  An online brokerage will lose
$537,000 for just five minutes of downtime.  Given those figures,
doesn't it make sense to fix potential problems before they lead to
downtime?

                 Reason 3. They're too hard to use!

We know you've been hurt before. Legacy testing tools, most of which
were originally developed for client/server environments, can be a
bear to use.  Some even require proprietary languages.  But a new
class of Web-based testing solutions enable you to create very
complex scripts with no programming in just minutes.  If it's been a
while since you evaluated testing tools (i.e., more than two years),
it would be worth your while to see what's out there now.

               Reason 4. We load tested it manually!

We'll try to break this to you gently -- you can't load test
applications manually unless your expected load is smaller than your
development team, and you can duplicate the production environment
in your office.  Companies have actually said to us, "We're all set
-- we all stayed late one night and logged on to the application
simultaneously -- it worked fine!"  Chances are, your application
will find its real-world load a little more taxing.  Automated load
testing is the only way to see how your application will truly hold
up under a variety of load scenarios.

                 Reason 5. We were very careful --
                       there aren't any bugs!

This is the favorite one of them all.  No developer likes to think
there could be any problems with his or her application -- but the
fact is, eValid has helped thousands of companies test their
applications, and we have NEVER been through a test that didn't find
at least one problem.  More often than not, we find several major
ones.

One of our consultants has observed a particular psychological
phenomenon within teams that trot out excuse number five.  It's
modeled after the more well-known "Four Phases of Grief," and we
call it the "Four Phases of Software Testing."

The Four Phases of Software Testing are:

   >> Over-Confidence: "You're just here to verify that my
      application is perfect."
   >> Denial: "You're not testing it right -- there's nothing wrong
      with that feature. Try it again."

   >> Pleading: "Oh no! Can you help me fix it?"

   >> Acceptance: "OK, now we know how to avoid that situation next
      time. What are we testing next?"

By the time they reach Acceptance, most people have converted.  Or
one would hope!

========================================================================

          Testing and Certification of Trustworthy Systems

               Part of the Software Technology Track

                        Thirty-eighth Annual
         HAWAII INTERNATIONAL CONFERENCE ON SYSTEM SCIENCES

                        January 3 - 6, 2005
                   Hilton Waikoloa Village Resort
                    on the Big Island of Hawaii

                   <http://www.hicss.hawaii.edu>

DESCRIPTION

The specification, development, and certification of trustworthy
computing systems hold great research challenges.  Modern society is
increasingly dependent on large-scale systems for operating its
critical infrastructures, such as transportation, communication,
finance, healthcare, energy distribution, and aerospace.  As a
result, the consequences of failures are becoming increasingly
severe.  These systems are characterized by heterogeneous
distributed computing, high-speed networks, and extensive
combinatorial complexity of asynchronous behavior.  Effective
methods for testing and certification of trustworthy systems are in
great demand.  This minitrack provides a venue for research results
and will contribute to their practical application in the software
systems of the future.

The minitrack focuses on advanced techniques for testing and
certification of trustworthy systems.  The following topics
represent potential research areas of interest:

  * New techniques for testing and certification of software
    systems
  * Testing and certification metrics
  * Testing trustworthiness attributes such as reliability,
    security, and survivability
  * End-to-end integration testing methods and tools
  * Test case generation
  * Existence and correctness of testing oracles
  * Object-oriented testing methods and tools
  * Integrating quality attributes into testing and certification
  * Engineering practices for testing and certification
  * Automated tools for testing and certification support
  * Testing in system maintenance and evolution
  * Specification methods to support testing in system
    certification
  * Roles and techniques for correctness verification in
    system certification
  * Industrial case studies in testing and certification
  * Technology transfer of testing and certification techniques

                        CONTACT INFORMATION

  * Richard C. Linger, Software Engineering Institute, Carnegie
    Mellon University , 500 5th  Avenue, Pittsburgh, PA 15213.
    Phone: (301) 926-4858 E-mail: rlinger@sei.cmu.edu

  * Alan R. Hevner, Information Systems & Decision Sciences, College
    of Business Administration, University of South Florida, 4202
    East Fowler Ave., CIS1040, Tampa, FL 33620. Phone: (813)
    974-6753 E-mail: ahevner@coba.usf.edu

  * Gwendolyn H. Walton, Dept. of Mathematics & Computer Science,
    Florida Southern College, 111 Lake Hollingsworth Dr, PS Bldg
    Room 214, Lakeland, FL 33801. Phone: (863) 680-6283 E-mail:
    gwalton@flsouthern.edu

CONFERENCE ADMINISTRATION

Ralph Sprague, Conference Chair
Email:  sprague@hawaii.edu

Sandra Laney, Conference Administrator
Email:  hicss@hawaii.edu

Eileen Robichaud Dennis, Track Administrator
Email: eidennis@indiana.edu

2005 CONFERENCE VENUE

Hilton Waikoloa Village (on the Big Island of Hawaii)
425 Waikoloa Beach Drive
Waikoloa, Hawaii 96738
Tel: 1-808-886-1234
Fax: 1-808-886-2900


========================================================================

                     1st International Workshop
              on Integration of Testing Methodologies

           <http://antares.sip.ucm.es/~forte2004/itm2004>

                  1-2 October, 2004, Toledo, SPAIN

                      in conjunction with the
             24th IFIP WG 6.1 International Conference
     on Formal Techniques for Networked and Distributed Systems
                http://antares.sip.ucm.es/~forte2004


                             Motivation

The growing complexity of present-day systems, as well as the
critical application areas in which they are used, has made it more
difficult to avoid introducing the possibility of catastrophic
failures. Testing, the process of checking that a system possesses a
set of desired properties and behavior has become an integral part
of innovation, production and operation of systems in order to
reduce the risk of failures and to guarantee the quality and
reliability of the software used.

Even though testing activities are considered to be rather
important, we have to overcome the problem that different testing
communities use different methods. We may roughly identify two
testing communities:  Testing of software and testing of
communicating systems. Until very recently, research had been
carried out with almost no interactions between these two
communities although they have complementary know-how.  Thus, there
is an urgent to find a synthesis between the different techniques
developed in isolation by each community.

Further, the existing techniques and tools for testing are not
adequately applied or even known in industry. It is necessary to
transfer know-how, facilitate introduction, and increase the
awareness of the benefits that can be obtained by adopting or
improving this type of technology in both large and small companies.

                        Scope and Objectives

This workshop is celebrated under the auspices of the Marie Curie
Research and Training Network TAROT (Training And Research On
Testing) and in cooperation with the Spanish MCyT project MASTER
(Advanced Methods for Testing and Performance Evaluation). However,
the participation is open (and encouraged!) to researchers outside
these projects. We welcome submissions on topics regarding
interactions between the two communities.  Nevertheless, given the
fact that this is the first edition of this workshop, the boundaries
of the scope of the workshop are not very much fixed. Suggested
topics include:

  * Adaptation of techniques and methods used in one of the
    communities to problems of the other community.

  * Testing of systems needing a dual approach. For example, testing
    e-commerce systems may need both software testing and
    communicating systems testing approaches.

  * Industrial applications and experiences in using academic
    methods.

                         Program Committee

* Antonia Bertolino,ISTI-CNR, Italy
* Richard Castanet, LABRI, France
* Ana Cavalli, GET-INT, France
* Fernando Cuartero, Universidad Castilla La-Mancha, Spain
* Rachida Dssouli, Concordia University, Canada
* Jens Grabowski, University of Goettingen, Germany
* Rob Hierons, Brunel University, UK
* Teruo Higashino, Osaka University, Japan
* Dieter Hogrefe, University of Goettingen, Germany
* Pascale Le Gall, University of Evry, France
* David Lee, Bell Laboratories, USA
* Manuel Nunez (Chair), Universidad Complutense de Madrid, Spain
* Farid Ouabdesselam, IMAG, France
* Mauro Pezze, Universita degli Studi di Milano Bicocca, Italy
* Jan Tretmans, Nijmegen University, The Netherlands
* Fernando Rubio, Universidad Complutense de Madrid, Spain
* Ina Schieferdecker, FOKUS, Germany
* Hasan Ural, University of Ottawa, Canada
* Umit Uyar, City University of New York, USA
* Joachim Wegener, DaimlerChrysler AG, Germany
* Nina Yevtushenko, Tomsk State University, Russia

========================================================================

                eValid: Latest News and Information

         Here's a roundup of the latest news from eValid...

                OnePage Benchmark Comparison Results

The idea that you really need to assess performance at the user's
desk-top -- the "last mile" of the web connection -- is a very good
one.

But we've been worried for some time that companies using well-
known, highly publicized performance benchmarks may be getting less
than complete information.  For example, the download time measured
at an ISP using HTTP/S checks from a PERL script will vary
drastically from that measured within eValid running on a DSL and
including full page rendering time.

If you rely on those numbers you may be getting a false impression.
In fact, in our recent experiments we found from ~3:1 up to ~8:1
differences between our measured full page download times and
others' timings.

The OnePage Benchmark Summary Report is at:
<http://www.soft.com/eValid/Benchmarks/OnePage/summary.html>

Our comparisons between eValid data and similar measurements done
with different technology is given at:
<http://www.soft.com/eValid/Benchmarks/OnePage/comparison.html>

                   JavaScript Interface Available

We've added a JavaScript interface to eValid.  This new interface
provides a convenient way to:

  o Issue a JavaScript function call from within the eValid script
    file for immediate execution.

  o Issue an eValid command from within the JavaScript on a current
    page.

The aim is to enhance the operation of eValid in complex
applications that require interaction between the script that is
currently playing and the web pages currently in the eValid browser.

Full details at:
<http://www.soft.com/eValid/Products/Documentation.40/Technical/js.script.html>

                       May Training Dates Set

Mark your calendar for eValid training: 5-7 Mar 2004 in San
Francisco.  Complete details at:
<http://www.soft.com/eValid/Training/blurb.html>


========================================================================

                 "Preventing the Internet Meltdown"

                         Spring/Summer 2004
                    Los Angeles, California, USA

                   <http://www.pfir.org/meltdown>

People For Internet Responsibility (PFIR) is pleased to
preliminarily announce an "emergency" conference aimed at preventing
the "meltdown" of the Internet -- the risks of imminent disruption,
degradation, unfair manipulation, and other negative impacts on
critical Internet services and systems in ways that will have a
profound impact on the Net and its users around the world.

We are planning for this conference (lasting two or three days) to
take place as soon as possible, ideally as early as this coming
June, with all sessions and working groups at a hotel in convenient
proximity to Los Angeles International Airport (LAX).

A continuing and rapidly escalating series of alarming events
suggest that immediate cooperative, specific planning is necessary
if we are to have any chance of avoiding the meltdown.  "Red flag"
warning signs are many.  A merely partial list includes attempts to
manipulate key network infrastructures such as the domain name
system; lawsuits over Internet regulatory issues (e.g. VeriSign and
domain registrars vs. ICANN); serious issues of privacy and
security; and ever-increasing spam, virus, and related problems,
along with largely ad hoc or non-coordinated "anti-spam" systems
that may do more harm than good and may cause serious collateral
damage.

All facets of Internet users and a vast range of critical
applications are at risk from the meltdown.  Commercial firms,
schools, nonprofit and governmental organizations, home users, and
everybody else around the world whose lives are touched in some way
by the Internet (and that's practically everyone) are likely to be
seriously and negatively impacted.

Most of these problems are either directly or indirectly the result
of the Internet's lack of responsible and fair planning related to
Internet operations and oversight.  A perceived historical desire
for a "hands off" attitude regarding Internet "governance" has now
resulted not only in commercial abuses, and the specter of lawsuits
and courts dictating key technical issues relating to the Net, but
has also invited unilateral actions by organizations such as the
United Nations (UN) and International Telecommunications Union (ITU)
that could profoundly affect the Internet and its users in
unpredictable ways.

Representatives from commercial firms, educational institutions,
governmental entities, nonprofit and other organizations, and any
other interested parties are invited to participate at this
conference.  International participation is most definitely
encouraged.

The ultimate goal of the conference is to establish a set of
*specific* actions and contingency plans for the Internet-related
problems that could lead to the meltdown.  These may include (but
are not limited to) technical, governance, regulatory, political,
and legal actions and plans.  Scenarios to consider may also include
more "radical" technical approaches such as "alternate root" domain
systems, technologies to bypass unreasonable ISP restrictions, and a
wide range of other practical possibilities.

It is anticipated that the conference will include a variety of
panels focused on illuminating specific aspects of these problems,
along with potential reactions, solutions, and contingency planning
for worst-case scenarios.  Breakout working groups will be available
for detailed discussion and planning efforts.  Formal papers will
not be required, but panel members may be asked to submit brief
abstracts of prepared remarks in advance to assist in organizing the
sessions.

If you may be interested in participating (no obligation at this
point, of course) or have any questions, please send an e-mail as
soon as possible to:  .

                        Contact Information
Lauren Weinstein
lauren@pfir.org or lauren@vortex.com or lauren@privacyforum.org
Tel: +1 (818) 225-2800
Co-Founder, PFIR - People For Internet Responsibility - http://www.pfir.org
Co-Founder, Fact Squad - http://www.factsquad.org
Co-Founder, URIICA - Union for Representative International Internet
    Cooperation and Analysis - http://www.uriica.org
Moderator, PRIVACY Forum - http://www.vortex.com
Member, ACM Committee on Computers and Public Policy
http://www.pfir.org/lauren

Peter G. Neumann
neumann@pfir.org or neumann@csl.sri.com or neumann@risks.org
Tel: +1 (650) 859-2375
Co-Founder, PFIR - People For Internet Responsibility - http://www.pfir.org
Co-Founder, Fact Squad - http://www.factsquad.org
Co-Founder, URIICA - Union for Representative International Internet
    Cooperation and Analysis - http://www.uriica.org
Moderator, RISKS Forum - http://risks.org
Chairman, ACM Committee on Computers and Public Policy
http://www.csl.sri.com/neumann

David J. Farber
dave@farber.net
Tel: +1 (412) 726-9889
Distinguished Career Professor of Computer Science and Public Policy,
    Carnegie Mellon University, School of Computer Science
Member of the Board of Trustees EFF - http://www.eff.org
Member of the Advisory Board -- EPIC - http://www.epic.org
Member of the Advisory Board -- CDT - http://www.cdt.org
Member of Board of Directors -- PFIR - http://www.pfir.org
Co-Founder, URIICA - Union for Representative International Internet
    Cooperation and Analysis - http://www.uriica.org
Member of the Executive Committee USACM
http://www.cis.upenn.edu/~farber

========================================================================

    2nd European Workshop on Object Orientation and Web Services

                             ECOOP 2004

                    Oslo, Norway - 14 June 2004

       <http://www.cs.ucl.ac.uk/staff/g.piccinelli/eoows.htm>

Themes and Objectives:

The question of how Web Services could and should change system and
solution development is very much open. Are Web Services just about
standards, or do they imply a new conceptual framework for
engineering and development? Similarly open is the question of how
requirements coming from system and solution development could and
should make Web Services evolve.  In particular, methodologies as
well as technologies based on the object-oriented conceptual
framework are an established reality. How do Web Services and
object-orientation relate? How can Web Services leverage the
experience built into current object-oriented practices?

The overall theme of the workshop is the relation between Web
Services and object orientation. Such relation can be explored from
different perspectives, ranging from system modelling and
engineering to system development, management, maintenance, and
evolution. Aspects of particular interest are the modularization of
a system into components and the (possibly cross-domain) composition
and orchestration of different modules.  Components and composition
are closely connected with the issue of reuse, and an important
thread of discussion within the workshop will address the way in
which Web Services impact reuse.

The objective of the workshop is twofold: assessing the current work
on Web Services, and discussing lines of development and possible
cooperation.  Current work includes research activities as well as
practical experiences.  The assessment covers an analysis of driving
factors and a retrospective on lessons learned. The identification
and prioritization of new lines of research and activity is a key
outcome of the workshop. In particular, the intention is to foster
future cooperation among the participants.


Organisers:

Anthony Finkelstein, Dept. of Computer Science, University College
London, UK.

Winfried Lamersdorf, Dept. of Computer Science, University of
Hamburg, Germany.

Frank Leymann, IBM Software Group Germany and University of
Stuttgart, Germany.

Giacomo Piccinelli, Dept. of Computer Science, University College
London, UK.

Sanjiva Weerawarana, IBM T. J. Watson Research Centre and University
of Moratuwa, Sri Lanka.

========================================================================

                      eValid: A Quick Summary
                       http://www.e-valid.com

Readers of QTN probably are aware of SR's eValid technology offering
that addresses website quality issues.

Here is a summary of eValid's benefits and advantages.

  o InBrowser(tm) Technology.  All the test functions are built into
    the eValid browser.  eValid offers total accuracy and natural
    access to "all things web."  If you can browse it, you can test
    it.  And, eValid's unique capabilities are used by a growing
    number of firms as the basis for their active services
    monitoring offerings.

  o Functional Testing, Regression Testing.  Easy to use GUI based
    record and playback with full spectrum of validation functions.
    The eVmanage component provides complete, natural test suite
    management.

  o LoadTest Server Loading.  Multiple eValid's play back multiple
    independent user sessions -- unparalleled accuracy and
    efficiency.  Plus: No Virtual Users!  Single and multiple
    machine usages with consolidated reporting.

  o Mapping and Site Analysis.  The built-in WebSite spider travels
    through your website and applies a variety of checks and filters
    to every accessible page.  All done entirely from the users'
    perspective -- from a browser -- just as your users will see
    your website.

  o Desktop, Enterprise Products.  eValid test and analysis engines
    are delivered at moderate costs for desktop use, and at very
    competitive prices for use throughout your enterprise.

  o Performance Tuning Services.  Outsourcing your server loading
    activity can surely save your budget and might even save your
    neck!  Realistic scenarios, applied from multiple driver
    machines, impose totally realistic -- no virtual user! -- loads
    on your server.

  o Web Services Testing/Validation.  eValid tests of web services
    start begin by analyzing the WSDL file and creating a custom
    HTML testbed page for the candidate service.  Special data
    generation and analysis commands thoroughly test the web service
    and automatically identify a range of failures.

  o HealthCheck Subscription.  For websites up to 1000 pages, eValid
    HealthCheck services provide basic detailed analyses of smaller
    websites in a very economical, very efficient way.

  o eValidation Managed Service.  Being introduced this Fall, the
    eValidation Managed WebSite Quality Service offers comprehensive
    user-oriented detailed quality analysis for any size website,
    including those with 10,000 or more pages.

       Resellers, Consultants, Contractors, OEMers Take Note

We have an active program for product and service resellers.  We'd
like to hear from you if you are interested in joining the growing
eValid "quality website" delivery team.  We also provide OEM
solutions for internal and/or external monitoring, custom-faced
testing browsers, and a range of other possibilities.  Let us hear
from you!

========================================================================

                 Software Test Managers Roundtable:
                Working Effectively with Outsourcing

May 22-23, Melbourne Florida at the Florida Institute of Technology

Join in a discussion of how to manage testing -- or deal with the
management of testing -- as testing and/or programming within the
software project is outsourced.  This year's workshop is facilitated
by Scott Barber, (http://www.perftestplus.com/), James Bach
(http://www.satisfice.org) and Cem Kaner
(http://www.testingeducation.org).

Meetings of the Software Test Managers Roundtable are primarily
practitioners' meetings, focusing on the needs of the working test
manager. Researchers, educators, consultants, and non-testing
development managers are welcome at the meeting, but the goal and
the focus of the meeting is development of ideas and solutions to be
used by working test managers.

This is a participatory workshop. In the 14 hours of formal
sessions, we expect to cover five to seven presentations. The
presenter will speak for 10 to 60 minutes. Following this, we
discuss the presentation. In past sessions, discussions have run
from 1 minute to 3 hours. During the discussion, a participant might
ask the presenter simple or detailed questions, describe consistent
or contrary experiences or data, present a different approach to the
same problem, or (respectfully and collegially) argue with the
presenter. The agenda is intentionally flexible. For example, some
presentations might inspire the group to break into brainstorming
sessions or breakout groups for further discussion.

Materials shared at the workshop will be posted at the website of
Florida Tech's Software Testing Education Research Lab,
<http://www.testingeducation.org>, and will be available for reuse
in software testing courses.

For further information, please contact Cem Kaner
.

Cem Kaner, Professor of Software Engineering
Director, Center for Software Testing Education & Research
Florida Institute of Technology, 150 West University Blvd., Melbourne,
FL 32901.
http://www.kaner.com, http://www.testingeducation.org,
http://www.badsoftware.com
Senior author of:
   Lessons Learned in Software Testing
   Testing Computer Software, and
   Bad Software: What to Do When Software Fails.

Co-sponsored by
-  The Association for Software Testing, and
-  Florida Tech's Center for Software Testing Education & Research

========================================================================
    ------------>>> QTN ARTICLE SUBMITTAL POLICY <<<------------
========================================================================

QTN is E-mailed around the middle of each month to over 10,000
subscribers worldwide.  To have your event listed in an upcoming
issue E-mail a complete description and full details of your Call
for Papers or Call for Participation to .

QTN's submittal policy is:

o Submission deadlines indicated in "Calls for Papers" should
  provide at least a 1-month lead time from the QTN issue date.  For
  example, submission deadlines for "Calls for Papers" in the March
  issue of QTN On-Line should be for April and beyond.
o Length of submitted non-calendar items should not exceed 350 lines
  (about four pages).  Longer articles are OK but may be serialized.
o Length of submitted calendar items should not exceed 60 lines.
o Publication of submitted items is determined by Software Research,
  Inc., and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items appearing in QTN represent the
opinions of their authors or submitters; QTN disclaims any
responsibility for their content.

TRADEMARKS:  eValid, HealthCheck, eValidation, TestWorks, STW,
STW/Regression, STW/Coverage, STW/Advisor, TCAT, and the SR, eValid,
and TestWorks logo are trademarks or registered trademarks of
Software Research, Inc. All other systems are either trademarks or
registered trademarks of their respective companies.

========================================================================
        -------->>> QTN SUBSCRIPTION INFORMATION <<<--------
========================================================================

To SUBSCRIBE to QTN, to UNSUBSCRIBE a current subscription, to
CHANGE an address (an UNSUBSCRIBE and a SUBSCRIBE combined) please
use the convenient Subscribe/Unsubscribe facility at:

       <http://www.soft.com/News/QTN-Online/subscribe.html>.

               QUALITY TECHNIQUES NEWSLETTER
               Software Research, Inc.
               1663 Mission Street, Suite 400
               San Francisco, CA  94103  USA

               Phone:     +1 (415) 861-2800
               Toll Free: +1 (800) 942-SOFT (USA Only)
               FAX:       +1 (415) 861-9801
               Web:       <http://www.soft.com/News/QTN-Online>