sss ssss rrrrrrrrrrr ssss ss rrrr rrrr sssss s rrrr rrrr ssssss rrrr rrrr ssssssss rrrr rrrr ssssss rrrrrrrrr s ssssss rrrr rrrr ss sssss rrrr rrrr sss sssss rrrr rrrr s sssssss rrrrr rrrrr +===================================================+ +======= Quality Techniques Newsletter =======+ +======= April 2004 =======+ +===================================================+ QUALITY TECHNIQUES NEWSLETTER (QTN) is E-mailed monthly to subscribers worldwide to support the Software Research, Inc. (SR), eValid, and TestWorks user communities and to other interested parties to provide information of general use to the worldwide internet and software quality and testing community. Permission to copy and/or re-distribute is granted, and secondary circulation is encouraged by recipients of QTN, provided that the entire document/file is kept intact and this complete copyright notice appears in all copies. Information on how to subscribe or unsubscribe is at the end of this issue. (c) Copyright 2003 by Software Research, Inc. ======================================================================== Contents of This Issue o 2nd National Software Summit o The Walk-Around, by Boriz Beizer o ATVA-2004: Automated Technology for Verification and Analysis o Software Reliability Engineering, by John Musa o 13Th International Symposium of Formal Methods/Europe o eValid Summary o 27th International Conference on Software Engineering (ICSE 2005) o International Workshop on Web Engineering o QTN Article Submittal, Subscription Information ======================================================================== 2nd National Software Summit Senior Executives, Legislators and Academics to Consider Global Outsourcing, R&D and Competitiveness in Creation of National Software Agenda <http://www.cnsoftware.org> The second National Software Summit (NSS2), themed "Software: The Critical Infrastructure Within the Critical Infrastructures!", will convene in Washington, D.C., May 10-12 at the J.W. Marriott hotel. NSS2 will gather an invited group of senior software and technology leaders from industry, academia and government to address growing concerns over the role of software and the software industry in our nation today. The objective of this summit meeting is to develop findings and recommendations leading to the creation of a national software public policy agenda, as well as a follow-on action plan. Phillip Bond, United States Undersecretary of Commerce for Technology, will keynote the summit at a dinner address the evening of May 10th. Additional keynotes the following morning will be delivered by Amit Yoran, National Cybersecurity "Czar" with the United States Department of Homeland Security; Dr. Alan Merten, president of George Mason University; and John Chen, chairman, president and CEO of Sybase, Inc. Sybase software is widely deployed within corporate and government infrastructures, particularly in the financial services, government, defense and telecommunications sectors. Dr. William A. Wulf, president of the National Academy of Engineering will also address the summit participants at their May 11th luncheon. "The first software summit was held in 1995 with Steve Case, then CEO of AOL, and Arati Prabkahar, then Director of NIST, delivering the keynote addresses," said Dr. Alan Salisbury, President of the CNSS and General Chairman for NSS2. "In a post-9/11 world, it's particularly important that we elevate the attention given to the primacy of software in maintaining both national security and global economic leadership, and to advance a national software agenda. NSS2 will start this process." As the theme of NSS2 indicates, software underpins virtually all of the nation's critical infrastructures, but policy on this critical part of our economy is ill defined. Issues of major concern that will be addressed at the conference include the trustworthiness of software, the education and qualifications of the software workforce, the adequacy of current software research and development, the impacts of outsourcing, and the competitiveness and stability of the nation's software industry. To frame these issues, NSS2 is being preceded by a series of four one-day workshops: Trustworthy Software Systems, co-chaired by Dr. Jeff Voas, CTO of Cigital, Rick Linger of the CMU Software Engineering Institute, and Prof. Bret Michael of the Naval Postgraduate School The Software Workforce, chaired by Harris Miller, CEO of the Information Technology Association of America (ITAA) Software Research & Development, co-chaired by Prof. Bill Scherlis of Carnegie-Mellon University, Dr. Larry Druffel, CEO of the South Carolina Research Authority (SCRA), and Tony Jordano, corporate vice president of SAIC The Software Industry chaired by Jim Kane, CEO of the Software Productivity Consortium (SPC). NSS2 is being hosted by the Center for National Software Studies (CNSS) in cooperation with a growing list of organizations, including the Council on Competitiveness, the Information Technology Association of America (ITAA), the Software Productivity Consortium (SPC), and the IEEE Reliability Society. Participating universities include George Mason University, George Washington University, Carnegie Mellon University and the Naval Postgraduate School. Additional information on NSS2 and the workshop series can be found at www.cnsoftware.org/nss2. About the Center for National Software Studies Headquartered in Washington D.C., the Center for National Software Studies is a not- for-profit organization whose mission is to elevate software to the national agenda, and to provide objective expertise, studies and recommendations on national software issues. More information about the Center is available at <http://www.cnsoftware.org>. Contact: Dr. Alan B. Salisbury, President, Center for National Software Studies, (703) 319-2187. ======================================================================== The Walk-Around Note: This article is taken from a collection of Dr. Boris Beizer's essays "Software Quality Reflections" and is reprinted with permission of the author. We plan to include additional items from this collection in future months. Copies of "Software Quality Reflections," "Software Testing Techniques (2nd Edition)," and "Software System Testing and Quality Assurance," can be purchased directly from the author by contacting him at. 1. Evaluations One of my more interesting assignments as a consultant is evaluating the state of an organization's esting and QA practices. It is an intensive process. At the heart of it is the thirty to fifty interviews that I usually conduct over the course of a week. This is in addition to a thorough review of all pertinent documentation such as test plans, process documents, organization charts, etc. As important as the interviews and document reviews are, the most important part of the process is what I call the "Walk-Around." It takes no more than a half-hour, but in that half-hour you can often get at the heart of issues, so that the interviews are primarily used to confirm and detail what the walk-around revealed. While my primary use of the walk-around is as an evaluator, the same techniques are very useful for a prospective employee who wants to get a handle on what kind of organization she is thinking of joining. Organizations that are willing to spend the $75,000 or so that a typical evaluation costs are usually sincere, at some level, in improving themselves. If they weren't why would they spend the money? I have found that organizations that go in for external evaluations fall into several patterns, or combinations of these patterns: 1.1. Winners. Over-all, the organization is in good shape. They are profitable. The software they ship has the quality appropriate to their market. But they want to improve their effectiveness. They may want to reduce climbing testing costs by improving test efficiency. They foresee structural changes in the industry and want to be sure to be on top of such changes. Their market share may have slipped a half-point or so and they have identified user- perceived quality, and therefore testing, as a component of that slippage. Competition is getting more intense and they look to process improvement as a key factor in maintaining that competitiveness. All excellent reasons for being evaluated. One of the more pleasant things I find about such winners is that they are often very harshly self-critical: they perceive themselves as being far worse than they actually are. I love working with winners. 1.2. Basket Cases. About the only valid self-perception they have is that they are in deep trouble over software. The quality isn't there, the software is always very late, the users are screaming, the market share has plunged, personnel turnover looks like a turnstile in a New York City subway and worst of all they are so far back on the quality process and testing curves that they're unlikely to last out the year. Sad cases. They're looking for miracles miracles which no ethical evaluator will offer. The problem is that it isn't always easy to spot the basket cases because on the surface, they could look just like the winners. 1.3. The Mythologists. These are organizations that have conned themselves into believing that they are really great and they're looking to the evaluator not recommend real improvements, but to give them a rubber stamp of approval. They know all the buzzwords, they seem to have all the right things in place (e.g., SQA, independent testing, TQM, etc. etc. ) but none of it is real. It is all fluff and fake, devoid of substance. These are even harder to help than losers -- who at least realize that they're in trouble. And again, based on superficial signs, they may look just like the winners. 1.4. The Con Artists. Their whole approach is cynical. They don't want to change the way they're doing software, because as far as they're concerned, the way they did it back in 1965 was just great. However, because of customer pressure, especially government pressure, or because of perceived marketing advantage, they decide that they better get themselves certified as ISO-9000 compliant and/or CMM level V. Last I heard, the going price for an ISO-9000 kit is about $50,000. A full set of procedures, documents, etc. needed for passing certification. I haven't heard of it yet, but I'm sure that there are some unscrupulous expert's out there peddling similar kits of CMM. This is the worst type -- and I have even been conned into doing an assessment for such groups because doing such an assessment was part of the con. There are many variations of these extremes, and each of the four above types could co-exist within one organization. What makes the evaluation job both difficult and interesting is that the superficial signs (process documents, organization charts, titles on the door) could be virtually identical. So how do you tell them apart? That's what the walk-around is for. But before the walk- around, it's a good idea to do some document reviews. 2. Documentation Review Before you go out there, ask to see samples of all pertinent documentation: process definition documents, internal standards, external standards they claim to follow, organization charts, internal and external training documents (as they pertain to quality issues, testing, and tools), sample test plans and scripts, change control documents, issue resolution documents, metrics, etc. What matters is not the documents themselves, or even if they do or don't have such documents. Excellent organizations may have no formal documents and the worst con-artists always have the best. The documents are merely a point of reference from which you can tell what is real and what is merely paper. You're looking for differences between the formal documents (if any) and reality. Having seen the documents gives you a perspective for the in-depth interviews, but more important, helps you set a mental agenda for the walk-around. 3. The Walk-Around You can usually do this in 15 to 30 minutes. Often, it takes almost no time because I do it as I am walking from one interview to the next. In no particular order, here are the things to look at: 3.1. Bulletin Boards. Things to look for on bulletin boards: 3.1.a. Metrics. Copious bug statistics, bug reports from the field, bug resolution rates, testing progress. Lots and lots of real metrics on the board. Metrics on the boards tell you that there isn't just a number-crunching group that publishes metrics that no one sees or uses. However, no metrics on the boards could mean that these are transmitted as messages on the intra-net. Be sure to check. 3.1.b. Conferences and Brown Bag Sessions. Internal technology conferences, brown-bag technology discussion groups, external technology conference announcements, sign-up sheets for training seminars (of the right kinde.g., technology, not motivational), all good. Motivational sessions, meetings of the third-subcommittee on quality imperative initiative, mass meetings to explain the latest reorg, new procedures for getting paperclips, all bad. 3.1.c. Social Stuff. Neutral. Too little and it may be a very harsh place to work. Too much and they may be too busy feeling good about themselves to do good software. 3.2. Conference Room Agenda. Many places post an agenda for the coming week for the use of the conference rooms and/or training rooms. What's being scheduled and how big a part of the time does it take? Assume that the room will be filled to say, 75% capacity and figure out how many work hours are being spent on productive stuff and how much on procedural garbage. As a rule of thumb, I'm dubious about most meetings. I can see 1-1 meetings, three person meetings, four person meetings, and of course, one-many meetings to explain policy items. But as for the typical many-many meetings, dump them; they're more often a forum for avoiding responsibility by blaming it on the anonymous consensus of a meeting than they are a mechanism for actually doing anything. 3.2.a. Productive. Among productive uses I include: project kick-off meetings; tools and technology vendor presentations (if not overdone); almost any technology training; "How I used this neat trick to improve..." kind of presentation; formal (e.g., Fagan) inspections; and of course, a consultant's evaluation of the organization. 3.2.b. Counter-productive. Any kind of rah-rah, motivational, "quality is good" garbage. Brainstorming sessions. Creativity enhancement meetings. Weekly progress review meetings. Pre-weekly progress review coordination and preparation meetings. Reorganizational meetings. Most "coordinating committee" meetings. Quality initiative task force subcommittee meetings. 3.3. The Library. Ask if there is one. If there isn't it could be a very bad sign. Just because there is one doesn't mean it's any good, and if even if it is good, doesn't mean it is used. 3.3.a. Books. See what books they have on testing and QA. Don't just go by the shelves, because all the good ones could be out. Look at the card catalog. If you don't see Beizer, of course dump them as losers. Kidding aside, though, what authors do they have? Are the most important books there and in sufficient quantities? If the books are on the shelf, look at the cards and see how many have been read and if they are all being read by the same three people. In the best libraries, all the good books are there, often in multiple copies, and they are being read by lots of people. 3.3.b. Journals and Trade Publications. Same with technical journals and trade publications. I expect to see a broad spectrum of journals, including IEEE SE, ACM TOSEM, ACM SIGSOFT, IEEE Software, conference proceedings, QA newsletters, etc. I expect it to be at least as good as my personal library. Check to see if the subscriptions are current. If they dropped the technical journals two years ago, they're probably on an irrecoverable downward slide. Look for trade journals also, both free and especially paid subscriptions. Is anybody reading these journals or are they also show-and-tell shelfware like the books? 3.3.c. Librarian. I hope to see a full-time, professional librarian. That's the only real way to get and maintain a good organizational library. Typically, that means an MS or MA from a recognized library training school such as Drexel University. Amateur librarians are no more effective than amateur software developers or amateur testers. What kind of person is this librarian? Is she a facilitator who works hard to see to it that people have the books and journals they need when they need it or is she some throwback to the 19th century a guardian of books? Professionally trained librarians these days rarely have that guardian mentality and yes, 90% of them are women; but don't be put off by a male librarian. They can be just as effective. 3.4. Personal Bookcases. This one can be tricky, because more and more these days, what used to be on the personal bookcase is now available on the Intranet. If you don't see the right stuff on the personal bookcases, be sure to cross check with other forms of availability. 3.4.a. Books and Journals. People are what they're d. Also, what they display on their personal bookcase tells you what they think is important (to display, that is. What they think is actually important could be totally different). How up-to-date are these guys? Look for process documents and books. Then open a few and see if they have actually been read. If the pages are pristine warning sign. Dog-ears, coffee stains, lots of yellow highlights and red underlines good signs. 3.4.b. Training Manuals. I put heavy emphasis on technology and tools training. Process training is also acceptable. But motivational garbage and Ten steps toward greater creativity? brraaap! However, too many tools manuals could be indicative of massive shelfware and tools chaos. Also, remember that much of this stuff could be available on the intranet, so it pays to check if you don't see anything technical on their shelves. 3.4.c. Personal Stuff. Tells you a lot about the person. If the personal stuff repeats on many shelves and forms a pattern, it can tell you a lot about the underlying culture -- a quality neutral issue in itself unless overdone, but good to be aware of. By overdone, I mean things like half of the people have the same set of books on sport X. Probably means that that sport is more important to them than the software they're supposed to be producing. I'm more comfortable with organizations that show lots of diversity in the personal stuff. I'm also suspicious of organizations that don't have any personal stuff on the shelves at all. 3.5. The Cubicles. I'm a throwback. I personally don't like cubicles, never have, and never will. But I suppose that I must accept, this almost universal fact of organizational life. But they can tell you a lot about an organization. 3.5.a. Testers Versus Developers. Except, perhaps, by what's on the bookcases, I hope that I can't tell a developer's cubicle from a tester's. I expect to see the same quality furniture, the same kind of terminals and computers, the same condition of cleanliness or chaos. At most, I'll accept different colors, but even that is probably bad. In the best groups not only can't I distinguish between a tester's and developer's cubicle, but I can't even tell if I'm in a test group or developer group because the two are so thoroughly intertwined. 3.5.b. Screens and Equipment. One screen (e.g., computer) per cubicle is the mandatory minimum these days: many organizations have two screens and some may have three. What computer is powering those screens? Is it a Intel-8086 with a 5 meg hard drive? Or the latest model high-powered work station? Don't put too much stock on printers, though. Shared printers are rapidly becoming the norm and they become less important as people increasingly work with electronics only documents. 3.5.c. Neat Versus Chaotic. I guess that I'm a middle of the roader when it comes to the neatness/chaos axis. What counts here is not the individual, but the pattern. Messiness perception is cultural. Beizers theory on messiness: In Western culture, horizontal chaos is considered messy while vertical chaos is considered neat. That is, if your stuff is spread-out horizontally all over your desk, you are messy by Western standards. However, if you take that same mess and transfer it to hanging file folders in a file cabinet (or do the equivalent in a disc directory), you are socially acceptably organized. Professor Cornelius Weygant, long the graduate student advisor at the Moore School of Electrical Engineering, U of P, was the most horizontal person I ever met. Not only was his desk piled up in mounds two feet high (no kidding, not an exaggeration) but so were all the credenzas in his huge office. One of the most brilliant and well-organized persons I have ever had the pleasure to deal with. So what's the point? Diversity again. We all have different degrees of horizontal/vertical orientation and different degrees of tolerance and effectiveness under superficial chaos. If the cubicles, as a group, appear to be shifted to one or the other extreme, look out! It could mean that the naturally chaotic people don't have the neatnicks to keep them under control. It could, at the other extreme, mean authoritarian despotism to the right of Attila The Hun. 3.6. The Cafeteria. Organization cafeterias can be so much more than a place to eat. When you go to lunch in the cafeteria you do another walk-around. 3.6.a. The Food. Check the food. Is it good quality. Is it fairly priced? Is there real diversity? Is it comfortable? Really clean? These external signs tell you a whole lot about how management values its employees. Don't look for lavish or exotic. If the cafeteria is too good it might mean that they're more into keeping people happy than in writing good software. 3.6.b. Who's Sitting Where? Executive dining rooms are definitely out in the software business. However, many organizations have places that can be subdivided or temporary facilities for those unavoidable high-level (meaning suits and ties) meetings. 3.6.c. Brown Bag Sessions. I love seeing a notice on the wall that says that the Northeast corner is reserved today for a presentation on technology X by speaker Y. 3.6.d. Demeanor. Are they content? Do they avert their eyes as you walk by. Do their voices drop to whispers? I always dress as they do, so they cant tell by suits and stuff who I am, but I am usually accompanied to lunch by high-level managers and the people know who they are. Check it out from day to day and see if there is de-facto management/peon segregation. I usually can't avoid the official accompaniment on the first day, but after that I try to go on my own and sit at random places with my ears open. I can usually manage to sit in three different places during that lunch period. You learn a lot by what people talk about and also by what they don't talk about in front of strangers. Be careful of how you read the signs, though: on more than one occasion I have been politely asked if I would mind moving to another table because they were discussing stuff which was company confidential perfectly reasonable of them to ask, and appropriate for me to comply with a smile and no offense taken. 3.7. A Caution. First impressions are just that and they could be wrong. Don't make your mind up on just the basis of the walk- around. Every tentative conclusion you come to must be confirmed through interviews and documentation. Not only because thats the ethical thing to do, but because no one will believe you, even if you are right, if you make recommendations based on a 15-30 minute walk-around. For a calibration, about one out of ten of the conclusions gotten from the walk-around turn out to be dead wrong. And it only takes one wrong conclusion to cause the other party to totally reject all your recommendations. And if it as a prospective employee that you do the walk-around, that one-out-of ten could be the wrong reason you passed up what could have been the best job of your career. ======================================================================== ATVA 2004 2nd International Symposium on Automated Technology for Verification and Analysis National Taiwan University Sunday 31 October - Wednesday 3 November 2004. http://cc.ee.ntu.edu.tw/~atva04 Encouraged by the success of the first ATVA in December 2003 and the promise of related research and industry in East Asia, we are very happy to?announce its continuation - ATVA 2004. The emphasis of ATVA 2004 will continue to be on various mechanical and informative techniques, which can give engineers valuable feedbacks to quickly converge their designs according to the specifications. ATVA 2004 will be a back-to-back occurrence with APLAS 2004 (Asian Symposium on Programming Languages and Systems, 4-6 November 2004) in Taipei. SCOPE OF INTEREST: The scope of interest includes the following research areas: parametric analysis (parameter synthesis), automated synthesis, optimization, performance analysis, automated tool supports, model-checking theory, theorem-proving theory, state-space reduction techniques, languages in automated verification, real-time systems, embedded systems, infinite-state systems, Petri-nets, UML, synthesis, practice in industry, decidability and complexity issues, case studies. Special emphasis will be on the algorithms, complexities, tools, and experiments of automated verification and analysis. SPECIAL TRACKS: ATVA 2004 will also have three special tracks with independent paper reviewing processes. The three tracks are (1) Design of Secure/High-reliable Networks, (2) HW/SW Coverification and Cosynthesis, and (3) Hardware Verification. KEYNOTE SPEAKERS: Dr. Robert Kurshan (Cadence, USA) Prof. Rajeev Alur (U. Pennsylvania, USA) ======================================================================== Software Reliability Engineering by JOHN D. MUSA http://members.aol.com/JohnDMusa/ Many of you know that my book "Software Reliability Engineering" recently went out of print unexpectedly, and I received a lot of inquiries asking what could be done to make it available again. This message is in response. I have talked with McGraw-Hill and they have agreed in principle to revert the rights to me. I have been investigating the new publishing technology of "Print on Demand" and have decided that it would be excellent for this book. Hence I am working expeditiously to prepare it for publication in this new format. I have asked those concerned for their best time estimates; they are two months for the reversion process and four months for publication preparation. Thus I hope to have the book available again this fall. When it is available, anyone anywhere in the world will be able to order ONE or more books directly from the printer over the internet (also by phone, fax, or snail mail if you wish) and have it custom printed (approximately two days) and shipped direct to you. There are presses in US and Europe, so shipping should also be fast. What really sold me on this approach was that the cost of individual custom printing will be no more than the current cost of the book. (You can also order indirectly through internet booksellers and retail booksellers if you wish, but I don't see any advantage in involving a middleman.) The key to this is the highly automated technology, which gets rid of all the risks and costs of keeping an inventory. It also makes it possible to keep the book available economically as long as even a few people want to buy it. If you would like to be notified when the book is available, please send an email to j.musa@ieee.org with the subject line "Book." I will also post progress information on my website <http://members.aol.com/JohnDMusa/.> Would you please forward this message to your email networks, since my own is not extensive. I understand there's a theory that states that if it gets forwarded N (N = ~6?) times, we will reach the vast majority of those who need this information. Our apologies to anyone who may receive more than one copy in our attempt not to miss someone. Best regards, John ======================================================================== The 13th International Symposium of Formal Methods Europe Newcastle upon Tyne, UK 18-22 July 2005 www.csr.ncl.ac.uk/fm05/ FM'05 is the thirteenth in a series of symposia organized by Formal Methods Europe, an independent association whose aim is to stimulate the use of, and research on, formal methods for software development. The symposia have been notably successful in bringing together innovators and practitioners in precise mathematical methods for software development, industrial users as well as researchers. Submissions will be welcomed in the form of original papers on research and practice, proposals for workshops and tutorials, and entries for the exhibition of software tools, publications and companies. FM'05 welcomes papers in all aspects of formal methods for computer systems, including, but not restricted to, the following: * introducing formal methods in industrial practice (technical, organizational, social, psychological aspects) * reports on practical use and case studies (reporting positive or negative experiences) * formal methods in hardware and system design * reusable domain theories * theoretical foundations (specification and modelling, refining, verification, calculation etc.) * tool support and software engineering * environments for formal methods * method integration CONTACT INFORMATION General Chair: John Fitzgerald, University of Newcastle upon Tyne, UK John.Fitzgerald@ncl.ac.uk Programme Chairs: Ian Hayes, University of Queensland, Australia Ian.Hayes@itee.uq.edu.au Andrzej Tarlecki, Warsaw University, Poland tarlecki@mimuw.edu.pl Organisers: Claire Smith & Jon Warwick, University of Newcastle upon Tyne, UK Claire.Smith@ncl.ac.uk, Jon.Warwick@ncl.ac.uk Workshops Chair: Juan Bicarregui, Rutherford Appleton Laboratory, UK J.C.Bicarregui@rl.ac.uk Tutorials Chair: Neil Henderson, University of Newcastle upon Tyne, UK Neil.Henderson@ncl.ac.uk Exhibitions & Sponsorship: Joan Atkinson, University of Newcastle, UK Joan.Atkinson@ncl.ac.uk ======================================================================== eValid: A Quick Summary http://www.e-valid.com eValid technology incorporates virtually every quality and testing fucntionality in a full-featured browser. Here is a summary of the main eValid benefits and advantages. o InBrowser(tm) Technology. All the test functions are built into the eValid browser. eValid offers total accuracy and natural access to "all things web." If you can browse it, you can test it. And, eValid's unique capabilities are used by a growing number of firms as the basis for their active services monitoring offerings. o Mapping and Site Analysis. The built-in WebSite spider travels through your website and applies a variety of checks and filters to every accessible page. All done entirely from the users' perspective -- from a browser -- just as your users will see your website. o Functional Testing, Regression Testing. Easy to use GUI based record and playback with full spectrum of validation functions. The eVmanage component provides complete, natural test suite management. o LoadTest Server Loading. Multiple eValid's play back multiple independent user sessions -- unparalleled accuracy and efficiency. Plus: No Virtual Users! Single and multiple machine usages with consolidated reporting. o Performance Tuning Services. Outsourcing your server loading activity can surely save your budget and might even save your neck! Realistic scenarios, applied from multiple driver machines, impose totally realistic -- no virtual users! -- loads on your server. o Web Services Testing/Validation. eValid tests of web services start begin by analyzing the WSDL file and creating a custom HTML testbed page for the candidate service. Special data generation and analysis commands thoroughly test the web service and automatically identify a range of failures. o Desktop, Enterprise Products. eValid test and analysis engines are delivered at moderate costs for desktop use, and at very competitive prices for use throughout your enterprise. o HealthCheck Subscription. For websites up to 1000 pages, eValid HealthCheck services provide basic detailed analyses of smaller websites in a very economical, very efficient way. o eValidation Managed Service. Being introduced soon. the eValidation Managed WebSite Quality Service offers comprehensive user-oriented detailed quality analysis for any size website, including those with 10,000 or more pages. Resellers, Consultants, Contractors, OEMers Take Note We have an active program for product and service resellers. We'd like to hear from you if you are interested in joining the growing eValid "quality website" delivery team. We also provide OEM solutions for internal and/or external monitoring, custom-faced testing browsers, and a range of other possibilities. Let us hear from you! ======================================================================== ICSE 2005 27th International Conference on Software Engineering 15-21 May 2005 St Louis, Missouri, USA http://www.icse-conferences.org/2005/ http://www.cs.wustl.edu/icse05/Downloads/ICSE05_CFP_General.pdf The theme of ICSE 2005 is Software Everywhere. It acknowledges the increasingly important role software plays in the life of our society through the technology that sustains it. The theme also highlights the growing level of responsibility our profession and its members are expected to assume. As such, an important goal of this meeting will be to reach out to other disciplines that have an impact upon or benefit from software engineering know-how. General Chair Gruia-Catalin Roman (Washington University in St. Louis, USA) Program Chairs William Griswold (University of California, San Diego, USA) Bashar Nuseibeh (The Open University, UK) PROGRAM Lasting impact on our profession and the society at large is the overarching goal that shaped the programmatic agenda for ICSE 2005. Format changes, novel initiatives, exceedingly high expectations, an exceptionally talented team, and an unprecedented level of support by the local corporate community are some of the ingredients bound to facilitate a fertile exchange of ideas and experiences likely to affect the professional life of each participant. The conference will offer an exciting program of events, including keynote talks by leaders in the field, invited talks along specialized themes, tutorials, workshops, and technical paper presentations on innovative research, the cutting edge of practice, and new developments in software engineering education. High quality submissions are invited for papers describing original unpublished research results, meaningful experiences, and novel educational insights. Topics of interest include, but are not restricted to: * Software requirements engineering * Software architectures and design * Software components and reuse * Software testing and analysis * Theory and formal methods * Computer supported cooperative work * Human-Computer Interaction * Software processes and workflows * Software security * Software safety and reliability * Reverse engineering and software maintenance * Software economics * Empirical software engineering and metrics * Aspect-orientation and feature interaction * Distribution and parallelism * Software tools and development environments * Software policy and ethics * Programming languages * Object-oriented techniques * AI and Knowledge based software engineering * Mobile and ubiquitous computing * Embedded and real-time software * Internet and information systems development ======================================================================== International Workshop on Web Engineering (http://www.ht04.org/Workshops/WebEngineering/) in conjunction with ACM Hypertext 2004, Santa Cruz, August 9-13, 2004 (http://www.ht04.org/) "Hypermedia Development & Web Engineering Principles and Techniques: Put Them In Use." The goal of the workshop is to bring together researchers and developers from academia and industry in order to exchange ideas about emerging problems during current web projects, and discuss recent and innovative results that may help them. The main broad topics of the workshop are: - Web Engineering: modeling and development of web-based Information Systems, including data management, hypermedia development, web software engineering, web services, e-commerce etc. - Hypermedia Development: Linking, navigational design, presentation aspects, development methods, evaluation & metrics, etc. Both fields have provided several important research results, especially during the last decade. However, very few of them have been transferred to real-life web information systems. Web engineers need time to study all research results in the fields of Web Engineering & Hypermedia, or in related fields, like multimedia, data management, software engineering and network engineering. As this is a time consuming task to be accomplished in the strict timeline of a web project, web & hypermedia research results are not exploited adequately (if at all) during the development of current web information systems. These research results (plus several more coming up every year) constitute a very complex information space that itself needs to be engineered, in order to be provided to developers in a meaningful and comprehensive way. INTENDED AUDIENCE * Hypertext/Hypermedia Designers and Researchers * Web Information Systems Developers * Web Engineers and Integrators * Web Project Managers * Web Services Providers * Users of Web Information Systems * Web e-commerce Systems Developers * Web Systems Network Designers WORKSHOP ORGANISERS Sotiris Christodoulou High Performance Information Systems Computer Engineering and Informatics University Of Patras Greece Phone: +302610993805 Fax: +302610997706 e-mail: spc@hpclab.ceid.upatras.gr Web: http://www.hpclab.ceid.upatras.gr/spc/ Michail Vaitis Department of Geography University of the Aegean GR-811 00 Mytilene Greece Phone: +30 22510 36433 Fax: +30 22510 36409 e-mail: vaitis@aegean.gr Web: http://www.aegean.gr/Geography/eng/staff/cv/vaitis-eng.htm Siegfried Reich Salzburg Research Forschungsgesellschaft Jakob Haringer Strasse 5/III 5020 Salzburg Austria Phone: +43 662 2288 211 Fax: +43 662 2288 222 e-mail: sreich@salzburgresearch.at Web: http://www.salzburgresearch.at/~sreich/ ======================================================================== ======================================================================== ------------>>> QTN ARTICLE SUBMITTAL POLICY <<<------------ ======================================================================== QTN is E-mailed around the middle of each month to over 10,000 subscribers worldwide. To have your event listed in an upcoming issue E-mail a complete description and full details of your Call for Papers or Call for Participation to . QTN's submittal policy is: o Submission deadlines indicated in "Calls for Papers" should provide at least a 1-month lead time from the QTN issue date. For example, submission deadlines for "Calls for Papers" in the March issue of QTN On-Line should be for April and beyond. o Length of submitted non-calendar items should not exceed 350 lines (about four pages). Longer articles are OK but may be serialized. o Length of submitted calendar items should not exceed 60 lines. o Publication of submitted items is determined by Software Research, Inc., and may be edited for style and content as necessary. DISCLAIMER: Articles and items appearing in QTN represent the opinions of their authors or submitters; QTN disclaims any responsibility for their content. TRADEMARKS: eValid, HealthCheck, eValidation, TestWorks, STW, STW/Regression, STW/Coverage, STW/Advisor, TCAT, and the SR, eValid, and TestWorks logo are trademarks or registered trademarks of Software Research, Inc. All other systems are either trademarks or registered trademarks of their respective companies. ======================================================================== -------->>> QTN SUBSCRIPTION INFORMATION <<<-------- ======================================================================== To SUBSCRIBE to QTN, to UNSUBSCRIBE a current subscription, to CHANGE an address (an UNSUBSCRIBE and a SUBSCRIBE combined) please use the convenient Subscribe/Unsubscribe facility at: <http://www.soft.com/News/QTN-Online/subscribe.html>. QUALITY TECHNIQUES NEWSLETTER Software Research, Inc. 1663 Mission Street, Suite 400 San Francisco, CA 94103 USA Phone: +1 (415) 861-2800 Toll Free: +1 (800) 942-SOFT (USA Only) FAX: +1 (415) 861-9801 Web: <http://www.soft.com/News/QTN-Online>