Thursday, June 10, 2010

Топки за голф

"В часа по философия професорът застана на катедрата, изпълнена с
различни предмети и зачака студентите да утихнат. Тогава взе голям
празен буркан от майонеза и го напълни с топки за голф. Попита
студентите дали съдът е пълен. Те отговориха утвърдително.

После професорът взе една кутия с камъчета и я изсипа в съда, разклати
го леко и камъчетата се наместиха между топките за голф. И отново попита
студентите дали съдът е пълен. Те пак отговориха утвърдително.

Сетне професорът взе кутия с пясък и я изсипа в съда. Естествено пясъкът
запълни всичко. Той попита още веднъж дали съдът е пълен. Студентите
отговориха с единодушно "да".

Тогава професорът взе две кутии с бира от бюрото и изсипа съдържанието
им в съда, което изпълни празното пространство сред песъчинките.
Студентите се разсмяха.

"Сега, каза професорът, когато смехът утихна, искам да ви кажа, че този
съд представлява вашият живот. Топките за голф са важните неща във вашия
живот - семейството ви, здравето ви, децата ви, приятелите ви, страстите
и предпочитанията ви - все неща, които ако загубите всичко друго и ви
останат само те, животът ви ще бъде достатъчно пълен. Камъчетата са
другите неща - работата ви, къщата ви, колата ви. Пясъкът е всичко
останало - малките неща."

И продължи:"Ако най-напред сложите пясъка в съда, няма да има място за
камъчетата и топките за голф. Същото се случва и с живота. Ако губите
времето и енергията си за дреболии, никога няма да имате място за
нещата, които са важни за вас. Обръщайте внимание на нещата, които
застрашават щастието ви. Играйте с децата си.

Излезте с партньора си навън, на вечеря. Винаги ще се намери време да
изчистите къщата и подредите.

Погрижете се най-напред за топките за голф, за нещата, които наистина си
заслужават. Подредете приоритетите си. Останалото е само пясък."

Една от студентките вдигна ръка и попита:"А какъв беше смисълът на бирата?"

Професорът се усмихна."Радвам се, че ме попитахте. Исках просто да ви
покажа, че няма значение колко пълен е животът ви, винаги ще се намери
място и за две бири."

Автор неизвестен

Monday, September 21, 2009

Call Web service from QTP through XSL transformation

It all started with issues at work – again. I had to plan how to call Web services as part of our testing project. As I didn’t like the clumsy QTP add-in facilities for web services, I browsed for better solution. I went through several options. One of them was Stefan Thelenius’ articles on XML automation. I played with it and I was not satisfied. It required too much typing and that makes it error-prone.

Then I came across Web Service Proxy generator using XSLT targeting VBScript and JavaScript. I wonder why this brilliant idea was not utilized before in greater scale. In essence, it is very simple.

The idea

Web services are described by Webservices Description Language (WSDL). WSDL itself is XML. XSLT on the other hand are used to transform XML documents into other formats. Then why not to use XSL to transform a WSDL to something useful? This is what Jacco Vonk presented in his article and went further. He transforms the WSDL into VBScript code ready for use. (JavaScript code is also creatable but I don’t care as QTP doesn’t care also). The VBScript code contains one or more proxy classes and functions that you can use to invoke the Web service described by the WSDL.

The issues

I picked up the idea and upgraded Jacco Vonk’s XSLT. I tried it with several WSDLs and I found several issues.

The first one breaking the generated VBScript code was that the WSDL target namespace ended sometimes with '/' and sometimes it did not. I resolved it by introducing XSLT variable serviceNamespace.

The second issue was that XSD double type was not trapped. Simple – just added one more check in the long lines starting with xsl:if.

Next I added type casting to string at several places and replaced '+' with '&' for string concatenation.

The biggest fight

that I fought though was with the namespaces. Different WSDLs use different prefixes for the namespaces and especially for http://www.w3.org/2001/XMLSchema. This namespace is used to define the Web service response types. Correct recognition of these return types was crucial for the generation of the correct types in the VBScript code. So, how do you find the prefix for a namespace? XPath gives us fantastic instruments to traverse, search and filter XML but mainly for XML main body elements and attributes. Not so easy with namespaces and their prefixes in the root node. After a week experimenting unsuccessfully with XPath expressions, I finally found an article at IBM site that resolved my issue almost at once.

Apparently the idea of transforming WSDL was not so new after all. I still wonder why it is not utilized wider and better. Or I just did not find the info?

The article Processing WSDL documents with XSLT discussed exactly the issues I had. Unfortunately this was not enough. After a lot of trials and errors I realized that declaring variable for XSD Schema prefix at the top of the stylesheet is not sufficient. It had to be declared in every template that uses it. Finally it worked.

At least I thought so. Next issue was that SOAP responses were arbitrary. They were using their own prefixes… So, instead of fighting, I decided to work around the problem and amended ‘TextBetween’ function to strip out the namespace prefixes out of the response.

There are certainly more issues to be resolved - for example when response is more complex than a single return value. I suspect that more namespace issue might arise. So far, so good – it worked out.

Wrap it up

The final step was to utilize all the above in QTP. I intended to use Microsoft command-line utility to execute XSLT when I came across this article. It made me slap my forehead. Why use external tool when I can do it directly in QTP through .NET? This is how the final solution got its shape.

Limitations and to do

So far it works with simple Web services that return single value or object with simple member types. I haven’t work out the situation when the response is an array of values or else. I also haven’t tested yet this solution in production environment so, please, be cautious.
Note also that this is still very experimental and I did it for fun.

Download and how to use it

Use the code of the sample QTP test - uncomment the code blocks to test with different Web services. Associate the code library required by the QTP test. Essential is the XSL template. Find them all here.

P.S. Did I mention that the debugging was a nightmare?

Thursday, September 10, 2009

My cat and my car

Can you see why I like both my cat and my car?


Still not sure what I mean?
And for those not yet convinced:
I was very surprised when I realized the resemblance. So I am asking is it just by chance? Do you have similar observations?

Saturday, January 03, 2009

A quest to automated test generation

It all started with problems at work. I wasn’t happy how the automated testing was going on and I wanted to find ways to improve it. Ok, back at home with long Christmas and New Year holydays ahead, let’s see what the best practices in managing automated tests are, I said, and googled the terms. I found several articles; one of the best has part I and part II. And this was just

The beginning

After that I found out about Google Test Automation Conference. Hm, strange I haven’t heard about it before. Here my direction started to shift. The first video I watched was about Advances in Automated Software Testing Technologies. I was impressed so much that I watched at least twenty more. Not all of them were that interesting except for this one which compared Selenium and WebDriver in really funny way. What changed the direction completely was this presentation. It is called Specification based Testing.

After watching it, I immediately searched breathless for the download link - no luck. It was just a prototype based on a student’s work. So my quest began.

In the last few years I’ve asked my self

Many questions

For each of the questions I’ve started researches several times that lasted several hours or a day. I got the answers to my questions, mostly unsatisfying. For example: Is there are free or open source alternative to Quality Center; answer: No.

Is there are a system to summarize all the knowledge on a project in such a way that the team members can ask question and to get responses from the system? Yes. Such are the expert systems and the knowledge bases. The usual thing to do in these days is wiki. Boring! Most of the wikis even don’t have full-text search. Yeah, they allow for sharing knowledge but just that. I achieve the same with a dozen gigabytes of shared documents and my favorite desktop search tool. (Interesting approach to the desktop search is implemented by Aduna in their product AutoFocus. It is free! Here are some impressive screenshots. It is another matter whether it is really useful.)

Oops! I drifted away from the actual question but this is how my quest went – jumping from subject to subject all the time.

Back to the question. The expert systems are probably the solution but they are expensive and require a lot of… well expert work. So, back to the knowledge bases. Ironically ignoring the wikis, I thought let’s see what knowledge management systems are out there. Well, same as expert systems and I am not convinced that they fit what I need. 

After watching the first video above, I thought that my leading question was if there is a way to generate automatically the code for automated tests. Answer is yes, definitely. You can find online a lot of ways to do it. But it is not what a tester might expect. Most of the automation is about unit tests and especially using JUnit.

So, the leading question as the title suggests became: Is it possible to automate the test cases generation?

I started googling. I thought using Firefox with the Search Cloudlet plug-in will help; unfortunately not. It wasn’t of much help because it shows tags that are most common to a search term and what I was researching wasn’t that common.

So, I started googling (did I say that already?) for requirements based testing. Of course, it wasn’t the right term. Added ‘generation’ to the search. Then ‘case’. After that I added ‘automatic’. Then I replaced some terms and so on until something meaningful is found. This was the main process of searching – long, iterative and not very productive. Any suggestions to do it better if you are looking for something that is not that common?

Then I said myself that there are smart guys out there that must have thought already of some language to describe software requirements or specifications in a formal way. Search for requirements language markup language etc. Of course it must be an XML based. You know why. I also tried with UML. What I got is code generation.

Then I realized: oh, poor us, the testers! There are hundreds of tools or attempts to generate application code from UML and other languages that describe software systems. There are tools to generate code for unit tests. There are also tools for automated building and continuous integration but all these are for developers! What about us? The response was flashing on the display: “It is all about us, the developers!”

This fruitless search lasted for two days with no advancement. I figured out that what I was looking for had to be a really formal language. It should allow for describing software requirements and then for deriving test cases from them. I realized also that I will not find such tools. No one cares about testers that much. Let’s see what is needed to develop such tool.

I knew about OWL (Web Ontology Language) since 2003 when I was very deep in XmEdiL development. At this time OWL was still a draft. I didn’t understand it very well and I ignored it as WTH.

Semantic! Semantic!

I turned to the site of one of my favorite companies Altova (why favorite is another story). The breakthrough came from one of their product sites – SemanticWorks. There is really nothing special about it. It just gave me the proper terms to search for.

While watching the videos from the GTAC at the beginning, I noticed that many presenters as well as their audience hated XML. I really don’t understand why. I love XML and w3.org is a site I visit from time to time to see where the things are going. I certainly missed OWL becoming a recommendation in 2004 and most certainly underestimated it. While reading about it and its grandma RDF I realized one more thing. OWL and RDF are not just “formal representation of a set of concepts within a domain and the relationships”. They actually can contain data representing those concepts. Oh, stupid me! I was ignoring OWL in my previous researches.

I also figured out that OWL is complex enough to expect to code it by hand. Tools are available – the commercial SemanticWorks, the free Protégé and NeOn Toolkit for Eclipse and countless others. You have to be expert knowledge engineer to author good and usable ontology for certain domain. Not to mention that you need a domain expert as well. So?

I came across Controlled natural language. Hm… WTF? And then: wow! It looks like CNL might be the answer to write requirements and specifications that computer can understand and that does not need a knowledge engineer. It might also avoid the need to do heavy NLP. At semanticweb.org I found that many people, companies and foundations are working on the semantic web but there are only peaces of something that is yet to be built. I found also several attempts to use controlled languages to produce OWL.

I visited many 2G wikis that should bring semantics to common wikis. Early example is this one. An extension to MediaWiki is Semantic_MediaWiki. Another example is Semantic MediaWiki+. The developing company states that “SMW+ is a flexible knowledge management tool“. Yeah, right! And how really useful is that? How much typing and clicking it requires? And you should know exactly what you are looking for and to which knowledge category it belongs in order to find it.

There are also applications that really work, probably not exactly the way someone (that's me) might expect. For example Freebase is supposed to be a semantic analog of Wikipedia. dbpedia on the other hand attempts to extract semantics directly from Wikipedia. I personally can question how useful is it in its current form. I have visited those several times in the past years and I don't see much use of those for testers' needs

Very curious example of using RDF, SPARQL, AJAX and other technologies is paggr. Just download and watch very carefully this screencast. It is not yet released publicly and I am impatient to see it delivered.

For those of us who’ve been in the telecom area, here is an example of semantic application. Go to Internet Business Logic and click on Tutorial, Part 1. It shows how using simple declarative sentences are used to make billing. It was really impressive. You can actually ask questions and get explained responses! It is clumsy but still it is an early bird.

There were some other interesting applications but the crown of my quest was ACEwiki. It is “making use of the controlled natural language ACE... that looks like natural English”. Just watch the demo video. By using almost plain English, it verifies the semantic consistency of what you have typed in and can reason over it. It can also answer questions! Even those that are not implicitly provided. I was stunned! It can also produce OWL.

Ok, you might say, we have Prolog for almost forty years now and it resolves logical issues. I was stunned because you don't need to be prolog programmer or knowledge engineer to use tool like ACEwiki. And you get OWL, and you have both the expert system and the knowledge base. Really it is small, it requires improvements but it works!

Imagine a system that uses such approach to collect software requirements and specifications from business analysts, domain experts and software architects. Now imagine that you can ask questions to this system and you can get explained answers that are not implicitly stated! And now imagine that the test cases are automatically generated based on the underlying OWL. Oh, yes, too much imagination… Ok, probably the generated output will be only the test cases skeleton as the details of the system implementation will not be stored in the OWL itself. Probably SML and other will fill the gap.

Another ACE based application is ACE View. It integrates with Protégé and in addition to responding to questions, it displays implicit entailments derived from the entered axioms after running one of the semantic reasoners: FaCT++ or Pellet.

ACE View is relatively easy to install as opposed to ACEWiki but it is worth the hard job of installing Java, Ant, Prolog etc. I played with both and it was real pleasure.

Almost forgot to mention that many semantic reasoners are based on the good old Prolog and its open source incarnation SWI-Prolog. Prolog is almost my age but still very much alive ;-)

ACE is an academic development but appears mature enough to have various implementations. I couldn’t find many implementations of ScenarioML which is base for the first video I watched.

Another academic work I found is Rabbit which also allows for authoring ontologies through controlled natural language. Its ROO Rabbit application is based also on Protégé. Even that this presentation states it produces better results I wasn’t convinced with the results after playing with it.

Yet another academic work is GINO - Guided Input Natural Language Ontology Editor, but I was able to find much about it.

For the Bulgarian readers, it might be also curious to find that Sirma has subsidiary called Ontotext. They have products and are actively working in the area of computer semantics.

For the lovers of Firefox, an early attempt to adopt semantics is The Tabulator. It is an FF extension that allows for browsing and editing metadata directly on the web pages.

Another interesting fact I found is that the European Commission is sponsoring numerous project in the semantic area.

Future

While browsing OWL tools, I found out that w3.org was developing yet another XML based language – SML. This stands for Service Modeling Language and should describe computer systems and their interrelationships.

At some time during the quest it struck me: I am thinking about is generation of functional tests. What about GUI tests? There are dozens of languages that describe GUIs so there should be a way to derive tests from these GUI descriptions. And also there should be a way to devise for example integration test from IDL... Apparently this is immense to research depending on the kind of tests you need. This is something to research in the future.

I leave also to the future the research on business processes and rules. Starting question would be is OWL equipped to represent business rules? Or there should be yet another languages and tools?

Conclusion

In short, I crawled the net for a week almost all day and night with short breaks for snack and nap. I visited hundreds of sites, read most of them, downloaded gigabytes of software – trial and free, installed it and played with it.

Did I get answer to my question? Yes, the answer is yes but it is too early for it. Companies, foundations, universities, professors and students are working on the technologies. Just wait and wake up for the bright day when The Tool will be built and it will fit the testers’ needs. Or, roll up our sleeves and start developing The Better Tool ourselves right now!

Was it confusing? It was confusing for me as well but pleasurable. I wish you have such quests.