An Introduction to Ignite UI and Combo with Load On Demand, Auto Complete, Remote Filtering, Auto Suggestion and Picture in Combo Item

Very recently I’ve worked on Infragistics Iginte UI and I found it very interesting and easy to for rich web features with only few steps or configurations. I’ve shared my experience and knowledge via code project article. If you any of you are interested then you can read it from here.

Here I’m quoting a part of my written article:

Well, probably you have already been working and been a fan of JQuery, Angular JS, Sencha/ExtJS etc tools/frameworks. So why you choose it? Well with my very little experience I found it very easy and interesting to achieve many complex functionalities in different type of controls. It take only few steps/lines of code. You just need to know how to configure it. If you want to learn more about it then Ignite UI tells it better way. Here I’m quoting two paragraphs from them:

Fire Up Your Hybrid HTML5 & jQuery Development with the Leading Edge UI Toolset for Any Browser, Platform, or Device“.

The best HTML5 applications call for stellar Data Visualization and rocket-fast performance, and nobody else gives you the tools to deliver across every browser, platform and device better than Ignite UI (formerly NetAdvantage for jQuery). Prepare to launch modern UIs that deliver the versatility of HTML5 without sacrificing resources, time, or money. With Ignite UI, your apps will boldly go where no app has gone before.”.

I hope you will enjoy it.

Advertisements

PhantomJS an Amazing Headless WebKit Scriptable with a JavaScript API

PhantomJS an Amazing Headless WebKit Scriptable with a JavaScript API

When I first read about PhantomJS, just became astonished to see how many reach features it has and how easy these are for the implementation! Really it is handy for developers and testers. And it is so easy to learn. I’m not going to discuss any details about it. Their documentation and getting started guide are well written and provide lots of real life examples. So if you are interested to learn it then you can visit this page.

An useful example and usage of PhantomJS is to capture web page screen in different output formats without involving any installed browser. You need very few lines of code to achieve this. You can capture the whole screen at a time in different formats. Here is the sample code (taken from here) to capture web screen:

var page = require(‘webpage’).create();

page.open(‘http://github.com’ , function () {

page.render(‘github.png’);

phantom.exit();

});

Recently I’ve created a windows application called CaptureWebPage. I’ve used the phantomjs.exe and .NET/C# to build that application. Here is the screen of the app:

Screen of CaptureWebPage

Screen of CaptureWebPage

If you have already visited this page or downloaded my application CaptureWebPage from codeproject then I’m sure you are also amazed with the features of PhantomJS, isn’t it? I will be really grateful to you if also please share your experience about PhantomJS.

Basic IIS Authentication using .NET/C#

Few days ago I was working on a project that required to access a wiki page. Whenever I was trying to access the page via web browser then it was asking user/pwd to access the page. It was not like a login page where you will put your credentials and access the page. Rather it was coming from IIS for basic authentication.

If any site uses standard Log In Form then I can easily log in to the site using .NET/C#. For this I simply need to pass my user/pwd as POST data. For IIS basic authentication I need to provide extra information to the site in different way. Here is the code that I used for my purpose:

WebRequest webRequest = WebRequest.Create(finalUrl);
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
CredentialCache credentialCache = new CredentialCache();
credentialCache.Add(new System.Uri(finalUrl), “Basic”, new NetworkCredential(userName, password));

webRequest.Credentials = credentialCache;
webRequest.PreAuthenticate = true;

HttpWebResponse response = (HttpWebResponse)webRequest.GetResponse();

contentType = response.ContentType;
System.IO.Stream stream = response.GetResponseStream();

System.IO.StreamReader sReader = new System.IO.StreamReader(stream);
pageSource = sReader.ReadToEnd();

Look at the bold lines of this code segment. Here I’ve  to mention that I’m going to create a “Basic” authentication as the second parameter of the credential cache and before requesting to the page I need to mention that it requires pre-authentication[webRequest.PreAuthenticate = true;]. This two options serve my purpose to authenticate basic IIS at first and then make my intended request to the site.

I hope this code block will also help you!

A Silly Mistake Can Lead to a Big Mistake

Writing code and testing software are both challenging tasks in Software Development. It is never easy to make cent percent synchronization between these two. It is challenging for a developer to write almost flaw less code similarly it is challenging for a tester to cover almost cent percent of testing path.

Very often we are in a situation where we are forced to or we have to overlook some minor analyses and we leave some very minor flaws in code or in testing.  Because of the ignorance of the analysis apparently we think that those minor flaws won’t affect the whole system in its major functionalities. But in reality a very minor flaw in the system can lead it be an unusable system.

Few days back I was asked to develop a tool to modify an SQL file which were generated form a replicated database and had to execute the modified SQLs to a pgSQL database. The original file size was 70-80MB and total number of lines were about 600K.

The main task was to filter out some specific SQL blocks from the file then writing those modifications into two set of files. One set of files was a single file that contained the whole modified SQLs and another set of files was with multiple files that contained the modified SQLs as several chunks.

Since number of lines were huge so it was difficult for me to test all those line manually, but I tried my level best and found it to be according to the requirement. I opened those files manually and seemed to be OK. I executed the second set of files (the chunk SQL blocks) and these were successfully executed to the desired database. BUT when I executed the first set which contained the whole modified SQLs then it showed me syntax error.

I started to analyze the file and found that very few of the very last lines were not present in it. It was also very interesting that the skipped lines were started from a middle of a line, I was sure about my logic that either it will write a complete line or skip it completely, fraction of line is impossible.

So at first glance I supposed it to be a memory related problem and started to minimize the original file size but found the same problem every time  I minimized the files. That really shocked me as the other chunk files even bigger than my problematic was OK. Then I went to the source code and found that every StreamReader or StreamWriter I used for reading/writing files were properly closed except the problematic one. I modified the source code and then executed the tool which then solved the problem.

As we know an opened stream made a file inaccessible but in my case I was able to use it and that made me think that closing the stream wasn’t big a issue apparently.  But that silly mistake made my application a useless one. That is why every developer and tester should be aware about his silly mistake, they need to increase their analysis even after a very very minor case, they have to increase their foreseeable power.

Software Testing in Proper Test Environment is a Great Challenge in SDLC

No doubt software testing is a great challenge in SDLC. To get a clear picture of your testing application you must have proper preparation in proper time and you have to execute all those preparations in proper time and proper places. On the other hand all the plans, hard works regarding software testing will be in vain if testing is not done in proper Test Environment.

In my professional experience I had found some bugs that were bugs because of not testing the application in proper test environment. Similarly I have missed some bugs that were because of not testing the application in proper test environment.

Proper test environment is not an easy term to understand. A proper test environment may relate hundred of tasks to do before testing any application. Here are some of  them:

> Application is built from proper SVN revision?

> Application is installed/deployed in proper way?

> In which OS the application is running on? What version is it?

> In which Browser it is running? Version?

> It is running where? Location, time, timezone, daylight saving?

> and so on………………..

Very recent I was missing a bug because of improper test environment. A desktop application runs mainly in USA, it communicates with server DB through web services.  The application is used by many users from several places of USA. Recently two date fields are added in it. The DB columns for these fields are of type Date, so it doesn’t consider Time/Zone related information.

In the mean time client complained us that many times they saw different dates than they provided for those fields. So I started to investigate the problem, executed different scenarios but couldn’t reproduce the bug. I tested the application both in local and live environment. The dates are selected from a calendar control and these are properly saved in DB server. Since the data type was Date so I thought there won’t be any TimeZone issues.

Just for curiosity I set different TimeZonse in my application running machine and server running machine and then started code debugging. From web service the data were returning to client side through a DataSet. When the data are assigned into a DataTable in client side then I wondered to see the return values in different environments. Here are the environments and return values:

Environment #1:
a)      App in Local PC (Time: 14:50, Time Zone GMT +6)
b)      Webservice in Live PC( Time: 03:50, Time Zone GMT-5)
c)      Original Start Date=12/05/2010 and End 12/11/2010)
d)      Now the DataTable have Start Date=12/05/2010 11:00:00 AM, End Date=12/11/2010 11:00:00 AM.
e)      So Start Date and End  Date are showing as 12/05/2010 and 12/11/2010.

Environment #2:
a)      App in Local PC(Time: 15:01, Time Zone GMT -5)
b)      Webservice in Live PC( Time: 04:01, Time Zone GMT -5)
c)      Original Start Date=12/05/2010 and End 12/11/2010)
d)      Now the DataTable have Start Date=12/05/2010 12:00:00 AM, End=12/11/2010 12:00:00 AM.
e)      So Start Date and End are showing as 12/05/2010 and 12/11/2010.

Environment #3:
a)      App in Local PC(Time: 15:01, Time Zone GMT +13)
b)      Webservice in Live PC( Time: 04:01, Time Zone GMT -5)
c       Original Start Date=12/05/2010 and End 12/11/2010)
d)      Now the DataTable have Start Date=12/05/2010 06:00:00 PM, End=12/11/2010 06:00:00 PM.
e)      So Start Date and End Date are whowing as  12/05/2010 and 12/11/2010.

Environment #4:
a)      App in Local PC(Time: 04:10, Time Zone GMT -5)
b)      Webservice in Local Newtwork( Time: 04:01, Time Zone GMT +6)
c)      Original Start Date=12/05/2010 and End Date 12/11/2010)
d)      Now the DataTable have Start Date=12/04/2010 01:00:00 PM, End=12/10/2010 01:00:00 PM.
e)      So Start Date and End Date are showing as 12/04/2010 and 12/10/2010 respectively. And this was the bug in different TimeZone.

Later I knew that whenever Date/Time related information are passed/retrieved to/from web service through DataSet/DataTable then it automatically adjusts time with TimeZone in SOAP XML [see here].

I couldn’t reproduce the bug if I had all things right except the different TimeZone. So this is how it is important to test application in proper environment.

DATABASE MIGRATION TESTING

Many often you are required to perform database migration testing. The task isn’t easy but if you follow some instructions one by one then the task will be easy for you. In my professional experience I’ve executed three types of database migration testing. These are MySQL to SQL Express, pgSQL to SQL Server and SQL Server to pgSQL.

When you will execute the task then you will have to face many db specific challenges, but here I’m presenting only the cases that are very to common to every database migration task.

Following cases should be considered in database migration task

  1. Unless specified, the migrated database should have equal number of tables, views, stored procedures, user defined data types,… and equal number of columns for each of the mentioned objects.
  2. Unless specified, the migrated database should have same name and same letter case for the names of tables, views, stored procedures, user defined data types and columns.
  3. Unless specified, the column/parameter order in tables, views, stored procedures and user defined data types should be same as original database.
  4. The type and size including precision of every column should be same as original database.
  5. If migrated database doesn’t have exact type as original database then it should use the type which is most similar to original database. Special consideration should be taken for numeric, bit, binary, image and date/time [with/without time zone] related data types.
  6. The constraints Primary Key, Foreign Key, Unique Key, Default, Check, NULL and NOT NULL on column(s) of table(s) in migrated database should be same as original database.
  7. Unless specified, the constraint names on every table should be same as original database.
  8. The actions of CASECADE DELETE and CASCADE UPDATE of foreign key(s) should be same as original database.
  9. Unless there has any limitation in migrated database, then the migrated data should be same as the data of original database.

Take Care of Your Alternative Email!

Alternative email ID is an easy way of retrieving forgotten password. Many of us use this feature in many applications. To use this feature we use two email IDs. Basically one email ID is for prime usages and another(the alternative email ID) is for retrieving the forgotten password of the prime email ID.

Since the usages of prime email ID are very frequent, therefore we take care of the ID and it’s password regularly. BUT have you thought about your secondary email ID? If the password of this ID is weak then it may cost you harmfully! If someone knows this ID’s password then you will loose everything! At first he will retrieve the password of your prime email ID [basically email IDs are public] and then … everything!

So beware about your secondary or alternative email ID and password. For security purposes you can use some unconventional secondary email ID[so that someone gets difficulty to know the ID] and of course a strong password. Those who use more than two email IDs should also be concerned about the passwords.