Sunday, October 31, 2010

How to touch every bit of a block of memory ?

To do sanitization of a block of memory , i came across a technique today

.The Algorithm is as follows

a ) Generate a random string ( called RANDSTR ) of n bytes
b ) Get 1's compliment of that string ( ~RANDSTR )
c ) Fill the buffer with RANDSTR 
d ) Fill the buffer with ~RANDSTR

This will make sure that this touches every bit and might sanitize the stuff

Saturday, October 30, 2010

Organize Source code tree based on functional unity

In a team development environment , we need to come at a Work Break Down structure. Since the expertize vary among the developers , we try to take into consideration the strength and weakness of developers in the team.

To a certain extent , Architecture of the application will be colored by the team composition as well.

Try to have an architecture were Layers are having some kind of "Closure of Operations " and communicate through well defined evolvable (!) interfaces. We need to distinguish the above from the Work break down structure.

When we do WBS , there is a potential for having modules organized along that line(Organizing the code along the political lines aka Political units ) . We should try to have an organization where Functional units are clubbed together. With the help of modern version control systems , it is possible to have such an arrangement

To Sum Up,
        Architecture for the team ( I know it is counter-intuitive )
        Source code organization based on Functional Unit
        Once some kind of stability has arrived , Collapse some Layers for Performance


If a word is named after a person who lived once , we call it Eponymous. Boolean Logic , Darwinian Evolution etc are some of them.

Mechanisms , Policies and Newbies

Most software projects fail because Mechanism and Policy matters get intertwined in the system. If some one is trying to upload a resource to some server , why should the protocol client ( HTTP,FTP ) details be part of the code which implements that functionality ?

Most newbies , in their quest to finish the task do not give ( we cannot blame them ! ) much thoughts to these stuff when they write the code.

one can abstract the functionality

bool UploadFile ( const char *url , const char *filename ) {

             // do the file upload !

Since we have abstracted out the details of file upload , In the application code one can write

bool DoAction () {

     if  ( AskConfirmation() )

The implementation of UpLoad file can be changed independently and  file upload is a verb to be executed by the person who write the application code, based on the business context.

FileUpload mechanism and application contexts (Policy )  are clearly seperated. The biggest success of Unix lies in this approach. Unix seperates mechanism and policy. using shell script , we can set the policy matters.

In most projects , we need to write lot of infrastructure code ( mechanisms ) to act as a base for policy matters to be implemented. For Newbies , when they finish writing the mechanism part , the project is finished . To their surprise , the real battle starts from there. These Mechanisms has to plugged together with business rules ( policy ) and state management rules to have a working system.

Then , infrastructure code is being glued in all sorts of ways where the programmer who wrote the stuff won't have any clue of what he has done. Now , the system will be slipping into messy state and the programmer's life will get screwed as well.

Try to learn about Mechanism vs Policy stuff as applied to computer programming.

Cargo Cult Programming - a new term encountered today

I was going through Eric Lippert's blog ( ) and encountered this new term. I found it amusing because , I got a new meme ( at least a name for some phenomena which I regularly come across ) and found a wiki page for the subject too.

Personality type - Can we know ourselves through introspection ?

I am a firm believer in introspection as a means for more knowledge. Even though , we cannot be fully objective , I try to step into other's shoe to analyze the impact ( good or bad ) of my words and decision on them. This has reduced the conflicts with others.

This happens in normal occasions. At times , caution is thrown out and musings become "sensitive" for the other person. Instincts dominate reason in such  occassions.

Because of pre-occupation with ourselves we cannot be objective as our innate instincts will have more say on our affairs. Rather than being remorseful about this , consider it as a "normal" thing.

Introspection is an ideal which we should aim for and expect failure in most occasions.

Friday, October 29, 2010

What my son thinks about me ?

I just fired up paint and was looking for a diagram which I created some time back and I stumbled upon this

The Microsoft and Programming Language evolution on the ground

Yesterday , I happen to know about C# 5.0 ( C# 4.0 had arrived early this year ! ) and it's support for declarative  ( or transparent ) asynchronous programming . The language compiler will generate the necessary plumbing code behind the scenes.

M$ is doing a wonderful job by bringing in  to the mainstream some of the ideas which  were considered to be a academic curiosity and available to researchers . If you take a look at C# programming language , it has got support for most of the  programming paradigms in vogue and IMHO , It is  the best (THE!)  programming language around.

The Language "shamelessly" borrows  features from other programming languages. Of course , they implement it   really well. There are innovations like LINQ is there in C#.

Why are they doing this at a brisk pace ?

M$ is the biggest consumer of these features. Take for example Generics , It was supposed to be there with C# 1.0 . But, schedule constraints delayed it's appearance on the scene ( with C# 2.0 ).

They added support for lambda , type inference , anonymous types and extension methods to the C# 3.0. In this case , all of the above features were implemented to enable the implementation of LINQ technology.The above features along with generics became a mainstream .NET feature because of gradual adoption of LINQ.

In the version 4 , they added support for dynamic or duck typing. The reasons given are enterprise application integration and dynamic language support is a cool feature. IMHO, Microsoft do not want to lag in the dynamic languages fad.

With the advent of multi-core processors , there needs to be support for declarative or implicit parallelism and enter the C# 5.0  Asynchrony.  It can revolutionize some applications. When the compiler generates the code for you , It will be more accurate in most cases than a human.

Microsoft implements support for all these features at the platform level and it is the duty of the respective language teams to map into the syntax of their language. To be a first class .NET language , every language "should" implement these features.

What Microsoft is actually doing here ?

In the 90s , Microsoft invented COM for itself and marketed as a great way to write applications. The fact of the matter is Microsoft was the biggest consumer and it saved the company millions of dollars.

.NET platform was considered as a political move (alleged ) to scuttle the "leak" from it's developer community towards Java . Fact of the matter (probable ) is Microsoft do not want to rewrite their code for Intel Itanium and AMD. So , they require a virtual machine platform.

For C# language , the story is the same. This time , M$ exposes the internals to the developer and it gives opportunities to "some" (me included ! ) to talk about programming languages , lexical closures , dynamic binding , type inferencing , type synthesis to name a few.

 Now , when you see a C# programmer , you have to ask which dialects he or she speaks.

What is the problem with this ?

Most of the systems are in .NET 1.1 and .NET 2.0 . Even in places where the stuff is running under 2.0 , the code still uses 1.1. Newer systems written in MVC is the only exception. By the time , MVC 1.0 was production ready , M$ had already released C# 3.0.

Any feature after .NET 2.0 , should be embedded deep inside the frameworks. Otherwise , you are asking for trouble in a corporate environment. The average developer do not have the inclination to learn all these stuff and the usage of all these features hamper his productivity.

Like M$ itself does , we need to expose FAT APIs (coarse grained ) for the application code with basic data types and all these features should be used for implementation of these FAT interfaces. This will isolate a newbie or an average developer from these features.

Why I love these things ?

Learning and using all these new stuff in your current favorite language is a pleasure. This gives opportunity for me to brag about various trivias about the behind the scenes activity of the compiler or tool.  It gives immense intellectual pleasure to think and know about how these exciting stuffs are implemented.

Most programmers are not able to comprehend it . Some people think they know it and with my partial understanding ( to fully understand these stuff , you need a Phd in Programming Languages ) , I have been able to appear "geeky". I can bring noise which is hard to decipher !.

What do I do on the ground ?
      a ) Stick with Basic C# (1.1 ) , wherever possible. This will help people from other languages program using the if/else/select/update/delete model.
      b)  use Generics in the application code only if it is absolutely necessary.
      c ) use FAT Interfaces for writing the application code.
      d ) use Interfaces to decouple API and it's imeplementation
      e ) Design micro frameworks ( to organize the code and to let application developer focus on functionality )
      f ) Usage of .NET 2.x features is forbidden in the application code and it will be residing in some assemblies.
      g) Always use static typing 

Any other danger ?

Even though , C# is a clear winner over Java by a mile , learning C# is a two year effort . Read books like Accelerated C# and Effective C# , you will understand.

Workflow is the culprit

When we bid for a project , we focus on entities and foundation code for having a system infrastructure. As we near the completion , in the case of Enterprise projects , you need to take care of the dreaded workflows. On top of workflow specific to the domain in question ,  we have got states and transitions to keep the system in a known state . This complicates the glue code between the modules. Then Mechanism and Policy will get intertwined to wreak havoc on the code.

Next time , when you bid for projects look at the workflow !

Thursday, October 28, 2010

Every Technology is difficult - Soon it becomes a TLA !

TLA stands for Three Letter Acronyms. Every programmer learns a lot of new technologies in his or her career. Some of them are damn difficult to learn initially. Once we have mastered it , it becomes just another TLA in your resume and the ideas from the stuff will sink in as your own.

Custom frameworks - putting developers under career risk ?!!

I blogged about writing a micro framework to keep sanity over the application code. By writing a "quick and dirty" framework , the code can be organized and the software designer's model can be realized in a much better manner than writing ad hoc code.

When I told this to a friend of mine , his reply startled me "You have put the career of people under you at risk !". Ya , there is element of truth in what he has said.

Instead , my argument is that the code is transparent and any one who work in the project can take a look at it to understand themselves or i can explain the stuff .

With out some kind of boundary , adaption and refactoring will be difficult !

" Specialization is for Insects " - a revisit

I came across this from a book "Design and Evolution of C++" ,written by Bjarne Stroustrup. Till date , I was thinking that he coined it. While surfing the net , I came across a link ( "")  which goes as follows

A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.

-Robert A. Heinlein


This brings me to a phenomena which i have noticed ever since Blogs have emerged as an effective medium of communication. I have observed "narrow" specialization among technical bloggers.  They have got personal blogs (personal blogs ! ) for .NET technologies , Java , C# , C++ , Javascript etc etc etc.

This may help the blog consumers. Without them (blogger ) knowing , the people who produce the content are slowly and steadily becoming "useless" for Team Projects. I can understand people who blog to promote their products ( like Silverlight's team lead producing a blog for it's prospective users )  and people who cannot express their opinion on the street blogging about their cherished view (through a specialized blog).

Take  a person who blogs only about .NET or .JAVA or some other technology , It will turn out to be  a marketing excersize. This is good for consultants,corporations etc.

What about a programmer who works with a company ? Statistical probability of him working with the technology he loves is very less. Even if , he is fortunate, the team dynamics might rob him of the opportunity to work with his favorite technology. If we specialize too much , you are a less employable material for most small companies where a person is supposed to work with multiple technologies for a given context.

Now , that Blogs have emerged as a marketing tool for corporations  and those guys are promoting Individuals who blog about their latest and the greatest offering, recognition nuisance will follow the suit.  Once they get the award ( or reward ) , there is a catch . To maintain the loyality , you are forced to blog about every crap which the corporation produces. This reduces the employability of the person even further.

Once you are a top shot "awardee" , you feel some kind of aura around you and will start talking about stuff which do not have relevance on the ground ( that is shipping of the software ) and he will become a niche technology specialist. ( In the formative years of my career , I had a undue liking for C/C++ which  nearly killed my employability ) .

I come across lot of employment opportunities which might be good for people around me. Because of their "compulsion " to specialize , I am not in  a position to recommend them. The people got the award produce a set of silent aspirants for that award at a ratio of  1:50 and it turns out that every one is under compulsion to specialize !!!!

 Blog about new technologies and specialized topics , if you want to become a Independent consultant.

If you are programmer who wants to be employed by a company or team , avoid the danger of specialization and too much liking for the modern fads. I have seen excellent software engineers on the ground who does not blog much !. They dominate the Industry and they have got a dis-like for bloggers . ( I have faced the wrath ever since I began to blog actively !)

BJAM - Why It is named so ?

I came across Boost's BJAM  (  utility some time back , when i compiled Boost using Visual Studio. While sifting through the book "Practical Development Environment " , I understood the expansion of term.

JAM stands for Just Another Make and BJAM is Boost's JAM.

Names for Your System Ingredients - It helps !

In an application which is being developed , as we near the completion of the project lot of implicit requirement which is not in the scope (radar ) pops up. Those requirements are mostly workflow related.
Most projects focus on fundamental entities and their interaction with some System related foundation code thrown in. When we want to fit that in to a business context , lot of additional things which is not part of the scope ought to be there for a "black box system". ( In this context , black box means doing maintenance job and house keeping job through UI. To start the ball rolling , we can populate data through batch jobs and db consoles ) It is true for every software project.

As a result , we write more and more code to satisfy those requirement . I felt that coherency of the code will get weakened , if we continue to do so with out a "mental layering". The code in question is .NET based server infrastructure.

SO , I named the server code as follows

                     a ) Server Pull ( Client has to push it )
                     b ) Server Push ( Client has to pull it )
                     c ) Back Office & Administration.

Suddenly , things began to get back into shape and we are in a position to incrementally test as well as deploy in a phased manner. Some times , naming something gives more clarity.

Wrote a Micro framework for an application

Today , I wrote some code to transparently handle stuff like authentication , page navigation , security , logging so that application programmer can focus on implementing the functionality.

The ideas were taken from the Microsoft Patterns and Practices site . Some code which i wrote to teach framework internals were ported to C# for this purpose

Wednesday, October 27, 2010

Serendipity - +ve vs -ve one

I like the concept of the word serendipity and has written and talked (even in a Barcamp presentation !) about it . Today , a friend of mine send a blog-post regarding this. The blog entry deals with problems associated with people who try to plan ahead and as a consequence miss lot of good (and not so good ) thing.

I think that people always confuse serendipity for luck. I think luck is +ve serendipity.

Couple of days back , In a discussion , I referred to a person about some "sensitive material" which is there with me. Some how or the other discussion drifted to the topic and In a spurt of exuberance , I offered the material to the person in question. There is a risk in offering that because it might not fit into political  view of the person in question and can endup in a "strained" relationship.

I thought about it a lot and finally yielded to my instinct and gave it away.

These kind of things have happened in the past and to see the twist  , I have taken such risks. Most often , it has turned out to be positive and  has taught me new lessons about life in general. Our mental blocks and perception about others need not have any rationality. Since we analyze a lot , we do not act when it is necessary.

Thus , LUCK is +ve SERENDIPITY and misfortune is -ve SERENDIPITY.

The original post can be retrieved from

Monday, October 25, 2010

STL - It is living up to the standard !

I just happen to get a mail from my friend, ASS ( Ashik, Son of Salahudeen ! ) , about an implementation of  STL from Electornic Arts. When we think of STL ( Standard Template Library ) , It is STL bundled with the compiler vendor. The original implementation was from HP ( Alex Stepanov and Meng Lee ) . Then SGI
(remember Silicon Graphics Incorporated ) had an implementation. Now It is the third Open Source Implementation I know of . There are commercial offerings from DinkumWare , EDG (?) etc.

Now , STL is a standard !

Invited speakers vs Voluntary speakers

I regularly go and give presentations at various events. Barring two occasions , I have not get paid for that. Couple of my friends also do this and the story is same for them. They some time want to have a psychological battle with the organizers to force them to pay us. Based on my learning ,  most events which i have been part are user groups , informal gatherings , college seminars ( how come we can ask payment from a struggling college union ? ) , community gatherings etc.( Some do provide excellent accommodation , if you reach early ! and on couple of occasions my company has paid for transport and accommodation).
I consider getting a spot is one of the biggest reward.

Today , I came cross a term called "Invited Speakers" ( as opposed to people who volunteered and came through Call for Proposals process ) and I can tell my friends that you ought to be a Invited speaker to get
paid. As a practicing software professional , I might not be able to don the hat of a professional speaker.
I do speak because there are listeners to what I say. ( Otherwise , How will I offload things which I learn ?)

Computer Programming = ( Abstract Mathematics + Concrete Engineering ) >> 1

I am always interested in learning about the idea of Computation and the abstract mathematical underpinnings behind it. After reading books on Predicate Logic ( and working with pencil and paper ) , Lambda calculus , Relational model , Turing machines  and Writing Compilers and Interpreters for Functional and Imperative languages , I  feel , I have got some notion about the topic.

This gives immense pleasure. What is the relevance of these topics to Computer Programming as a profession  ? ( Other than Intellectual insights !) . I have seen people without knowing Relational Algebra writing excellent Queries . People without any idea of Lambda Calculus manipulating Closures , People without Knowledge of Compilers and Interpreters talking about Virtual Machines and they seem to understand the stuff. ( If the code written by them is seen as understanding ! )

This has always bothered me . Computer Programming as a discipline is only sixty years old and at this point of time we are still figuring out how to engineer a device which embodies the idea of Computation. So , we are at a primitive state. To progress , we need to think at a much fundamental aspect as well to enable the progress. A model driven learning ( computation model ) will enhance comprehension and it will help to manipulate the system much better than blind manipulation.

While writing software products , these computation models can be a template for your system model. What are your thoughts on it ?

Forking Open Source Projects - Exception is not the Rule

As the computer infrastructure  and programming models get better some projects might look aged and there might be tendency among people to have a new fork to add the extra stuff available.

I have noticed a curious pattern where the original "limited" project survive and outlive the "newbie" ( forked one ). Decision of the primary stakeholders of the (original)  project might have come from the software architecture and code entropy  point of view. Adding a new feature might wreak havoc in the existing code base.

If it ain't broken , do not fix it ! . Some do not listen to it and survive to tell the tale. But, do have time and money to take the risk ?

A Brilliant way to put security in perspective

I was just flipping through a book by the title "19 Deadly Sins of Security - Programming Flaws and How to Fix them" .  Authors of the book are heavy weights in the security Industry.

What caught my attention was the forward by one Amit Yoran ( Former director of Home Land Security dept. ).

 I am summarizing his foreword

   Computation is based on the theory of deterministic machines and we expect software to act as a proxy for our intention to control the machines ( which realizes Computation  in real life). Due to the stack of technologies which we rely on sending information from the user to the final metal , non determinism can creep in because of subtle issues in each layer. This non-determinism is manifested as security flaws

Sunday, October 24, 2010

Being Nice - It is an Investment , Like any Investment ROI can be negative

I Know  a person who instinctively modulates his behavior to appear politically correct , altruistic , honest etc. The people who pay the price for it are his dependents , immediate family members and his close friends. He thinks that world is unfair to him.

I convinced him (at least that is what I believe ! ) yesterday that Being Nice is not a moral stuff. It is a mechanism to minimize disorder with the expectation that people might reciprocate. They cannot reciprocate always because of their unique condition ( temporal !) . So , most often this investment can give negative returns.  

NFC - a "disease" which I suffer from , what about you ?

I am suffering from a disease called "Need For Cognition" (NFC ) at a personal level for more than twenty years now. This is a phenomena which is there with 10% of the people I meet in my personal and professional life.

I need to understand and analyze ( remember my blog title ! ) something before I use it.This is called NFC .
As a human being , I do not have cognitive ability to understand everything (even 1% ) before using something . This cripples you in some way and you will get tensed for things which you need not be.

Certain tools (and ideas ) are meant to be used in a "mechanchal fashion" !. NFC is good to model things , but at the ground level you need to be tactical by obeying your instincts.

An example of Human Bias

I happen to take a book from my shelf yesterday when there was a power breakdown. It happens to be a book written by Noam Chomsky by the title "On Language". To be precise , It is a omnibus edition containing his widely appreciated "Language and Responsibility" and "Reflections on Language".

Language and Responsibility is a series of Interviews with him on a broad range to topics from Politics , Linguistics and  Philosophy.

While reading his political views , I was getting uncomfortable. why should I ? I was perplexed. I closed the book and reflected upon it. ( I could not read even the first paragraph. I did get uncomfortable after reading the title of the chapter. Even before , I started reading his views ! )

Noam Chomsky has got "anti-establishmentarian" ( I am not so sure whether such a word exists - I meant it as against the Governing system prevalent there ) views regarding American governance.  He has been referred by Leftists and Islamic fundamentalists as a "brave heart" who shares their view on American Government. He does not have anything to do with Marxist Philosophy and at the same time Paradoxically , he is a Jew !. 

I moved on to the second book "Reflections on Language" , which was about Linguistic structures , Cognition etc.  How much ever you try to be objective, your belief system will fail you.

Later , after some time I did go back to the first part and read it . He is correct from a moral philosophy perspective. Can we attach morality to nation states ?

Started Investigation on EF and NO SQL DBs

Yesterday , I did attend a session on entity framework and another on Raven DB @ K-mug community tech day. I had taken a look at Nhibernate couple of years back and have even used it in a production application. Besides my exposure to LINQ to SQL , I am already familiar with what an ORM is.
But , Since MS has got into the game , they will add some kind of improvisation for sure. With a clean mind , i am planning to approach EF.

Having worked with CAD/CAM Industry as a software developer for a while , I do understand the value of non relational approach to data storage. When heirarchies are deep , a relation schema is an overkill. But, now it has become a industry fad. When I had gone for Pycon India , I saw NO SQL people getting
battered . I have just started with it.

CINT - An Interpreter for C/C++ Programs

While researching on reflection support for C/C++ , I happen to revisit CINT , a fantastic interpreter for C/C++ programs. using CINT , you can learn C++ much faster than in a compile , link , run , debug mode tool chain.

CINT has been embedded into many applcations and even when I fiddled with it in the year 2000 , it was a sound system. It works on all major platforms.

Will StackOverflow kill User Groups ?

Now a days , I am seeing less activity in local user groups . Most forums are sort of dying a slow death because of the lack of participation. Now , SO is the favorite site for getting code snippets and trouble shooting etc....

I believe , Local user groups won't die ( and should not die ) as it serves a wider purpose than immediate programming tasks. Every profession needs a platform for social interaction. Otherwise , It will be "boring" to work in that particular profession.

You participate in any event , be it a user group meeting , a developer summit or  a barcamp , you will find avenues to network , freshen your perspectives and re-evaluate your priorities. Moreover , you will carry a ton of new memes.

I am writing this after attending K-MUG's Community Tech day event.

Dynamic Type Synthesis for C/C++ Programming Language

I have been playing with virtual machine languages like Java and .NET for a long time. After working with them , C/C++ Programming language seems to be bit crude for some tasks like XML manipulation. The ease with which we can map a XML document to some Java or C# type is necessary for efficient ( in human cost ) usage of XML documents.

I am thinking about  a Structure manipulation language ( a DSL ! ) to make this happen for C/C++ Programmers.  ( BEA tuxedo has such a library )

The idea is as follows .....

imagine there is a C struct

typedef struct
    int age;
    char name[50];
    float salary;

and a XML document


In Field manipulation language

          INT A;
          STRING B[35];
          FLOAT C;

   PERSON.A = Person.age;
   PERSON.B =;
   PERSON.C = Person.salary;

The interpret will create a memory buffer with a layout similar to the C structure and The stuff can be casted as C/C++ strucutre or can be copied on to the structure.

TEST *p = (TEST *)ParseAndMap("Person.xml" , "Person.Fml");

do the stuff with p;

Community Tech Day @ Kochi by INETA and K-MUG (Part 4 )

Generics - a Revisit by Yanesh Tyagi

After the tea , It was a session by Yanesh Tyagi , who is associated with a Consulting company based in Kochi. He took all of us to a time when there was no generics and convinced the audience that Compilers could not certain issues at the compile time. Since his session was interactive in nature , some people did share their experience where they passed an Image object unknowingly to a method which was expecting something else.

He then showed series of examples by putting things in perspective. He reminded audience that Generics is beyond the Type safe collections. He explained about Action , Predicate , WinIntellect Power Collections. He really impressed the audience with his presentation skills.

.NET Serialization and XML - By Praseed Pai (me ! )
The session started by defining the meaning of the word serialization with a illustration as given below

Then he talked about Binary Serialization and XML Serialization. After that , explained about Xml Serialization and advantage of Xml Serialization in a scenario where iPhone based applications interacting with a .net server infrastructure. The Demos were centered around Synthesizing types from XML document using XSD serialization and C# dynamic/ExpandoObject scenario.

After the event , the key members of K-Mug had a discussion at a nearby restaurant and all of were happy about another event well organized .

Community Tech Day @ Kochi by INETA and K-MUG (Part 3)

Security Auditing and Accountability By Manu Zachariah
After the Lunch , It was Manu Zachariah's turn to give a very entertaining presentation on Windows Login mechanism and associated events. He started with NTLAN manager and explained why that system is dated. He explained about the Login IDs associated with Login events and exceptions.

Then , he explained about the Active directory based authentication and put Kerboros Ticketing based authentication into perspective. For the next thirty minutes he dissected Kerberos and explained the workflow and the actors in it.

NoSQL databases in .NET Apps by Shiju Varghese
He started with the state of the art in the relational database world and explained why relational model is not suitable for all occasions. Social networking web sites are primary users of these databases. He did touch up on the scalability of relational databases . (They do have some bottle necks in horizontal scalability )

He then explained about NO SQL eco system and put emphasis on Raven DB as it is built for .NET platform. There was some source code to accompany his talk and it showed how the stuff gets persisted on to the DB. Raven DB can be embedded into
an application as well.

It was afternoon team time , once again !

Saturday, October 23, 2010

Community Tech Day @ Kochi by INETA and K-MUG (Part 2 )

TSQL Tips and Tricks by Vinod Kumar
After the tea , It was the turn of Vinod Kumar to give an entertaining session on TSQL Techniques. He made his session interactive by asking questions and It proved to the audience that they do not know much about something which they thought they are familiar with . His suggestion that SLA ( Service Level Agreement ) should determine the backup and restore policy was the key take away for me. He gave tips about performance by avoiding Functions in TSQL statement , exploiting the indexes etc. to name a few.

He showed a novel technique to sort IPAddress based on the use of the way heirarchy ID is stored in SQL server DB. He talked about Row Constructor feature and associated caveats. When using a feature , weigh the pros and cons and we might be in for some surprise (unless we know what we are doing )

I thought , he should have given more time to finish his presentation. But, like any other event , time is a limited resource. entity framework by Shalvin PD
In which he demonstrated , how to create a simple Contact management system without writing a single line of SQL code . The session started with discussion on Conceptual schema definition , Physical schema definition and Mapping Layer definition.  The peek behind the code behind and data type mapping between the objects and realtional data was useful.

He then did show how we can data bind a Query result by writing a filter in LINQ to a WPF Combo Box and Grid control. The biggest take away from the session was one can write applications without writing much SQL with the help of entity framework and LINQ .

It was Lunch time ! I was into it already by then.

Community Tech Day @ Kochi by INETA and K-MUG (Part 1 )

The INETA and K-MUG conducted a one day developer event on 23rd October , 2010 @ IMA Hall , Kochi.Being active in the K-MUG circuit, I regularly participate and take sessions at these events. This time was not an exception.

The details of the event is available @

I started around 7.15 am from Aluva and got a Low floor BUS to the Kaloor Stadium. Around 7.45 am , I reached the Kaloor stadium. I got hold of an auto rickshaw and moved towards the venue. On the way , I picked up Satheesh Kamath ( an active K-MUG particpant ) and thanks to that the auto trip became free for me. (He paid for it ! ).

At the venue , I met Sreejumon ( K-MUG president ) who had arrived from Bangalore (early morning). After the usual pleasentaries and chit-chats , we exchange notes about the event with each other.

Then , Vinod Kumar ( arrived with a smile and a trophy ( a team accompanied him from Bangalore to bring that ! ) to be presented to K-MUG for being the best developer user group from a tier -2 city.

Raj Chaudhuri ( ) arrived after that and I gave him company while he was smoking. Amal and Shalvin arrived at the scene and there was a slight drizzle as well.  I had met Raj in person in the year 2001 at TechEd , when he was much younger. While talking to him , I felt he is still the same person I met a decade back.

As the clock started ticking , people began to flock and we understood that despite the rain and couple of recruitment drivers , there will be people. I had invited couple of my ILUG friends and one of them  turned up
with a classmate who incidentally lives near my location.

I met Sanil ( ) and I was surprised to know from him that they are already using Harbour project to run their application under Windows. I am a great proponent of that project as
a mechanism to run MS DOS XBase applications on Multiple platforms. (

The ball started rolling around 9.30 am ( Late by 15 minutes )  with Sreejumon welcoming the guest speakers and the audience . Next , It was the turn of Vinod Kumar to give a short speech and the award to K-MUG volunteers. We recieved the award.

After the award , Vinod Kumar Invited  Praseed Pai (that is me ! ) to start the session.

Plunj into Mono by Praseed Pai
The talk centered around speaker's exploration into the Mono tool chain. The session started by introducing Mono as a  Cross Platform Development Environment using C# as the primary programming language. After giving the url of the project , he dived into some serious demos.

The first demo was about , an accounting package for education purpose. The code was written in the year 2007 using .NET 1.1 and contained close to 7,500+ lines of code. By retargeting the backend DB to SQLite , this was ported into Windows , Linux and MAC OS X. The demo was supposed to be done on a MAC OS X machine. Since , he forgot to bring the Projecter Cable ( MAC has lot of such Idiosyncracies ) it was shown on a Windows Vista machine. He has blogged about it @

The next demo was the compilation of ( ) compiler using Mono and without a single change the code compiled and ran correctly using the Mono run time.

The session concluded by promoting  a new meme "Cross Platform C# ".

Teleric productivity tools by Raj Chaudhuri
With out a Micro Phone , he jumped into the middle of audience and started his talk. He talked about
Mono Project for a period of five minutes and shared his experience with the Mono. He did meniton
that application runs correctly under Mono run time. I was thinking that Winforms works fine and has got some problem. ( May be developing without Visual studio is a problem. Deployment is a different matter )

Then , he started getting into the meet of his presentation. As usual , he started with a anecdote where he happen to consult for a friend ( Never make customers out of friends , you can make friends out of customers -Raj Chaudhury ! ) where he asked for features available in the desktop environments ( like Excel , Outlook ) on a web app. In this context , he explained about lot of tools from the company which can enhance the developer productivity. His energy and style of presentation made this presentation remarkable. ( IMHO,
he pitched for a vendor without irritating the audience. ) . I was surprised to know that at least five companies in Kochi is already using these productivity tools.

He announced a special discount for these products for Community Tech Day event. While demonstrating the controls , he walked through the underlying JavaScript code and it gave people lot of insights into the working of these controls.

At a personal level , It was nice to see and listen to him after a decade . His E-book ( ) had helped me to save time to emulate some IL instructions for my compiler ( ) .

Then , It was tea time !

Friday, October 22, 2010

For a person with hammer , everything around is a nail

This old adage summarizes a phenomena I have noticed happening consistently in my life. Take for instance , a technology (framework ) like Ruby on Rails.  The other day , i wrote a hello world program using Ruby. There is a possibility that I might investigate it for fun. I have heard from lot of people that RoR is how MVC should be !

If I manage to invest a week or so investigating RoR , Like any other human being propensity to get the return on Investment ( of time ) will prompt me to see Software engineering from a Ruby developer point of view. Soon, this urge might help me to work in a project involving RoR.

A friend of a mine Invested some time with entity framework. Now , he sees ORM everywhere !.

STUB Your Implementation before you begin…!

While engineering a software , there is a tendency among software engineers to understand the sample code bit deeper before incorporating the stuff into their application. This takes away valuable time from the project's schedule.

Ideally , stub the code ( if you want to send a email from your applcation )

extern "C" bool DispatchEmail ( char *sender , char *recipients , char *body , char ** filenamestoattach )

    // ------------- Paste your code here….


You can continue with the next feature and comeback later to implement this.This can give some benifit in developing the application.

Thursday, October 21, 2010 - An Educational Tool for "wannabie" Business Programmers !

I have started porting a C# 1.1 based accounting package to run on Windows , Linux and MAC OS X using Mono.  I have already successfuly ported the chart of accounts module ( with the necessary SQLite db wiring ) and the entire package runs under Windows , Linux and MAC OS X using the Mono run time.

This (Windows winforms based program ) has been used to introduce Financial Accounting Concepts to at least 50+ programmers. There is an auxiliary asset by the title "Financial Accouting in 60 minutes". This slide + Excel spreadsheet + workshop model presentation has been successfuly presented in some corporate companies as well.

I am planning to Open Source it soon !. I will be showing a sneak preview tomorrow at the K-MUG Community Tech days ( ).

Who said mono is just an academic curiosity ? I do not believe it.

SPAM stands for Simple Program for Accouting and Management !

I will be embedding ( ) into to make this a scriptable accounting application. runs with mono on Windows , Linux and MAC OS X as well.

A Presentation and it's aftermath

I am planning to present about Mono Tool Chain to a group of developers and newbies @ Community Tech day to be held at Kochi, Kerala (on 23rd October ,2010).

Rather than showing some trivial use cases , I thought of showing something useful.

The first demo is about running my Open Source Compiler ( ) under Linux and MAC OS X . The code base contains 3000+ lines of code and it was never meant to be multi-platform. Still , the stuff worked correctly ( This convinced me about the viability of using Mono for serious development , when i tried it last February )

Next , I tried to port an Accounting package (7,500 + LOC )  written by my wife to work with Mono on Windows , Linux and MAC OS X. The DB used was SQL Server 2005 express edition.

a) I re-wrote the DAL part ( some 50+ lines of code ) to run against SQLite DB.
b) Changed the Config to take SQLite connection string.

Compiled and ran under Visual studio and it worked fine.

Next , I went to Fedora 10 to do the same. After half and hour's effort , I could run the stuff there with actual data being retrieved and sent to the DB. I used Mono 2.6.x to do the job

In MAC OS X and Windows using  Mono , I have hit some problem with the SQLite provider. There are two providers available Mono.Data.SQLite.dll and System.Data.SQLite.dll . Both are showing problems. That has nothing to do with Mono C# compiler. In the coming days , I will investigate the stuff and probably send a patch to the project.

I am able to compile and run all the example programs for my .NET Serialization and XML demo under Mono on Windows , Linux and MAC OS X. The provider on Fedora works fine . Yet to fix
the problems in MAC OS X and Windows.

It was a good learning experience !. I liked the sqlite source code distribution technique. They merge all the
source file to one file ( amalgamated source in their parlance ). This helped in me compiling the source very easily in MAC OS X and Windows. I relieved the fond memories of C/C++ programming days. At the job level , now a days i prefer to work with C# or Java !.

Wednesday, October 20, 2010

Mono 2.8 has support for C# 4.0 !

I have got mono 2.6.7 on my desktop , laptop and office MAC OS X laptop. For a session which I am supposed to take , I tried to compile some VS 2010 programs using Mono. I could not compile a program
which was using dynamic variable.

I Installed Mono 2.8 toolchain from the mono site and i could easily compile my stuff.

Mono has got three compiler drivers now

a) mcs - mono c# command line compiler ( .NET 1.x )

b) gmcs - has got support for generic ( .NET 2.x and above )

c) dmcs - support for dynamic (.NET 4.x !)

Tuesday, October 19, 2010

Plunj (Plunge!) into Mono

I am giving a talk at K-MUG's Community Tech day program at IMA Hall (23rd October 2010), Kochi on the Mono Tool Chain. (

It is a silent revolution which is happening around us in the software engineering world. Mono Project has come of age and it is filling  a void in the Linux platform . There is lack of a good Desktop application development toochain and language there.

Mono contains 90% of the .NET framework libraries implemented. They are only a few months behind Microsoft in their FrameWork class implementation. If there is discipline , we can develop application
which can run under Microsoft's .NET , Mono on Windows , Mono on MAC OS X , Mono on GNU Linux.

The era of  "Cross Platform C#"  has arrived. IMHO , People who have invested their time and effort
in learning .NET and C# should leverage their expertise to run their application under Windows , Linux
and MAC OS X.

Since the binaries are compatible , you can develop under Windows ( using Visual studio ) and deploy
under Windows , Linux and MAC OS X.

The session aims at Author's experience in "flirting" with mono tool chain. The Session will cover SQLite DB's provider to help you migrate your database to a small footprint database system (This is essential in scenarios where DB is hosted in non-Windows platform ) . I have used SQLite provider to write a DB synchronization mechanism for a iPad Application.

For more drop in @ the venue !

The details and registration formalities are available @

Monday, October 18, 2010

SAX Parser - It is not for newbies !

There are two dominant standards in Xml Parsing strategy. One is W3C's DOM API and other is SAX API. In small footprint memory devices , they provide a SAX API . This is understandable as it saves Memory and it is important for such devices.

Using a SAX Parser with a reference counted Object model like that of Objective C can frustrate young and not so young (read experienced ) programmers. I wasted some hours fixing such a bug.

I recommend every company to write a DOM adapter for small documents or write a C style callback to simplify the stuff. Builder pattern is useful in this scenario. It is especially true when you want to synthesize type from a XML document.

Sunday, October 17, 2010

Application Software development - Why Proprietory Platform is "compelling" ?

I have found out that , in a instituitional setting every decision has got a risk point and there should be a plan to mittigate it. ( At least in principle ! )

When a organization goes for the selection of platforms and tool chains  for basing their application , the decision makers require a exit clause to shift the blame on to a known software vendor. Based on the feedback I have got , most managers say that in the case of proprietory platforms, the technical support is there and the blame gets shifted to these companies. For every manager , his skin is precious than the organization's savings.

The argument for FOSS stems from the presence of libraries to do most of the stuff which a system is supposed to do. It is true in most cases. But, the savings one might get from moving into these system
does not justify  the risk for the decision makers. For Companies like IBM and Oracle , using Linux
and FOSS libraries is a way to leverage existing stuff  to get a headstart. The  presence of  "Demand side learning" and economics justify this.

Take a domain oriented company. The technical personal won't have indepth knowlege to manipulate the
stacks of libraries and they make a mess out of the stuff .

In the above diagram , the green line is the logical scenario. The information flows from the top of the stack to the bottom in a linear fashion. In the case of system software consumers ( domain oriented companies ) , if they choose FOSS libraries , they will end up having the black path.

This is because each of the layers has got it's own versioning cycle and bugs associated with it.

These domain oriented companies might get a red path , if they piggy back their software on a proprietory platform.

Protocol clients - It either works or does not work

When you write software which is supposed to work in a multi vendor eco system , compliance to standard protocol formats is mandatory (de-factor or de-jure) . Since most companies in our part of the world does not venture into such software development , it is hard to convince the stakeholders about the difficulty in bringing down the (mandated ) requirements, for a faster development time. You can't dilute a protocol implementation.

Most of the protocols meant for document exchange will have a elaborate web of standard compliance to even reach the primary document.  I call these stuff , "Domino Protocols" . We need to navigate the protocol path like a Domino fall. Try to hop one thing , you end up getting gibberish

Protocol software => "It either works ! or does not work ! " ( As simple as that ! )


I am a great fan of Procedural programming in the case of Database access. In my .NET programs , I use a class by the name SQLAccess to simplify the access to SQL Server DB.

Today , I ported the stuff to work with Mono and Sqlite database.

// Test.cs
// A Simple SQLAcess class for SQLite client....!
// Written by Praseed Pai K.T.
// gmcs Test.cs -r:System.Data.dll -r:System.Data.SqliteClient.dll
// mono Test.exe

using System;

using System.Collections.Generic;

using System.Text;

using System.Data;

using Mono.Data.SqliteClient;

namespace DbLayer


    public class SQLAccess


        private SqliteConnection _con = null;

        private SqliteCommand _cmd = null;

        string _constr;

        public SQLAccess(string constr)


            _constr = constr;


        public DataSet Execute(string SQL)


            _con = new SqliteConnection(_constr);


            _cmd = new SqliteCommand(SQL, _con);

            SqliteDataAdapter da = new SqliteDataAdapter(_cmd);

            DataSet ds = new DataSet();


            _con = null;

            return ds;


        public IDataReader ExecuteQuery(string SQL)


            _con = new SqliteConnection(_constr);


            _cmd = new SqliteCommand(SQL, _con);

            SqliteDataReader rs = _cmd.ExecuteReader();

            return rs;


        public bool Close() {
                 if ( _con != null )
               return true;  


        public bool ExecuteNonQuery(string SQL)




                _con = new SqliteConnection(_constr);


                _cmd = new SqliteCommand(SQL, _con);


                _con = null;

                return true;


            catch (Exception e)



                return false;





     public class Tester {

       public static void Main(string[] args)
             string connectionString = "URI=file:Sqlite.db";
          SQLAccess sq = new SQLAccess(connectionString);   
          DataSet ds = sq.Execute("SELECT * from test");
          DataTable dt = ds.Tables[0];
          int cnt = dt.Rows.Count;
          foreach ( DataRow drs in dt.Rows )
               String s = drs["ONE"] + "|" + drs["TWO"];

          IDataReader dr = sq.ExecuteQuery("SELECT * from test " );

          while(dr.Read()) {
            string FirstName = dr.GetString (0);
            string LastName = dr.GetString (1);
            Console.WriteLine("Name: " +
                FirstName + " " + LastName);
       } // public static void Main...

    } // class Test

} // DbLayer

Here is my console ...log...

[sandhya@localhost pai]$ gmcs Test.cs -r:System.Data.dll -r:Mono.Data.SqliteClient.dll
[sandhya@localhost pai]$ mono Test.exe
Name: hell narak
Name: heaven swarg
Name: downunder pathal
[sandhya@localhost pai]$ 

Saturday, October 16, 2010

Tyranny of small decisions - we are what we are because of it !

Always , we need to take a decision on future course of action based on the current context , available information and physical constraints ( and other n number of factors ).

Since , a person can be in multiple context , we need to take a decision which we consider as optimal for the context. These series of optimal decision can create a global extrema ( minimum , when are looking for maximum and vice versa ) of some sorts.

This has been studied by Game theorist . See

System re-engineering - A Philosophical approach (it might work !)

Every programmer or system designer is confronted with a scenario where he is responsible for deciphering an existing system to tweak it to suit a business requirement or two.

How do one approach it ? Even if , the designer or re-engineer has got good domain expertize , the footprint
of the former designers will be left in the system. Unlocking the fundamental assumptions of the system in question (at the time of inception ) is a key in tweaking it's parameters.

To do that , one should understand the cognitive model of the previous designer. Since most systems evolve
over time in a Darwinian sense , we need to understand the forces which has influenced the design over a period of time.

We need to attack the system from both ends . At a bottum up level , look at the information structures ( data structures , tables , algorithms etc ) to understand the implementation model. From a top down perspective , understand the mechanisms which is in place to enable the bottum up implementation of the system ( We are assuming that domain expertize is equivalent ! )

Next , understand the stakeholders ( i mean , players in the system ) and analyze the interactions which can occur between the players and remove the improbable. Try to understand the composition of the previous team and also analyze the general mindset of programmers of the time where this system was in inception stage.

The "manual" static analysis of the code base needs to be done in order to gain further insight into  assumptions made about the use case , business context to run this etc.

The code can be classified into

                   a) Platform Agnostic code
                   b) Minimal Platform dependency
                   c) Coupled with Platforms

If the system is architected properly by previous developers , by this stage we will be in a position to make some incremental changes in the system.

The problem is due to code entropy , the layers collapse and tactical tweaks make things much more complex. Then tweak and pray model of development is the only option.

Have a team of functional testers at your disposal. This model will work as we have got plenty of
messy systems like educational system , fiscal system , public adminstration systems which is made
to work albiet with blemishes.

Every system works because , behaviorally it seems to work most of the times. 

An unusual book

I happen to go to a post dated 2005 and happen to see the mention of Ruby On Rails. From that site , i got a link to a well written book .

Pls. feel free to check  @

Program Efficiency vs Programmer Efficiency

Objective C and C++ are extensions to the base language. The former owes it orgin to the smallltalk language and the latter to the simula system.

The reason for basing both languages on top of C is different.

a ) Bjarne Stroustrup (developer of C++ ) did not want to invent a new religion (a new language from scratch ) to implement Simula like classes , virtual functions

b) Brad Cox (developer of Objective C ) wants to add small talk like features into the C language
as a pure extension.

The first one is statically ( strong ) typed and the latter is dynamically typed.

Now a days , we can mix C , C++ and Objective C into a single source or in a single executable ( by producing objs ) using Apple and GNU extensions.

IMHO , Objective C is good for Application frameworks where programmer productivity is more important. C++ is good for performance. ( ie Program efficiency )

Try to write a number crunching Subroutine or Computer matrix package in Objective C , you will understand what i say.

Friday, October 15, 2010

Wife Song !

Wife , Wife , Wife , every where it's same
she will nag you for your words
Think twice , before you say
Say to her , what she wants to hear
Curb your instincts, to mollify her
Man wants a baby , she wants a hubby
Even Stalin , feared his wife
Loose , Loose to make the marriage win
Once you are hers , she will look after you
In the process , your spirit is gone !

Best way to tackle Snobs is to Yield !

Now , in every sphere there are snobs. We are also part of the process. I have found a wonderful way to tackle such people.

An Old man in my locality is interested in talking to me because I listen to his stories about the heroics o his two sons. As the day went by , he began to boast too much.

One day he irritated me so much amongst my friends at a marriage cerimary and the following happened

Old man => " Now my son is a very busy person and he does not have time to come home "
Me => " Is he a slave to work for all the seven days of a week ?"
Old man => " He is a business development manager for the whole of north american region ! "
Me => " If a man is working more than five days a week in an instituitional setting , either he is a inefficent
guy or an asshole who ends up doing work of others ! "

People around me laughed ....!. This irritated the man and he began to abuse some people around us.

Me => "Uncle , You want to prove to me that your sons are smarter than me and by virtue of giving
birth to them , you are smarter than my father , right ?"

He got stunned

Me =>"I agree to that ! "

Every one laughed once again and hearsay is that  he has never tried to snob others since.

A useless kind of polymorphism in C/C++

The C language does not support the function overloading. Today , while reading a thread on Winmain vs main , I happen to see a post where some one mention variable number of parameter support for main.
( I do know that printf family function has got variable parameter support using va_arg/va_list/va_end stuff)

I think , main is treated by most linkers the same way , they look for

main(int , char **)  ( argc/argv scenario )
main(int , char **, char ** );  ( argc/argv/envp )

That posts also claimed that you can pass any number of parameters in the __cdecl calling convention mode.
Even though , one can pass any number of parameters , without some kind of type indicator one cannot process it.


int __cdecl abs( ... ) {

return 1;


int main( )


int c = abs(21,3,4);

int d = abs();

int n = abs("ssfsdffd");


How do i process data inside abs , without adding a format string in the front or making an assumption that the arguments are of specific type ( in certain order ) ?
The above program compiles correctly in Visual C++ and executes as well.

What is the need of Winmain , why can't Microsoft use main ?

While searching the web, I have seen the above question. Then , a series of posts speculating the stuff will start and have not seen it reaching any conclusion.

Only some Engineer in M$ (who was part of the decision making process ) will be able to explain it. It might be a case that , he just suggested to the management that they should do this and it got approved as they can sell a SDK with a sexier name. ( Windows SDK ! )

May be M$ was interested in having their name stamped on the venerable C/C++ entrypoint. When M$ came up with Winmain , there was no ANSI C or ANSI C++ standard. They might have thought that
let the standard committee consider their entry point as well.

The real reason might have been

Windows started it's life under MSDOS. To make a program run in the co-operative multitasking environment as an application with Windows , Buttons and menus , they need a different Executable layout.
The extension of Binary was exe in the case of DOS and Windows programs.

Every Windows executable has got a DOS stub at the start. If windows environment is not started ( using Win ) , the DOS executable part will run. ("This program requires MS Windows stuff " ).

These special programs need additional parameters like Current Instance ( HINSTANCE - void * ) , Previous Instance Handle (hPrevInstance ) , char *commandline ( LPSTR not converted to arc/argv stuff )
and how the Window should be during the startup ( ShowWindow(nCmdShow) ) etc.

if M$ tried to make main work with both DOS and WINDOWS , they need to make programmer call a series of API functions to get environment specfic stuff like Handles.

This might be a reason why there is winmain .

Installing SQLite under Fedora 10 and using it from Mono/C#

Currently , I am using SQLite in my office as part of an iPad project. Out of curiousity , i installed it on my Fedora 10 machine.

@the terminal

yum install sqlite

This installed sqlite db engine and sqlite development package.

I created a database by the name Sqlite.db to create a table.

@ the terminal

sqlite3 Sqlite.db

create table test( one varchar(100) , two varchar(100) );

I populated some data inside the table and here is a log of my session

[root@localhost sqlmono]# sqlite3 Sqlite.db
SQLite version 3.5.9
Enter ".help" for instructions
sqlite> select * from test;
sqlite> delete from test where one = 'pathal';
sqlite> select * from test;
sqlite> insert into test values ('downunder' , 'pathal' );
sqlite> select * from test;

I modified  a program from the mono web site to test the access from the Mono compiler

using System;
 using System.Data;
 using Mono.Data.SqliteClient;
 public class Test
    public static void Main(string[] args)
       string connectionString = "URI=file:Sqlite.db";
       IDbConnection dbcon;
       dbcon = (IDbConnection) new SqliteConnection(connectionString);
       IDbCommand dbcmd = dbcon.CreateCommand();
       string sql =
          "SELECT one, two " +
          "FROM test";
       dbcmd.CommandText = sql;

       IDataReader reader = dbcmd.ExecuteReader();

       while(reader.Read()) {
            string FirstName = reader.GetString (0);
            string LastName = reader.GetString (1);
            Console.WriteLine("Name: " +
                FirstName + " " + LastName);
       // clean up
       reader = null;
       dbcmd = null;
       dbcon = null;

I compiled the stuff at the commmand line and ran it... !

[root@localhost sqlmono]# gmcs test.cs -r:System.Data.dll -r:Mono.Data.SqliteClient.dll
[root@localhost sqlmono]# mono test.exe
Name: hell narak
Name: heaven swarg
Name: downunder pathal
[root@localhost sqlmono]# 
[root@localhost sqlmono]# 

Hope this helps !

Highway men of the 1980s

In the mid 80s , there were a group of men visible all across Kerala who will always be on their motor bike. Those guys travel a lot with their vehichle.

Since they travel on a highway , I used to call them "HighWay Men".

Today , I happen ask a colleague of mine whether he is a highway man or not ? ( He loves to travel great distances in his motor bike )

Thursday, October 14, 2010

Offline mode vs Connected mode

Now a days , various mobile devices are being programmed to do enabling applications. For Business reasons , the stakeholders want system to work only in connected mode. That is a functional requirement
and the system architecture should not get influenced by the "Connected mode" expectation.

Always design a system with Offline mode architecture. Some day , you might need it. The Architecture complexity is the same in both mechanism. The connected mode system needs to queue for transactions in
some way to be send it across the wire. Why cannot you make the queue persistant ?

Architecture and Testing - a case of kissing cousins

Now a days , I spent some time on reading the book Beautiful Architecture. It gives a lot of insight into the challanges faced in various system scenarios.

Once we have got a tentative architecture ( Architecture evolves ! ) , we need to engineer them. The Engineered system needs to go through some kind of verification process to have a usable system.

The Verification process depends on the family of system we are writing. That means , for an Architecture there is a corresponding verification system (Test framework ! ). I have started believing that an Architect
should design system with system testability in mind.

Architecture and Testing seems to be close cousins , who parted ways in the name of divison of labour. I have decided to buy Beautiful testing book today.

Too often , too much emphasis is given on the functional testing and people forget the system testing part. IMHO , if system is reasonably architected and tested , functional conformance can be achieved much faster.

void * - The most flexible data type !

Objective C has got a complex mechanism to invoke a function through a reference. The concept of selector and delegate comes into play and it complicates the stuff.

Apple's NSXML parser has got a SAX like asynchronous API to add more complexity. This is true when we write business systems where data seems to be the king. Too often , the use case is creation of a Dictionary ( NSDictionary *) from a XML file.

We created a Synchronous C style function to hide the complexities of XML parsing , Dictionary creation using another C function as a callback. The callback was passed on to the Parsing class as a void * . It saved
us lot of time !.

Dynamic Languages - I think there are use cases

Being a programmer schooled in statically typed languages , I have always viewed dynamic languages with some skepticism. I was not impressed when C# did add support for dynamic in C# 4.0 .

The worry was it will start an era of "sloppy coding". Still , I believe that whereever possible apply static typing. But, when we are writing Business integration systems , dynamic typing is blessing as we can focus
on the task at hand.

Dynamic typing is not absence of typing. It is just a case of deferring the type decision to the run time. When you want to consume a return value from external system , It can be of great help. ( I am writing this after seeing my first Groovy system which is in production ! )

Groovy , Scala and Ruby

Recently , I am seeing excitment around me regarding the above mentioned programming languages. As always , I do show some prejudices against a new programming tool chain,when it arrives on the scene.

I have heard that RoR's ( Ruby on Rails ) idiom of Convention Over Configuration has been carried around and incorporated into various application frameworks. The MVC is one example where this has been adopted.

A colleague of mine who recently joined my team talked about Groovy  today. After couple of minutes , I heard stuff like Closure , Fast development cycle from him. He alos told about an application which was delivered  with  GRAILS as the backend with Adobe Flex as the front end, after joining our office.

Today evening , he showed the application he developed and we had a discussion on how he started working with Groovy. The previous company which he was working was struggling with dead lines when they were working with Java. Groovy was suggested as an alternate platform for fast development by
a person from the United States. He claims great boost in productivity after adopting the Groovy and Grails framework.

It was an eye opener for me ... !

Wednesday, October 13, 2010

Ubuntu 9 vs Debian 10

I was trying to help a friend of mine to setup Qt libraries for development. He developed the stuff in Fedora 12 and want to compile and run the same stuff on Ubuntu  and Debian.

I called couple of my Linux afficionados and got the reply that in Ubuntu xx , it was installed here and in debian you can find it somewhere else....

I think era of Linux balkanization has arrived ....!

Tuesday, October 12, 2010

English has got a German Origin !

English is a West Germanic language that arose in the Anglo-Saxon kingdoms of England and spread into what was to become south-east Scotland under the influence of the Anglian medieval kingdom of Northumbria.

Source :-

A Picture is worth 1K words

I happen to search "Programming Techniques" in google and the first link i happen to hit was

From this document , I found a diagram which summarizes the steps in creating a executable.

To re-quote a great mathematician of the past

"It is impossible for any person to explain how c programming language works . I have a truly marvelous digarm  for which my bandwidth is too low to upload "

Source Quote :

“It is impossible for any number which is a power greater than the second to be written as a sum of two like powers. I have a truly marvelous demonstration of this proposition which this margin is too narrow to contain.” - Pierre de Fermat  ( x^n + y^n <> z^n , for n > 2 )

Go to Page 14 , Figure 2-2   ( You have to do some homework to see this truth ! )

A quote which I stumbled upon.

I happen to stumble upon a page today which contained the following quote

"In a world with an increasing number of platform and language bigots, it's important to keep the ultimate goal of actually fixing the problem in view and that's what I do."

Source :-

He was one of the chief organaizers of Pycon India , 2009/2010

Google Technology User Group meeting @ UST Global , Trivandrum

You can read about it @

Why do people want to die fast ?

Now a days , I am seeing lot of talk about pension plans and retirement plans. Even , people in their early twenties are talking about it . I wonder why they want to die fast !

Monday, October 11, 2010

Good , Bad and the Smart

Now a days , there is pressure to be "good" person and a "smart" person. These are contradictory in nature . In a world which is getting poorer day by day , the very survival of a person depends upon balancing the quest to be good  with "crimes" one need to commit to make a living.

if you do not commit "crimes" , you should associate with some one who have committed it.  That is what is happening in a corporate organization. Most employees can claim moral one upmanship by donating for the charities , work in NGOs who claims to do social work , being religious etc etc. The money to do these
kind of stuff might be coming from a dubious source !

After the Industrial revolution , there is seperation of concern regarding our survival. There is Physiological survival ( a concern for man uptill that time ) and Social survival (Now a days , this dictates everything ). For Social survival , there is necessity to commit some "bad" act to make a decent living or associate with someone who has the guts to do it for you.

I do not think a person can be good and smart. IMHO , It is good to be smart.

Sunday, October 10, 2010

Parsing XML - Reflective Languages are better

Just now , i used libxml parser to parse couple of xml documents. What can be achieved by a couple of lines of .NET has to be coded by hand in the case of C/C++ programming language. This is true regardless of the XML parsing strategy. ( DOM and SAX the effort is the same )

Reflective languages like .NET and java does these kind of stuff without much of a problem. With C# 4 , you can dynamically create an object from xml elements using ExpandoObject .

Saturday, October 09, 2010

Who said TV is bad ?

My Son is fond of Disney XD , Cartoon network , Pogo and Chutti TV to name a few. These channels show
the same video footage in English  , Tamil and Hindi audio.

Of late , my son is uttering Hindi like a native speaker and mixes english and Hindi along with Tamil to communicate to me. When i probed further , he is fond of seeing the same show in multiple languages.

To give an example

He => "Meim Khatron Ka Khiladi hoom...."

Me => " Kya .....! "

The reply was "Aap hey mera Khatra ! "

This can usher in a new era with Kids having excellent Linguistic skills.

E++ is as important as C++ for programmers

A friend of mine (Ashok K Shenoy ) created a meme E++ in the mid ninenties to sum up the lack of english competancy among programmers of the day in Kerala.

E++ is not yet another programming language. It is an attempt to elate English language to the level of C++ programming language. In 1995 , people were crazy about learning C/C++ and people who failed at that
became Visual Basic , PowerBuilder and Delphi ( Java has not arrived at the scene yet. It arrived in the late 1995) Programmers. The Guys who failed at the above , decided that Windows is weird and moved to become GNU Linux advocates !.

I will give an anecdote (it is a real story !) here.

I was very happy after reading a paragraph from a book by the title Hard Disk Secrets

The para goes as follows

"If you have got incredible drive , you will attempt anything which fascinates you and soon, you 
realize that you can do almost  all the things"

I was so enthused by the statement that , I showed it to a friend ( a love hate one !) and I did not
get the kind of response I was expecting. This prompted me to ask him , what he felt.

His reply translated into english was as follows , " Who does not know that if you have got a better
drive, you can store more data ". I was stunned and in our friend circle to this day , when we feel
that someone's english is pathetic , the statement "his drive seems to have a problem " will be uttered
by another in the group.

Due to the globalization and emphasis on english speaking , things have improved a lot. Still , English
comprehension is given not much importance. This inhibits communication of information .

The other day , there was a discussion about English proficiency and a question popped up which goes as follows

"In Kerala , English is one of the key factor In success . What about Britain and US ? "

My reply to that was "Conent of your speech "

IMHO , Reading books is one of the key factor which enhances your competancy to use appropriate phrases , idioms which suits the occassion. As always , Conent is the king. 

Community Tech Day @ Kochi

The Kerala Microsoft User Group ( K-MUG ) in collaboration with Microsoft Corporation is conducting a developer conference @ Kochi on 23rd October , 2010. The sessions include XML Serialization , Transact SQL ( T-SQL ) tips and tricks , Web application security , NO SQL databases , entity framework to name a few.

The sessions will be focusing on the latest version of Visual studio release. If you are a .NET developer , do not miss this event.

Venue : IMA Hall , Kaloor , Kochi ( Behind JN stadium )
Date : 23rd , October , 2010

The details of the event including the registration (free !) is available

realloc is not a function , it is a memory manager

I was discussing C/C++ memory management function with a friend of mine today and to clarify the working , i wrote a program. This might be of education value

// rt.cpp
// A Simple Visual C/C++ program to demonstrate
// the realloc function can be used as a kernel
// for implementing malloc , calloc , free etc.
// The program is written for demonstration purporse
// only 
// Written by Praseed Pai K.T.
// cl rt.cpp 
// rt

#include <stdio.h>
#include <stdlib.h>
#include <string.h>

//  This is a wrapper over realloc function...
void *t_realloc( void *ptr , int size ) {
   return realloc(ptr,size);

// malloc implemented using realloc

void *t_malloc(  int size )
   return t_realloc( 0 , size );


// free implemented using realloc
void *t_free( void *ptr )
    return t_realloc(ptr,0);

// calloc implemented using t_malloc (in turn calls t_realloc )
void *t_calloc( int elem , int elemsize ) {

   char *ptr = (char *)t_malloc(elem*elemsize);
   return ptr;

//  The Entry point ...
int main( int argc , char **argv )

     long *ptr = (long *)t_calloc( 5,4 );
     long i=0;
     long *tptr = ptr;

     if ( tptr == 0 )
            return -1;


     while (i<5 )
    *tptr ++ = i++;

     for( int j=0; j<5; ++j )
        printf("%d\n",ptr[j] );

     ptr = (long *)t_realloc( ptr , 40 );
     for( int j=0; j<10; ++j )
        ptr[j] = j ;    

     for( int j=0; j<10; ++j )
        printf("%d\n",ptr[j] ); 



Friday, October 08, 2010

A C/C++ Programming riddle !

An intern who works with me for a project , came today with a simple C program written using GCC.

The Program was retrieved from  ( was posted by Anand in the barcamp kerala google group )

I am attaching the question here for the ease of read

A small technical doubt in C Programming

A Small technical doubt

Reply if anyone knows

This is a small pgm,Compile in gcc and see the result

int main()
float a=0;
printf(“\n %f \n”,a*-1);
printf(“\n %f \n”,abs(a*-1));
return 0;

I got the result as follows



Why is it so?

I was puzzled by a*-1 producing -0.0000 !. I quickly wrote a small Visual C++ program to test the program and this issue did not come up there.

I booted a ubuntu machine and wrote the code. Here , it did work as specified in the above article

#include <stdio.h>
#include <math.h>
#include <stdlib.h>

int main(int argc , char **argv)
   float a = 0;

[sandhya@localhost puzzle]$ ./a.out
[sandhya@localhost puzzle]$ 

I wrote another C/C++ program to dump the IEEE floating point format data for the float value. ( this technique will work only on 32 bit system )

#include <stdio.h>
#include <math.h>
#include <stdlib.h>

float myabs( float a )
   if ( a < 0.0 )
       return -a;
   return a; 

int main( int argc ,char **arv )
  volatile float a = 0;
  printf("\n %f\n",a*-1.0 );
  printf("\n %f \n",myabs(a*-1.0));
  float c = myabs(a*-1.0 );
  printf("\n %f \n",c);

  //  The line below gets the bit representation of IEEE 754 single 
  //  precision floating value. 
  //  take the address of c => &c (float *);
  //  cast it to void * from float *
  //  cast into an unsigned long * ( long * will not work want every bit )
  unsigned long *p = (unsigned long *) ((void *)&c);

  // Get the long value pointed by p 
  unsigned long t = *p;

  // Iterate the bit pattern ....printing 1 for the presence and 0 for the absence of bit value
  // The intel CPU stores data in Little endian format
  while ( t > 0 ) {
    printf("%c",t&1 ? '1' : '0' );
    t = t >> 1;
  return 0;

Here is the output i got

[sandhya@localhost puzzle]$ g++ rsa.cpp
[sandhya@localhost puzzle]$ ./a.out



[sandhya@localhost puzzle]$ 

The above code clearly shows that sign bit is 1 and that is the reason why we get -0.000. (Read the stuff in the reverse order )

At first , I did not have a clue why this happened. I wrote another C/C++ program to test the assembler output it generated for a*-1

#include <stdio.h>

int main( int argc , char **argv )
        float a = 0;    


After compiling the stuff using -c flag ( it only generates obj file now ) , I disassembled the stuff using objdump .

[sandhya@localhost puzzle]$ gcc -c di.c
[sandhya@localhost puzzle]$ objdump -S di.o

di.o:     file format elf32-i386

Disassembly of section .text:

00000000 <main>:
   0:    8d 4c 24 04              lea    0x4(%esp),%ecx
   4:    83 e4 f0                 and    $0xfffffff0,%esp
   7:    ff 71 fc                 pushl  -0x4(%ecx)
   a:    55                       push   %ebp
   b:    89 e5                    mov    %esp,%ebp
   d:    51                       push   %ecx
   e:    83 ec 24                 sub    $0x24,%esp
  11:    b8 00 00 00 00           mov    $0x0,%eax
  16:    89 45 f8                 mov    %eax,-0x8(%ebp)
  19:    d9 45 f8                 flds   -0x8(%ebp)
  1c:    d9 e0                    fchs   
  1e:    dd 5c 24 04              fstpl  0x4(%esp)
  22:    c7 04 24 00 00 00 00     movl   $0x0,(%esp)
  29:    e8 fc ff ff ff           call   2a <main+0x2a>
  2e:    83 c4 24                 add    $0x24,%esp
  31:    59                       pop    %ecx
  32:    5d                       pop    %ebp
  33:    8d 61 fc                 lea    -0x4(%ecx),%esp
  36:    c3                       ret    
[sandhya@localhost puzzle]$ 

The instructions written in the bold is responsible for the a*-1 = -0.0 stuff.

a*-1 is transformed into

load a
change sign

So magnitude was zero here and it set the value. I was focusing on the a*-1 aspect and the abs(a*-1) was answered by Jayakrishnan ( I am pasting his reply from the barcampkerala site )

The code is wrong.

He is calling the integer abs() function on a floating point number. He should be calling fabs() unless abs() has been overridden for double/float.

Since printf() cannot validate its parameters, the second printf() assumes the value is a float/double and tries to print it out as such.

However, and here I am guessing... since the first printf() call actually prints a double i.e. -0.0 i.e. 8 bytes and since the second printf() call takes a int as a parameter, only 4 bytes of the int and the 4 bytes of the earlier double parameter is used to create a double which is printed out.

I assume that the 4 bytes int overwrites the portion of the double that does the least damage (no damage in this case being 0 itself). However the sign and the other significant portions of the old parameters gets used and you end up with something very close to the previous parameter (actually the same parameter since it is integer value 0).

In fact if you modify the first printf() to printf("\n %f  \n", -234.0), the program would most likely print -234.00000000 for the second printf() also.

While the mistake is itself a typical newbie mistake.... the explanation of why in this case requires a good understand of C.

- JK

I did test a*-1 = -0.00 once again with the program given below

#include <stdio.h>
#include <math.h>
#include <stdlib.h>

float rabs( float a ) {
    if ( a < 0 )
    return -a;

    return a;

int main(int argc , char **argv)
   float a = 0;

What was solved using fabs , is brought back when i wrote a custom absolute value function with out CPU floating point instruction .

Even if , sign bit is on ,the math sub system correctly interprets data. That is why giving fabs solves the stuff. A good learning experience.