Geeks With Blogs
Srijith Sarman Time,space and living

Transfering large volume of data through the wire is indeed a challenge for developers. Distributed technology offers many ways .Remoting or webservices ? .That's the commonly asked question. Ofcourse,technology is in full swing as names like WCF is in front of us .

 But my aim is to transfer large amount of data through the network.Let's set apart all these technologies. Whatever be the technology,I would be willing to choose it ,provided if it suits for my requirements.

My requirements are,

             1. It should be able to transfer huge data without any loss,say for 100 MB.

              2.How can I compromise on speed?. So it should be fast.I would be happy if this 100 MB data reaches the other end within a second(a little over ambitious.isn't it?).

             3.It should not eat up all the available bandwidth.There sould be provision to control the bandwidth consumption.

             4.There should not be any Outofmemory errors.

             5.Binary data should be transfered as binary itself.

           So I have webservices,remoting ,enterprise services and straight dsn connection in my list. In order to set up a dsn client-sever application, I should have WAN or LAN environment and each client also needs seperate license.So that option is dropped.DCOM is also dropped ,since it's an older technology and it's a shame to use DCom in this centuary(just kidding..).         

          Remoting is obviously faster than webservices.But,Considering the first requirement, I have to cut down remoting from my list.Why? can't remoting be used in a scenario to transfer 100 or 200 MB data.yes, of course .But it will give us a lot of pain to implement. It's specified in MSDN that remoting may cause huge load over server if the data reches over 10 MB and may throw up out of memory errors.Well,this is happening due to the buffering of  binary data in the server. Remoting is implemented in the form of a straight tcp channel.So the serialization channel will end up buffering the data even if send it as data chunks. Now it's up to us to prevent this is from being happened and implement remoting effectively. Means,we have to write a suitable algorithm. A nice problem for practice!!

         For second requirement,I thought passing data in binary format will do.We have to implement a suitable algorithm which effectively serializes complex data types such as data sets.There are some widely popular utility classes such as data set surrogates, ghost serializer etc. Each of them has their own advantages and disadvantages.I feel that it's good to write your own algorithm,which suits your requirement.

         There are many tools available for data compression .The most popular is of course sharpziplib.But considering only speed I prefer lzo compressor which outperforms sharpziplib .lzo is written in C++ ,so not in managed code where sharpziplib is written in C# .

            Now,taking the fifth requirement ,we know that passing byte array straight into a web service channel will cause encoding them into bas64binary.This would definitely affect the perfomance.There are mainly two methods available in the scene.One is Dime and other is MTOM. Dime is not obselete as there are lot of developments still happen in .Net 1.1 . MTOM is very simple to implement,but powerful too.We can return byte[] as it is .Only we need to modify some configuration settings,which is very well specified in the WSE 3.0 document. In Dime we need to attach the binary data with the requestsoapcontext . Since the attached data is outside the soap packet ,it's not fully secured.Now ,that wouldn't be a majour issue for many of us as there may be sufficient security measures in the environment.

         We can also implement a chunking algorithm whch splits the binary data into several chunks. As you know,the default packet size in machine.config is 4 MB.We can modify this value according to our requirement considering the network capacity.

 Any way there are lot's of developments happening in the field which often confuses a developer.Lot's of new technogies are comimg up in a high speed.After some months ,I think we  can browse all TV programmes.Wouldn't it be funny? minimizing BBC and maximizing FOX TV(It's started happening,though).

Posted on Saturday, December 23, 2006 5:01 PM | Back to top


Comments on this post: Transfering large volume of binary data

No comments posted yet.
Your comment:
 (will show your gravatar)


Copyright © Srijith Sarman | Powered by: GeeksWithBlogs.net