Tuesday, March 13, 2018

Uploading large files from browser to ASP.Net web server - WebSockets


Most of the applications are now web applications. When we are in browser we are in restricted area. We cannot code freely to leverage machine's full capabilities. One of the area where the limitation applies is on uploading large files which is a normal use case. Small files we can easily upload using file input control. But when the file size grows we get into trouble. 
There are so many solutions out there including third party controls. But what if we need to write one from scratch? What are the options?


We can use ActiveX controls which helps to unleash the potential of the machine. With the arrival of HTML5 FileAPI we have more control in the HTML itself. There are so many third parties who are making use of the same. If there is a way to use ActiveX or third party controls, the problem is solved. Else or you think the existing solutions are not enough, continue reading.


There are many ways to upload files. Below are the principles taken on the solution approach


  • Never use ActiveX
  • Read the file without breaking the browser memory (chunking)
  • Should support up to 25GB file minimum.
  • Support scaling
There are reasons behind the principles but are not relevant at this point. Below is the first solution and that is going to be discussed in this post.


One of the modern way is the WebSockets. Below goes a repo which demos how the file can be read async and sync methods and send via WebSockets.


Feels free to surf the repo and comment. Will be adding more development notes soon.


  • Using a Web Worker and FileReaderSync API seems faster than async FileReader API.


Below goes some limits
  1. The connection has to be open. Affects scaling.
  2. Socket connection management is tedious
  3. The server is slower in reading and writing the files. Client reports it send all the data.


Tuesday, March 6, 2018

Azure @ Enterprise - Limits on AppServices

AppService mechanism allows free tier to host Azure WebApps. Though this has 2 min execution timeout, it is a good place to host web sites.If we ask is this enterprise friendly, we will end up in different answers. It has enterprise friendly mechanism called AppServiceEnvironment to host applications in isolated environment ie an environment which is not open to public. Enterprise can host its internal services in it.

Though it says technically we can host any .Net application, it has good number of limits. For example if we have legacy application which uses any COM component, it cannot be used in AppService. This post is a journey towards such a limit


It is a browser specifically Chrome automation framework in NodeJS. It helps to start headless browser instance and control it via JavaScript.  It needs a separate post to explain about Puppeteer. Ideally if we host the code in Azure NodeJS WebApp, it should work.

But fails due to unavailability of of some APIs in WebApp.

Details on WebAPI limits 

Below goes the error message
Error: spawn UNKNOWN
    at _errnoException (util.js:1022:11)
    at ChildProcess.spawn (internal/child_process.js:323:11)
    at Object.exports.spawn (child_process.js:502:9)
    at Function.launch (D:\home\site\wwwroot\node_modules\puppeteer\lib\Launcher.js:81:40)
    at Function.launch (D:\home\site\wwwroot\node_modules\puppeteer\lib\Puppeteer.js:25:21)
    at Server.<anonymous> (D:\home\site\wwwroot\server.js:14:39)
    at emitTwo (events.js:126:13)
    at Server.emit (events.js:214:7)
    at parserOnIncoming (_http_server.js:602:12)
    at HTTPParser.parserOnHeadersComplete (_http_common.js:116:23) code: 'UNKNOWN', errno: 'UNKNOWN', syscall: 'spawn' }
started server.js

How to reach to a conclusion that Puppeteer not supported in Azure WebApp? Lets look at the below issue logged in their GitHub project

This will take us to the Azure Functions page which says Windows based Azure Functions don't support headless browsers? Really are there windows based functions and Linux based functions?

Unfortunately yes. The Below link explains the same.

This again takes us to the kudu page where the limit is documented.

Moral of the story. Read the docs and understand what it really means.

Tuesday, February 27, 2018

Swagger API Test UI for easing PoC development


Developing PoCs (Proof of Concepts) or samples to prove some approach or technology is suitable for our project is some every software engineer does as part of their job. Also if something goes wrong with existing systems in production we often create a sample to isolate and reproduce the problem to analyze further. If the role is of Architect, they have to involve more into these research kind of activities. Please note that create these PoCs are slightly different from prototyping as prototype is more towards visualizing the concept in the initial time only.

The goal of PoC if it is to troubleshoot an issue is to create it to the closest way possible to mimic the actual application in trouble. If the issue is production the time will be limited to create app from scratch. When we say closest it should be closest in terms of the technology, threading model, security etc...


If it is windows application of console applications, it is easy to develop a PoC and run the same way the original application runs. Even if original application is triggered in the context of a different user configured from windows task scheduler or similar, we can still mimic it. But most of the applications are web apps now a days. If we want to reproduce an issue of Web App, it would be difficult if we create a windows console app unless we run the console application using the same service account of app pool which runs the web app. But still the threading model and all will be different. If we want to develop the PoC as web application, it may take some more time than a simple console application which adds delay.

One possible solution

So the best way is to develop WebAPI and host it using the same app pool of original application. Provided there are permissions to host. Suppose we have permissions to host, the next thing is the easiness to test using WebAPI alone. If we have to develop UI for that it would take more time. So what is the solution?


Enter Swashbuckle after the long introduction which help us to render UI for the available WebAPI methods. This helps us to test the WebAPIs from a simple UI dynamically generated by it.

Since there are so many links out there which explains how to use it, lets follow the DRY principle. Links below on how to setup Swashbuckle into the WebAPI.


There would be so many different views on solving the original problem of creating PoCs. Feel free to comment your technique of rapidly creating PoCs to mimic the real application.

Tuesday, February 20, 2018

Skype v/s Visual Studio - Pair programming via Skype for business meeting

Last week I was doing a pair programming session with a colleague which works from the other part of the globe. The screen sharing was done through 'Skype for Business'. It was about a generic retry mechanism to be used with Azure KeyVault. Not like the retry code which is already available, we have to get the renewed key from Azure KeyVault in certain scenarios.

I was sharing screen. When ever we are in the intense debate and I do some code change the screen sharing gets lost. I continue with my arguments if the change tried was proposed by me and when I ask why you are silent, other guy tells he lost sharing. It happened  4-5 times when we were trying the changes proposed by me. Then I started thinking what my colleague may having thinking about the screen sharing. Will he be thinking that I am intentionally cutting the screen share? Never because we knew each other for long and we know how software from Microsoft works.

When it get frustrated, we decided to investigate on this. If we cannot solve our computer problem how can we solve others problem with computer? Step by step retries couple of times revealed the root cause.

Root cause

Its Ctrl+Shift+S short cut. As a practice from college where the computers were desktops and power may go at anytime, I used to save immediately after typing something. It continuing now also in the era of using laptops or even mobiles for programming. The short cut in Skype for Business to end screen sharing is also Ctrl+Shift+S.

The real solution is to depend on the Ctrl+Shift+B for building which saves all the files. But really its difficult to change.


Tuesday, February 13, 2018

Azure @ Enterprise - Moving databases from SQL VM to SQL Azure


Enterprise will have so many databases running in its existing systems. If those systems are legacy the databases might have all the legacy features which SQL Azure (The PaaS offering not the SQL VM in Azure) does not support. How to move such databases to SQL Azure as part of Azure adoption. If anyone wonder why the SQL Azure is not backward compatible with standalone SQL Server versions welcome to PaaS. 

Research & Solution

If we google, we can get so many options on how we can move an on-premise database to SQL Azure. Some of those are as below.

After doing good research, the best option found to be the .bacpac mechanism using SQLPackage utility. As we can see in any production databases, the file groups will be all over the place to increase performance. the bacpac mechanism using SQLPackage will eliminate the file groups issue in it's latest versions.


But it may not be the easy and hurdle free migration road. Below are some issues.

SQLPackage fails on large tables

The SQLPackage.exe has its own timeouts. When there are large tables the timeouts may hit and it will error out. When it error out, there could be a message as follows.

Processing Table '[dbo].[large tables with millions of rows]'.
*** A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)


The message tells some network issue not specifically on timeout. But after trial and error this seems to be related to timeouts of SQLPackage.exe utility. It has some params to control the timeout. The usage is as follows

sqlpackage.exe /Action:Import /tsn:tcp:<databaseserver>.database.windows.net /tdn:<database name> /tu:<user> /tp:<password> /sf:<local path to file.bacpac> /p:DatabaseEdition=Premium /p:DatabaseServiceObjective=P15 /p:Storage=File /p:CommandTimeout=0 /TargetTimeout:600

The highlighted are 2 params worked out. The value depends on the size of the database. /p:Storage=File is must when we deal with large databases anyway. It cannot be the other option where memory is used which may drain out quickly.

SQL Azure Server supports only one collation

If the on-premise application is serving globally the collation of the database might have different than the collation supported by the SQL Azure Server.

If anyone is confused on whether there is a SQL Server for PaaS offering, yes there is one. This easily makes us think that there would be real VMs running behind he SQL PaaS offering.

Coming back to the problem, we can have database in SQL Azure with different collation as of SQL Azure instance. But if there are stored procedures which needs to access the system objects, they fail. The default answer would be to change the collation of SQL Azure Server. But unfortunately it is not supported. SQL Azure instance is kind of hard coded to 'SQL_Latin1_General_CP1_CI_AS' collation.  Microsoft has their own reason it seems. But as user what we can do?


Modify our SQL code to include collation or change the collation of our database to the collation of SQL Azure instance which is 'SQL_Latin1_General_CP1_CI_AS'. It is simple to say change the collation but in enterprise its sequence of approvals especially if the collation is set as standard across multiple applications.


It is nice and good to use Azure. It works well when the application is from scratch and cloud native. But when it comes to migration of existing applications, its nightmare. Sometimes feels like the Azure is not matured for enterprise.



Tuesday, January 9, 2018

WCF - Concurrency, InstanceContextMode and SessionMode & request handling behavior

Though WebAPI is penetrating the service world in .Net WCF still has a space in .Net. The main reason is the support for protocols other than http and there are lot of legacy systems out there which has to be maintained till the world fully moved to Cloud native.

WCF is easy to develop but when it comes to production, there are so many internal things to be known than just writing the service. The major area to be known before putting any WCF service is how the service is going to serve requests. Will there be one service object? How many requests will be processed in parallel? In there are more than one requests getting processed how many service objects will be there to process? Is it one to one or one to many? If it is one service object processing  many requests what is means to thread safety? Service object here means the object of class which implements the service contract interface? Hope everyone knows who creates the object of implementation class.

There are so many articles out there which explains how the service process requests. So this post is to consolidate all in one place to avoid googling.

MSDN Article 1 & Article 2 - These are the best article if someone have enough understanding about the concept. If someone is really fresh to WCF this is not a great place to start

Sivaprasad Koirala on Instancing & concurrency - This is good place to start for freshers. He explains with diagrams and there is a good sample which can be used to understand the behavior. But it talks only instancing and concurrency. When instancing combined with Concurrency mode and Session, the behavior will change.

If we are in hurry and have to understand the behavior quickly the best site is given below.


It explains with combinations and tell what happens in each combination. But the unlucky blog post has no comments till now even after 3 years.

It is not ending here. There is throttling behavior which might need to be tweaked. Also the security mode which gets enabled automatically for some bindings though we don't need it and reduce throughput.

Tuesday, January 2, 2018

ReST based Web Sites & ReSTful Navigation


ReST is highly considered as architectural pattern for integrating systems. It mainly uses APIs
point to resource and do operation on those. The URL often follows a cleaner hierarchical path pattern without much key value pairs than conventional key value pair based URL schemes. People often follow the ReST based URL format for APIs but not widely accepted for web sites.

This post aims to investigate on bringing this ReST based URL schemes to web sites similar to APIs.

Why should I do create web sites as ReSTful URLs

The same benefit of ReST based API URLs applies here as well. The URL will be easy to remember. It is easy to have separation of concerns. New features can be totally implemented separately. using their own area / virtual directories. No need to mix with existing screens even it they are related. Simple and short URLs than lengthy story telling URLs.

eg: www.mycompany.com/employeelist.aspx?empId=1 can be easily represented via

Why there are not much web sites following ReSTful pattern

One reason could be the difficulty to follow the pattern. If it is product company the development team will get more freedom to select URLs. Again if the product owners don't know what is ReST and the advantages they may influence the URL pattern. In the other side the consulting industry is heavily driven by client demands. Though regular clients may not ask for particular URLs, semi technical clients may ask.

Some tips for ReSTful web site URLS

Below are some tips to design web site URLs in resource oriented way

No operation oriented screens

The screens should be pointing to resources to display those. For example the below URL displays the employee resource.


Lists and details screens

If the URLs display the resource how to edit them? Which screen to edit those resources? The better way is to use lists and details mechanism.

If we display the resources in a list those can be edited in the same list itself and save using a button. Here the single page application (SPA) concept helps us than navigating to an edit page.

Circular navigation

If the resources have circular relation the navigation may also becomes circular.
For example, the employee page may show the department where he belongs to as hyper link. Clicing that will navigate to dept page where it lists the employees in that dept including the manager. Going to the manager page may display employees under him and clicking on the same employee's link there will end up in same employees page where we started.

Multiple navigation paths

Similarly there would be multiple navigation paths to reach one resource. For example the home page of company may show the departments and various project it is doing currently. Navigating to department as well as project may end up in same employee page.

A powerful search experience to navigate

If a resource is buried under the hierarchy, it would be difficult to find it without multiple clicks. So better to have a search mechanism where the resource can be searched and navigated on clicking the associated URLs.