Monday, March 18, 2013

Cache Multiple Versions of a user control using VaryByParam, VaryByControl ,and VaryByCustom


Recently, I was working on optimization of application performance where we were using some user control which was taking a lot of time to load. It contains a user role based information as authenticated and authorized users are able to see some links and portion of pages.

Initially, I thought, as it is user specific storage, storing Session will be good idea but didn't find any such way to store data in Session like OutputCache. Hence decided to use "User specific Control Cache" like Session.

Here are the following ways that we can achieve it:

VaryByParam:

VaryByParam works based on parameters passed in URL's query string. It gets a list of query string or form POST parameters that the output cache will use to vary the user control.

<%@ OutputCache Duration="120" VaryByParam="PageTitle" %>

Example and few points to note:

1. If you url is http://www.xyz.com?PageTitle=Home, it will cache control for that patricular url for specified duration.

2. If you move to anyother page like "Contact Us", it will cache different version of same user control.

3. If user has multiple parameters in query string/Form posting, again multiple version of control are cached. E.g if you want to cache control based on User ID, add parameter as follows:

<%@ OutputCache Duration="120" VaryByParam="PageTitle;UserID" %>

Note
You can pass authenticated token in query string as a part of UserID, as SessionID/ other user PII information is not recommended to pass in URL as a plain text.


VaryByControl:

You can cache multiple versions of a user control by simply declaring it in an .aspx file more than once. As with user controls that are not cached, you can include a cached user control in an ASP.NET page as many times as needed for your application. Unless you set the Shared property to true for the user control, multiple versions of the control output will be stored in the cache.

How to create:

1. Create a control that post-backs itself.

2. To cache the user control based on user control properties, you should specify the fully qualified name of the properties in the varyByControls section of the PartialCachingAttribute. Multiple properties if any should be separated by semi-colons.


<%@ Control Language="C#" AutoEventWireup="true" 
CodeFile="WebUserControl.ascx.cs" 
Inherits="WebUserControl" %>
<%@ OutputCache Duration="60" 
VaryByControl="WebUserControl.param1;WebUserControl.param2
VaryByParam="none" Shared="true" %>


or you can also include the PartialCache attribute for the user control:

[PartialCaching(60, null, "WebUserControl.param1;WebUserControl.param2", null, true)]
public partial class WebUserControl : System.Web.UI.UserControl
{
    public string param1 { get; set; }
    public string param2 { get; set; }
}


OR another way to cache the control on the combination of both values would be:

[PartialCaching(60, null, "WebUserControl.BothParams", null, true)]
public partial class WebUserControl : System.Web.UI.UserControl
{
    public string param1 { get; set; }
    public string param2 { get; set; }

    public string BothParams    
    {
        get { return String.Concat(param1, param2); }
    }

}

The last parameter (true) specifies shared. Duration is specified by 60. Refer to the link How to: Cache Multiple Versions of a User Control Based on Parameters

Assign it in the user control code behind:




[PartialCaching(60, null, "WebUserControl.BothParams", null, true)]
public partial class WebUserControl1 : System.Web.UI.UserControl
{
    ...
    protected void Page_Load(object sender, EventArgs e)
    {
        this.CachePolicy.Duration = new TimeSpan(0, 0, 60);
    }    
}
You can assign it in the code behind of the page where user control is referenced using the ID of the user control.

e.g. If the user control on the aspx is:

<mycontrols:control1 ID="ucControl1" runat="server" param1="15" param2="20" />

then in the code behind of aspx, you should write:

this.ucControl1.CachePolicy.Duration = new TimeSpan(0, 0, 60);


NOTE
if both the user control and page are cached: If the page output cache duration is less than that of a user control, the user control will be cached until its duration has expired, even after the remainder of the page is regenerated for a request. For example, if page output caching is set to 50 seconds and the user control's output caching is set to 100 seconds, the user control expires once for every two times the rest of the page expires.


VaryByCustom:

This one is my favorite. You can play with customized values. The key of creating a custom cache variance is understanding that ASP.NET uses a simple string comparison to determine if a cached result should be returned instead of processing the page. For example, say we want to cache a certain page by SessionID. We add the OutputCache directive like this:

<%@ OutputCache Duration=”60” VaryByParam=”None” VaryByCustom=”SessionID” %>


Now, in global.asax, we must override the GetVaryByCustomString method, like this:


Public override string GetVaryByCustomString(HttpContext context, string arg) 

  if(arg.ToLower() == “sessionid”) 
  { 
    HttpCookie cookie = context.Request.Cookies[“ASP.NET_SessionID”]; 
    if(cookie != null) 
      return cookie.Value; 
  } 
  return base.GetVaryByCustomString(context, arg); 
}

In case of multiple keys, you can supply multiple values separated by ';' and handle same in GetVaryByCustomString method as given below:

<%@ OutputCache Duration=”60” VaryByParam=”None” VaryByCustom=”SessionID;Key1” %>


public override string GetVaryByCustomString(HttpContext context, string custom)
        {
            string[] keys = custom.Split(new char[] { ';' });
            string result = string.Empty;
         
            foreach (string key in keys)
            {

                switch (key)
                {
                    case "Key1":
                        //Add page Url
                        result = //Add Key1 logic.
                        break;
                    case "SessionID":
                          HttpCookie cookie = context.Request.Cookies["ASP.NET_SessionID"];
                        result += cookie.Value;
                        break;
                }
            }

            if (!string.IsNullOrEmpty(result))
            {

                return result;
            }
            else
            {

                return base.GetVaryByCustomString(context, custom);
            }
        }



That’s it. Simple, elegant, beautiful :)


Wednesday, March 6, 2013

System.Web.HttpException: The remote host closed the connection. The error code is 0x80072746

Today we got an error when we are downloading huge file like in 1GB+ via http. Existing code was working fine for small sized file.

Exception:

System.Web.HttpException: The remote host closed the connection. The error code is 0x80072746
 at System.Web.Hosting.ISAPIWorkerRequestInProcForIIS6.FlushCore(Byte[] status, Byte[] header, Int32 keepConnected, Int32 totalBodySize, Int32 numBodyFragments, IntPtr[] bodyFragments, Int32[] bodyFragmentLengths, Int32 doneWithSession, Int32 finalStatus, Boolean& async)
   at System.Web.Hosting.ISAPIWorkerRequest.FlushCachedResponse(Boolean isFinal)
   at System.Web.Hosting.ISAPIWorkerRequest.FlushResponse(Boolean finalFlush)
   at System.Web.HttpResponse.Flush(Boolean finalFlush)
   at System.Web.HttpResponse.Flush()


This exception generally happens when you have download functionality in your Asp.net applications and when user starts the download process, but, the download process does not complete for one of the following reasons:
  • User cancels the download
  • User navigates to another page
  • User closes the page
  • The connection closes


We have a file download page that allows the user to download files of different sizes varying from few KBs to GBs. We also have multiple files bundled into a single .zip file.


Root Cause:

Default value of Response.Buffer  property is set to "true" and hence the Asp.net process writes data/output to the buffer, instead of the output stream. When the page is finished processing, the buffered output is flushed to the output stream and hence, sent to the browser. This is usually done to avoid multiple data transfer process over the network and hence optimize performance by sending the overall page output in a single Response.

But, when a page writes a file content (Or, a file) in the Response, usually, we used to flush the buffer immediately after we write the file content in the output stream, so that, the file content is sent to browser immediately, instead of waiting for the page to complete its' execution process process. In general, following is the code that we usually use.

Response.OutputStream.Write(buffer,0,buffer.length);
Response.Flush();
Response.End();

Now :

  • User started to download the file from the download prompt (As a result of the Response.OutputStream.Write()), data is stored in buffer and the buffer is overflown and a stream of binary data started to flow from server to client.
  • As file is large, based on executiontime configured in web.config, response timeout or the download process is cancelled.
  • Meanwhile, the Asp.net process tries to Flush() the buffer content to the output stream, that is already closed by the client.
  • Exception occurs.



Solution:

Buffering response at server may cause system degradation which can be worsen when multiple users are trying to hit files simultaneously. Hence just set

Response.Buffer = false;

Further, if you have large file like in GB, it is good idea not to buffer it for better performance. Just open the stream and start writing to response stream. As we are not buffering at server level, it directly opens to client and file downloaded started. Till now, I've successfully downloaded 1.5 GB file from server to client but I'm sure, this will work for jombo files too.


var resp = HttpContext.Current.Response;
        resp.Buffer = false;
 resp.AddHeader("Content-Disposition", "attachment; filename=test.txt");
            resp.ContentType = "application/zip";
            string strShowMsg = string.Empty;
             long start=0;

     using (FileStream rstream = new FileStream(@"C:\DownloadHuge.txt", FileMode.Open))
             {
                 int totalbytes = 0;

                 byte[] buffer = new byte[256];
                 int x = rstream.Read(buffer, 0, 256);
                 while (x > 0)
                 {
                     resp.OutputStream.Write(buffer, 0, x);                     
                     totalbytes += x;
                     x = rstream.Read(buffer, 0, 256);
                 }
            
             }

             resp.Flush();
             resp.End();

Although, this solution seems simple but it took a lot of time to figure out this issue.

Friday, March 1, 2013

Download large file using WCF


Recently, we were facing issues when we need to download more than 2GB file like 4GB, 5GB etc from WCF service.

It worked till 2GB by default because max value permitted in maxReceviedBufferSize=int32.MaxValue which is 2147483647 (1.99 GB). What if we need to download file more than 2GB??  If you try, you will get following error:

"The maximum message size quota for incoming messages (2147483647 ) has been exceeded. To increase the quota, use the MaxReceivedMessageSize property on the appropriate binding element"

After searching on internet, following options suggested but none worked:

1. Only change transfermode to Stream.
2. Set Max value of all elements like

<binding name="BasicHttpBinding_Service" messageEncoding="Mtom" maxReceivedMessageSize="2147483647" maxBufferPoolSize="2147483647" maxBufferSize="2147483647" transferMode="Streamed"  useDefaultWebProxy="true" sendTimeout="05:00:00" closeTimeout="05:00:00"/>

Finally, I got a way to work after tweaking the client config as follows:

<binding name="BasicHttpBinding_Service" messageEncoding="Mtom" maxReceivedMessageSize="4294967294" maxBufferPoolSize="65536" maxBufferSize="65536" transferMode="Streamed"  useDefaultWebProxy="true" sendTimeout="05:00:00" closeTimeout="05:00:00"/>

If you note here, we have change following property as follows:
1. maxReceivedMessageSize="4294967294"  which is 4 GB
2.  maxBufferPoolSize = "65536" which is 64 KB
3. maxBufferSize="65536" which is 64 KB
4. transferMode = "Streamed"

Restricting the maximum incoming message size is not enough in this case. The MaxBufferSize property is required to constrain the memory that WCF buffers. It is important to set this to a safe value (or keep it at the default value) when streaming. For example, suppose your service must receive files up to 4 GB in size and store them on the local disk. Suppose also that your memory is constrained in such a way that you can only buffer 64 KB of data at a time. Then you would set the MaxReceivedMessageSize to 4 GB and MaxBufferSize to 64 KB. Also, in your service implementation, you must ensure that you read only from the incoming stream in 64-KB chunks and do not read the next chunk before the previous one has been written to disk and discarded from memory

Note
1.As it opens Stream to your client, pay a special attention about writing this stream. I would recommend not to  use MemoryStream to consume same because it is a large file and use of MemoryStream will kill you box and effect system performance. If you are using a web application, multiple users can try to download same file which can exhaust your box.

2. As a solution, download file to a temporary location and play as per your need for better for performance.

3. If you have any requirement to download more than 4 GB, you can change to maxReceivedMessageSize value as per your need and let other values unchanged.

Hope this helps who have similar type of problem. I'll be happy to listen your feedback/comment that can help me to improve.

Thanks!
Tarun