Monday, July 31, 2017

Sitecore SQL Session State Provider: What you need to know

Standard

Background 

While working with Sitecore Support to troubleshoot a SQL session issue that we encountered on a high-traffic, scaled,  Sitecore environment running Sitecore 8.1 Update 2, we discovered that the root cause of the issue was a connection leaking bug in the SessionStateStoreProvider that causes unnecessary load on SQL server making it unresponsive.

The purpose of this post is to arm you with the information that you need to implement a stable SQL Session State in your Sitecore deployment.

Symptoms 

When the issue / outage occurred, the exceptions in the Sitecore logs where the following:

 Exception: System.Data.SqlClient.SqlException  
 Message: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - No such host is known.)  
 Source: .Net SqlClient Data Provider  
  at System.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, UInt32 waitForMultipleObjectsTimeout, Boolean allowCreate, Boolean onlyOneCheckConnection, DbConnectionOptions userOptions, DbConnectionInternal& connection)  
  at System.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal& connection)  
  at System.Data.ProviderBase.DbConnectionFactory.TryGetConnection(DbConnection owningConnection, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal oldConnection, DbConnectionInternal& connection)  
  at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions)   
  at System.Data.SqlClient.SqlConnection.TryOpenInner(TaskCompletionSource`1 retry)  
  at System.Data.SqlClient.SqlConnection.TryOpen(TaskCompletionSource`1 retry)  
  at System.Data.SqlClient.SqlConnection.Open()  
  at System.Data.SqlClient.SqlConnection.Open()  
  at Sitecore.SessionProvider.Sql.SqlSessionStateStore.UpdateItemExpiration(Guid application, String id) 
  at Sitecore.SessionProvider.Sql.SqlSessionStateProvider.ResetItemTimeout(HttpContext context, String id)  
  at System.Web.SessionState.SessionStateModule.BeginAcquireState(Object source, EventArgs e, AsyncCallback cb, Object extraData)  
  at System.Web.HttpApplication.AsyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()  
  at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)  
 Nested Exception  
 Exception: System.ComponentModel.Win32Exception  
 Message: No such host is known  
Our New Relic Application monitoring system reported that the GetExpiredItemExclusive SQL stored procedure that was running against the session database was the most time consuming and was responsible for the highest throughput between our Sitecore and SQL server.

We discovered that once the execution of the stored procedure got above the 10,000 calls per minute range, the application would start having trouble and would eventually stop responding.

Root Cause

It was determined that .NET was naturally creating a number of session state provider objects, and during high traffic periods, the number got so large that it caused too much load on the SQL server and eventually caused the application to stop responding.

Stabilizing Session State

The Patch 

Sitecore Support issued us with a patch and noted that the issue was fixed in 8.2 Update 2 on. 

From a high level, the change involved Sitecore using their own factory and creating session state objects manually. 

The issue was registered as bug #98800. It's important to note that all prior versions will require a ticket to request the patch. 

We implemented the patch by following these steps:
1) Put the attached 'Sitecore.Support.98800.dll' assembly to the /bin folder of the website.
2) Changed session state provider type from
type="Sitecore.SessionProvider.Sql.SqlSessionStateProvider, Sitecore.SessionProvider.Sql"
to
type="Sitecore.Support.SessionProvider.Sql.SqlSessionStateProvider, Sitecore.Support.98800"

The change was made in both the Web.Config and Sitecore.Analytics.Tracking.config

Not Stable Yet

About 3 days later, our site was brought down to it's knees again. New Relic showed that the GetExpiredItemExclusive SQL stored procedure calls were well above the 10,000 calls per minute range.

Configuration Update

Working with Sitecore Support again, we increased the Session Provider polling interval from the default 2 seconds to 60 seconds and also increased the SQL connection timeout to 300 seconds.

The polling interval is basically the number of seconds to check the Session database for expired sessions. Under the covers, this would execute the GetExpiredItemExclusive SQL stored procedure.

The final configurations looked like these:

Sitecore.Analytics.Tracking.config

 <add  
  name="mssql"  
  type="Sitecore.Support.SessionProvider.Sql.SqlSessionStateProvider, Sitecore.Support.98800"  
  connectionStringName="session"  
  pollingInterval="60"  
  compression="true"  
  sessionType="shared"/>  
Web.Config

 <add   
  name="mssql"   
  type="Sitecore.Support.SessionProvider.Sql.SqlSessionStateProvider, Sitecore.Support.98800"   
  connectionStringName="session"   
  pollingInterval="60"   
  compression="true"   
  sessionType="private"/>  
ConnectionStrings.config

 <add name="session"   
    connectionString="user id=xxx;password=xxx;Data Source=xxx,1433;Database=Sessions;MultiSubnetFailover=TRUE; Connection Timeout=300" />  

Status

With the patch in place, and the final configuration updates, the application has been stable and has survived extremely high traffic days. 

An example of one of these days: 40,000 requests per minute, 7500 simultaneous users and 142,000 page views per hour.

Takeaways

If you intend to use SQL Session State for your Sitecore implementation, and are running a version of Sitecore prior to 8.2 Update 2, you need to create a ticket with Sitecore support to request the patch.

After this, it's critical that you increase your polling interval configuration from the default 2 seconds to something higher like we did. 60 seconds seems to be the perfect number.

If you have any questions, feel free to submit a comment and I will help you out to the best of my knowledge about this issue.

Monday, June 19, 2017

How to trigger xDB Pattern Cards using jQuery AJAX with Sitecore MVC

Standard

Background 

Sitecore makes it easy for content marketers to assign Profile Cards to pages when implementing a behavioral profile strategy, but more often than not, there are areas of today's modern Sitecore sites where this is a bit challenging.

For the same reason that you may want to trigger Goals or Outcomes using jQuery AJAX, an event driven way to trigger Pattern Cards would be equally useful.

My interest to achieve this was sparked while working on a eCommerce site where we implemented many, many dynamic lightboxes / modals during the ordering flow where we wanted to implement a part of our behavioral profiling strategy.

NOTE: The P’s of Sitecore Personalization can by somewhat tricky to understand, so if you need to brush up on the lingo before reading further, I suggest that you take a look at Mike Shaw's post on Profile Cards and Other P’s of Sitecore Personalization.


Architecture 

I am a fan of using Sitecore.Services.Client, and as I explained in this post, making your controllers xDB / session aware is pretty easy. If you prefer, you can most certainly use a regular MVC controller.

When thinking through the architecture, I wanted to be able to trigger multiple pattern cards using percentages as it would provide the most flexibility.

I needed to be sure that I had the right score calculation checks in place, as I wanted the same results as if the cards were triggered from a page where they had been assigned using the Content or Experience Editor.

You will see in the code that follows where I needed to ensure that the percentages sum had to be equal to 100 if there were multiple cards, or if there were more than one card without an assigned percentage to any of them, the percentages needed to be distributed evenly.

Pattern Card Model


Controller Action

This controller action may be a bit long-winded, but I wanted to demonstrate the flow of logic, from top to bottom.


Trigger using jQuery

Finally, here is an example button click event where we post an array of Pattern Card objects to our controller action.

Note, if you are using Sitecore.Services.Client or WebAPI, there is a known issue that will force you to change your data to be a single anonymous object instead of a raw array.

Thursday, June 8, 2017

The easy way to enable xDB tracking for Sitecore.Services.Client and Web API

Standard
There are several posts on the web that talk about the fact that Sitecore.Services.Client (SSC) and Web API don't allow xDB tracking due to the fact that they are session-less by default.

In this older post, Pavel explains the reason for this, and a provides a solution to make SSC or Web API session aware: http://jockstothecore.com/xdb-tracking-the-untrackable-part-1

In my post, I will demonstrate how to do this in 5 lines of code.

FXM Is Your Huckleberry

As described by the article, the key is to give the target route a session aware handler right after the SSC route (or Web API route) has been registered in the initialize pipeline.

FXM has a BeaconSessionRouteHandler already built in, because its SSC controllers track activity in xDB from external sites using the magical Beacon script.

So, using what FXM already gives you, all you need is the following "Session State Enabler" Processor in the initialize pipeline, right after the ServicesWebApiInitializer processor.

 Code

 using System.Web.Routing;  
   
 using Sitecore.FXM.Service.Handler;  
 using Sitecore.Pipelines;  
   
 namespace MyProject.Pipelines.Initialize  
 {  
   public class EnableEntityServiceSessionStateProcessor  
   {  
     public void Process(PipelineArgs args)  
     {  
       var route = RouteTable.Routes["EntityService"] as Route;  
   
       if (route != null)  
       {  
         route.RouteHandler = new BeaconSessionRouteHandler();  
       }  
     }  
   }  
 }  
   

Config

 <configuration xmlns:patch="http://www.sitecore.net/xmlconfig/">  
  <sitecore>  
   <pipelines>  
    <initialize>  
     <processor type="MyProject.Pipelines.Initialize.EnableEntityServiceSessionStateProcessor, MyProject"  
     patch:after="processor[@type='Sitecore.Services.Infrastructure.Sitecore.Pipelines.ServicesWebApiInitializer, Sitecore.Services.Infrastructure.Sitecore']" />  
    </initialize>  
  </sitecore>  
 </configuration>  

 That's it! You can now perform xDB tracking in your SSC or Web API controllers.


Tuesday, May 23, 2017

Sitecore Ecommerce Reporting using Google Tag Manager's Data Layer

Standard
A requirement on my last couple projects was to implement Google Tag Manager's Data Layer (GTM) in order to get eCommerce data to report in Google Analytics.

I have seen several custom implementations of this in Sitecore, most where developers end up writing some ugly spaghetti code in a view rendering that spits out the required data layer push script.

In this post, I will show you a clean way of implementing the data layer, that you can use as a guide in your own implementation if it is a requirement on your project.

This post assumes that your analytics team has configured the data layer in GTM and that they have provided you with the requirements to generate a data layer script in specific pages driven by targeted events.


What Is A Data Layer?

Before we get started, it's useful to understand what the data layer actually is.

The data layer is a piece of script that contains any information or variables that you want Google Tag Manager to read, and then report to Google Analytics - including eCommerce data.

Here is a great post that will help you to understand the true value of the data layer: Data Layer Demystified

Use Case

As previously mentioned, I was required to generate a data layer push script to track a visitor's activity on a Sitecore eCommerce site, along with the ability to allow purchase information to be sent to the data layer on the "Thank You" page after a purchase was complete.

As you can tell by the sample eCommerce script that was provided to me below, it is dynamic based on what items a visitor had purchased.

 digitalData = [{  
  page: {  
   category: {  
    pageType: 'menu item',  
   },  
   pageInfo: {  
    experienceType: 'desktop',  
    sysEnv: 'prod'  
   }  
  },  
  user: {  
   profile: {  
    profileInfo: {  
     loginStatus: 'logged-in',  
     profileID: '12345'  
    }  
   }  
  },  
  ecommerce: {  
   purchase: {  
    actionField: {  
     id: 'T12345', //Unique transaction ID. Required for purchases and refunds.  
     affiliation: 'Catering',  
     revenue: '35.43', // Total transaction value, including tax and shipping.  
     tax:'4.90',  
     shipping: '0', //Always set to '0'.  
     coupon: '' //Always set to empty string  
    },  
    products: [{  
     'name': 'Fruit Tray', //Product Name  
     'id': '12345', //Product SKU  
     'price': '26', //Product Price  
     'brand': '', //  
     'category': 'Trays', //Product Category  
     'variant': 'Small',  
     'quantity': 1  
    },{  
     'name': 'Barbeque Sauce', //Product Name  
     'id': '12345', //Product SKU  
     'price': '2', //Product Price  
     'brand': '', //  
     'category': 'Add On', //Product Category  
     'variant': '',  
     'quantity': 1  
    }]  
   }  
  }  
 }];  

My goal was to build the top section of the script on every eCommerce page, telling the Data Layer information about my visitor, and then add in the eCommerce section of the script when they completed a transaction.

POCO Time

You will notice that the data layer script itself is in JSON data format. So, I decided to create a nice clean POCO that I would populate with the required data and then serialize and output to my pages.

I created the following C# classes from the JSON:

   public class DataLayerModel  
   {  
     public Page page { get; set; }  
     public User user { get; set; }  
     public Ecommerce ecommerce { get; set; }  
   }  
   public class Category  
   {  
     public string pageType { get; set; }  
   }  
   public class PageInfo  
   {  
     public string experienceType { get; set; }  
     public string sysEnv { get; set; }  
     public string destinationURL { get; set; }  
     public string pageName { get; set; }  
   }  
   public class Page  
   {  
     public Category category { get; set; }  
     public PageInfo pageInfo { get; set; }  
   }  
   public class ProfileInfo  
   {  
     public string loginStatus { get; set; }  
     public string profileID { get; set; }  
   }  
   public class Profile  
   {  
     public ProfileInfo profileInfo { get; set; }  
   }  
   public class User  
   {  
     public Profile profile { get; set; }  
   }  
   public class ActionField  
   {  
     public string id { get; set; }  
     public string affiliation { get; set; }  
     public string revenue { get; set; }  
     public string tax { get; set; }  
     public string shipping { get; set; }  
     public string coupon { get; set; }  
   }  
   public class Product  
   {  
     public string name { get; set; }  
     public string id { get; set; }  
     public string price { get; set; }  
     public string brand { get; set; }  
     public string category { get; set; }  
     public string variant { get; set; }  
     public int quantity { get; set; }  
   }  
   public class Purchase  
   {  
     public ActionField actionField { get; set; }  
     public List<Product> products { get; set; }  
   }  
   public class Ecommerce  
   {  
     public Purchase purchase { get; set; }  
   }  

Hydrating The Data Layer

The next order of business was to write a method that would create and populate the object based on my data layer class.

My method would take the data layer eCommerce object as a parameter, and add it to the parent data layer object if it was populated.

I added this into my repository layer using two methods.

The first method (GetDataLayerModel) generated the data layer object and took an eCommerce object as a parameter. As I mentioned before, it added the eCommerce object to my data layer object if it was populated.

1:      public DataLayerModel GetDataLayerModel(Ecommerce commerceModel)  
2:      {  
3:        var currentItem = Sitecore.Context.Item;  
4:        var pageName = HttpUtility.JavaScriptStringEncode(currentItem.Fields[DisplayConstants.TitleField].Value);  
5:        var pageType = currentItem.TemplateName;  
6:        var experienceType = DisplayConstants.DesktopExperienceType;  
7:        var sysEnv = Sitecore.Configuration.Settings.GetSetting("DataLayerEnvironmentName");  
8:        var destinationUrl = HttpContext.Current.Request.Url.AbsoluteUri;  
9:        var loginStatus = DisplayConstants.NotLoggedIn;  
10:        var profileId = "";  
11:    
12:        var crnUser = new CrnUserModel();  
13:    
14:        if (crnUser.IsAuthenticated && !crnUser.IsSitecoreDomain)  
15:        {  
16:          loginStatus = DisplayConstants.LoggedIn;  
17:        }  
18:    
19:        if (Tracker.Current != null && Tracker.Current.Contact != null)  
20:        {  
21:          profileId = Tracker.Current.Contact.Identifiers.Identifier;  
22:        }  
23:    
24:        var dataLayerModel = new DataLayerModel  
25:        {  
26:          page = new Page  
27:          {  
28:            pageInfo = new PageInfo  
29:            {  
30:              experienceType = experienceType,  
31:              sysEnv = sysEnv,  
32:              destinationURL = destinationUrl,  
33:              pageName = pageName  
34:    
35:            },  
36:            category = new Category  
37:            {  
38:              pageType = pageType  
39:            }  
40:          },  
41:          user = new User  
42:          {  
43:            profile = new Profile  
44:            {  
45:              profileInfo = new ProfileInfo  
46:              {  
47:                profileID = profileId,  
48:                loginStatus = loginStatus  
49:              }  
50:            }  
51:          }  
52:        };  
53:    
54:        if (commerceModel != null)  
55:        {  
56:          dataLayerModel.ecommerce = commerceModel;  
57:        }  
58:    
59:        return dataLayerModel;  
60:      }  
The second method (GetCommerceDataLayer) would take the transaction data, and create a data layer eCommerce object that would be used to pass over to the GetDataLayerModel method mentioned and shown above.

1:      public Ecommerce GetCommerceDataLayer(orderModel orderModel, menuResponse oloMenuResponse)  
2:      {  
3:        var ecommerce = new Ecommerce  
4:        {  
5:          purchase = new Purchase  
6:          {  
7:            products = new List<Product>(),  
8:            actionField = new ActionField  
9:            {  
10:              affiliation = DisplayConstants.CateringAffiliation,  
11:              tax = oloMenuResponse.Order.TaxAmount.ToString(CultureInfo.InvariantCulture),  
12:              revenue = oloMenuResponse.Order.TotalAmount.ToString(CultureInfo.InvariantCulture),  
13:              id = oloMenuResponse.Order.SubmitOrderNumber.ToString(CultureInfo.InvariantCulture),  
14:              shipping = "0",  
15:              coupon = string.Empty  
16:            }  
17:          }  
18:        };  
19:    
20:        foreach (var lineItem in orderModel.LineItems)  
21:        {  
22:          var dataLayerProduct = new Product  
23:          {  
24:            id = lineItem.ItemTag,  
25:            name = lineItem.Name,  
26:            price = lineItem.RetailPrice.ToString(CultureInfo.InvariantCulture),  
27:            quantity = lineItem.Quantity,  
28:            variant = string.Empty,  
29:            category = string.Empty  
30:          };  
31:    
32:          ecommerce.purchase.products.Add(dataLayerProduct);  
33:    
34:          if (!lineItem.Modifiers.Any())  
35:          {  
36:            continue;  
37:          }  
38:    
39:          foreach (var modifier in lineItem.Modifiers)  
40:          {  
41:            var dataLayerModifierProduct = new Product  
42:            {  
43:              id = modifier.ItemTag,  
44:              name = modifier.Name,  
45:              price = modifier.RetailPrice.ToString(CultureInfo.InvariantCulture),  
46:              quantity = modifier.Quantity,  
47:              category = DisplayConstants.CateringAddOn,  
48:              variant = string.Empty  
49:            };  
50:    
51:            ecommerce.purchase.products.Add(dataLayerModifierProduct);  
52:          }  
53:        }  
54:    
55:        return ecommerce;  
56:      }  

Coding the Controller and View

The next order of business was to create a controller and view that would be added to my target pages.

I created a simple controller and coded the action to grab my eCommerce transaction data, and pass it over to the data layer methods that I had created before.

Action

My controller action looked like this:

1:      public ActionResult DataLayer()  
2:      {  
3:        Ecommerce eCommerceTransaction = null;  
4:    
5:        //Check for order info in session  
6:        var orderResponse = _contactRepository.GetSubmitOrderResponse();  
7:        var order = _contactRepository.GetSubmitOrder();  
8:    
9:        if (order != null && orderResponse != null)  
10:        {    
11:          eCommerceTransaction =_analyticsRepository.GetCommerceDataLayer(order, orderResponse);  
12:            
13:          //Kill session objects  
14:          _contactRepository.SetSubmitOrderResponse(null);  
15:          _contactRepository.SetSubmitOrder(null);  
16:    
17:        }  
18:    
19:        var model = _analyticsRepository.GetDataLayerModel(eCommerceTransaction);  
20:        return View(model);  
21:      }  

View

My view was very simple; it took my data layer object, and serialized it.

1:  @model MyProject.Foundation.Domain.Models.DataLayer.DataLayerModel  
2:  @using Newtonsoft.Json  
3:    
4:  @{  
5:    if (Model != null)  
6:    {  
7:      <script>  
8:        window.digitalData = @Html.Raw(string.Format("[{0}];",JsonConvert.SerializeObject(Model)))  
9:      </script>  
10:    }  
11:  }  

Adding the Sitecore Component

With the code in place, I created the Controller Rendering item in Sitecore.




Adding the Component to your Target Pages

In my implementation, I statically bound my Controller Rendering to my eCommerce Layout.

You can most certainly add the controller rendering to a specific placeholder that you have set up. Just be aware that the data layer code needs to be within the head element of your page, right before
your Google Tag Manager code:

1:  <head>  
2:  //Your regular head stuff here  
3:    @{  
4:       //Constant value is the ID of my Controller Rendering shown above {CFE19999-FC9C-42C0-A3C0-A6F0FCFB8519}           
5:       @Html.Sitecore().Rendering(MyProject.Feature.Analytics.Constants.RenderingConstants.DataLayer)   
6:      //Google tag manager script (I am loading in a regular partial view below)  
7:      Html.RenderPartial("~/Views/Identity/GoogleTagManager.cshtml");  
8:    }  
9:  </head>  

Final Results

With all the pieces of the puzzle in place, the only thing left to do was to check the script in my pages to confirm that it was being built out correctly.

Here is a sample script generated by a regular eCommerce page:

1:  <script>  
2:        window.digitalData = [{"page":{"category":{"pageType":"Standard"},"pageInfo":{"experienceType":"desktop","sysEnv":"Dev","destinationURL":"https://SC81U2/OrderStuff","pageName":"Category"}},"user":{"profile":{"profileInfo":{"loginStatus":"logged-in","profileID":"ID-123456789"}}},"ecommerce":null}];  
3:   </script>  

Here is sample script that was generated from an eCommerce "Purchase Complete / Thank You" page that had the eCommerce / transaction data loaded:

1:  <script>  
2:        window.digitalData = [{"page":{"category":{"pageType":"Standard"},"pageInfo":{"experienceType":"desktop","sysEnv":"Dev","destinationURL":"https://SC81U2/Thankyou","pageName":"Category"}},"user":{"profile":{"profileInfo":{"loginStatus":"logged-in","profileID":"ID-123456789"}}},"ecommerce":{"purchase":{"actionField":{"id":"2218523","affiliation":"Catering","revenue":"116.65","tax":"7.15","shipping":"0","coupon":""},"products":[{"name":"Fruit Cup","id":"FRUIT_CUP","price":"2.19","brand":null,"category":"Breakfast","variant":"Small Fruit Cup","quantity":50}]}}}];  
3:  </script>  
4:    


Tuesday, March 7, 2017

Securing your mLab Cloud Service for Sitecore MongoDB databases

Standard
The recent string of ransomware attacks on MongoDB databases that left over 30,000 servers compromised, has got most Sitecore clients skittish about the security of their hosted Sitecore MongoDB databases.


Almost all of the posts out there reference the generic MongoDB security checklist as what you should implement to protect your MongoDB installation.

So with this being said, the following questions should be on your mind;

  1. How does this list apply to my mLab Cloud hosted MongoDB service?
  2. Are my mLab MongoDB databases as secure as they possibly can be?
With the new Sitecore Azure PaaS offering picking up steam, it's more important than ever to understand mLab's security considerations as mLab on Azure is the default option that clients are turning to.

The purpose of this post is to help you understand how secure your client's current mLab environment is, or how to secure a new database cluster that you may be working on.

The items being referenced can be found within mLab's security documentation at http://docs.mlab.com/security.

Dedicated Cluster

Every client should be on a Dedicated Cluster plan of some sort.

These plans offer a number of potential security enhancements, as well as a number of baseline security considerations, such that all deployments will always have auth enabled no matter what.

Private Environment

An optional security enhancement is using an mLab Private Environment.

This feature allows for an mLab deployment to be created in a VPC such that another VPC can be peered to limit any connections to the customer-owned VPC. This is especially useful for applications that may have dynamic scaling and non-static IP address.

You can read more about mLab Private Environments at http://docs.mlab.com/private-environments/ 

Encryption at Rest

This will encrypt any data as it resides on disk: http://docs.mlab.com/security/#encryption-at-rest

The feature is currently only supported on AWS and Google Cloud Platform and NOT Azure.


Encryption during Transit (SSL)

Without this feature enabled, any communication with your mLab deployment that is not originating from within AWS or Azure is going to take place across the open internet and will be susceptible to packet sniffing.

Even with custom firewall rules in place to limit access to only the IP address(es) (or address ranges) you specify, traffic between the database and the client applications and networks is vulnerable to snooping.

Whether creating a new deployment or upgrading an existing deployment, you can enable SSL support for MongoDB connections directly from the mLab management portal.

It's an extra $80 a month, but it's well worth the investment in order to ensure privacy, critical security and data integrity.

The details around this feature can be found here: http://docs.mlab.com/ssl-db-connections/.

Custom Firewall Rules

Basically, the feature offers the ability for you to define custom firewall rules so that your database only allows network access from your application infrastructure.

Access can be limited to specific IP address ranges and/or to Amazon EC2 security groups (AWS only).

If you are using AWS, your Security Group must be in EC2-Classic and exist in AWS us-east-1 (the same AWS Region as your database). If your app is in EC2-VPC, consider migrating this deployment to an mLab Private Environment: http://docs.mlab.com/private-environments/

More information about this feature can be found at http://docs.mlab.com/security/#custom-firewalls

Two-factor Authentication for the mLab management console

2FA  is optional by default for account users. 

Access to the mLab management console provides full and complete access to any deployment within the account, including the ability to create and download backups as well as delete/modify deployments. 

Making 2FA a requirement will reduce the potential for undesired access.

Final Note

These security enhancements are all optional, but also recommended.

mLab's baseline security practices provide a reasonable degree of security, but as you very well know, security is not a binary subject and there are always ways to increase the overall security of a deployment.


Monday, January 23, 2017

Sitecore Cleanup Monitor - Proactively keeping an eye on your Event Queue, History and Publish Queue tables

Standard

Background

There are several horror stories floating around the web about the Event Queue bringing Sitecore down to its knees.

Brian Pedersen
https://briancaos.wordpress.com/2016/08/12/sitecore-event-queue-how-to-clean-it-and-why/
https://briancaos.wordpress.com/2014/10/23/sitecore-eventqueue-deadlocks-how-to-solve-them-and-how-to-avoid-them/

Andy Cohen
https://blog.horizontalintegration.com/2016/02/09/sitecore-eventqueue-strikes-again/

I have experienced trouble myself
http://sitecoreart.martinrayenglish.com/2016/08/diagnosing-content-management-server.html

The Last Straw 

There is a bug in pre 8.1 U3 releases (I am on 8.1 U2) that will cause the Event Queue table in the Core database to be flooded with timestamp data from your Sitecore servers in a scaled environment.

The issue was related to the property:changed event that was being added into the Event Queue. Every 10 seconds each Sitecore Instance would use the SetTimestampForLastProcessing method.

There was no need to inform other instances about the update in last processed stamp of local instance, and Sitecore Support provided me with a patch where they simply used the event disabler to fix the issue.

Here is a copy of the patch for download if you are having this problem: https://www.dropbox.com/s/lpjhil5rf9dri0n/Sitecore.Support.99697.zip?dl=0

After experiencing this and other problems in the past, I decided to take action.

Sitecore Cleanup Monitor Module 

The Event Queue was my initial focus, but per Sitecore's Performance Tuning Guide, in order to keep Sitecore running optimally, we need to keep the Event Queue, History and Publish Queue tables below 1000 rows: https://sdn.sitecore.net/upload/sitecore7/70/cms_tuning_guide_sc70-72-a4.pdf. The reason behind this is due to SQL deadlocking: https://technet.microsoft.com/en-us/library/ms177433(v=sql.105).aspx.

With all this being said, I decided to put together a module that would keep an eye on these key tables.

The module consists of 3 agents that will monitor the Event Queue, Publish Queue and History tables to ensure that they don't exceed a set threshold.



Why would you use it?

In many cases, Sitecore's default cleanup agents just aren't efficient enough in cleaning up these key Sitecore tables.

This module allows you to be proactive instead of reactive, so that you don't have to log into your SQL instance to manually run queries to clean up your tables, usually after the $#!,$h has hit the fan.

How does it work? 

When due, the agent will check the row count of the target table in each database (core, master and web), and if the count is above the set threshold, it will remove the oldest rows, bringing the row count down to the threshold. It won't do anything to tables with row counts that are below the threshold.

You can set how often you want each agent to run, and what you want your threshold / table row count to be. You also don't need to use all three agents. If you only want to monitor the Event Queue for example, simply comment or remove the other agents from the module's config file.

You can monitor it's activity be examining your Sitecore logs. Here is a snapshot example:


Installation and Configuration

Documentation, full source code and package download is available from my GitHub repository: https://github.com/martinrayenglish/Sitecore.Cleanup

The module is available on the Sitecore Marketplace: https://marketplace.sitecore.net/Modules/S/Sitecore_Cleanup_Monitor.aspx