Skip to content

Performance related considerations

nairdo edited this page Apr 10, 2013 · 9 revisions

Speed is a primary feature of Rock ChMS. Before writing any code think about performance, and when you write code, code for performance.

Caching Best Practices

Many of the most important entities are cached by the Rock framework (Rock.Web.Cache namespace) including:

  • AttributeCache
  • BlockCache
  • BlockTypeCache
  • CampusCache
  • DefinedTypeCache
  • DefinedValueCache
  • EntityTypeCache
  • FieldTypeCache
  • GlobalAttributesCache
  • PageCache
  • SiteCache

Whenever dealing directly with any of these related entities, you should generally read them from cache:

    // example by id
    var campus = Rock.Web.Cache.CampusCache.Read( campusId );

    // example by guid
    var definedValue = DefinedValueCache.Read(
         Rock.SystemGuid.DefinedValue.CHECKIN_SEARCH_TYPE_PHONE_NUMBER );

Transaction Queue

Every effort should be made to return a page back to the user as quickly as possible. Any processing that can be done out-of-process should consider using transactions.

Rock has a built-in transaction queue to handle out-of-process execution of code. A block can create a transaction, add it to the queue and move on. An example usage is the implementation of page analytics. To capture data for pages that have been viewed, a transaction is added to the queue instead of writing to the database directly while the user waits. In many cases you can see nearly 100x increase in responsiveness .

WARNING: Transactions in the queue are not persisted. In the event of a server crash, shutdown or reboot, items in the queue will be gone. Therefore use the queue for non-critical data events.

Using Transactions

A transaction type class must be created for type of transaction. These must inherit from ITransaction which has one method called Execute(). For example, to implement the page analytics feature described above, a PageViewTransaction.cs class was created with an Execute method consisting of:

/// <summary>
/// Execute method to write some data to a file in a Logs folder.
/// </summary>
public void Execute()
{
    string directory = AppDomain.CurrentDomain.BaseDirectory;
    directory = Path.Combine( directory, "Logs" );

    // check that directory exists
    if ( !Directory.Exists( directory ) )
        Directory.CreateDirectory( directory );

    // create full path to the fie
    string filePath = Path.Combine( directory, "pageviews.csv" );
    
    // write to the file
    StreamWriter w = new StreamWriter( filePath, true );
    w.Write( "{0},{1},{2},{3},{4},{5}\r\n", DateViewed.ToString(),  PageId.ToString(),
        SiteId.ToString(), PersonId.ToString(), IPAddress, UserAgent);
    w.Close();
}

To use this transaction type on a block you would simply instantiate an object, set its properties, and add it to the transaction queue using the RockQueue.TransactionQueue.Enqueue() method. Using our working example, this is how the Rock page loader uses the PageViewTransaction to record page views:

    PageViewTransaction transaction = new PageViewTransaction();
    transaction.DateViewed = DateTime.Now;
    transaction.PageId = PageInstance.Id;
    transaction.SiteId = PageInstance.Site.Id;
    if ( CurrentPersonId != null )
        transaction.PersonId = (int)CurrentPersonId;
    transaction.IPAddress = Request.UserHostAddress;
    transaction.UserAgent = Request.UserAgent;

    RockQueue.TransactionQueue.Enqueue( transaction );

The Rock queue manager will wake up (currently every 60 seconds) and drain the queue by calling the each transaction's Execute method through the interface.

Sample code can be found in Rock.Transactions. In general, although this is very simple, it is also very powerful.

CONSIDERATION: Transactions are meant for short running tasks and are not recommended for very long running tasks. They are not cost-free processing. They still operate in the IIS context and still use processing and memory. Longer running tasks should be developed with other alternatives, such as Rock Jobs.

SQL Profiler

After you create a new Block or Entity you should run the SQL Profiler while exercising your new code. You might just discover some performance implications you were not aware of. Read the Avoiding EF Deferred Execution in the Tips and Tricks section of the Entities documentation for a related situation.

Clone this wiki locally