This is a total repost of Stacia’s blog post from this morning so that hopefully everyone will see it.  So pretend that Stacia wrote this and that I didn’t.

Yesterday, Denny Cherry (blog|twitter) and I co-presented a 24HOP session for the Fall 2011 lineup, “So How Does the BI Workload Impact the Database Engine?” 24HOP stands for 24 Hours of PASS and is a semiannual roundup of speakers from the SQL Server community. Initially, this event consisted of 24 consecutive sessions, each lasting an hour, but later it became a two-day event with 12 consecutive sessions each day. The sessions are free to attend and feature many great topics covering the spectrum of SQL Server things to know. Even if you missed previous 24HOP events, you can always go back and view recordings of sessions that interest you at the 24HOP site for Spring 2011 and Fall 2010.

And if you missed Denny and me yesterday, a recording will be available in a couple of weeks and I’ll update this post with a link. Our hour-long session for 24HOP was a sneak preview of our upcoming half-day session of the same name that we’ll be presenting at the PASS Summit in Seattle on Thursday, October 13, 2011 from 1:30 pm to 4:30 PM. In our half-day session, we’ll dig into the details and spend more time on database engine analysis, whereas in our 24HOP session, we focused on reviewing the architecture and highlighting the connection between BI components and the database engine.

We were able to answer a few questions at the end, but one question in particular could not be answered easily in the time allotted in a single sentence or two: How much RAM do I need to plan for Integration Services (SSIS)? Andy Leonard (blog|twitter) did manage a succinct response: All of it! I, on the other hand, am not known for being succinct, so deferred the question for this post.

Andy is right that SSIS wants as much memory as you can give it, which can be problematic if you’re executing an SSIS package on the same box as SQL Server. On the other hand, there are benefits to executing the package on the same box as well, so there is no one-size-fits-all solution. And the solution for one data integration scenario might not be the right solution for another data integration scenario. A lot depends on what CPU and RAM resources a given server has and how much data is involved. In order to know how much horsepower you need, you’re going to have to do some benchmark testing with packages. Here are some good resources for SSIS if you’re concerned about memory:

  • Top 10 SQL Server Integration Services Best Practices from the SQL Customer Advisory Team (blog | twitter): This article provides an overview of best practices (as the name implies!) and includes links to information about using performance counters to monitor resource usage and about optimizing the Lookup transformation, which is one of the big memory consumers in SSIS.
  • SQL Server 2005 Integration Services: A Strategy for Performance, a whitepaper by my friend, former colleague, and co-author of my first book, Elizabeth Vitt. Although it was written for SSIS 2005, the principles related to tuning packages and how to benchmark still apply. The significant changes between SSIS 2005 and SSIS 2008 with regard to performance were improvements in thread management and in the Lookup transformation.

Is there a rule of thumb for deciding how much memory you’ll need for SSIS? Well, no less than 4 GB per CPU core is a good place to start. But if that’s not possible, you certainly want to have memory that’s at least two or three times the size of data that you expect to be processing at a given time. So if you’re processing 1 GB of data, you’ll want at least 2-3 GB of memory and, of course, more memory is even better!

Share

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trust DCAC with your data

Your data systems may be treading water today, but are they prepared for the next phase of your business growth?