IT tutorials
 
Technology
 

SQL Server 2012 : Optimizing SQL Server Memory Configuration - Min and Max Server Memory, Optimize for Ad-Hoc Workloads

9/27/2013 4:20:19 AM
- Free product key for windows 10
- Free Product Key for Microsoft office 365
- Malwarebytes Premium 3.7.1 Serial Keys (LifeTime) 2019

This section discusses some of the most common memory configuration options for SQL Server.

1. Min and Max Server Memory

Min Server Memory (MB) and Max Server Memory (MB) control the allowable size of all SQL Server’s memory usage. This makes sizing SQL Server’s memory requirements much easier than with previous versions.

As its name suggests, Min Server Memory controls the minimum amount of physical memory that SQL Server will try to keep committed. We say “try” because it can fall under that value if Windows is desperate enough, but to all intents and purposes it sets a floor for SQL Server’s memory usage.

When the SQL Server service starts, it does not acquire all the memory configured in Min Server Memory but instead starts with only the minimum required, growing as necessary. Once memory usage has increased beyond the Min Server Memory setting, SQL Server won’t release any memory below that amount.

Not surprisingly, Max Server Memory is the opposite of Min Server Memory, setting a ceiling for memory usage. Both values can be set using sp_configure or through Management Studio on the Memory page of the SQL Server Properties window.

Configuring a maximum value for the buffer pool is the more important of the two settings and will prevent SQL Server from taking too much memory. This is particularly significant on 64-bit systems, where a lack of free physical memory can cause Windows to trim SQL Server’s working set. See the section “Lock Pages in Memory” for a full description of this issue.

There are several different ways to calculate an appropriate value for configuring Max Server Memory, but two of the most straightforward are as follows:

  • Look at SQL Server’s maximum usage.
  • Determine the maximum potential for memory requirements outside SQL Server.

Each of these options is covered in the following sections.

Looking at the SQL Server’s Maximum Usage

With this method, you set SQL Server to dynamically manage memory and then monitor the MSSQL$<instance>:Memory Manager\Total Server Memory (KB) counter using Performance Monitor. This counter measures SQL Server’s total buffer pool usage.

The Total Server Memory value will decrease if other requirements outside SQL Server need more physical memory than is currently free, and then increase again to use any free memory. If you monitor this counter for a period of time that is representative for your business (i.e., it includes busy and slack periods), you can then set Max Server Memory to the lowest value that was observed for Total Server Memory (KB), and you won’t have to worry about SQL Server having to shrink its usage during normal operations.

Determining the Maximum Potential for Requirements Outside SQL Server

This option is the most popular, as the aim is to calculate the worst-case scenario for memory requirements other than SQL Server’s. You should allow for the following:

  • 2GB for Windows
  • xGB for SQL Server worker threads. You can find your max workers count by querying sys.dm_os_sys_info. Each thread will use 0.5MB on x86, and 2MB on x64.
  • 512MB, if you use linked servers, extended stored procedure dlls, or objects created using Automation procedures (sp_OA calls)
  • 1–3GB, for other applications that might be running on the system, such as backup programs or anti-virus software

For example, on a server with eight CPU cores and 64GB of RAM running SQL Server 2012, a third-party backup utility, and virus checker, you would allow for the following:

  • 2GB for Windows
  • 1GB for worker threads (576 × 2MB rounded down)
  • 512MB for linked servers, etc.
  • 1GB for the backup program and virus checker

For a total of 4.5GB, you would configure Max Server Memory to 59.5GB.

Both of these options can be valid in different circumstances. On a single SQL Server from which you need to squeeze every drop of performance, you might use option 1 and monitor Total Server Memory to see how often SQL Server has to give memory back to Windows. However, if you had dozens of SQL Servers to manage or a mission-critical server, you might go with option 2, as it would be easier to calculate across multiple servers and is less likely to cause a failure under exceptional circumstances.

Checking That Your Max Server Memory Is Effective

How you decide to configure Max Server Memory when you build a server (there are many opinions on the matter) isn’t as important as measuring its effectiveness and adjusting it when the server has run its expected workload. An easy way to do this is using performance monitor counters, specifically, MSSQL$<instance>:Buffer Manager\Page Life Expectancy (PLE) (also see the section “Clerks, Caches, and the Buffer Pool”) and Memory\Available MBytes. The balance between these two counters shows you how effective your Max Server Memory setting is.

  • PLE: Shows you how many seconds SQL Server expects to keep a page in the data cache and is a good measure of memory pressure on SQL Server
  • Available MBytes: Shows how much physical RAM Windows has that isn’t doing anything

If your PLE is low (<300 is definitely low but you might choose a higher threshold), then check your Available MBytes to see how much unused RAM is available. Windows starts to aggressively trim (see next section) all application working sets when it has less than 5MB available, so anything close to this on a production server should be considered an urgent problem.

The minimum Available MBytes you should have is 100MB but even this is cutting it too close because any application that is run on your SQL Server can easily use that up. Instead, try aiming for 500 or 600MB as a minimum or even 1GB to be sure. That way, if you need to run any support tools on the server, there will be plenty of RAM for them.

So, if your PLE is low and you have plenty of Available MBytes because you were conservative with your Max Server Memory setting, then you have scope to increase your Max Server Memory, thereby increasing your PLE. Conversely, if your Available MBytes is low because you were aggressive with your Max Server Memory setting and your PLE is very high, then you can reduce your Max Server Memory to give some RAM back to Windows.

Here are some example scenarios to illustrate this point:

  • Max Server Memory is 30GB on a server with 32GB RAM. PLE averages 10,000 and Available MBytes is 90MB. Solution: Lower Max Server Memory by at least 500MB.
  • Max Server Memory is 46GB on a server with 50GB RAM. PLE averages 10 and Available MBytes is 1500MB. Solution: Increase Max Server Memory by 500MB to 1000MB.
  • Max Server Memory is 60GB on a server with 64GB RAM. PLE averages 50 and Available MBytes is 20MB. Solution: Lower Max Server Memory by 100MB and buy more RAM (quickly).

Lock Pages in Memory

Lock Pages in Memory (LPIM) is used as a work-around for a problem than can occur between Windows and SQL Server, and it was especially bad on older versions of SQL Server, which could run on Windows Server 2003 and earlier.

If there isn’t enough free physical memory in Windows to service a request for resources from a driver or another application, Windows will trim the working set (which refers to the physical memory usage of a process) of all applications running on the server. This is normal behavior and shouldn’t have much noticeable impact.

Windows Server 2003 didn’t cope very well with badly written drivers and could actually force all applications to empty their working sets. This is known as aggressive working set trimming and had a devastating effect on SQL Server’s memory allocation — and therefore performance. So that you could see when this happened, Microsoft added a message to the SQL Server Error Log. Here is an example:

A significant part of sql server process memory has been paged out.
This may result in a performance degradation. Duration: 0 seconds.
Working set (KB): 1086400, committed (KB): 2160928, memory
utilization: 50%.

This behavior significantly changed in Windows Server 2008 and later thus preventing the biggest problem — badly written drivers causing application working sets to be emptied. This won’t affect SQL Server 2012 because it only runs on Windows Server 2008+.

In SQL Server 2012, you will still get messages logged when Windows performs working set trimming. Several messages can indicate a gradual decline of SQL Server’s working set (which is still a problem).

Resolving this issue (or avoiding it all together) can be approached in two ways:

  • Set Max Server Memory appropriately to ensure that Windows and other processes running on the server have enough physical memory to perform their work without asking SQL Server to trim. See the previous section on Min and Max Server Memory for more details.
  • If you’re still seeing the issue (or if its effects are so severe you don’t want to risk seeing it again), you can configure your SQL Server to use Locked Pages in Memory (LPIM).

When LPIM is enabled, SQL Server’s buffer pool pages are “locked” and non-pageable so Windows can’t take them when trimming.

Once the pages are locked, they’re not considered part of available memory for working set trimming. However, only SQL Server buffer pool allocations can be locked — Windows can still trim the working sets of other processes, affecting resources on which SQL Server depends.

LPIM should be used if you continue to get working set trimming after setting a suitable max server memory or the cost of SQL Server’s working set being trimmed again is too risky.

Whether or not it should be used as a default best practice on all your SQL Servers is a common debate. One perspective is that it’s a work-around and not intended to replace the default behavior on every SQL Server implementation. Administrators who believe this don’t set it unless they know it’s going to fix a problem. Another perspective is that setting it by default on every implementation is a good preventative measure, which avoids working set trimming ever happening.

Ultimately, it’s down to personal choice; and whether or not you choose to enable it by default is less important than understanding what it does and making an educated decision rather than blindly enabling the feature because someone advised it.


NOTE
Having read what LPIM was introduced to fix, it’s also worth noting that a side-effect of using locked pages is that they require slightly less overhead to manage (because they can’t be moved). This can translate into a real performance benefit on large scale, high-throughput SQL Servers. So it’s definitely worth testing on your most performance sensitive servers to see if it helps.

If LPIM is working, you’ll see the following message in the SQL Server Error Log:

Using Locked Pages in the Memory Manager.

2. Optimize for Ad-Hoc Workloads

If an execution plan is never reused, then it’s just taking up resources unnecessarily; and the use of unparameterized ad-hoc T-SQL is the most likely cause.

When you execute code in SQL Server, it generates a hash value of your code and uses that to determine plan reuse. If you execute a stored procedure, a hash value is generated from the stored procedure name, and the plan will be reused on each subsequent procedure call regardless of the parameter values used.

If you run the same code outside of a stored procedure (ad-hoc T-SQL), the hash is taken on the whole statement, including any literal values. When you then change the literal values for another execution, the hash is different, so SQL Server doesn’t find a match and generates a new execution plan instead of reusing the previous one.

This situation can lead to a scenario called plan cache bloat, whereby potentially thousands of ad-hoc plans are generated and cached with a usecount of 1 even though the code is fundamentally the same.

The ideal solution is to use stored procedures or functions, or to parameterize all your ad-hoc T-SQL; but this can be very challenging, and often unachievable due to complexity and company politics, so Microsoft introduced the Optimize for Ad-hoc Workloads server-level option in SQL Server 2008 to help.

When this option is enabled, SQL Server will cache only a plan stub the first time a piece of ad-hoc T-SQL is executed, rather than the full plan. If SQL Server subsequently tries to reuse that plan, it will be generated again but this time cached in full. This avoids the scenario of thousands of single-use plans taking up valuable space in cache.

SELECT count(*) AS 'Number of Plans',
sum(cast(size_in_bytes AS BIGINT))/1024/1024 AS 'Plan Cache Size (MB)'
FROM sys.dm_exec_cached_plans

The preceding code produced the following results on a SQL Server 2012 instance with Max Server Memory set to 32GB:

Number of Plans   Plan Cache Size (MB)

14402 2859

Almost 3GB of memory is being used to cache plans, so it’s significant enough to investigate the usage details. The following script breaks down the plan cache size by cached object type:

SELECT objtype AS 'Cached Object Type',
count(*) AS 'Number of Plans',
sum(cast(size_in_bytes AS BIGINT))/1024/1024 AS 'Plan Cache Size (MB)',
avg(usecounts) AS 'Avg Use Count'
FROM sys.dm_exec_cached_plans
GROUP BY objtype

The results are as follows:

Cached Object Type  Number of Plans  Plan Cache Size (MB) Avg Use Count

UsrTab 10 0 222
Prepared 286 72 4814
View 216 20 62
Adhoc 13206 2223 39
Check 30 0 7
Trigger 22 5 1953
Proc 738 554 289025

As you can see, most of the plan cache is taken up with ad-hoc plans, with an average use of 39, which is quite low; therefore, you now want to determine how many of those are single-use plans by modifying the earlier cache sizing script:

SELECT count(*) AS 'Number of Plans',
sum(cast(size_in_bytes AS BIGINT))/1024/1024 AS 'Plan Cache Size (MB)'
FROM sys.dm_exec_cached_plans
WHERE usecounts = 1
AND objtype = 'adhoc'

Here are the results:

Number of Plans   Plan Cache Size (MB)

12117 332

This indicates that 332MB of cache is being used for plans that aren’t being reused, which isn’t a huge amount on this server, but it’s completely wasted, so there’s no reason not to get rid of these plans.

The Optimize for Adhoc Workloads option ensures that this scenario will never occur — and because it only affects ad-hoc plans, we recommend switching it on by default in all installations of SQL Server.

 
Others
 
- SQL Server 2012 : SQL Server Memory - Clerks, Caches, and the Buffer Pool
- Troubleshooting Windows Server 2008 and Windows Server 2008 R2 : Scheduling a Script to Run Data Collector Sets
- Troubleshooting Windows Server 2008 and Windows Server 2008 R2 : Writing a Script to Run Data Collector Sets
- Troubleshooting Windows Server 2008 and Windows Server 2008 R2 : Running System Data Collector Sets
- Troubleshooting Windows Server 2008 and Windows Server 2008 R2 : Gathering Information from the Reliability Monitor
- Windows 7 : Using a Windows Network - Security and File Sharing
- Windows 7 : Using a Windows Network - Searching the Network
- Windows 7 : Using a Windows Network - Using Shared Folders in Windows 7
- Windows Home Server 2011 : Patching Home Computers with WSUS - Approving Updates
- Windows Home Server 2011 : Patching Home Computers with WSUS - Connecting Home Computers to WSUS
 
 
Top 10
 
- Microsoft Visio 2013 : Adding Structure to Your Diagrams - Finding containers and lists in Visio (part 2) - Wireframes,Legends
- Microsoft Visio 2013 : Adding Structure to Your Diagrams - Finding containers and lists in Visio (part 1) - Swimlanes
- Microsoft Visio 2013 : Adding Structure to Your Diagrams - Formatting and sizing lists
- Microsoft Visio 2013 : Adding Structure to Your Diagrams - Adding shapes to lists
- Microsoft Visio 2013 : Adding Structure to Your Diagrams - Sizing containers
- Microsoft Access 2010 : Control Properties and Why to Use Them (part 3) - The Other Properties of a Control
- Microsoft Access 2010 : Control Properties and Why to Use Them (part 2) - The Data Properties of a Control
- Microsoft Access 2010 : Control Properties and Why to Use Them (part 1) - The Format Properties of a Control
- Microsoft Access 2010 : Form Properties and Why Should You Use Them - Working with the Properties Window
- Microsoft Visio 2013 : Using the Organization Chart Wizard with new data
Technology FAQ
- Is possible to just to use a wireless router to extend wireless access to wireless access points?
- Ruby - Insert Struct to MySql
- how to find my Symantec pcAnywhere serial number
- About direct X / Open GL issue
- How to determine eclipse version?
- What SAN cert Exchange 2010 for UM, OA?
- How do I populate a SQL Express table from Excel file?
- code for express check out with Paypal.
- Problem with Templated User Control
- ShellExecute SW_HIDE
programming4us programming4us