Showing posts with label Memory. Show all posts
Showing posts with label Memory. Show all posts

Saturday 2 December 2017

Memory + CPU support for all SQL Server versions and editions

It is by no means uncommon to think that every DBA should have a list like this following one in order to have clear some aspects before installing/implementing new database environments. It is a compact brief of what different versions and editions of SQL Server support in terms of memory and CPU resources. Consequently, I share with you my resume, and I hope you find it useful. Let me know any remarks you may have.

SQL Server 2016/2017 SQL Server 2014
SQL Server edition Maximum memory supported Maximum Compute Capacity SQL Server edition Maximum memory supported Maximum Compute Capacity
Enterprise OS max OS max Enterprise OS max OS max
Standard 128 GB 4  CPU or 24 cores Business Intelligence 128 GB 4 CPU or 16 cores
Web 64 GB 4 CPU or 16 cores Standard 128 GB 4  CPU or 16 cores
Express 1410 MB 1 CPU or 4 cores Web 64 GB 4 CPU or 16 cores
Express with Advanced Services 1410 MB 1 CPU or 4 cores Express 1 GB 1 CPU or 4 cores
Express with Tools 1 GB 1 CPU or 4 cores
SQL Server 2012 Express with Advanced Services 1 GB 1 CPU or 4 cores
SQL Server edition Maximum memory supported Maximum Compute Capacity
Enterprise OS max OS max SQL Server 2008 R2
Business Intelligence 64 GB 4 CPU or 16 cores SQL Server edition Maximum memory supported Maximum Compute Capacity
Standard 64 GB 4  CPU or 16 cores Datacenter OS max OS max (64 CPU or 256 logical) 
Web 64 GB 4 CPU or 16 cores Enterprise 2 TB 8 ( or 256 Logical CPU)
Express 1 GB 1 CPU or 4 cores Developer OS max OS max
Express with Tools 1 GB 1 CPU or 4 cores Standard 64 GB 4
Express with Advanced Services 1 GB 1 CPU or 4 cores Web 64 GB 4
Workgroup 4 GB (64-bit), OS max (32-bit) 2
SQL Server 2008 Express 1 GB 1
SQL Server edition Maximum memory supported Maximum Compute Capacity Express with Tools 1 GB 1
Enterprise OS max OS max (8 CPU or 64 Logical CPU) Express with Advanced Services 1 GB (4 GB for RS) 1
Developer OS max OS max
Standard OS max 4 SQL Server 2005
Web OS max 4 SQL Server edition Maximum memory supported Maximum Compute Capacity
Workgroup 4 GB (64-bit), OS max (32-bit) 2 Enterprise Edition OS max OS max
Express 1 GB 1 Developer Edition OS max (32-bit), 32 TB (64-bit) OS max
Express with Tools 1 GB 1 Standard Edition OS max, 32 TB (64-bit) 4
Express with Advanced Services 1 GB 1 Workgroup Edition 3 GB(32-bit) 2 (32-bit)
Express Edition 1 GB(32-bit) 1 (32-bit)
Evaluation Edition OS max OS max

Thursday 7 September 2017

Getting information about memory used by SQL Server

Naturally, every DBA is asked to report information about the memory used by SQL Server. Common questions are related to memory reserved and memory used currently, and sometimes we might not know how to complete this task. How many times did we take a look at the Windows Manager Task to find out that info? Many of us might have ended up quite frustrated time and time again because it did not help much. Over time Microsoft decided to provide Administrators with more useful tools and released Resource Manager tool whereby useful memory information per process is available, nevertheless, whether we want to get that info from SQL Server it might be an uphill battle as it would need that we write certain complex code at windows level.  For the time being, thinking about this situation I made the snap decision of sharing with you some helpful scripts that will alleviate the pain.

In the likely event that you might need to get the total buffer pool memory used by all databases at SQL instance level, this script is for it.

SELECT cast( cast( COUNT(*) /128.0/1024.0 as decimal(10,2)) as varchar(10)) + 'GB'  AS TotalUsageBufferPool
FROM sys.dm_os_buffer_descriptors

And if you wanted to know the buffer pool memory used by each database, you can use this:

SELECT CASE database_id 
        WHEN 32767 THEN 'ResourceDb'  ELSE db_name(database_id)         END AS DatabaseName,
        cast( COUNT(*) /128.0 as decimal(10,2)) AS [BufferPool(MB)]
FROM sys.dm_os_buffer_descriptors
GROUP BY DB_NAME(database_id) ,database_id
ORDER BY [BufferPool(MB)] DESC

Finally, and more importantly, answering the question about the total memory used by the whole SQL instance:
-- SQL2012/2014/2016/2017
select cast(cast(physical_memory_kb /1024.0/1024.0  as decimal(10,2))  as varchar(10)) + 'GB' TotalPhysicalRAM, 
cast(cast(visible_target_kb /1024.0/1024.0  as decimal(10,2)) as varchar(10)) + 'GB' MaxRAM, -- max memory configure at sql server level
cast(cast(committed_target_kb /1024.0/1024.0  as decimal(10,2)) as  varchar(10)) + 'GB' ReservedRAM,  --memory reserved
cast(cast(committed_kb /1024.0/1024.0  as decimal(10,2)) as  varchar(10)) + 'GB' UsedRAM --memory used currently
FROM sys.dm_os_sys_info

-- for SQL2005/2008/2008R2
SELECT cast(cast(physical_memory_in_bytes /1024.0/1024.0  as decimal(10,2))  as varchar(10)) + 'GB' TotalPhysicalRAM, 
cast(cast(bpool_visible /128.0/1024.0  as decimal(10,2)) as varchar(10)) + 'GB' MaxRAM, -- max memory configure at sql server level
cast(cast(bpool_commit_target /128.0/1024.0  as decimal(10,2)) as  varchar(10)) + 'GB' ReservedRAM,  --memory reserved
cast(cast(bpool_committed /128.0/1024.0  as decimal(10,2)) as  varchar(10)) + 'GB' UsedRAM --memory used currently
FROM sys.dm_os_sys_info

That is all for now. I hope you find these scripts helpful. Let me know any remarks you may have.

Thursday 20 July 2017

Detecting excessive compilation and recompilation issues

Undoubtedly, recompilation is a big topic to reckon with, more importantly, in database environments processing data that is changing rapidly over time and compounded by ad-hoc workloads which may cause CPU bottleneck, so it is of paramount importance to detect excessive compilation and recompilation issues and address them to warrant stable performance for the queries, and in this sense, there are some tools to be used to detect these issues such as Performance Monitor, Extended Events, SQL Server Profiler Trace, DMVs, etc. When it comes to using Performance Monitor we should concentrate the efforts on analising the performance counters SQL Server: SQL Statistics: Batch Requests/sec, SQL Server: SQL Statistics: SQL Compilations/sec, and SQL Server: SQL Statistics: SQL Recompilations/sec. What's more, you can save a trace file capturing the events SP:Recompile, SQL:StmtRecompile, and CursorRecompile, then you can use the following query to see all the recompilation events:

select spid, StartTime, Textdata, EventSubclass, ObjectID, DatabaseID, SQLHandle 
from fn_trace_gettable('C:\RecompilationTrace_01.trc', 1)
where EventClass in(37,75,166) -- 37 = Sp:Recompile, 75 = CursorRecompile, 166 = SQL:StmtRecompile

Furthermore, we can also capture the showplan XML for query compile, but doing that has significant performance overhead because it is captured for each compilation or recompilation. So, just do it for a very short time as long as you see a high value for the SQL Compilations/sec counter in Performance Monitor. Once you know where the problem is you can use Engine Tuning Advisor to see whether any indexing changes improve the compile time and the execution time of the query.
Talking of DMVs to diagnose recompilation issues, looking into 'sys.dm_exec_query_optimizer_info' is very helpful, in particular, look at the Elapsed Time, which is the time elapsed due to optimizations, and also Final Cost. If you see that Time Elapsed is very close to the CPU time, you might reach the conclusion that the compilation and recompilation time is attributable to that high CPU use. Another DMV to use is 'sys.dm_exec_query_stats' whose most important columns to look at are sql_handle, total worker time, plan generation number (the number of times the query has recompiled), and statement Start Offset. Here is an example to check the top 20 most procedures that have been recompiled.

select top 20 SQLText.text, sql_handle, plan_generation_num, execution_count, dbid, objectid 
from sys.dm_exec_query_stats
   cross apply sys.dm_exec_sql_text(sql_handle) as SQLText
where plan_generation_num >1
order by plan_generation_num desc

There is a plethora of recommendations to deal with recompilation and keep the performance in optimum conditions, but for now you can take into consideration the following options:
  • Check to see whether the stored procedure was created with the WITH RECOMPILE option or whether the RECOMPILE query hint was used. If a procedure was created with the WITH RECOMPILE option, since SQL Server 2005, you may be able to take advantage of a statement-level RECOMPILE hint if a particular statement within that procedure needs to be recompiled. Using this hint at the statement level avoids the need of recompiling the whole procedure each time it executes, while at the same time allowing the individual statement to be compiled. 
  • Recompilations can occur due to changes in statistics, and you can use the KEEPFIXED PLAN query hint to make recompilations occur only when there is the need to ensure correctness and not to respond to changes in statistics. For instance, in this context recompilation can only occur if the underlying table structure or its schema that is referenced by a statement changes, or if a table is marked with the sp_recompile stored procedure, all resulting in the fact that the plan no longer applies and consequently triggering the recompilation event. 
  • Using the KEEP PLAN query hint is useful to set the recompilation threshold of temporary tables to be the same as permanent tables. Take a look at the EventSubclass column which displays 'Statistics Changed' for an operation on a temporary table.
  • Turning off the automatic updates of statistics for indexes and statistics that are defined on a table or indexed view prevents recompilations that are due to statistics changes on that object. It is worth noting that turning off the auto-stats option is not always a good idea. This is because the query optimizer is no longer sensitive to data changes in those objects resulting in suboptimal query plans. To be honest, I never turned off this option because I always preferred trusting in SQL Server criteria and instead opting to work on optimising queries.
  • Keep in mind that recompilation thresholds for temporary tables are lower than for normal tables, so if the recompilations on a temporary table are due to statistics changes, you can change the temporary tables to table variables. A change in the cardinality of a table variable does not cause a recompilation. The side effect of this approach is that the query optimizer does not keep track of a table variable's cardinality because statistics are not created or maintained on table variables. This can result in less optimal query plans, however, you can test the different options and choose the best one. Generally, temporary tables provide much better performance than tables variables when lots of data is involved.
  • Recompilation might also occur as a result from SET option changes, so one can diagnose it by using SQL Server Profiler to determine which SET option changed. It is highly advisable to avoid changing SET options within stored procedures and it is much better to set them at the connection level, and for the overwhelmingly majority of cases the default SET options work well. It is also very important to ensure that SET options are not changed during the lifetime of the connection.
  • To avoid recompilations that are due to deferred compilations, do not combine DML with DDL and do not create the DDL as a result from conditional IF statements.
  • To avoid recompilation and also to avoid ambiguity between objects, batches should have qualified object names, for example, dbo.Table1, User1.MySP, etc.
That is all for now, let me know any remarks you may have.
HELLO, I'M PERCY REYES! — a book lover, healthy lifestyle lover... I've been working as a senior SQL Server Database Administrator (DBA) for over 20 years; I'm a three-time awarded Microsoft Data Platform MVP. I'm currently doing a PhD in Computer Science (cryptography) at Loughborough University, England — working on cryptographic Boolean functions, algorithmic cryptanalysis, number theory, and other algebraic aspects of cryptography. READ MORE