Finally, we just have to click on "Test Connection" in the tab "Connection" to proceed with the test. If the connectivity to the SQL instance is ok, you will see the message "Test connection succeeded", it couldn't have been simpler!
Tuesday, 25 July 2017
Testing database connectivity by using a Universal Data Link file
After installing a SQL instance, we may need to test the database connectivity from a client host to make sure that everything is working very well, for instance, sometimes Windows Firewall might block access to the service or there might be network issues. Moreover, in the likely event that there was no time to install SQL Server client tools such as SSMS or SQLCMD to carry out the test, you woud be a bit suprised to know that there is a simpler way to do it, that is via a Data Link file. Consequently, in this post I am going to show you how to create and use a Data Link file to test connectivity to a SQL instance. To begin with, you must open Notepad to create an empty .txt file and save it with the .udl extension as you can see in the following picture.
After doing that, you must open the .udl file and you will then see the following window with four tabs. The second tab "Connection" is to fill with the server name (or SQL instance name) and the credentials accordingly. For instance, I am testing the connectivity to a default SQL instance and using Windows Authentication. You must modify that to serve your needs.
In the first tab "Provider" we can choose the Provider to use in the test. By default, it is always "Microsoft OLE DB Provider for SQL Server". It is ideal to test other providers as well. It is worth noting that SQL Native Providers will be listed if SQL Client tools are installed locally in the client host from where the test is done. In the tab "Advanced" it is possible to set the timeout value whereas in the tab "All" we can see a summary of all the settings, and we can also edit the values of some important connection parameters such as "Language", "Connect Timeout", "Packet Size", "Data Source" and "Initial Catalog".
Finally, we just have to click on "Test Connection" in the tab "Connection" to proceed with the test. If the connectivity to the SQL instance is ok, you will see the message "Test connection succeeded", it couldn't have been simpler!
That is all for now. I hope you find this post helpful and practical. Let me know any remarks you ma have. Stay tuned.
Finally, we just have to click on "Test Connection" in the tab "Connection" to proceed with the test. If the connectivity to the SQL instance is ok, you will see the message "Test connection succeeded", it couldn't have been simpler!
Thursday, 20 July 2017
Detecting excessive compilation and recompilation issues
Undoubtedly, recompilation is a big topic to reckon with, more importantly, in database environments processing data that is changing rapidly over time and compounded by ad-hoc workloads which may cause CPU bottleneck, so it is of paramount importance to detect excessive compilation and recompilation issues and address them to warrant stable performance for the queries, and in this sense, there are some tools to be used to detect these issues such as Performance Monitor, Extended Events, SQL Server Profiler Trace, DMVs, etc. When it comes to using Performance Monitor we should concentrate the efforts on analising the performance counters SQL Server: SQL Statistics: Batch Requests/sec, SQL Server: SQL Statistics: SQL Compilations/sec, and SQL Server: SQL Statistics: SQL Recompilations/sec. What's more, you can save a trace file capturing the events SP:Recompile, SQL:StmtRecompile, and CursorRecompile, then you can use the following query to see all the recompilation events:
Furthermore, we can also capture the showplan XML for query compile, but doing that has significant performance overhead because it is captured for each compilation or recompilation. So, just do it for a very short time as long as you see a high value for the SQL Compilations/sec counter in Performance Monitor. Once you know where the problem is you can use Engine Tuning Advisor to see whether any indexing changes improve the compile time and the execution time of the query.
Talking of DMVs to diagnose recompilation issues, looking into 'sys.dm_exec_query_optimizer_info' is very helpful, in particular, look at the Elapsed Time, which is the time elapsed due to optimizations, and also Final Cost. If you see that Time Elapsed is very close to the CPU time, you might reach the conclusion that the compilation and recompilation time is attributable to that high CPU use. Another DMV to use is 'sys.dm_exec_query_stats' whose most important columns to look at are sql_handle, total worker time, plan generation number (the number of times the query has recompiled), and statement Start Offset. Here is an example to check the top 20 most procedures that have been recompiled.
There is a plethora of recommendations to deal with recompilation and keep the performance in optimum conditions, but for now you can take into consideration the following options:
select spid, StartTime, Textdata, EventSubclass, ObjectID, DatabaseID, SQLHandle from fn_trace_gettable('C:\RecompilationTrace_01.trc', 1) where EventClass in(37,75,166) -- 37 = Sp:Recompile, 75 = CursorRecompile, 166 = SQL:StmtRecompile
Talking of DMVs to diagnose recompilation issues, looking into 'sys.dm_exec_query_optimizer_info' is very helpful, in particular, look at the Elapsed Time, which is the time elapsed due to optimizations, and also Final Cost. If you see that Time Elapsed is very close to the CPU time, you might reach the conclusion that the compilation and recompilation time is attributable to that high CPU use. Another DMV to use is 'sys.dm_exec_query_stats' whose most important columns to look at are sql_handle, total worker time, plan generation number (the number of times the query has recompiled), and statement Start Offset. Here is an example to check the top 20 most procedures that have been recompiled.
select top 20 SQLText.text, sql_handle, plan_generation_num, execution_count, dbid, objectid from sys.dm_exec_query_stats cross apply sys.dm_exec_sql_text(sql_handle) as SQLText where plan_generation_num >1 order by plan_generation_num desc
- Check to see whether the stored procedure was created with the WITH RECOMPILE option or whether the RECOMPILE query hint was used. If a procedure was created with the WITH RECOMPILE option, since SQL Server 2005, you may be able to take advantage of a statement-level RECOMPILE hint if a particular statement within that procedure needs to be recompiled. Using this hint at the statement level avoids the need of recompiling the whole procedure each time it executes, while at the same time allowing the individual statement to be compiled.
- Recompilations can occur due to changes in statistics, and you can use the KEEPFIXED PLAN query hint to make recompilations occur only when there is the need to ensure correctness and not to respond to changes in statistics. For instance, in this context recompilation can only occur if the underlying table structure or its schema that is referenced by a statement changes, or if a table is marked with the sp_recompile stored procedure, all resulting in the fact that the plan no longer applies and consequently triggering the recompilation event.
- Using the KEEP PLAN query hint is useful to set the recompilation threshold of temporary tables to be the same as permanent tables. Take a look at the EventSubclass column which displays 'Statistics Changed' for an operation on a temporary table.
- Turning off the automatic updates of statistics for indexes and statistics that are defined on a table or indexed view prevents recompilations that are due to statistics changes on that object. It is worth noting that turning off the auto-stats option is not always a good idea. This is because the query optimizer is no longer sensitive to data changes in those objects resulting in suboptimal query plans. To be honest, I never turned off this option because I always preferred trusting in SQL Server criteria and instead opting to work on optimising queries.
- Keep in mind that recompilation thresholds for temporary tables are lower than for normal tables, so if the recompilations on a temporary table are due to statistics changes, you can change the temporary tables to table variables. A change in the cardinality of a table variable does not cause a recompilation. The side effect of this approach is that the query optimizer does not keep track of a table variable's cardinality because statistics are not created or maintained on table variables. This can result in less optimal query plans, however, you can test the different options and choose the best one. Generally, temporary tables provide much better performance than tables variables when lots of data is involved.
- Recompilation might also occur as a result from SET option changes, so one can diagnose it by using SQL Server Profiler to determine which SET option changed. It is highly advisable to avoid changing SET options within stored procedures and it is much better to set them at the connection level, and for the overwhelmingly majority of cases the default SET options work well. It is also very important to ensure that SET options are not changed during the lifetime of the connection.
- To avoid recompilations that are due to deferred compilations, do not combine DML with DDL and do not create the DDL as a result from conditional IF statements.
- To avoid recompilation and also to avoid ambiguity between objects, batches should have qualified object names, for example, dbo.Table1, User1.MySP, etc.
Categories:
CPU,
DBA,
Memory,
Performance Tuning,
Recompilation,
Statistics
Friday, 14 July 2017
Ports and Protocols Used by Microsoft SQL Server
Naturally, I have been asked many times about ports used by SQL Server services and to be honest sometimes I took me some time to reply because there is a great number of ports and protocols and it is not easy to remember them quickly. Not all of us have to learn everything by rote, so thinking about it, I made the decision of sharing the following lists of useful ports and protocols so that you can have them at hand when needed.
That is all for now. I hope you find this post useful. Let me know any remarks you may have. Stay tuned.
Ports and Protocols Used by Microsoft SQL Server 2000 | ||
Service / Purpose | Protocol | Port |
Analysis Services | TCP | 2725 |
Client connections when "hide server" option enabled | TCP | 2433 |
Clients using Named Pipes over Netbios | TCP | 139/445 |
Microsoft SQL Monitor port | UDP | 1434 |
OLAP Services connections from downlevel clients OLAP Services 7.0 | TCP | 2393/2394 |
SQL over TCP ** | TCP | 1433 |
Standard URL for a report server (Reporting Services) | TCP | 80 HTTP /443 SSL |
Ports and Protocols Used by Microsoft SQL Server 2005 | ||
Service / Purpose | Protocol | Port |
Analysis Services connections via HTTP (default) | TCP | 80 |
Analysis Services connections via HTTPS (default) | TCP | 443 |
Clients using Named Pipes over Netbios | TCP | 137/138/139/445 |
Dedicated Administrator Connection | TCP | 1434 by default (local port). But this port is assigned dynamically by SQL Server during startup. |
Reporting services on Windows 2003/2008/Vista (default) | TCP | 80 |
Reporting services on Windows XP SP2 | TCP | 8080 |
SQL Server 2005 Analysis Services | TCP | 2383 |
SQL Server Browser Service | TCP | 2382 |
SQL Server Integration Services (MSDTSServer) | TCP | 135 |
SQL Server Resolution Protocol | TCP | 1434 |
SQL over TCP (default instance) | TCP | 1433 |
SQL over TCP (named instances) | TCP | 1434 / 1954 |
Ports and Protocols Used by Microsoft SQL Server 2008/2012/2014/2016/2017 | ||
Service / Purpose | Protocol | Port |
Analysis Services connections via HTTP (default) | TCP | 80 |
Analysis Services connections via HTTPS (default) | TCP | 443 |
Clustering | UDP | 135 |
Clustering | TCP | 135 (RPC) / 3343 (Cluster Network Driver) / 445 SMB / 139 NetBIOS / 5000-5099 (RPC) / 8011-8031 (RPC) |
Database Mirroring | TCP | There is no default port for this service. Use the following T-SQL statements to identify which ports are in use: SELECT name, port FROM sys.tcp_endpoints. |
Dedicated Administrator Connection | TCP | 1434 by default (local port). But this port is assigned dynamically by SQL Server during startup. |
Filestream | TCP | 139 y 445 |
Microsoft Distributed Transaction Coordinator (MS DTC) | TCP | 135 |
Reporting services Web Services | TCP | 80 |
Reporting Services configured for use through HTTPS | TCP | 1433 |
Service Broker | TCP | 4022 |
SQL Server Analysis Services | TCP | 2382 (SQL Server Browser Services for SSAS port) |
2383 (Clusters will listen only on this port) | ||
SQL Server Browser Service (Database Engine) | UDP | 1434. Might be required when using named instances. |
SQL Server Browser Service | TCP | 2382 |
SQL Server default instance running over an HTTPS endpoint. | TCP | 443 |
SQL Server Instance (Database Engine) running over an HTTP endpoint. | TCP | 80 y 443 (SSL) |
SQL Server Integration Services | TCP | 135 (DCOM) |
SQL over TCP (default instance) | TCP | 1433 |
Transact-SQL Debugger | TCP | 135 |
Windows Management Instrumentation | TCP | 135 (DCOM) |
That is all for now. I hope you find this post useful. Let me know any remarks you may have. Stay tuned.
Categories:
DBA,
Ports/Protocols,
Security,
Windows
Sunday, 25 June 2017
Getting the full name of SQL Jobs including the steps in execution
While monitoring a database server, we may need to know some details about the sessions, connections and requests that migh be causing performance or blocking issues so that we can take actions to fix them. In doing so, one very important piece of information is the program name that is connected to the database engine. Broadly, it is posible to see the program name in detail by using system stored procedures or DMVs such as 'sp_who2' and 'sys.dm_exec_sessions'. Nevertheless, not every name of the programs may be easy to interpret, especially when it comes to SQL Jobs. For instance, if you detected that a SQL Job is the root of the problem and then needed to know which SQL Job it is, the column program_name of 'sp_who2' or 'sys.dm_exec_sessions' woud only give us the SQL Job ID based on the following format:
SQLAgent - TSQL JobStep (Job 0x2613DA812CD2D248A9BA377DE6DEF355 : Step 1)
Obviously, we cannot do much with that info because there is no SQL Job name, and even worse, no SQL Job step name. However, we can figure out the name of the SQL Job in msdb.dbo.sysjobs by using the ID.
Despite the fact that it may be relativily easy to get the name of the SQL Job, it is not enough because it is of paramount importance to know the name of the step in execution, and keep in mind that doing this manually this every time when needed is going to be an uphill battle as it is arduous and not optimal, especially if there are many SQL Jobs running and causing struggles. Thinking of this situation, I created a script to automate the task of figuring out details related to SQL Jobs such as the name and also the step name that is running. To be more precise, this logic is inside a function called "ufn_GetJobStepNameDesc" that takes the value of the "program_name" column and returns the name of the SQL Job and the step in execution. Let's take a look at the following code whereby we also filter out the sessions used by SQL Jobs.
As you can see, I have highlighted the use of the function, and also added other important columns to look at as part of monitoring. So, using dbo.ufn_GetJobStepNameDesc([program_name]) the final outcome would be like this: SQLAgent - TSQL JobStep "<Name of the job step>" (Job: <Job name>). For instance: SQLAgent - TSQL JobStep "Updating_Accounts" (Job: SAP_Financial_Process)
SQLAgent - TSQL JobStep (Job 0x2613DA812CD2D248A9BA377DE6DEF355 : Step 1)
Obviously, we cannot do much with that info because there is no SQL Job name, and even worse, no SQL Job step name. However, we can figure out the name of the SQL Job in msdb.dbo.sysjobs by using the ID.
SELECT * FROM msdb.dbo.sysjobs
WHERE job_id=0x2613DA812CD2D248A9BA377DE6DEF355
Despite the fact that it may be relativily easy to get the name of the SQL Job, it is not enough because it is of paramount importance to know the name of the step in execution, and keep in mind that doing this manually this every time when needed is going to be an uphill battle as it is arduous and not optimal, especially if there are many SQL Jobs running and causing struggles. Thinking of this situation, I created a script to automate the task of figuring out details related to SQL Jobs such as the name and also the step name that is running. To be more precise, this logic is inside a function called "ufn_GetJobStepNameDesc" that takes the value of the "program_name" column and returns the name of the SQL Job and the step in execution. Let's take a look at the following code whereby we also filter out the sessions used by SQL Jobs.
SELECT
session_id, login_time, login_name, [status],
writes,
logical_reads,
[language], DB_NAME(database_id) DatabaseName,
dbo.ufn_GetJobStepNameDesc([program_name]) AS
SQLJobDescription
FROM sys.dm_exec_sessions where [program_name] like 'SQLAgent - TSQL%'
Here I share with you my script so that you can check it thoroughly and then make the most out if it.
USE
[master]
GO
CREATE FUNCTION [dbo].[ufn_GetJobStepNameDesc] (@step_name_desc
VARCHAR(MAX))
RETURNS VARCHAR(max)
AS
BEGIN
DECLARE
@full_step_name_desc VARCHAR(MAX)
DECLARE
@jobstep_id_start INT
DECLARE
@jobstep_id_len INT
DECLARE
@jobstep_id INT
SELECT
@jobstep_id_start=CHARINDEX(': Step',
@step_name_desc)+7, @jobstep_id_len=CHARINDEX(')',
@step_name_desc)-@jobstep_id_start
set @jobstep_id= CAST(SUBSTRING(@step_name_desc,
@jobstep_id_start,@jobstep_id_len) AS INT)
DECLARE
@job_id_start INT
DECLARE
@job_id_len INT
DECLARE
@hexa_job_id VARBINARY(MAX)
SELECT
@job_id_start=CHARINDEX('(Job 0',
@step_name_desc)+5,
@job_id_len=CHARINDEX(':',
@step_name_desc)-@job_id_start
SET @hexa_job_id=CONVERT( VARBINARY, RTRIM(LTRIM(SUBSTRING(@step_name_desc,
@job_id_start,@job_id_len))),1)
SELECT
@full_step_name_desc='SQLAgent - TSQL JobStep "' +
step_name+'"
(Job: ' + j.[name] +')'
FROM msdb.dbo.sysjobsteps
js
INNER JOIN msdb.dbo.sysjobs
j ON j.job_id=js.job_id
WHERE
step_id=@jobstep_id and j.job_id=CAST(@hexa_job_id AS UNIQUEIDENTIFIER)
RETURN (
@full_step_name_desc)
END
Monday, 19 June 2017
Error: could not obtain information about Windows NT group/user
Without any doubt, at times there is a need to use extended stored procedures in SQL Server, for instance, the following error may be raised while using 'xp_logininfo' to get information about domain account 'MyDomainMyAccount' (the account you are logged within SQL Server). The error may also appear when a SQL job, whose owner is a windows account, tries to authenticate against the Active Directory(AD) and this validation fails because of internal security reasons between SQL Server and the AD.
Error: 15404, State: 19. Could not obtain information about Windows NT group/user ‘MyDomainMyAccount’, error code 0x5.
In order to solve this error your Network Administrator has to enable ‘Allowed to authenticate’ security setting on the domain controllers computer object for the account 'MyDomainMyAccount' in the domain 'MyDomain' by following these steps:
Error: 15404, State: 19. Could not obtain information about Windows NT group/user ‘MyDomainMyAccount’, error code 0x5.
In order to solve this error your Network Administrator has to enable ‘Allowed to authenticate’ security setting on the domain controllers computer object for the account 'MyDomainMyAccount' in the domain 'MyDomain' by following these steps:
- logon to the Domain Controller of domain 'MyDomain'
- open Active Directory Users and Computers (dsa.msc)
- enable the ‘Advanced Features’ under the menu ‘View’
- navigate to the domain controllers computer object and open the property window
- click on the security tab
- add the SQL Service account 'MyDomainMyAccount' and enable the setting ‘Allowed to authenticate’
- click OK to close the window
- repeat steps 4-7 on each Domain Controller computer object
Having done that, 'xp_logininfo' will run successfully bringing the information from the Active Directory. That is all for now, let me know any remarks you may have. Thanks for reading. Stay tuned.
Friday, 2 June 2017
sys.dm_exec_requests: total_elapsed_time column might return inconsistent data
Developing useful scripts for administration purposes, I have found a bug with respect to data recollected by 'sys.dm_exec_requests' dynamic management view on SQL Server 2008 R2 SP2 while I was analysing the total elapsed time for a particular SQL query. Let me expand on what I am saying. I did detect an error in the 17th second of the execution with 'session_id' equal to 63 (see it in the picture). Following the sequence of each result in the execution of queries, I need to draw your attention to the second query and its second column where the total elapsed time according to 'sys.dm_exec_requests' should be 17 and not 938 seconds since the previous one was 16. Now, checking the value of the third column (calculated by subtracting the 'start_time' value from GETDATE function), you will verify that this time is accurate whereas the second one is false. The value for the second and third columns should be the same but they are different.
To sum up, for this particular and real case, the column 'total_elapsed_time' returns inconsistent information about elapsed execution time of a process when it exceeds 16 seconds. Despite of the fact that I have not seen the same issue in other versions like SQL Server 2012/2014/2016/2017, it is better not to trust in DMVs so much. Therefore, I suggest working with caution. That is all for now, let me know any remark you may have. Stay tuned.
To sum up, for this particular and real case, the column 'total_elapsed_time' returns inconsistent information about elapsed execution time of a process when it exceeds 16 seconds. Despite of the fact that I have not seen the same issue in other versions like SQL Server 2012/2014/2016/2017, it is better not to trust in DMVs so much. Therefore, I suggest working with caution. That is all for now, let me know any remark you may have. Stay tuned.
HELLO, I'M PERCY REYES! I've been working as a senior SQL Server Database Engineer for over 20 years; I'm a three-time Microsoft Data Platform MVP. I'm a cryptographer conducting research on cryptographic Boolean functions and their applications.