Finally, we just have to click on "Test Connection" in the tab "Connection" to proceed with the test. If the connectivity to the SQL instance is ok, you will see the message "Test connection succeeded", it couldn't have been simpler!
Tuesday, 25 July 2017
Testing database connectivity by using a Universal Data Link file
After installing a SQL instance, we may need to test the database connectivity from a client host to make sure that everything is working very well, for instance, sometimes Windows Firewall might block access to the service or there might be network issues. Moreover, in the likely event that there was no time to install SQL Server client tools such as SSMS or SQLCMD to carry out the test, you woud be a bit suprised to know that there is a simpler way to do it, that is via a Data Link file. Consequently, in this post I am going to show you how to create and use a Data Link file to test connectivity to a SQL instance. To begin with, you must open Notepad to create an empty .txt file and save it with the .udl extension as you can see in the following picture.
After doing that, you must open the .udl file and you will then see the following window with four tabs. The second tab "Connection" is to fill with the server name (or SQL instance name) and the credentials accordingly. For instance, I am testing the connectivity to a default SQL instance and using Windows Authentication. You must modify that to serve your needs.
In the first tab "Provider" we can choose the Provider to use in the test. By default, it is always "Microsoft OLE DB Provider for SQL Server". It is ideal to test other providers as well. It is worth noting that SQL Native Providers will be listed if SQL Client tools are installed locally in the client host from where the test is done. In the tab "Advanced" it is possible to set the timeout value whereas in the tab "All" we can see a summary of all the settings, and we can also edit the values of some important connection parameters such as "Language", "Connect Timeout", "Packet Size", "Data Source" and "Initial Catalog".
Finally, we just have to click on "Test Connection" in the tab "Connection" to proceed with the test. If the connectivity to the SQL instance is ok, you will see the message "Test connection succeeded", it couldn't have been simpler!
That is all for now. I hope you find this post helpful and practical. Let me know any remarks you ma have. Stay tuned.
Finally, we just have to click on "Test Connection" in the tab "Connection" to proceed with the test. If the connectivity to the SQL instance is ok, you will see the message "Test connection succeeded", it couldn't have been simpler!
Thursday, 20 July 2017
Detecting excessive compilation and recompilation issues
Undoubtedly, recompilation is a big topic to reckon with, more importantly, in database environments processing data that is changing rapidly over time and compounded by ad-hoc workloads which may cause CPU bottleneck, so it is of paramount importance to detect excessive compilation and recompilation issues and address them to warrant stable performance for the queries, and in this sense, there are some tools to be used to detect these issues such as Performance Monitor, Extended Events, SQL Server Profiler Trace, DMVs, etc. When it comes to using Performance Monitor we should concentrate the efforts on analising the performance counters SQL Server: SQL Statistics: Batch Requests/sec, SQL Server: SQL Statistics: SQL Compilations/sec, and SQL Server: SQL Statistics: SQL Recompilations/sec. What's more, you can save a trace file capturing the events SP:Recompile, SQL:StmtRecompile, and CursorRecompile, then you can use the following query to see all the recompilation events:
Furthermore, we can also capture the showplan XML for query compile, but doing that has significant performance overhead because it is captured for each compilation or recompilation. So, just do it for a very short time as long as you see a high value for the SQL Compilations/sec counter in Performance Monitor. Once you know where the problem is you can use Engine Tuning Advisor to see whether any indexing changes improve the compile time and the execution time of the query.
Talking of DMVs to diagnose recompilation issues, looking into 'sys.dm_exec_query_optimizer_info' is very helpful, in particular, look at the Elapsed Time, which is the time elapsed due to optimizations, and also Final Cost. If you see that Time Elapsed is very close to the CPU time, you might reach the conclusion that the compilation and recompilation time is attributable to that high CPU use. Another DMV to use is 'sys.dm_exec_query_stats' whose most important columns to look at are sql_handle, total worker time, plan generation number (the number of times the query has recompiled), and statement Start Offset. Here is an example to check the top 20 most procedures that have been recompiled.
There is a plethora of recommendations to deal with recompilation and keep the performance in optimum conditions, but for now you can take into consideration the following options:
select spid, StartTime, Textdata, EventSubclass, ObjectID, DatabaseID, SQLHandle from fn_trace_gettable('C:\RecompilationTrace_01.trc', 1) where EventClass in(37,75,166) -- 37 = Sp:Recompile, 75 = CursorRecompile, 166 = SQL:StmtRecompile
Talking of DMVs to diagnose recompilation issues, looking into 'sys.dm_exec_query_optimizer_info' is very helpful, in particular, look at the Elapsed Time, which is the time elapsed due to optimizations, and also Final Cost. If you see that Time Elapsed is very close to the CPU time, you might reach the conclusion that the compilation and recompilation time is attributable to that high CPU use. Another DMV to use is 'sys.dm_exec_query_stats' whose most important columns to look at are sql_handle, total worker time, plan generation number (the number of times the query has recompiled), and statement Start Offset. Here is an example to check the top 20 most procedures that have been recompiled.
select top 20 SQLText.text, sql_handle, plan_generation_num, execution_count, dbid, objectid from sys.dm_exec_query_stats cross apply sys.dm_exec_sql_text(sql_handle) as SQLText where plan_generation_num >1 order by plan_generation_num desc
- Check to see whether the stored procedure was created with the WITH RECOMPILE option or whether the RECOMPILE query hint was used. If a procedure was created with the WITH RECOMPILE option, since SQL Server 2005, you may be able to take advantage of a statement-level RECOMPILE hint if a particular statement within that procedure needs to be recompiled. Using this hint at the statement level avoids the need of recompiling the whole procedure each time it executes, while at the same time allowing the individual statement to be compiled.
- Recompilations can occur due to changes in statistics, and you can use the KEEPFIXED PLAN query hint to make recompilations occur only when there is the need to ensure correctness and not to respond to changes in statistics. For instance, in this context recompilation can only occur if the underlying table structure or its schema that is referenced by a statement changes, or if a table is marked with the sp_recompile stored procedure, all resulting in the fact that the plan no longer applies and consequently triggering the recompilation event.
- Using the KEEP PLAN query hint is useful to set the recompilation threshold of temporary tables to be the same as permanent tables. Take a look at the EventSubclass column which displays 'Statistics Changed' for an operation on a temporary table.
- Turning off the automatic updates of statistics for indexes and statistics that are defined on a table or indexed view prevents recompilations that are due to statistics changes on that object. It is worth noting that turning off the auto-stats option is not always a good idea. This is because the query optimizer is no longer sensitive to data changes in those objects resulting in suboptimal query plans. To be honest, I never turned off this option because I always preferred trusting in SQL Server criteria and instead opting to work on optimising queries.
- Keep in mind that recompilation thresholds for temporary tables are lower than for normal tables, so if the recompilations on a temporary table are due to statistics changes, you can change the temporary tables to table variables. A change in the cardinality of a table variable does not cause a recompilation. The side effect of this approach is that the query optimizer does not keep track of a table variable's cardinality because statistics are not created or maintained on table variables. This can result in less optimal query plans, however, you can test the different options and choose the best one. Generally, temporary tables provide much better performance than tables variables when lots of data is involved.
- Recompilation might also occur as a result from SET option changes, so one can diagnose it by using SQL Server Profiler to determine which SET option changed. It is highly advisable to avoid changing SET options within stored procedures and it is much better to set them at the connection level, and for the overwhelmingly majority of cases the default SET options work well. It is also very important to ensure that SET options are not changed during the lifetime of the connection.
- To avoid recompilations that are due to deferred compilations, do not combine DML with DDL and do not create the DDL as a result from conditional IF statements.
- To avoid recompilation and also to avoid ambiguity between objects, batches should have qualified object names, for example, dbo.Table1, User1.MySP, etc.
Categories:
CPU,
DBA,
Memory,
Performance Tuning,
Recompilation,
Statistics
Friday, 14 July 2017
Ports and Protocols Used by Microsoft SQL Server
Naturally, I have been asked many times about ports used by SQL Server services and to be honest sometimes I took me some time to reply because there is a great number of ports and protocols and it is not easy to remember them quickly. Not all of us have to learn everything by rote, so thinking about it, I made the decision of sharing the following lists of useful ports and protocols so that you can have them at hand when needed.
That is all for now. I hope you find this post useful. Let me know any remarks you may have. Stay tuned.
Ports and Protocols Used by Microsoft SQL Server 2000 | ||
Service / Purpose | Protocol | Port |
Analysis Services | TCP | 2725 |
Client connections when "hide server" option enabled | TCP | 2433 |
Clients using Named Pipes over Netbios | TCP | 139/445 |
Microsoft SQL Monitor port | UDP | 1434 |
OLAP Services connections from downlevel clients OLAP Services 7.0 | TCP | 2393/2394 |
SQL over TCP ** | TCP | 1433 |
Standard URL for a report server (Reporting Services) | TCP | 80 HTTP /443 SSL |
Ports and Protocols Used by Microsoft SQL Server 2005 | ||
Service / Purpose | Protocol | Port |
Analysis Services connections via HTTP (default) | TCP | 80 |
Analysis Services connections via HTTPS (default) | TCP | 443 |
Clients using Named Pipes over Netbios | TCP | 137/138/139/445 |
Dedicated Administrator Connection | TCP | 1434 by default (local port). But this port is assigned dynamically by SQL Server during startup. |
Reporting services on Windows 2003/2008/Vista (default) | TCP | 80 |
Reporting services on Windows XP SP2 | TCP | 8080 |
SQL Server 2005 Analysis Services | TCP | 2383 |
SQL Server Browser Service | TCP | 2382 |
SQL Server Integration Services (MSDTSServer) | TCP | 135 |
SQL Server Resolution Protocol | TCP | 1434 |
SQL over TCP (default instance) | TCP | 1433 |
SQL over TCP (named instances) | TCP | 1434 / 1954 |
Ports and Protocols Used by Microsoft SQL Server 2008/2012/2014/2016/2017 | ||
Service / Purpose | Protocol | Port |
Analysis Services connections via HTTP (default) | TCP | 80 |
Analysis Services connections via HTTPS (default) | TCP | 443 |
Clustering | UDP | 135 |
Clustering | TCP | 135 (RPC) / 3343 (Cluster Network Driver) / 445 SMB / 139 NetBIOS / 5000-5099 (RPC) / 8011-8031 (RPC) |
Database Mirroring | TCP | There is no default port for this service. Use the following T-SQL statements to identify which ports are in use: SELECT name, port FROM sys.tcp_endpoints. |
Dedicated Administrator Connection | TCP | 1434 by default (local port). But this port is assigned dynamically by SQL Server during startup. |
Filestream | TCP | 139 y 445 |
Microsoft Distributed Transaction Coordinator (MS DTC) | TCP | 135 |
Reporting services Web Services | TCP | 80 |
Reporting Services configured for use through HTTPS | TCP | 1433 |
Service Broker | TCP | 4022 |
SQL Server Analysis Services | TCP | 2382 (SQL Server Browser Services for SSAS port) |
2383 (Clusters will listen only on this port) | ||
SQL Server Browser Service (Database Engine) | UDP | 1434. Might be required when using named instances. |
SQL Server Browser Service | TCP | 2382 |
SQL Server default instance running over an HTTPS endpoint. | TCP | 443 |
SQL Server Instance (Database Engine) running over an HTTP endpoint. | TCP | 80 y 443 (SSL) |
SQL Server Integration Services | TCP | 135 (DCOM) |
SQL over TCP (default instance) | TCP | 1433 |
Transact-SQL Debugger | TCP | 135 |
Windows Management Instrumentation | TCP | 135 (DCOM) |
That is all for now. I hope you find this post useful. Let me know any remarks you may have. Stay tuned.
Categories:
DBA,
Ports/Protocols,
Security,
Windows
HELLO, I'M PERCY REYES! I've been working as a senior SQL Server Database Engineer for over 20 years; I'm a three-time Microsoft Data Platform MVP. I'm a cryptographer conducting research on cryptographic Boolean functions and their applications.