Skip to content

posts🔗

SET NOEXEC is my new friend

Have never really played around with the option: SET NOEXEC ON Turns out this can be a helpful way to validate larger batch scripts before actually making changes, to ensure compilation happens. If you choose, you can verify syntax by "parsing" in SSMS. However, this doesn't compile. Compilation checks more than your syntax. It actually validates the objects referenced exist.

The execution of statements in SQL Server has two phases: compilation and execution. This setting is useful for having SQL Server validate the syntax and object names in Transact-SQL code when executing. It is also useful for debugging statements that would generally be part of a larger batch of statements. MSDN #188394 I previously had done this basic verification by running an estimated execution plan. This had the benefit of finding compilation errors as well. However, with large batch jobs it could be problematic and slow, as it had to process and return all execution plans for the statements, which I didn't need. Having this function can be a nice resource for validation of scripts before running, without the overhead of estimated execution plans. Additionally, you can handle transactional rollbacks and prevent cascading problems from happening when running a batch that changes your database by setting NOEXEC ON when encountering an error. Red Gate SQL Compare does this elegantly: MyDescription

Running very large scripts is not a strong area for SSMS

out of memory, argggh!

Am I the only one that has experienced the various out of memory issues with SSMS? Not according to google!

lovingly crafted in the forges of.. well ... dbforge

I've a huge fan of Devarts products. I've done a review in the past on their SQL Complete addin, which is the single most used tool in my SQL arsenal. It vanquishes nasty unformatted code into a standard lined up format I can easily read. The 100's of options to customize the formatting make it the most customizable formatter I've found for SQL code. This SQL Complete however, is a plugin for SSMS. It is native in their alternative to Sql Server Management Studio, dbForge Studio. Highly recommend checking this out. It's affordable, especially if you compare against other products that offer less.... and they have a dark theme muaaah!

execute script that is far too large

I'll post up more detail when time permits on some of the other features, but one noticeably cool feature is the "execute large script" option.

MyDescription

MyDescription You can see the progress and the update in the output log, but the entire script isn't slowing down your GUI. In fact, you can just putter along and keep coding.

MyDescription Other options to accomplish the same thing include executing via SQLCMD, powershell, or breaking things up into smaller files. This just happened to be a pretty convenient option!

Have I switched?

I haven't switched to using it as my primary development environment because of 2 reasons. Extensions... I do have quite a few that work in SSMS like SSMS Tools, SSMS Toolpack, and some Red Gate functionality. I lose that by switching over to dbForge Studio. Also, some of the keyboard shortcuts like delete line and others I'm so used to aren't in there. Regretably, they don't support importing a color scheme from visual studio, so you lose out on sites like https://studiostyl.es/ Other than a few minor quibbles like that I'm pretty happy with it. They've done a great job and the skinning of the app is great, giving you the option of dark or light themes.

Devart apps provided to me for my evaluation, but are not biasing my recommendation.

Database Stuck in Single-User Mode Due to Botched Restore

Working in a development environment, I botched up a restore. After this restore attempt to overwrite my database with the previous version, I had it stuck in single-user mode.

SSMS provided me with helpful messages such as this: Changes to the state or options of database 'PoorDb' cannot be made at this time. The database is in single-user mode, and a user is currently connected to it. Additionally, I was told I was the deadlock victim when attempting to set the user mode back to multi-user.

Going forward I looked at several articles from Stack Overflow and various other blogs, and followed the recommended steps such as

I even added a step to kill the connections to it by using this statement, helpfully posted by Matthew Haugen

Finally went through and removed all my connections from master based on an additional post. No luck. Stopped my monitoring tools, no luck. At this point, it felt like a Monday for sure.

Since I was working in a development environment, I went all gung ho and killed every session with my login name, as there seemed to be quite a few , except for the spid executing. Apparently, the blocking process was executing from master, probably the incomplete restore that didn't successfully rollback. I'll have to improve my transaction handling on this, as I just ran it straight with no error checks.

VICTORY!

What a waste of time, but at least I know to watch out next time, ensure my actions are checked for error and rolled back. I'm going to just blame it on the darn SSMS GUI. Seems like a convenient scapegoat this time.

Successful pushed out my changes with the following script:

What was I thinking? Deleting myself from localdb?

Was testing a batch file to add a user to a localdb instance. Assumed that my user as admin on the machine wouldn't have an issue inserting myself back.... didn't think that one through too carefully. Executing any type of SQLCMD against it denied me. SSMS denied me. No SA had been setup on it, so I couldn't login as SA either. Looked for various solutions, and ended up uninstalling and reinstalling (localdb)v11.0 so that I'd stop having myself denied permissions.

This however, didn't fix my issue. The solution that ended up working from me came from dba.stackstackexchange.

I ended up deleting everything in the v11.0 Instances folder and then issuing the following command sqllocaldb.exe c v11.0

Resulting in message: LocalDB instance "v11.0" created with version 11.0.3000.0.

Success! This resulted in the instance being created successfully, and then I was able to login with SSMS. Apparently today was my day for learning some localdb permissions issues. What a blast..... Could have avoided this if I had simply used a test login, or had setup the SA with a proper password for logging in. #sqlfail

Some simple examples of querying xml with sql

XML is a beast if you've never tackled it. Here are some simple examples of what I discovered as I experimented and walked through obtaining values out of a XML column.

Enabling Instant File Initialization

Found a couple good walkthroughs on enabling instant file initialization. However, I'm becoming more familar with the nuances of various setups and found it confusing in trying to map the correct user/group to enable this option. In my case, I had the SQL Service running under NT SERVICE/MSSSQLSERVER and as such this logic wasn't showing up when trying to find groups/users to add to the necessary permissions. Lo and behold...

I typed it in manually and it worked. If time permits I'll update the article later with a more technical explanation, but as of now, this is just a quick functional post to show what resolved the issue. Add the service account or group (whatever you have sql server in) to the perform volume maintenance privileges in the local security policy.

Instant File Initialization 1 Instant File Initialization 2 Instant File Initialization 3 Instant File Initialization 4

Upgrade from SQL 2014 Evaluation to Developer Edition

Couldn't find documentation showing that upgrade from SQL 2014 evaluation version was possible to developer edition. I just successfully converted an evaluation version to developer edition.

  1. Obtain the key for the developer edition (in my case I had to download the .ISO from MSDN downloads, and go through the installation wizard to get it)
  2. Run the installation center app for SQL Server 2014
  3. Select edition upgrade
  4. Plug in your new serial from the developer edition. Pretty simply, but thought I'd post a confirmation for anyone wanting confirmation the upgrade path was an option for developer. Probably somewhere in the MSDN documentation, but I couldn't find it with a quick search.

Utilizing the power of table parameters to reduce IO, improve performance, decrease pollution, and achieve world peace...

I was dealing with a challenging dynamic sql procedure that allowed a .NET app to pass in a list of columns and a view name, and it would generate a select statement from this view. Due to requirements at the time, I needed the flexibility of the "MAIN" proc which generated a dynamic select statement, while overriding certain requested views by executing a stored proc instead of the dynamic sql.

During this, I started looking into the string parsing being completed for a comma delimited list of numbers to lookup (the primary key). I figured I'd explore the benefits of the user defined table and pass through the list of ids from the .NET application with a table parameter instead of using comma delimited list. Some great material

I came across indicated the overhead might be a little more client side, but that the benefits to cardinality estimation and providing SQL Server a table to work with can far outweigh the initial startup cost when dealing with lots of results to join against. The main area I wanted to address first, that I couldn't find any clear documentation on was the memory footprint. I saw mention on various sources that a TVP can have a lower memory footprint in SQL Server's execution due to the fact as intermediate storage it can be pointed at by reference, rather than creating a new copy each time, like when working with parsing into another variable using comma delimited lists.

I get that passing the stored proc a table variable means it's working with provided object, but what about the portability of this object? In my case, there are at least 2 levels being worked. The MAIN proc and the CHILD proc. The child proc needs access to the same list of ids. The dynamic statement in the MAIN proc also needs the list of ids. Currently it was creating the list of ids by inserting into a table parameter the delimited list of values.

Could I instead consider passing the actual table parameter around since it's by a readonly object and hopefully keep referring to it, instead of having separate copies being created each time. This could reduce the IO requirements and tempdb activity by having a single TVP being used by the MAIN and CHILD procs. TVP Testing 2 TVP Testing 3 TVP Testing 4 TVP Testing 5 Summarized IO: TVP Testing 1 The footprint is reduced when dealing with IO from the child statement, because it keeps pointing to the same in memory object. I also validated this further by examining a more complex version of the same query that compares the comma delimited list against executing a nested stored procedure, which in turn has dynamic sql that needs the table parameter passed to it. The results of the review show successfully that it keeps pointing to the same temp object! TVP Test 6

In summary, the table valued parameter can end up being pretty powerful when dealing with passing a list of values that may need to be referenced by several actions or passed to nested procs (not that this is the best practice anyway). Disclaimer: this is working with the constraints of what I have to release soon, so not saying that nested procs with dynamic sql in both MAIN and CHILD are a great practice, but sometimes you gotta do what you gotta do!

Restoring a database that doesn't exist

When restoring a database that doesn't exist, say for instance when a client sends a database to you, you can't use the option to restore database, because there is no database matching to restore. To get around this you need to use the Restore Files and Filegroups option and then restore the database. Restore database doesn Restore files and filegroups Another option I found interesting was the support for loading database hosted on a fileshare. Brentozar has an article on hosting databases on a NAS that I found interesting. I haven't tried it yet, but think it has a great usage case for dealing with various databases loaded from clients. If you haven't read any material by him... then my question is why are you reading mine? His whole team is da bombiggity.... stop reading my stuff and head on over there!

SQL Sentry Pro Explorer is worth it...

{{< admonition type="info" title="Updated: 2017-04-21" >}} Another great bit of news from reviewing this older post I wrote... SQL Sentry Pro is now a free tool thanks to the generosity of the Sentry One team! It's a must have. Go download it for sure. {{< /admonition >}}

{{< admonition type="info" title="Updated: 2015-04-28" >}} I created a few autohotkey scripts and solved the problem of collapsing panes and a few other annoyances. This has improved my experience a little. - Also noted one major improvement that would help with tuning is aggregating the total IO, and stats, rather than only each individual statement. I've found the need to compare two very different plans to see the total writes/reads variation and the impact on IO, but I've having to utilize another tool for statistics IO parsing to run totals, and then come back to the SQL Sentry Plan explorer for other details. The SQL Sentry plan explorer tool could be improved by enhancing with totals/sums to better compare various runs of plans. I can make do without it, but it makes me have to do a lot of workarounds for now. {{< /admonition >}}

I'll post more later, but after a full day of query tuning on a difficult view, I'd definitely say the cost for PRO is worth it. I'm a fan of sql sentry (free), and decided recently to push for a license at work on this tool. Turns out it was well worth it. The ability to measure variance in plans with small changes without cluttering up SSMS without 20 versions was incredibly helpful and time saving. There are a few quirks that really bother me, but not enough to negate the benefits of this tool. Perks - Save a continual session on troubleshooting a query - Evaluate Logical IO easily in the same view - Save comments on each plan version run to identify the changes you made and what impact it had Negatives - Not integrated with SSMS or preferred text editor so the text editor extremely sparse on features. - No ability to easily sum logical IO and COMPARE to another plan, really you have to open two tabs and eyeball them. That is the biggest frustration, no easy comparison side by side without opening the same session and eyeballing. - NO KEYBOARD SHORTCUTS. GEEZ is that frustrating as you are trying to power through some changes, copy cells/io, and more. Overall: Love the product. Hope they enhance the producivity and efficient aspect more as that's the only area I'm seeing it's short in. Here are some screenshots from my work with it today. I additionally compared the final IO with http://statisticsparser.com/index.html Image SQL Tuning Session 1 Image SQL Tuning Session 2 Image SQL Tuning Session 3 Image SQL Tuning Session 4 Image SQL Tuning Session 5