Skip to content

posts🔗

Best Tools for Taking Notes In Development

{{< admonition type="info" title="Updated: 2016-04-08" >}} Been working through some issues with Clarify app on my personal computer, something I messed up. @trevordevore with @clarifyapp has worked step by step through it with me providing some great personal assistance. In addition, he's given me some tips about my research in merging in a php script to automatically upload my images to cloudinary for hosting. I just finished producing some documentation with Clarify 2, and I have to say it's one of the best tools I can think of for producing a stellar documentation without requiring the user to spend time doing much formatting. I think the points in this post still remain, I wish they'd continue to expand it a little more. Maybe an extension that replicates a library of screenshots like Snagit, or something else for quick ad hoc annotation and pasting would be welcome. Overall, these are minor concerns with the great results the app already produces. {{< /admonition >}}

Developers have a lot to juggle.

Do a few of these things sound familiar:

  • Juggle current sprint tasks
  • Resolve outstanding issues needing immediate attention
  • Research/Professional Development
  • Long term development objectives that you can only get fragments of time to work on
  • Performance testing - and being able to remember numbers/figures with all the moving pieces
  • With all the time left blog, write an insightful blog post that will be hailed by millions

I've been on a long quest to evaluate the best tool to help me personally document and stay up to date with some of these tasks. Documenting and task management can be easily overlapped. For the purpose of my notes, I'm going to focus primarily on the documenting aspect. Whether you use Onenote, Trello, word, or any other system, the most important thing is an easy way to recap and evaluate all the work you've done

A few goals for a best fit

  1. Screenshots are a key for me (see next section if you wonder why)
  2. Annotation of screenshot images are a must
  3. Easily able to copy and paste to reuse the material or export in a way that others could consume
  4. Basic authoring, nothing fancy needed, just basic bold/italics, etc. Something to help visually organize, as raw text can be hard to skim through if a lot of content.

Why do care so much about screenshots?

The problem with solely using text for content is the variety of content we run across.

For instance, when I'm working on performance tuning a database, I will have text based details from SSMS, a grid of results that may be relevant. I may utilize a performance tuning tool like Qure Profiler, or find a diagnostic graph from perfmon that I want to preserve. The variety of formats is daunting to try and capture in a usable format into a documentation tool without considering screenshots. Since the other tools present information in a usable format, but not always exportable format I can use a screenshot to capture the data I reviewed without spending time trying to get things just right.

I also find visually walking through a problem easier to remember when re-reviewing in the future. Your mileage may vary.

Tools I Evaluated

Disclaimer: Please note that most of these tools I received a free license for indicating I'd review. I DO NOT base recommendations on these. A piece of crud is a piece of crud regardless of it it is free. I normally like to review products for an extended period before writing about any of them

My Final Review Rating after intense self examination, contemplation of my navel for hours on end, and the hope of lucrative contracts becoming a professional software reviewer....

Epic: Highly Recommended

  1. Clarify 2
  2. Snagit
  3. Camtasia Studio

Close: Good product

  1. Greenshot1. Great free screenshot utility. Best general screenshot tool besides Snagit I think I've found.
  2. OneNote
  3. Great for OCR scans of text, but basically useless for anything relating to annotations/commenting on an image.

Benched: Not a good fit for me *not reflective of all users

  1. Ashampoo Snap 8

    1.  UI was ok, but found it lacking with keyboard shortcuts and power user usage. The annotations were very dated and seemed very cartoonish for the most part, so I wasn't very happy with the end result. I think the app is a good one, just needs some redesign on a few elements for making it cleaner and modernized. The canvas expansion was very unintuitive as well, making it unfriendly for merging several screenshots together.
    
  2. Screenpresso

    1.  organizes by folder/project, which was a cool concept. Overall very limited and nothing that really stood out as a paid product.
    
  3. ShareX

    1.  Very extensible. One of my favorite features was the _on-screen annotations,_ which let you capture an area on the screen, and do basic annotations on it without any apparent GUI. Nice! Overall I can't recommend due to the learning curve and complexity. It's extremely powerful, but that comes at the cost of being very complex to configure and get working. Documentation was ok, but still for some more advanced setting up of custom share destinations I found it difficult to find help.
    
  4. ActivePresenter (free version)

    1.  Great capturing of screenshots and mouse movements into training video based on detecting movement/actions. This was a great find, however, I can't recommend it for any but those with no work budget, as it's extremely clunky for usage relating to development/notes/personal workflow. It's very busy and hard to find settings to configure, but then again... free!
    

All 3 of these products are commercial. All three have solid functionality, and I really find them to be beneficial to my development process. They each have a different scope in what they accomplish so I could see myself using each of them in various ways from this point forward.

Clarify 2

This was a gem I discovered and initially passed over. The site documentation focused a lot of documenting steps for tutorials/help, and I went ahead and tried to see what it would be like as it looked polished. So far, my favorite discovery in documenting/notes for development and blogging. This tool has the polish of a well design tool, while still having some powerful extensibility. I think it focuses very well on one thing which is documenting step by step work. This ties directly in with blogging and tech documentation. In a blog, I'm covering specific areas, and organizing my thoughts in blocks.

On the tech side, when I'm evaluating a SQL performance problem or tuning effort, I systematically am going through comparisons of before and after, impact assessment, and identifying the improvements against baseline. This tool allows this workflow perfectly.

The export options are pretty polished. I did a quick walk-through for a issue at work and got several complements about the polished documentation I produced. Best part is this took minimal effort.

Pros

  • Absolutely intuitive and clean
  • Lack of over complication with unneeded features. Instead, it focuses on usability.
  • The annotation style is my favorite, very modern and minimal, not like some that use more cartoonish annotation styles.
  • Rounding of images on canvas as a default, looks polished.
  • Easy manipulation of canvas, no complication in trying to add multiple images
  • Fantastic export options. Polished PDF, copy to clipboard as rich text, _EXPORT TO WORDPRESS _this article was written with this tool entirely* Cons
  • Keyboard shortcuts are lacking
  • Could really use with an extension of functionality to just capture single screenshots and annotate like snagit. As it stands, the tool is focused more on the documentation aspect, but with some enhancements it would work as a screenshot annotation tool. Sort of like a single step in their documentation tool, without all the rest.

Clarify 2 Image Annotation Editing

This image shows some of the useful annotation powers that clarify offers. The canvas that images are placed on can easily be re-sized with a mouse drag and images pasted.

Snagit

I avoided this product for a while as I honestly thought it was so mainstream that it wouldn't really benefit a power user/developer. So far, I'm glad to say I'm wrong. It's a well designed product that I've come to start using as my primary screen capture utility, along with Clarify

Pros

  • Wealth of export functionality
  • Good annotation capability with shapes, text, and so on.
  • Can capture screen video for a quick tutorial or walkthrough, no extensive editing options on this however. This is the focus of Camtasia.
  • Ability to tag screenshots (so bug screenshots could be easily re-reviewed)

Cons

  • Keyboard shortcuts are a little lacking with annotations
  • Some of the effects/annotations seems a little cheesy to me, but there are plenty of ways to find a good fit that formats in the style I like.
  • No rounded corner presets for the image canvas if you want that. There are workarounds, but nothing built in.

Snagit Extensibility with output

A wealth of export functionality is in Snagit, along with the additional plugins you can pull in.

Snagit Profile Extensibility with scripted profiles

Along with keyboard shortcuts and powerful export options, I found you could create capture profiles, letting you with setup a shortcut (keyboard or just menu driven) to capture and save as a special format, basically scripting several steps you'd manually repeat together.

Snagit Annotations

Annotations are pretty comprehensive with some functionality to splice out the middle of images (say a toolbar), merge multiple images, and more. Pretty well designed and along with Clarify 2 the best implementation from any utilities I've experimented with so far.

I did like the ability to "curve" an arrow on the annotations, as it let you smoothly draw an annotation that could curve around content without obfuscating it.

Camtasia

Camtasia crosses over the screenshot territory into more of a screen capture area. This functionality can be very powerful however, as I've found the ability to walk through tech issues with a recording is powerful. I'll probably blog with Camtasia eventually as I get more time to do audio recordings. The primary benefit I've found is the easy gif creation with full annotations, blurring, and other effects that help when providing examples while protecting some sensitive connection/context information.

With Camtasia, I've found it tremendously intuitive, as I've already worked with NLE (Non-linear editors), and the process of putting together a video with it is very simple. Some products I've tried that compare somewhat would be ActivePresenter, which is great for creating tutorials with a lot of "whitespace removed" by detecting the activity via keyboard and clicks.

However, I'd say it still has a hurdle of usability, and is not intuitive compared to the Techsmith offering, making it a great option if you have _zero _budget, but not a good option if you really want to create quickly a quality video.

conclusion

Quality tools can help the development workflow. I personally think a tremendous amount of value comes from utilizing images for varied capturing of work in addition to typed notes, as you can gain a lot of information this way. Hopefully, my suggestions might help spark a few new ideas for great tools to help with your workflow.

Dynamic SQL & Quotename

Not quite fineprint, but sure feels like it!

Quotename can be a pretty cool function to simplify your dynamic sql, as it can ease some of the escaping of strings. However, I normally use it for table/column names, and so hadn't ran into a "gotcha" of this function until today. It's limited to 128 characters, and if you pass in greater than 128 characters will yield a null. Yep... you could be trying to track down that error for a null string somewhere in your concatenation for a while... only to find out this silent error is occurring. I'd like to thank NoSqlSolution for mentioning this and helping me go back to the other window I had open and rereading it.... I guess sometimes it pays to read the darn BOL.

not-quite-fineprint-but-sure-feels-like-it-_w3xtwg

XML Attribute VS Element Assignment when working with SQL

XML handling with aliases

I find it interesting the difference in behavior with querying XML between column assignment, and quoted alias naming. It's a generally understood best practice to not use the deprecated syntax of column aliasing using a quoted name. For example:

select
    [escapedWithBracketsIsGood] = case when raining then cats else dogs end
    ,NoEscapeNeededPerGoodNamingPractices = case when rains then pours else friday end
    ,case when writtenThisWay then ICringe else UseAssignmentWay end as NormalWayMostFolksDoIt
    ,'BadNaming' = case when food then eat else hungry end
    ,case when work then eat else JobLess end as 'VeryBadNaming'
You can see the difference in naming. The bottom too are deprecated syntax from older naming standards, and should be avoided typically. Aliasing is pretty straight forward and the variance in doing your alias pattern doesn't normally have an actual impact on the results or way of handling... Except when doing with XML apparently

Image of XML Difference

Further reading

MSDN covers in more detail and precision in the this page The handling of XML is a newer area to me, as previously I've avoided like the plague. However, in working w/apps sometimes creating XML configuration files might be useful so exploring this can helpful to understand how to manipulate and even obtain values from it.

Split personality text editing in SSMS with Sublime Text 3

My preview post showed a demonstration of the multi-cursor editing power of Sublime Text 3 when speeding up your coding with SQL server.There is a pretty straight forward way to setup sublime (or one of your preferred text editors) to open the same file you are editing in SQL Management Studio without much hassle. I find this helpful when the type of editing might benefit from some of the fantastic functionality in Sublime.

External Tool Menu

Go to Tools > External Tools

external tool menu

Setup Sublime Commands to Open

Setting Value
Title Edit in Sublime
Command C:\Program Files\Sublime Text 3\sublime_text.exe
Arguments $(ItemPath):$(CurLine):$(CurCol)
Initial Directory $(ItemDir)

Limitation: Unsaved temporary files from SSMS are empty when you navigate to them. If you save the SQL file you will be able to correctly switch to the file in Sublime and edit in Sublime and SSMS together.

Important: One thing I personally experienced that wasn't consistent was the handling of unsaved files. If the file is SqlQuery as a temp file that hasn't been saved, then this opening didn't work for me. Once I had the file named/saved, it worked perfectly, even bringing the cursor position in Sublime to match what was currently in SSMS.

Refresh File3

Tools > Options > Environment > Documents You can setup the auto-refresh to be in the background if you wish, or manually select the refresh from SSMS when it detects the change. If the auto-refresh happens while you are editing sometimes it caused me to have redo some work (or control-z) in Sublime, but for the most part it's pretty seamless.

Multi-Cursor Editing SQL feels like the movie Inception just became real

  • Yes... multicursor editing is epic
  • No... SSMS doesn't support multi-cursor editing the way it should.
  • Yes... you can do some basic editing with multiple lines using alt-shift
  • No... it doesn't come close to what you can do with Sublime. Cool thing is you can open text in Sublime synced w/SSMS cursor position and switch between the two with a shortcut. That will be a post for another day, I'm just telling you now to get your appetite going. If you can't tell, I love shortcuts. Sublime + AHK pretty much covers most text editing needs you'll ever have. Feel free to send me a check for all the time you'll save from my revelation. I apologize in advance for the video quality. I plan on recording a better one in the future, after I wrap my head around the awesomeness of Camtasia Studio 8

Multi-cursor editing #2

Brentozar''s Training: Chocolate & Cowboy Hats Included

It was entertaining to listen to a technical wizard fighting the obsession with waiting to eat chocolate with 3k viewers watching. Kendra wore about 4-5 cowboy hats in an effort to help those of us who wear many hats feel welcome.... Now that's the kinda of training I enjoy! No pretense, just honest real, and insightful training with enjoyable humor included for free

Highly recommend attending the webex presentations occurring today and tomorrow with Brent Ozar and his amazing team. They are giving back to the community with some excellent training and Q&A. Attended the Shared Storage discussion yesterday and wasn't disappointed (I don't think they've ever disappointed with their training!) Register for the training now, as it is the second of the 3 days, with the first training occurring here @ 10am CST. Brentozar Event Registration

Monitoring SQL Server on a budget

Cheap ain't easy

There's a lot of tools out there, and very few that are polished, have a good UI, and some reasonable functionality to help monitoring, that don't cost an arm and a leg. One such tool I've recently begun to appreciate is MiniDBA . I was generously provided with a license to evaluate this and continue testing, and have recently had an actual chance to start using it more in my environment. The cost for MiniDBA is one of the most affordable I've found for a live monitoring tool with a good UI design (eye candy is critical for monitoring a server as we all know ) At the time of this post's original date, there is a simple free version for monitoring a single instance on the machine running. This free version is awesome if you have a VM running full-time, as you could have it stay running and monitor the instance you care about. Paying $50 for developer and $100 for Enterprise gives you more flexible management with alerts, multiple servers, and a service to collect the data instead of having to run the GUI app the whole time.

wait stats

Wait stats are the first place to typically go to when analyzing the delays a server may face. MiniDBA offers a few cool ways of looking at the data, including getting the diff on waits since the point in time you started looking at it, helping isolate the waits that really matter to you right now.

get alerts on critical server issues

I'd love to see this more extensible/customizable, but it's a good start. The time "resolved" would also be great when reviewing the history to be able to see how long before an issue was resolved.

general healthcheck on "best practices"

Again, some really cool stuff in here. I'd love more customization opportunity to actually expand or customize these as I have a boatload of custom DMV's for evaluating best practice setup conditions on a SQL server. It would be great to extend this more.

active connections

Pretty straightforward, but one plus is it offers ability to view the execution plan for each SPID, potentionally helping save a few steps. Note the execution plans are not shown at the server level "SQL tab", but at the database level. This reminds me of a less thorough "sp_whoIsActive".

other features

There are features to look at like: - table sizes - index sizes - files in the database - memory - default trace - last 3000 transaction log entries - locks on objects.

visual monitoring

The key of course for a great monitoring tool is not just a bunch of text data thrown at you, but a great visual representation of various facts so you can easily identify something wrong. I think the developer did a great job in providing a useful "dashboard". I think more customization or ability to look at a point in time more specifically would be great (like SqlSentry offers) but at the same time, the scope of the MiniDBA project seems to focus on simplicity, and not offering so much that it becomes complicated. I'd say for the price, the value is pretty good for a team looking for a simple tool with a few visual ways of looking at the performance, while still giving some active connection monitoring. Again, there's a lot of other options out there for monitoring, even the built in functionality. But for value, this is a pretty good option, as it seems to focus on simplicity, usability, and not being a $1000+ per server license.

Stranger Danger... The need for trust with constraints

I ran into an issue with errors with an database upgrade running into a violation of a foreign key constraint. Don't know how it happened. Figured that while I'm at it, I'd go ahead and evaluate every single check constraint in the database to see if I could identify any other violations, because they shouldn't be happening.

improve the execution plan by checking the data

In my reading, I found out that checking the constraints can enable the constraint to be marked as trusted. The trusted constraints are then able to be used to build a better query plan execution. I knew that constraints could help the execution, but didn't know that they could have a trusted or untrusted trait.

Brentozar to the rescue

I'm serious, this guy and his team are awesome. This one single team and their web resources have single handled helped me gain more understanding on SQL server than any other resource. I love how they give back to the community, and their communication always is full of humor and good examples. Kudos! Anyway, commendation aside, the explanation from sp_blitz was fantastic at summarizing the issue.

After this change, you may see improved query performance for tables with trusted keys and constraints. - Blitz Result: Foreign Keys or Check Constraints Not Trusted As the site further mentions, this can cause locks and performance issues, so this validation might be better done off hours. The benefit might be worth it though!

my adaption of the check constraint script

I appreciate the script as a starting point (see link above). I adapted to run this individually on each check constraint and log the errors that occurred. This runs though all FK and CHECK constraints in the database you are in, and then checks the data behind the constraint to ensure it is noted as trusted.

Red Gate SQL Source Control v4 offers schema locks

Looks like the rapid release channel now has a great feature for locking database objects that you are working on. Having worked in a shared environment before, this could have been a major help. It's like the poor man's version of checking an object out in visual studio except on database objects! With multiple developers working in a shared environment, this might help reduce conflicting multiple changes on the same object.

Note that this doesn't look to evaluate dependency chains, so there is always the risk of a dependent object being impacted. I think though that this has some promise, and is a great improvement for shared environment SQL development that uses source control.

Qure Analyzer Benchmark Testing

{{< admonition type="warning" title="Updated 2018-03-18" >}} Wouldn't recommend tool at this time. Development seemed to cease, resulting in me not being able to use with later versions of SQL Server. When I came back recently to check on it, the app was sold and it was with a new company. Didn't see to have progressed much based on a quick look and not really interested in it at this point. Instead, other solutions like the new DEA (Database Experimentation Assistant from Microsoft) would be where I'd spend more effort. {{< /admonition >}}

the problem of monster views

I've been working with tuning an application that performs a lot of Entity framework calls, as well as stored procedure activity. The stored procedures could be processing a large amount of data, or a small amount, depending on the configuration of the client. The major issue was the source of the data for the client application was a view with 20+ joins involved. In breaking down the logic, most of the joins really functioned as just adding additional columns of data. The logical grouping mean that mostly 2-3 tables at a time joined to provide a set of columns based almost exclusively on the source table. I needed to get away from this pattern, as I was finding tremendous issues with cardinality estimation for SQL Server, resulting in sometimes 20GB nested loop join worktables by the optimizer.

Simplify

the incredible masterful... well... actually quiet simple solution... simplify!

My proposed solution was to break up the large view into small pieces, which I identified by logically grouping the joins/related columns. Instead of 20+ joins to return the columns needed, I'd instead provide 10 separate selects of 2 tables or so in each query. These would be processed as a dataset on the client, instead of returning a single large datatable. The next issue was to identify the improvements based on evaluating the possibility of larger amounts of base accounts pulled at once with the new format, vs the smaller batch size the large view required to function at all.

first approach was statistics io & Sql stress

Using the SQL Stress tool along with evaluating the statistics io, time was the my first course of action. However, the problem I ran across was really that I needed to run this dozens of times in a row, and evaluate the impact on the client performance as well. SQL stress provided a great way to run the query manually, but with the input variables from the app, I really wanted to run the app as the end user experience, and request the batch job on the largest amount of rows I could get. This way, I truly was matching all the input from the app, and getting timings from this. In addition, I was looking for a good way to evaluate the workloads against each other.

the right tool for the analysis = Qure Profiler

I'd used ClearTrace before, and found it helpful, but hard to consume some comparison type reports easily. I needed something to help me identify the improvement or degradation of performance and Qure Analyzer solved this for me, turning something very messy into a much simpler proposition (which also helped with my end research presentation to the big kahonas). Dbsophic Qure Profiler has had some great reviews for assisting in database profiling, but I haven't had a chance until recently to fully dive into using it. Since I was doing batch testing, I figured now would be a great time to re-review it. The numerous variables at play made this pretty complex to really quantify any improvement based on a single run alone. So, I brought back up Qure Profiler to help me out. Necessity is the mother of experimentation.

Initial Performance Evaluation Run (Batch level analysis)

Evaluated runs on the largest current grouping. This was 1232 base accounts. I would later evaluate with larger batch sizes, but started simple.

Actual Execution Details with Qure Profiler

Tested Again With QURE Profiler set to minimum batch info. I also ran DBCC free proc cache to attempt to better ensure the plan was correctly rebuilt for the new counts, and that it was a fresh start in the comparison as far as impact on the disk IO.

Comparing 100 to larger batch sizes in the base account request

This final comparison shows 42% improvement by using 500 rows at a time. This seemed to be a good compromise at this point to increase batch sizes, while still maintaining lower logical reads. Next step was to test against a larger database to evaluate scalability.

Actual Execution Results on Larger database

Evaluating against a database with about 500GB of data, I found the best execution time seemed to be the base account count (tongue twister) seems to be the 1000 batch size at this time. It is returning the results in the shortest duration and the lowest impact on reads. FREE PROC CACHE COMPLETED ON EACH STEP

Qure Profiler workload comparison

Set the baseline as the 100 batch size (which is the current way of processing the request). Qure provided an extremely powerful side by side comparison of both of the workloads. The best value is still the 1000 batch size, showing that the logical reads at point.

Comparing to the original methodology of a huge view with 20+ joins

Against two databases, both performed signficantly better with the simplified approach. One database was much faster, while another that was completely timing out with 20GB+ nested loop join worktables finally ran without incident. Major win! The changes look to have positively improved the performance overall for both databases representing two much different usage patterns (and data stored in it) When comparing new (1000) simplified procs vs 100 on original views it showed: - 27% shorter per execution - 77% less cpu impact - 81% less logical read impact

Additional functionality I haven't even touched on

Qure Profiler offers additional functionality that I haven't even gotten a chance to touch on. It can normalize the count of events to compare an even 100 against another 100 even when the second trace might have run longer and caught the event more than the 100 times. Check out their page for more details.

TL;DR Summary

Long post. If you read this far, I commend you for either being a great page scroller, or for being tenacious and finding all my graph's super interesting. If that's you, you get #SqlCred I'm finding that with workload testing, Sql server workload comparison is one of the harder things to do right. There are a lot of variables to take into consideration, and even when doing a simple test on batch size changes like I did, aggregating the comparison results into usable statistics and in a presentable format can be daunting. Highly recommend the dbsophic product for a great utility to help save a lot of time in doing this comparison. This tool goes into my top SQL server performance tuning tools for sure. I'm going to be evaluating their more advanced Qure Optimizer soon as well, as it might help identify other issues I've missed on tuning by evaluating adjustments against a test copy of a database.

Are there any cons?

I didn't run into any significant issues that impacted my ability to use it. I do think there were a few stability issues that I had to work around by restarting the app a few times, but for the time it saved, I wasn't complaining too much. 1. Need to offer option to use extended events. This is on their roadmap apparently. Xevents should help lower the impact from doing the profiling on the workload being tested. 2. Some app errors related to closing workload comparison tabs. I reported this bug and hopefully they'll continue to improve the application. 3. Maybe some GUI improvements to make it more in-line with modern UI standards? Just a personal preference. I didn't find the tab based comparison approach the most intuitive. Not terrible, but would be a nice improvement in the future.

What could they improve?

  1. Offer comparison report option that wouldn't require as many screenshots, and instead summarize the selected items in a pane that you could copy to image. This would be slick!