Skip to content

2016🔗

ManicTime - Timetracking Automation Done Right

{{< admonition type="info" title="Updated: 2020-04-29" >}} broken image links removed {{< /admonition >}}

Tracking time is always a beast. With the amount of context switching many developers do, it can be tough to remember how much time went to each project. With companies looking to track effort on sprints, hours on a client project, or (as in my case) just a dev wanting to better evaluate the productive use of time, this app fills some gaps that others don't.For instance, I've tried tools such as Toggl, and similar. I found them useful, but requiring a lot of diligence to work with. There is very little "automatic" categorization of time. Many of those tools are focused on a timer based approach that requires you to start and stop the timer. ManicTime approaches this differently. It has the typical stop watch, countdown, Pomodoro type functionality a time tracking tool might offer, but in addition to this it provides a captured timeline of activity with various forms of meta data to easily review and parse for categorization.

This categorization of information can take the form of details such as:

  1. Development
  2. Personal
  3. Browsing or be specific based on user input such as
  4. Development, Reticulating Splines
  5. Personal, Contemplating Navel
  6. Project Manhattan, Task 666: Optimizing Nuclear Db Performance

Manually entered information is a big driver for better tagging accuracy, but it expands this to allowing dynamic tags based on matching of applications, titles of documents, titles of web pages, calendar entries, and more. This offers a vast range of meta data captured on your system to generate more accurate allocation of your time. The only real negative to this is that it is not as simple as something like Toggle. However, with a little work, you can work an entire day and quickly recap at the end, categorizing your effort with very little fuss, and a high amount of accuracy.

If you find yourself forgetting to hit the stop/start button on your time tracker and want to consider a better option for tracking your effort, then look no farther.

I've used this application over the last year and found a lot of value in it, figured I'd share a little on it, as it's become on of the tools in my essentials pack. Disclaimer: I was provided with a license a while ago on this. This doesn't impact my review, as I just like finding great tools to help devs have a better workflow. I only end up reviewing the ones that I've really found useful

Overview

Time Categories

Time Categories

Overview of Day's Activities

The list of all activities is organized into several timelines, whether it be the applications, extracted document titles from applications, calendar, or even calendar feeds. This allows a variety of ways to go back through and easily organize and track time. One recent improvement that I completely love is the integrated screenshots into the application timeline. This allows you to keep a running screenshot log of activity to easily go back through existing applications and remember exactly what was being done at the time. A very useful implementation!

Tracking Exclusion

Note that you can choose to go off record, as well as specifically only track certain times of day. This is a good option for those that have a work laptop that they might leave running and only want to report on certain periods of active work time.

Autotagging

Autotagging is where this tool gets powerful. Basically, the concept is to allow automatically tagging based on application, window title, url, or other parsed value. This means you can categorize your effort much more easily, with minimal effort.

Regex Parsing

I've yet to figure out the dynamic tags based on regex parsing as it doesn't seem to give you a preview to test and refine results. Once I figure this out, or the app improves the ability to use this I think the additional timelines will be very handy as you could have one timeline focused on dynamic parsing and grouping of projects based on doc/chrome titles that doesn't interfer with the categorization that the other timeline might use. This is a usability issue that I hope to see improved in the future. It has a lot of potential.

Regex Parsing

Multiple Autotag Timelines

This is someone I've recently been exploring as it provides the capability to create an automatic tagging of apps, but for different purposes. For instance, you might setup one rule for parsing project numbers and send to a AutoProject timeline that aggregates the totals, but another timeline for categorization of the apps/websites. Another use might be a timeline focused on categorizing web usage, while another focuses on app usage.

Multiple Autotag Timelines

Tagging

Away Time Tagging

You can have ManicTime prompt you when you return from your computer, or when a timer has detected minutes of idle on your system. This can help ensure that if you are gone to a meeting, or away from your PC you are still tracking the time you used.

Away Time Tagging

Narrow down untagged time quickly

There is a variety of ways to filter down the timeline to only untagged activities as the selected, or untagged as what's actually shown. This can help identify gaps in what you've reviewed.

Narrow down untagged time quickly

Statistics & Reports

Generate Timesheet Report

Generate Timesheet Report

Some Nice Visual Statistics Available

Other Statistics Available

These are listed based on the selected groups, tags and more.

Other Statistics Available

Manic Time Server

Manic time offers server functionality to allow this tool to be used to help generate reports for staff members and usage. This functionality is not for the non-technical user. I found it a little challenging to get things setup, so this current iteration wasn't designed as a simple "central" solution for all devices. With a better setup/configuration experience (no domain user logins etc) and perhaps more of a Google Drive/Dropbox type sync, I think the solution would be fantastic for tracking time on various devices. Due to the setup issues I had on server, I wasn't able to include tracking from the new ManicTime android client. I would say that homegrowing your own tracking solution with Tasker and a custom timeline here might not be a difficult project to consume through the app due to the documented format for consuming external timeline information. I haven't gone to that effort, but it's an intriguing concept.

daily retrospective

Being able to retroactively tag and categorize effort at the end of the day, without having to constantly toggle the stopwatch. You can approach with a stop watch/pomodoro/countdown built in, but if you get pulled in multiple tangents, this tool makes it easy to go back and categorize throughout the day... IF your primary work is driven on using your computer. Since I'm approaching this from a developer tool point of view, it's a perfect fit!

Last Thoughts

Fantastic app with a unique approach. Cost is a little high, but it's an independent app so supporting the development can be a good thing as a really specialized tool. Not sure they'd be able to continue development if it was a lifetime purchase (those seem to have gone away over time). As a good office tool for better tracking and reporting on time (for instance if working with clients), then I believe it might just pay for itself. I'd like to see a smoother integration with the server components to being a better cloud tracking mechanism, allowing android, pc, mac, all to provide a solid reporting mechanism for families on the go. The app seems more focused on enterprise/business tracking though, so this might not be implemented. I'll continue using and finding great value in helping track my time with the least amount of work. For those looking for a solution, give it a shot. They have a lite version available as well with less features, so you can take a swing at it.

Dynamically Set Powershell Variables from json

I created this small snippet to allow a list of values from a json file be turned into variables to work with. For working with a fixed list of configuration values, this might be helpful to reduce some coding effort.

The Traditional Birthday Song Is Terrible

The traditional birthday song is terrible.It's never really changed. It's like singing a dirge. It's really really hard for people to sing anywhere close to on key. We all sing it because we have to, but there is this feeling of regret, like "I'll do it for you, but just because I love you". It is followed by "Many mourns" by the closest available family clown. Apparently, the roots were back in the 19th century, and wikipedia says:

In 1988, Warner/Chappell Music purchased the company owning the copyright for US$25 million, with the value of "Happy Birthday" estimated at US$5 million....In February 2016 Warner/Chappell settled for US $14 million, paving the way for the song to become public domain.[18] WIKI Let's leave this tainted legacy behind. I propose a radical change. Ditch it for something fun. Make a new family tradition.

This is much more like Rend Collective's reminder to be always practicing the Art of Celebration :-) It's super easy to sing the first time, promotes foot stomping and hand clapping, promotes dancing and jumping, and overall, I feel conveys the feeling a birthday should have. New lyrics:

**The Art of Celebration Birthday Song**
- created after much deliberation and refinement in 1 min at the house of Sheldon Hull
Hey hey hey, It's your birthday day
Hey hey hey, It's your birthday day
Hey hey hey, It's your birthday day
Sing Loud, Sing Proud, It's Your Birthday
- Creative Commons Zero CC0, so do whatever you want with this bit of musical genius

Scan folder of dlls to identify x86 or x64 compiled assemblies

Point this at a directory of dlls and you can get some of the loaded assembly details to quickly identify what type of processor architecture they were compiled for.I did this as I wanted to explore a large directory of dlls and see if I had mixed assemblies of x32 and x64 together from a visual studio build. Some dlls with invalid assembly header information were found, and this skips those as warnings.

Attaching Database Using SMO & Powershell

Steve Jones wrote a great article on using this automation titled The Demo Setup-Attaching Databases with Powershell. I threw together a completed script and modified it for my functionality here. MSDN documentation on the functionality is located here Server.AttachDatabase Method (String, StringCollection, String, AttachOptions)I see some definitive room for improvement with some future work on this to display percentage complete and so on, but did not implement at this time.

For the nested error handling I found a great example of handling the error output from: Aggregated Intelligence: Powershell & SMO-Copy and attach database. If you don't utilize the logic to handle nested errors your powershell error messages will be generic. This handling of nested error property is a must to be able to debug any errors you run into. http://blog.aggregatedintelligence.com/2012/02/powershell-smocopy-and-attach-database.html

If you want to see some great example on powershell scripting restores with progress complete and more I recommend taking a look at this post which had a very detailed powershell script example. SharePoint Script - Restoring a Content Database

Parallel Powershell for Running SQL

This is just a quick look. I plan on diving into this in the future more, as I'm still working through some of the changes being made in the main parallel modules I utilize for SQL server. In the meantime, if you are looking for a quick way to leverage some parallel query running, take a look at PSParallel. I've avoided Powershell Jobs/Workflow due to limitations they have and the performance penalty I've seen is associated with them.For my choice, I've explored PSParallel & PoshRSJob. I've found them helpful for running some longer running queries, as I can have multiple threads running across server/database of my choice, with no query windows open in SSMS. Another great option that is under more active development is PoshRsJob. Be clear that this will have a higher learning curve to deal with as it doesn't handle some of the implicit import of external variables that PSParallel does. You'll have to work through more issues initially to understand correctly passing parameters and how the differents scope of runspaces impact updating shared variables (ie, things get deeper with synchronized hashtables and more :-) ) Hope this helps get you started if you want to give parallel query execution a shot. Here's a function using PSParallel to get you started. Let me know if it helps

Get Backup History for All Databases in Server

Here's a quick snippet to get a listing of the database backups that last occurred on a server. Most solutions provided a single backup listing, but not the brief summary of the last backup details I was looking for.

Pizzicato Pizza

Had a blast making this song. Unfortunately, creating video of slow motion pizza cutting was going to be a bit awkward to do, as I didn't feel like asking a pizza place to let me stand behind the counter and video pizza assembly. :-)Best listened to with headphones, not a phone speaker :-)

This was an experimentation with several new tools for me.

song creation

I first used Presonus Studio One 3 to record some basic track parts like the guitar. I then added some drums using BFD3 (which I've previously reviewed). Using the tools to adjust and randomize the velocity and humanize the rhythm helped create something that sounds more realistic. Then I recorded a fun guitar riff, and used Melodyne Essentials, which is included in Studio One, to convert the guitar part into MIDI. This allowed me to then change the guitar part I had recorded into a MIDI song part and I choose to replace with Pizzicato strings, as I really liked the sound of Presence XT string section. I've never heard a post-rock style song with pizzicato strings, so figured I might bring a little variety to the post-rock mix.

video editing

The video editing was extremely simplistic. I used Edius 8.2, which is a fantastic NLE for broadcast video editing. I plan on writing or creating a video presentation on that soon to help others interested in video editing to see a little more on that process. I graded with Magic Bullet Looks as I liked the grain and dark feel of the image. Using stabilization effect in Edius, and slowing the speed to 35% (originally 1080p 60fps Panasonic GH3) the resulting imagery was great. I recorded this imagery handheld from this camera on a rainy day in my backyard. I thought the close up of the rain puddling would be interesting for something, and finally found a use. Hope you enjoy my experiment, I'll work later on doing some walkthroughs for anyone interested more in converting audio to MIDI, basic editing with Edius, and some workflow examples with Presonus Studio One. Music:

Data Compare on Temporal Tables

I hadn't seen much talk on doing data comparisons on temporal tables, as they are a new feature. I went through the exercise to compare current to historical to see how Red Gate & Devart handled this. I'm a part of the Friends of Red Gate program, so love checking out their latest updates, and I'm also a regular tester on Devart which also provides fantastic tools. Both handled Temporal Tables with aplomb, so here's a quick walk through on how I did this.

SSMS 2016 View Of Temporal Table

{{< admonition type="info" title="Updated: 2020-04-29" >}} broken image links removed {{< /admonition >}}

With the latest version of SSMS, you can see the temporal tables labeled and expanded underneath the source table.

SSMS 2016 View Of Temporal Table

Red Gate SQL Data Compare 12

To begin the comparison process, you need to do some custom mapping, which requires navigating into the Tables & Views settings in SQL Data Compare

Unmap the existing options

To remap the Customers to Customers_Archive, we need to select this in the tables and choose to unmap the Customer and the Customer-Archive Tables from each other. This is 2 unmapping operations.

Setup Compare Key

Go into the comparison settings on the table now and designate the key as the value to compare against. For the purpose of this example, I'm just doing key, you can change this however you see fit for your comparison scenario.

Setup Compare Key

Remove any columns from comparison desired

In this example, I'm removing the datetime2 columns being used, to instead focus on the other columns.

Remove any columns from comparison desired

Compare results

If you run into no results coming back, look to turn off the setting in compare options for Checksum comparison, which helps improve the initial compare performance. With this on, I had no results coming back, but once I turned off, the comparison results came back correctly.

Compare results

Conflict Row

This entry was matched in DbForge SQL Data Compare as a conflict due to matching the key in a non-unique manner. The approach the two tools take is a little different. In RG Data Compare

Conflict Row

Conflict Entry Only In Destination

The entry identified as potential conflict by DbForge is identified in the Only In Destination.

Diff Report

Both tools report differences. RG's tool has focused on the diff report being simple CSV output. This is fine in the majority of cases, though I'm hoping for additional XLSX and HTML diff reports similar to DbForge eventually. In the case of the CSV output, you could consume the information easily in Power-BI, Excel, or even... SQL Server :-) No screenshot on this as it's just a csv output.

Devart SQL Data Compare

Going into the mapping, you can see support for Customers and Customers_Archive, which is the temporal history table for this. In this case, I mapped the current table against the temporal table to compare the current against the change history.

Devart SQL Data Compare

Choose the key column to compare against

As a simple example, I just provided the primary key. You could get creative with this though if you wanted to compare specific sets of changes.

Choose the key column to compare against

Handling Conflicts differently

Looks like the conflict is handled differently in the GUI than Red Gate, as this provides back a separate tab indicating a conflict. Their documentation indicates: Appears only if there are conflict records (records, having non-unique values of the custom comparison key). DbForge Data Compare for SQL server Documentation - Data Comparison Document

Diff Report

The diff reports provided by DbForge Data Compare are very well designed, and have some fantastic output options for allowing review/audit of the rows.

Diff Report

Diff Report Details

Here is a sample of a detail provided on the diff report. One feature I found incredibly helpful was the bold highlighting on the columns that had diffs detected. You can trim down the report output to only include the diff columns if you wish to further trim the information in the report.

Diff Report Details

Overall, good experience with both, and they both support a lot of flexibility with more specialized comparisons.

Fixing Untrusted Foreign Key or Check Constraint

Untrusted constraints can be found when you alter/drop foreign key relationships and then add them back without the proper syntax.If you are deploying data through several tables, you might want to disable foreign keys on those tables during the deployment to ensure that all the required relationships have a chance to insert their data before validation.

Once you complete the update, you should run a check statement to ensure the Foreign Key is trusted. The difference in the check syntax is actually ridiculous.... This check would not ensure the actual existing rows are validated to ensure compliance with the Foreign Key constraint.

alter table [dbo].[ChickenLiver] with check constraint [FK_EggDropSoup]

This check would check the rows contained in the table for adherence to the foreign key relationship and only succeed if the FK was successfully validated. This flags metadata for the database engine to know the key is trusted.

alter table [dbo].[ChickenLiver] with CHECK CHECK constraint [FK_EggDropSoup]

I originally worked through this after running sp_Blitz and working through the helpful documentation explaining Foreign Keys or Check Constraints Not Trusted.

Untrusted Check Constraints and FKs can actually impact the performance of the query, leading to a less optimal query plan. The query engine won't know necessarily that the uniqueness of a constraint, or a foreign key is guaranteed at this point.

I forked the script from Brent's link above and modified to iterate through and generate the script for running the check against everything in the database. This could be modified to be server wide if you wish as well. Original DMV query credit to Brent, and the the tweaks for running them against the database automatically are my small contribution.

Note: I wrote on this a while back, totally missed that I had covered this. For an older perspective on this: Stranger Danger... The need for trust with constraints