Over the last two days I’ve written code that automatically logs all SQL commands (database queries, invocations of stored procedures, etc) in MadPoker, my reference application.

If you wish to skip the backstory, go directly to the solution.

Reference Application

What do I mean by “reference application”?

I intend to spend time in the near future learning AngularJS, the client-side JavaScript framework used by my new employer. I’ve always been stronger on server-side development, so I’m excited to improve my client-side programming skills. I plan to write code to convince myself I understand AngularJS concepts and syntax as I read Pro AngularJS and watch Pluralsight courses. I can’t test this code in a vacuum- I need to add it to a functioning website. So I’ll add it to MadPoker, a website I’m developing that enables users to play poker against bots. I call MadPoker my “reference application” because I’ve used the latest ASP.NET Core technologies to build it, and whenever I need to refresh my memory of how a particular feature is enabled in ASP.NET Core, I refer to MadPoker.

I realize AngularJS has been superseded by Angular. I can learn Angular (not used by my employer) after learning AngularJS, which is used by my employer, and therefore, is a priority for me to learn.

Motivation

Before diving into AngularJS code, I want to ensure I’ve built a solid architecture upon which I can rapidly build web services and websites. So over the last few months I’ve written code that leverages the latest ASP.NET Core APIs, Dapper, and Refit to provide the following application features.

  • Automatic logging of all page and service method invocations.
    • Trace Messages
    • Performance Measurements (elapsed time)
    • Metrics (counts of page hits and service method calls)
  • Developer-Initiated Logging
  • Automatic flow of Correlation ID from client to service (to relate logs as code executes across processes).
  • Automatic Exception Logging
  • Automatic flow of exception details from service to client (recursively embeds inner exceptions and call stack).
  • Type-safe invocation of REST web service methods via C# proxy classes.
  • SQL-focused database queries and ORM (no LINQ obfuscation of SQL).
  • Identity Management
    • User Registration
    • Password Hashing
    • Claims-Based Permissions
    • Tamper-Proof JWT Tokens
    • Policy-Based Access Control
  • Dependency-injection of thread-safe singleton services.

I’ve organized my code in an ErikTheCoder namespace, C# classes, classes derived from Microsoft classes, ASP.NET Core middleware, and extension methods; developed and revisioned my code in public GitHub repositories; and published the compiled binary class libraries as a series of ErikTheCoder packages on NuGet. You’re welcome to use my source code or reference my NuGet packages in your own projects.

One feature missing from my custom-built tech stack is automatic logging of all SQL commands issued by the application. By “automatic” I mean logging should occur without requiring me (or any other developer) to insert SQL logging statements throughout application code. SQL commands should be intercepted and logged before they’re executed. Such a feature would aid troubleshooting of application issues by revealing the exact SQL queries issued by code running in the service tier (the web tier of my application does not query the data tier). Of course this feature must not adversely affect performance of the application.

I’m happy to say I wrote a SQL logging feature over the last couple days that meets all these requirements. It took longer than I expected because I ran into numerous obstacles, which I’ll discuss in this blog post.

False Starts

Dapper: The first idea I had was to identify an event or virtual method provided by Dapper that I could respond to or override. If Dapper fired an event just before executing a SQL command, or provided a virtual method that it called just prior to executing a SQL command, I could hook into that and log the command. This would have the disadvantage of only logging SQL commands sent through Dapper. However, my reference application uses Dapper exclusively- it does not execute SQL commands directly via ADO.NET or other ORMs- so this would be an acceptable solution. Unfortunately, Dapper does not provide such an event or virtual method. Dapper’s creator, Marc Gravell, said so himself: “Dapper does not currently have an instrumentation point here. This is perhaps due, as you note, to the fact that we (as the authors) use mini-profiler to handle this.” I chose not to enter the labyrinth of code that is mini-profiler.

SqlConnection: My second idea was to sub-class the SqlConnection class. Internally, Dapper uses a SqlConnection class to communicate with Microsoft SQL Server. This would have the disadvantage of only logging SQL commands sent to a Microsoft SQL Server database. However, my reference application stores data in SQL Server exclusively- it does not communicate with Oracle, MySQL, or other databases- so this would be an acceptable solution. Unfortunately, Microsoft has sealed the SqlConnection class, so it’s not possible to inherit from it.

IDbConnection: My third idea was to implement the IDbConnection interface using an adapter pattern. I’d reference a SqlConnection instance in a private field and implement each method defined by IDbConnection- first logging SQL commands before executing them via the private SqlConnection instance. This would have the advantage of logging all SQL commands regardless whether they were issued via ADO.NET, Dapper, or any other ORM against SQL Server or any other database. (ADO.NET, which defines IDbConnection, sits very low in the Microsoft .NET tech stack.) Unfortunately, Dapper didn’t play along with my plan. It threw the following exception.

Exception Type = System.InvalidOperationException
Exception Message = Async operations require use of a DbConnection or an IDbConnection where .CreateCommand()
returns a DbCommand
Exception StackTrace = at Dapper.SqlMapper.TrySetupAsyncCommand(CommandDefinition command, IDbConnection cnn,
Action`2 paramReader) in C:\projects\dapper\Dapper\SqlMapper.Async.cs:line 402
at Dapper.SqlMapper.QueryRowAsync[T](IDbConnection cnn, Row row, Type effectiveType,
CommandDefinition command) in C:\projects\dapper\Dapper\SqlMapper.Async.cs:line 477
at ErikTheCoder.Identity.Service.Controllers.AccountControllerBase.LoginAsync
(LoginRequest Request)

Thankfully, the programmers who wrote Dapper included a detailed exception message that explains exactly what to do. In order to use Dapper’s async methods (as MadPoker does), Dapper must be provided with a connection that returns a DbCommand class (or sub-class) not an IDbCommand interface. Nick Craver, one of the Dapper developers, explains this is because “IDbCommand pre-dates any async functionality in ADO.NET, no async methods exist on the interface.”

Solution

Having written a class that implements IDbConnection and compiles successfully but throws an exception at runtime, I was 90% done. I merely had to refactor the class to sub-class DbConnection, which required implementing a few more methods. This Dapper accepted.

Similarly, I sub-classed DbCommand.

To use these logged database connections and commands, inject an ILoggedDatabase singleton during application startup.

Then reference the ILoggedDatabase (the _database variable below) in website or WebAPI controllers when asynchronously opening database connections.

Because these logged database connections and commands leverage my zero latency, thread-safe logger to write to files and / or a database, they do not adversely impact application performance. The logging API calls do not block waiting for I/O to complete- they merely add log messages to an in-memory queue. A ThreadPool thread removes log messages from the queue and writes them to disk and / or a database. The logging code guarantees thread-safety by using BlockingCollection and ConcurrentQueue classes. In fact, the entire login process for MadPoker completes in only 27 milliseconds. This includes an HTTP form post, three SQL queries, and JSON serialization / deserialization between the website and service tiers.

select p.*
from [Logging].PerformanceLogsLastDay p
where p.Timestamp > '7/6/2019 10:32:06 PM'
order by p.Id

Id Timestamp CorrelationId HostName AppName ProcessName OperationName OperationDuration
782 2019-07-06 22:32:14.1466667 19A60DCA-6D57-413A-AF94-B32EA151453F THECOLONEL MadPoker Identity Service /account/login 00:00:00.0198530
783 2019-07-06 22:32:14.1500000 19A60DCA-6D57-413A-AF94-B32EA151453F THECOLONEL MadPoker Website /account/login 00:00:00.0268323

The three SQL queries (and their parameters) issued during the login process are logged.

Database command type = Text
Database command text =
select u.id, u.Username, u.PasswordManagerVersion, u.Salt, u.PasswordHash,
u.EmailAddress, u.FirstName, u.LastName
from [Identity].Users u
where u.Username = @username
and u.Confirmed = 1
and u.Enabled = 1.
Database command parameter Username = emadsen.

Database command type = Text
Database command text =
select r.Name
from [Identity].UserRoles ur
inner join [Identity].Users u on ur.UserId = u.Id
inner join [Identity].Roles r on ur.RoleId = r.Id
where u.Id = @id
order by r.Name asc.
Database command parameter Id = 2.

Database command type = Text
Database command text =
select c.[Type], uc.[Value]
from [Identity].UserClaims uc
inner join [Identity].Users u on uc.UserId = u.Id
inner join [Identity].Claims c on uc.ClaimId = c.Id
where u.Id = @id
order by c.[Type] asc, uc.[Value] asc.
Database command parameter Id = 2.

You may review the full source code in my Data project in GitHub.

7 Replies to “Automatically Log All SQL Commands”

  1. Thanks for this post.
    Just wanted to know why you think miniprofiler is a labyrinth of code?
    Also, any reason why you implemented logging yourself rather than using a library like Serilog?

    1. Regarding miniprofiler, all I mean is it’s a lot of code. I’d either need to a) read it (or the relevant portion of it) to determine if it does what I need with the performance characteristics I need or b) add its binary package to my projects, observe it, and guess if it’s doing what I need it to do. Why take a dependency on a library that implements way more features than I need?

      The main reason I rolled my own is because logging is an easy feature to implement and homegrown solutions are not dangerous. It’s not an encryption cipher- that is, something better left to experts. It’s beneficial for troubleshooting, its features are minimal and obvious, and it can be implemented in a few hundred lines of code. See the Motivation section of the ReadMe file (https://github.com/ekmadsen/Logging#motivation) for a more complete answer.

  2. Great! Exactly what I was looking for. I am using the logging to create complete database scenarios for automated tests.
    And with your solution, I can take every database I like w/o the need for special providers.

Leave a Reply to Kenny Long Cancel reply

Your email address will not be published. Required fields are marked *