More than mediocre.

Data Migrations and Microsoft Data Platform Reliability Engineering.

When it comes to data integrity, availability or security, there is no place for mediocre solutions. Databases are the centre of most operations. Outages, data loss, data breaches - a small mistake can lead to financial and reputational loss. Do not let any of this happen to you. I specialise in Data Migrations as well as SQL Server and Power BI technologies. I have been helping businesses migrate and integrate data between disparate enterprise solutions and design, develop and maintain reliable database platforms for over 20 years. I am also the creator of SQLWATCH.IO - Open Soruce SQL Server Performance Monitor.

Get in touch to find out how I can help you.

Move data to Azure Archive Storage using PowerShell

Concept You may have come across a term multi-tiered storage. This means, that the storage solution has multiple arrays, fast and expensive and slow but cheap. Files accessed very frequently are stored on very fast SSD disks, files accessed less frequently are stored on the much cheaper but much slower spinning disks. Files designated for …

Dynamic data sources in PowerBI Desktop

Abstract PowerBI Desktop is a data analytics and presentation tool. Defining data sets is very easy and usually involves creating connection string to the source data and defining objects to pull data from. Alternatively, for SQL databases we can write custom SQL script to query source data. However, if we want to pull data from …

Generate realistic test data

As data professionals, we often need test data, whether for functional testing, to satisfy business logic criteria or for non-functional, to satisfy performance requirements. We must also not store any sensitive or personal information in non-production systems and doing so could be against Data Protection Regulations (GDPR). A common approach is to refresh test environments …

How to intelligently auto-cycle ERRORLOG using T-SQL

In my previous post, I described different ways to read very large SQL Server Log (ERRORLOG) – 5GB to be exact. However, this was a reaction to something that has already happened. In this post, I will show you how to prevent this from happening at all.

Rapid SQL Server test instances with Docker and Azure

As a DBA, developer, and more importantly the creator of SQLWATCH.IO, I need the ability to rapidly deploy and test different SQL Server configurations or test different upgrade variations from one version of SQLWATCH to another. This is quite laborious, time-consuming tasks as I either have to build a new SQL Server instance of a specific …

Add colours to the ERRORLOG

When dealing with large logs I often find it difficult to find the information we are looking for. To make it easier, we can use PowerShell to add colours the ERRORLOG based on string patterns Note this will only work when using the Select-String command. Remote ISE Even though we need physical access to the ERRORLOG files, …

How to read large SQL Server ERRORLOG (the 5GB test)

Let’s take a look at different ways of reading the SQL Server ERRORLOG….I often come across large ERRORLOGS mainly because they are not being recycled frequent enough, or not at all. Sometimes, despite the frequent recycling, the ERRORLOG can grow to unmanageable size. This has recently happened to me during a large transaction rollback over Synchronous Availability Group. Within a few hours, the ERRORLOG log grew to 5GB in size due to “Sending page request for Undo” messages.

Database Development in Visual Studio

A lot of database administrators and developers like to use the SQL Server Management Studio to make changes to the database schema directly. As a production DBA, I can definitely say that there are situations where this is acceptable and even desired, however, in most cases a Visual Studio Database Project is a much better approach. I have made a switch from developing in SSMS to Visual Studio several years ago and never looked back.

SQL Agent Jobs Timeline with dbatools.io (download)

An important element of performance monitoring is knowing what SQL agent jobs run at what time to understand and avoid potential clashing. Notice the underlined what time – time correlation is very important when it comes to performance but finding it is not always easy. I wrote a simple PowerShell script to help visualise what jobs run and when

SQL Server Performance Dashboard using PowerBI (download)

I often help improve the performance of a SQL Server or an application. Performance metrics in SQL Server are exposed via Dynamic Management Views (DMVs). However, DMVs only provide a view of the current state and no history. This is important as it makes it particularly difficult to draw a bigger picture of how the system is behaving over time and what problems are occurring during overnight batch processing or during peak operational times, for example when users log in to the system at 8 am or when they leave for lunch at 1 pm. To address this deficiency I have built a simple yet comprehensive SQL Server Performance Dashboard in PowerBI.

Like what you see? Enter your email address to subscribe to this blog and receive notifications of new posts by email.