Attach to Process

sqlserver

To cause Azure to keep sending emails repeatedly for your Heath Check alerts, you need to uncheck the checkbox that says “Automatically Resolve Alerts”.


If you want to see what kind of error Axios throws, it is better to use console.log(error.toJSON()) than console.log(error). That's because the former option will list out all the properties from the error variable, while the latter option will only list out the error message itself.


Instead of adding todo comments, track it as a task so it doesn't get lost.

Source: TODO: Post an Article


Looking for a new way to do end-to-end testing on modern web apps? Check out Playwright. The tests are pretty easy to follow because it's basically a set of instructions written in code. So as you might have guessed, it's catered more to developers than non-technical folks.


I am going through the Pro ASP.NET Core 6 book. After trying to run the command libman install bootstrap@5.1.3 -d wwwroot/lib/bootstrap from page 72 (Chapter 4) of the book, I started getting the error listed below:

[LIB002]: The "bootstrap@5.1.3" library could not be resolved by the "cdnjs" provider

Turns out it was an issue with the library manager package that was installed. The fix is to install a newer one. More info can be found here:

https://github.com/Apress/pro-asp.net-core-6/blob/main/errata.md


Was looking for a way to add a list of objects to the appsettings.json file, then retrieve that list and turn them into something I can check against in a controller class. This answer works wonderfully.


React Hook Form — seems like a cool alternative form handler for React.


SQL to MongoDB Mapping Chart — good reference on how to write MongoDB queries based on expected SQL Server queries.


Reasons to blog if you're a software developer:

  • It helps you write better emails at work.
  • It helps you write better user stories and acceptance criteria.
  • It helps you communicate your ideas better.

If you're noticing a pattern there, it's all about being a better communicator. And you would want that, because that also helps you advance in your career.


Series: #DevNotes Tags: #Axios #ASPNetCore #Azure #Blogging #JavaScript #MongoDB #Playwright #React #SqlServer #Tests

Discuss... or leave a comment below.

I seem to have a need for this every 6 months or so. So I'm putting it up on this site to make it easier for me to lookup.

This SQL Server query allows you to search for specific text inside all stored procedures on a database. The query will return the name of the stored procedure.

SELECT name 
FROM   sys.procedures 
WHERE  Object_definition(object_id) LIKE '%Search-Text%' 

Source: Search text in stored procedure in SQL Server

Tags: #SqlServer #Scripts

Discuss... or leave a comment below.

This is sound advice. Using -- to comment out scripts in SQL is generally okay. But there is a risk of the script breaking, when all of a sudden your script is listed in just one line. So the better practice is to use /* comment goes here */ instead. Hats off to Brent Ozar's post for this tip.

Link: Never, Ever, Ever Start T-SQL Comments with Two Dashes


I figured I would add some sample script to illustrate the problem. So here goes.

Original Script

delete from dbo.Customers
--where createDate >= '20211201'
where customerId = 99

Take for example the script above. You originally wrote that script to delete all Customer records in the database that were created since the month of December. You then realize that it will delete more records than you wanted, so you comment out the createDate filter, and use a customerId filter instead.

Now in its current form, the above script would work just fine. You run it and it will delete the record where customerId is equal to 99. All is good.

But what happens if that script is somehow parsed by some other tool, and the resulting script displayed in just one line? Here is what that script looks like.

Broken Script

delete from dbo.Customers --where createDate >= '20211201' where customerId = 99

As you can see from the resulting broken script listed above, instead of just deleting the record where customerId is equal to 99, you will now be deleting all the records in the Customers table. Yikes!

Tags: #SQL #SqlServer

Discuss... or leave a comment below.

Recently had need of a script that can tell me the time difference in seconds, between two DateTime values in SQL Server. Here is what worked for me:

SELECT DATEDIFF(S, StartDateTime, EndDateTime) AS DurationInSeconds;

Note the first parameter, S, tells the DATEDIFF function to return the difference in Seconds. You can pass in other values as well, like MS for milliseconds or HH for hours.

You can find out more about the DATEDIFF function here.

Tags: #Scripts #SqlServer

Discuss... or leave a comment below.

If you're trying to publish a SQL Server database project using Visual Studio, but you're not seeing the Publish dialog box show up, check your other Desktop in Windows. I've noticed that every once in a while, the Publish dialog box shows up in another virtual desktop, instead of showing up on the one you're currently working on.

Tags: #SqlServer #VisualStudio

Discuss... or leave a comment below.

Recently had need of a SQL Server script that can tell me if there are open transactions on the database. The script below worked for me:

USE MASTER
GO
SELECT spid,
       PROGRAM_NAME,
       nt_userName,
       loginame,
       DB_NAME(s.dbid) AS DatabaseName,
       CR.TEXT AS Query
FROM   sysprocesses s
       CROSS apply sys.Dm_exec_sql_text(sql_handle) CR
WHERE  open_tran = 1

Source: Welcome To TechBrothersIT: DBA – How To Find Open Transactions In SQL Server

Tags: #Database #Scripts #SqlServer

Discuss... or leave a comment below.

Unintentionally running data manipulation scripts on a Production database is obviously bad. In this post, I'll share some tips for working with Production databases so the same thing doesn't happen to you.

1. Check your Database Connections

Double-check your database connection before running any scripts. Oftentimes, this simple check is all you need to do, to avoid accidentally deleting data in a Production database.

2. Start new instance of SQL Server Management Studio

When connecting to a Production database, start up a new instance of SQL Server Management Studio. The idea is to keep scripts between Production and other databases separate. This helps you distinguish between database connections. And hopefully stop you from running a script that will mess up the Production database.

3. Work off Database Backups

Better yet, if you have the option, restore a backup copy of the Production database on a dev box and work off that. This way, you can mess with the data and not worry about affecting Production.

Note: Make sure restoring database to a local dev box is allowed per your company policy.

4. Use Database Transactions

If the script you are running has the potential to change a databases' data or schema – like if it has the words ALTER, CREATE, DELETE, INSERT, UPDATE, or any other statement that can modify the contents of a database – then you should be wrapping the scripts in a transaction. BEGIN TRAN is your friend. I do this even when I'm running the scripts on a local database instance. It might sound like overkill to use transactions on scripts running on non-PROD databases, and maybe it is. But what's an extra 10 seconds or so of doing that, if it prevents you from accidentally blowing up the Production database? I do it for peace of mind. It's part of routine for me now. Anytime I write a script that has the words ALTER, CREATE, DELETE, INSERT or UPDATE, I wrap them in a transaction. Doesn't even matter what database I'm connected to.

This is how I wrap scripts in a transaction:

BEGIN TRAN

-- ALTER, CREATE, DELETE, INSERT, UPDATE Statements go here

ROLLBACK TRAN
COMMIT TRAN

I use it this way: I highlight the scripts starting from BEGIN TRAN and stop just before the ROLLBACK TRAN statement. I run those and double-check the results. This gives me the opportunity to either ROLLBACK or COMMIT depending on the results.

Also, you'll notice above that I have a ROLLBACK TRAN just before the COMMIT TRAN statement. This is just an extra safety precaution that can be especially useful when you're sending out scripts to someone else. They might accidentally run the scripts in a Production database, but the ROLLBACK TRAN will make sure nothing bad happens.

5. Close Instances of SQL Server Management Studio

Before going home or logging off from work, close any instances of SQL Server Management Studio, especially those that you used to connect to a Production database. You'll want to do this because closing an SSMS instance while you have an open transaction, will cause it to display a warning. An open transaction has the potential to block queries and cause timeouts. This will remind you to either rollback or commit your transactions before logging off.

That's it for today. I hope those tips are helpful. If you have your own tips for working with Production databases, do share them by leaving a comment below or sending me a message.

Tags: #Database #Scripts #SqlServer

Discuss... or leave a comment below.

For the past week or so, I've noticed that my C drive was constantly running out of space. Normally, I have a little over 200 GB of free space on my C drive. I ran the SpaceSniffer tool to figure out what was taking up all the space. It pointed me to a set of temp files used by SQL Server. One file in particular, “templog.ldf”, was taking up 40 GB of space. The other tempdb files were 17 GB each but there were 8 of them. My understanding is that these files should automatically shrink back down, but why won't they?

I know what caused those files to increase in size. I've been restoring large database backups locally. And I've done it many times throughout the week. The stored procedures in those databases do use the temp tables a lot. So, I'm not surprised the SQL Server temp files got so big. But I'm still not sure why those files were not shrinking back down to their normal sizes. They maintain their size even after republishing to an empty database. If that was always the case, then I would have noticed this issue years ago. Yet, it has only been happening recently.

While I don't exactly have an explanation for why those files remained huge, I did find a way to shrink them back down. The solution was simple: restart the SQL Server service. Once I did this, the temp files went back to their regular sizes. Restarting the PC would also work, but I haven't done it in awhile since I've started working from home. I believe that contributed to the issue of files maintaining their huge sizes. So, going forward I will restart my work PC every weekend.

Tags: #Database #SqlServer

Discuss... or leave a comment below.

As software developers, one of the problems we have to be mindful of, is our tendency to “reinvent the wheel”. By this I mean the tendency for developers to write a new application to solve a problem, that could have been solved by simply using existing applications/tools. When faced with the problem of importing all ObjectIds from a MongoDB collection into a SQL Server table, I thought I needed to write a new migration/utility app. Turns out I don't need to write any code at all, well except for the MongoDB query of course, but that's besides the point. Here is how I solved this problem using free tools available on the Internet.

Solution

First up is the MongoDB query that will get you all ObjectIds from a collection. I use the free Robo 3T app to query MongoDB.

db.getCollection('Collection').find({},{_id:1});

Robo 3T has a “view results in text mode” option that will display the results in JSON format. Select that and then run the query listed above. You will then be presented with results that will look similar to what I have below.

/* 1 */
{
    "_id" : ObjectId("5cba07dcca67bd08e8a6b3e2")
}

/* 2 */
{
    "_id" : ObjectId("5cba07b4ca67bc33705ded1d")
}

/* 3 */
{
    "_id" : ObjectId("5cba07baca67bc33705ded1e")
}

So now we have a list of ObjectIds from MongoDB. Imagine if you got back thousands of ObjectIds. It would be tempting at this point to say something like, hey I need to write a console app that can parse these results. I know I did. And I did write such an app, but eventually I realized I didn't need to. All I needed was Notepad++. Actually any text editor with a good enough “search and replace” feature will work.

After copying those results from Robo 3T into Notepad++, you can then use the “search and replace” functionality to transform the results into INSERT scripts that you can run in SQL Server.

So first thing you need to do is comment out all those brackets { and }. To do this, you simply search for { or } and replace the values with --{ or --}. Adding -- to the start of a line will comment it out in SQL Server. You can also opt to simply just delete all those brackets. SQL Server won't care if there are spaces between INSERT scripts. At this point your text file in Notepad++ will look similar to what I have below.

/* 1 */
--{
   "_id" : ObjectId("5cba07dcca67bd08e8a6b3e2")
--}

/* 2 */
--{
   "_id" : ObjectId("5cba07b4ca67bc33705ded1d")
--}

/* 3 */
--{
   "_id" : ObjectId("5cba07baca67bc33705ded1e")
--}

The second thing you need to do is to transform the lines with the ObjectId values in it, into SQL Server INSERT statements. To do this you go through two steps:

  1. Search for "_id" : ObjectId(" and replace the values with INSERT INTO [dbo].[MongoDbObjectIds] ([ObjectId]) VALUES ('. (This is assuming you have a MongoDbObjectIds table in SQL Server with an ObjectId column.)
  2. Search for ") and replace the values with ');. This will round out the INSERT statements. At this point, you should have valid INSERT statements that can be used in SQL Server. They should look similar to the ones I have below.
   /* 1 */
   --{
      INSERT INTO [dbo].[MongoDbObjectIds] ([ObjectId]) VALUES ('5cba07dcca67bd08e8a6b3e2');
   --}

   /* 2 */
   --{
      INSERT INTO [dbo].[MongoDbObjectIds] ([ObjectId]) VALUES ('5cba07b4ca67bc33705ded1d');
   --}

   /* 3 */
   --{
      INSERT INTO [dbo].[MongoDbObjectIds] ([ObjectId]) VALUES ('5cba07baca67bc33705ded1e');
   --}

All that needs to be done now is to copy the INSERT statements, run them in SQL Server Management Studio and you are done.

Now it must be noted that this approach to getting ObjectIds out of MongoDB and into SQL Server, is useful only if you are doing it once or twice. As soon as you have to repeat this process multiple times throughout the week or worse, throughout the day, then by all means write a migration/utility app to automate the process. It might take you longer to finish it, but the re-usability of said migration application will pay for itself in the future.

Hope you found this post helpful. Happy Friday to everyone!

Tags: #MongoDB #SqlServer #Scripts #Database

Discuss... or leave a comment below.

One of the best additions to SQL Server 2016 is native support for JSON. There a number of articles already covering how to query JSON in SQL Server, so I won't cover those. What particularly gives me headaches though is querying a JSON Array in SQL Server. Hopefully the scripts below can help someone else.

The SELECT script below can be used when you are working with a very basic JSON array that doesn't even have a Property for the array. In this example, we have an array of OrderIds.

DECLARE @OrderIdsAsJSON VARCHAR(MAX) = '["1", "2", "3"]';

SELECT [value] AS OrderId
FROM OPENJSON(@OrderIdsAsJSON)
WITH
(
       [value] BIGINT '$'
);

The second SELECT script below can be used when you are working with a JSON array that does have a Property defined for the array. In this example, we have an array of OrderIds with a defined OrderId property.

DECLARE @OrderIdsAsJSON VARCHAR(MAX) = '{"OrderId":["4", "5", "6"]}';

SELECT [value] AS OrderId
FROM OPENJSON(@OrderIdsAsJSON, '$.OrderId')
WITH
(
       [value] BIGINT '$'
);

Update: 2019-07-17

Adding another example for a JSON array that came from an Int list in C#. This is how you could query it in SQL Server.

DECLARE @OrderIds VARCHAR(MAX) = '{"$type":"System.Int64[], mscorlib","$values":[4, 5, 7, 999]}';

SELECT [value] AS OrderId
FROM OPENJSON(@OrderIds, '$."$values"');

#SqlServer #Scripts #Database #JSON

Discuss... or leave a comment below.