Find all the unused SQL database files on your SQL server

If your like me you have been frustrated by finding stray mdf, ldf, or ndf files on your SQL server from databases deleted long ago that should have been cleaned up but someone forgot.

To permanently solve this problem I have written a couple of PowerShell scripts and configured a scheduled task to send an email to our DBA group if we ever have SQL database files (mdf, ldf, ndf) sitting on any local drive attached to an SQL server that are not currently being used by a database on that SQL server.

First up we a simple helper function that allows us to run SQL statements from anywhere .net is installed without SQL specific utilities installed:

function Invoke-SQL {
    param(
        [string] $dataSource = ".\SQLEXPRESS",
        [string] $database = "MasterData",
        [string] $sqlCommand = $(throw "Please specify a query.")
      )

    $connectionString = "Data Source=$dataSource; " +
            "Integrated Security=SSPI; " +
            "Initial Catalog=$database"

    $connection = new-object system.data.SqlClient.SQLConnection($connectionString)
    $command = new-object system.data.sqlclient.sqlcommand($sqlCommand,$connection)
    $connection.Open()
    
    $adapter = New-Object System.Data.sqlclient.sqlDataAdapter $command
    $dataset = New-Object System.Data.DataSet
    $adapter.Fill($dataSet) | Out-Null
    
    $connection.Close()
    $dataSet.Tables
}

Next up we have have Get-SQLDatabaseFilesNotUsed:

function Get-SQLDatabaseFilesNotUsed {
    param ( 
        [parameter(Mandatory = $true)][string]$SQLServerName,
        [switch]$IncludeEqual
    )

        $SQLCommand = @"
set nocount on
SELECT DB_NAME([database_id])AS [Database Name], 
        [file_id], name, physical_name, type_desc, state_desc, 
        CONVERT( bigint, size/128.0) AS [Total Size in MB]
FROM sys.master_files WITH (NOLOCK)
ORDER BY DB_NAME([database_id]) OPTION (RECOMPILE);
"@

    $Results = Invoke-Sql -datasource $SQLServerName -database master -SQLCommand $SQLCommand 
    $SQLDatabaseFilesInUse = $Results | select @{Name="fullname";Expression={$_.physical_Name -replace "\\\\","\" <#SQL server allows \\ in paths and just uses it like \#> }}

    $SQLDatabaseAndLogFilesOnDisk = invoke-command -ComputerName $SQLServerName -ScriptBlock { 
        $Filesystems = get-psdrive -PSProvider FileSystem
        Foreach ($FileSystem in $Filesystems) {
            Get-ChildItem $FileSystem.Root -Recurse -include *.mdf,*.ldf,*.ndf -File -ErrorAction SilentlyContinue | select fullname 
        }
    }

    $SQLDatabaseAndLogFilesOnDisk = $SQLDatabaseAndLogFilesOnDisk | 
    Where fullname -notlike "C:\windows\winsxs*" | 
    Where fullname -notlike "C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\Binn\*" |
    select fullname
    
    
    Compare-Object -ReferenceObject $SQLDatabaseFilesInUse -DifferenceObject $SQLDatabaseAndLogFilesOnDisk -Property FullName -IncludeEqual:$IncludeEqual | FT -AutoSize

}

One thing to note, if you have additional database files that you want to keep on disk and exclude from this check just add another “Where fullname –notlike “<Path where you have files you don’t want checked>” |” to the others you see that filter out some files that share the same extensions as SQL database files even though they are not databases.

Lastly we have a script file Test-SQLDatabaseFilesNotUsed.ps1 that will want to call from your scheduled task:

. c:\scripts\Invoke-SQL.ps1
. c:\scripts\Get-SQLDatabaseFilesNotUsed.ps1

$OutputMessage = ""

$FromAddress = "scheduledtasks@tervis.com"
$ToAddress = "WindowsServerApplicationsAdministrator@tervis.com"
$Subject = "SQL Server has data or log files that are not being used"
$SMTPServer = "cudaspam.tervis.com"

$OutputMessage = Get-SQLDatabaseFilesNotUsed -SQLServerName sql

if ($OutputMessage)
{
    Send-MailMessage -From $FromAddress -to $ToAddress -subject $Subject -SmtpServer $SMTPServer -Body ($OutputMessage | FT -autosize | out-string -Width 200) 
}

Now save all of these in a C:\scripts directory and create a scheduled task with whatever schedule you like (I have ours run once per week) that executes:

Powershell.exe -noprofile -file c:\scripts\Test-SQLDatabaseFilesNotUsed.ps1

This has saved me a +100GB already and from this point forward you won’t have to worry if you have space being used up but the database files that time forgot.

Tagged

Why the least supported, under marketed, Mostly unknown, feature of the kindle is its killer feature

As a knowledge worker in our digital age there are innumerable ways we have at our disposal to gain knowledge, each with their respective pros and cons. We have google, wikipedia, youtube, blogs and associated RSS feeds, ServerFault and its family of sites, podcasts, iTunesU, the kindle, etc.

I will talk about most of these different areas but today I want to talk about why the kindle is critical to your ability gain access to a specific domain of knowledge that had up until recently, been completely inaccessible to us.

There is a lot of great content out there via all of the mechanisms mentioned. I listen to podcasts (PacketPushers, .Net Rocks, RunAs Radio, PowerScripting Podcast, etc.) incessantly with a great return on investment, more detail in a future blog post. This combined with a good RSS reader subscribed to numerous blogs, google, and wikipedia allow me to tap into a wide range of knowledge and learn many topics to various depths.

Even with all of this there is still a domain of knowledge that is inaccessible through these means, the printed pages of books. As nice as it might be to have access to Safari online, an Apress or Oriely subscription, there exists a broad base of knowledge, much of which is found nowhere else, only in books. As with all mediums you have to be selective in which books you invest your time in but the depth of the content that you can glean from books on topics such as software development lifecycle, Agile software development, BizTalk Administration, Oracle Expert Database Architecture, traditional project management, The practice of system and network administration, etc. is greater than you can find on msdn, blogs, podcasts, etc.

The problem for most of us is that we simply are not willing to take the time to invest in sitting and reading a book regardless of whether we recognize the valuable knowledge that they contain. We have fractured chunks of time at irregular intervals which works for consuming a quick post but doesn’t likely present us the opportunity to pick up a book and read for that time.

Audio books are one helpful way to try to address this problem. They allow us access to some of the content out there but unfortunately much of the deeper or more technical books are never recorded in audio form. Audio books also tend to lock you into a pace of consuming the content that is designed to be appealing to the least common denominator. I prefer to consume audio content at at least twice the speed of the original recording as I find I do not lose any of the fidelity of the content and enable myself to consume twice as much in the same amount of time. For this reason I exclusively use DownCast to listen to podcasts as it allows me the greatest flexibility in controlling playback speed while at the same time allowing me to consume podcasts without every connecting to iTunes.

We are then faced with this problem, how do get immediate, convenient, self paced, access to the content that can only be found in books without having to sit and read?

The answer is what I think is the least supported, under marketed, and mostly unknown feature of some models of the Amazon Kindle, Text to Speak also known as Read to Me.

This feature allows you to have the kindle read to you any book that the publisher has allowed to be text to speech enabled.

When browsing a kindle books product page there is a section towards the bottom that gives you the kindle specific features of the book.

If you look at the comparison between different kindle models on Amazon’s website you will not find this feature listed.

20120827-222936.jpg

20120827-222949.jpg

20120827-222956.jpg

This feature allows you to gain access to an audio form of books which I garauntee you will never see published in an official audio book form. Exactly who is going to sit down and record an unabridged copy of Implementing Cisco Unified Communications Manager, Part 1 (CIPT1), no one, that’s who. Especially when you consider that there may be 5 people who would actually purchase such a recording if it even did exist.

I can tell you this technology has transformed my life and if you start listening to hard core technical books while driving to and from work, it will change yours as well.

So what are you waiting for, go out there and get you the best real time audio book generator for technical books on the market, also known as the Amazon Kindle 3g (the one with the keybaord, ps the touch model they discountinued had it for a while too).

Thanks for reading,

Chris

Q: How to remove all items that haven’t been modified in X days using scheduled task without any supporting script files

Short answer:

Create a scheduled task and put the code below into the start a program step after updating the number of days directory path

powershell "Get-Childitem 'C:\Test Directory with old files' | Where LastWriteTime -le $(Get-Date).AddDays(-30) | Remove-Item"

 

Long Answer:

Open Task Scheduler and create a new basic task

image

Give it a name

image

 

Next > Next > Next until you get to the Start a Program action

Now enter the code below:

powershell "Get-Childitem 'C:\Test Directory with old files' | Where LastWriteTime -le $(Get-Date).AddDays(-30) | Remove-Item"

 

image 

Next and then Yes

image

Finish

image

Done, set whatever schedules, triggers, etc you want. The key is finally having a working one liner

Best rack nuts I have ever used

Over the years I have installed and removed countless devices from various racks. Doing this immediately leads you to despise getting your hands cut up by trying to remove rack nuts that are too stiff or trying to remove them with a screw driver and stabbing your hand because you never happen to have that special little tool lying around when you need it.

After dealing with those issues I worked to find the best rack nuts on the market and for the last few years we have exclusively used StarTech.com rack nuts like those listed here.

These rack nuts are so easy to work with that whenever we move anything we always throw away whatever rack nuts were there and use these instead. They are plyable enough to be clipped in and out with your hands but strong enough to hold the heaviest UCS blade chasis in place without any issues. 

Sometimes its the little things in life that make things great and these rack nuts are a joy to work with for anyone who has ever had to struggle with inferior rack nuts before.

Restore database backup files in a folder to SQL server using PowerShell

Ever had a directory full of .bak files and need them all restored to a SQL server? Ever tried to find the solution to that problem through PowerShell instead of some narly looking VBscript on spiceworks?

If you said yes to either of these then the code below is for you. I will try to document more when/if I can but the code does what it says and handles logicial file names that are not the same as the database names and other common gotchas. I have tried to write this in such a way that it requires little or no stateful information and determines everything it needs from the backup data contained within the files and from the configuration parameters of the SQL instance itself.

function invoke-DatabaseRestore {
    param ([String]$SQLServer="(local)", $BackupPath, [String]$BackupFileFilter = "")
    #load assemblies
    [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
    [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
    [Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
    [Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null
    
    gci $BackupPath -Filter $BackupFileFilter | select fullname | % {
        $backupFile = $_.FullName

        #we will query the database name from the backup header later
        $server = New-Object ( "Microsoft.SqlServer.Management.Smo.Server" ) $SQLServer
        $backupDevice = New-Object( "Microsoft.SqlServer.Management.Smo.BackupDeviceItem" ) ($backupFile, "File")
        $smoRestore = new-object( "Microsoft.SqlServer.Management.Smo.Restore" )
        $backupDevice| FL *

        #Get default log and data file locations http://sqlblog.com/blogs/allen_white/archive/2009/02/19/finding-your-default-file-locations-in-smo.aspx
        $DataPath = if ($server.Settings.DefaultFile.Length -gt 0 ) { $server.Settings.DefaultFile } else { $server.Information.MasterDBLogPath }
        $LogPath = if ($server.Settings.DefaultLog.Length -gt 0 ) { $server.Settings.DefaultLog } else { $server.Information.MasterDBLogPath }

        #restore settings
        $smoRestore.NoRecovery = $false;
        $smoRestore.ReplaceDatabase = $true;
        $smoRestore.Action = "Database"
        $smoRestore.PercentCompleteNotification = 10;
        $smoRestore.Devices.Add($backupDevice)
 
        #get database name from backup file
        $smoRestoreDetails = $smoRestore.ReadBackupHeader($server)
 
        #display database name
        "Database Name from Backup Header : " +$smoRestoreDetails.Rows[0]["DatabaseName"]
 
        #give a new database name
        $smoRestore.Database = $smoRestoreDetails.Rows[0]["DatabaseName"]

        #Relocate each file in the restore to the default directory
        $smoRestoreFiles = $smoRestore.ReadFileList($server)

        foreach ($File in $smoRestoreFiles) {
            #Create relocate file object so that we can restore the database to a different path
            $smoRestoreFile = New-Object( "Microsoft.SqlServer.Management.Smo.RelocateFile" )
 
            #the logical file names should be the logical filename stored in the backup media
            $smoRestoreFile.LogicalFileName = $File.LogicalName

            $smoRestoreFile.PhysicalFileName = $( if($File.Type -eq "L") {$LogPath} else {$DataPath} ) + "\" + [System.IO.Path]::GetFileName($File.PhysicalName)
            $smoRestore.RelocateFiles.Add($smoRestoreFile)
        }
        #restore database
        $smoRestore.SqlRestore($server)
 
    }
}

Find all files with NTFS Alternate Data Streams using PowerShell

  1. Open Powershell
  2. cd to the directory or the root of a drive where you want to recursively look for alternate data streams
  3. Run the following command (requires PowerShell V3)

gci -recurse | % { gi $_.FullName -stream * } | where stream -ne ':$Data'

To test that this works properly navigate to a random directory on your computer and run the following commands:

new-item test.txt -Type file
Add-Content test.txt &quot;test content&quot;
Add-Content test.txt -str HiddenStream &quot;Hidden test content stored in the stream&quot;
get-content .\test.txt
get-content test.txt -str HiddenStream

Now navigate to some folder above the directory where you ran these commands and run the command to find all files with alternate data streams and you should see our test.txt file come up.

For a more complete description of what you can do with NTFS Alternate Data Streams in PowerShell V3 please check out this blog post.

Tagged

SCVMM 2012 SP1 Error when adding clusters with decommissioned nodes

During our recent upgrade to System Center Virtual Machine Manager 2012 SP1 I hit an issue where I got this error when I tried to add two of the Hyper V Clusters in our environment back to SCVMM:

Image(3)

I confirmed that none of the reasons mentioned in the message were preventing us from adding these hosts and after doing a ipconfig /flushdns on the SCVMM server and then trying to add one of the clusters again I noticed that the SCVMM server was querying DNS for an ip for a host that was decommissioned but had not yet been evicted from the cluster.

This brought memories of hitting this issue in SCVMM 2008 R2 back to mind and I confirmed that both clusters that I was trying to add had a single node that was decommissioned but had not yet been evicted.

After evicting the decommissioned nodes I was able to properly add the clusters to SCVMM.

I am now blogging about this so that hopefully I and everyone else can remember that you will hit this issue in SCVMM 2008 R2, 2012, and 2012 SP1 if you try to add clusters to SCVMM that have nodes that are offline.

How to manage development against large databases

We use Dynamics NAV where I work and have a database that has grown to over 500 GB. Due to this, doing a database restore is a time consuming operation that should be avoided whenever possible. Due to the nature of the application it does not work well to have multiple developers in the same database which means we need to have a DB for each developer as well as for our test and stage environments.

The easiest way to accomplish this is through the wonders of virtualization and specifically differencing disks. The basic workflow is this:

  • Create a virtual machine using Hyper V that has a base OS and SQL versions installed for the data your working with and two drives, one for the OS and the other one completely balank
  • Restore the database you need to provide to multiple developers to the secondary drive that does not contain an operating system
  • After the restore, detach the database and shutdown the virtual machine
  • Rename the second vhd with the data and log files on it to _parent.vhd
  • Set the parent vhd to be read only
  • Move the parent vhd to some location where you have the space to store this large restore and where you can provide read access to it via a unc path like \\MyServer\Share\_parent.vhd
  • Open Hyper V manager and create a new differencing disk pointing to the parent vhd through the unc path created earlier
  • Now provide a copy of this vhd to every individual who needs to have their own copy of this data set.

That’s it. If you have developers that need to have a local copy of the data accessible from their laptops now they can simply attach the vhd using disk manager in Windows 7 or 8 and use the data however they like. If your providing a copy of the data to a staging environment that is a virtual machine simply shut down the vm and attach a copy of the child vhd as a new disk.

This process is made even better if you have windows server 2012 as the file server where your parent is stored and either 2012 or windows 8 clients pulling the data as cifs 3 really makes the performance of this solution even better than it has been.

There are many variations on this theme that you can use (snapshots of source data exposed via iscsi directly from your San, keeping the parent and child vhds on the same storage so that you can get better performance, keeping the parent and child vhds on the same disks and then attaching the child vhds to the host file server os and then creating separate shares of the data inside each child vhd so that remote SQL servers can simply attach the data and log files directly has unc paths without any knowledge of the vhd system behind it, etc.) but by using differencing vhds you can worse it more manageable to deal with many copies of a large data set while still keeping cost low.

Tagged ,

Clean up Active Directory Computers and Users with stock PowerShell

Here two commands that will quickly help you delete computers and users from active directory that havn’t logged on for over a month.

For computers use:

Get-ADComputer -properties LastLogonDate -filter * | where LastLogonDate -LT (get-date).AddDays(-30) | Remove-ADObject -Recursive

For users use:

Get-ADUser -properties LastLogonDate -filter * | where LastLogonDate -LT (get-date).AddDays(-30) | Remove-ADObject -Recursive

Both commands will prompt you before deleting the object so that you can sanity check each deletion before it occurs and if your unsure simply choose not to delete that object.

It’s quick, easy, and if you simply delete the things your sure you can delete you will likely be leaps ahead of where you started.

Tagged ,

Get-Member lies

Get-Member lies.

Let me show you.

Here are some arrays:

$Array = @()
$Array2 = 1,2
$Array3 = "Thing1","Thing2"

Here is what GM says about these arrays:

PS H:\> $Array | GM
GM : No object has been specified to the get-member cmdlet.
At line:1 char:10
+ $Array | GM
+          ~~
+ CategoryInfo          : CloseError: (:) [Get-Member], InvalidOperationException
+ FullyQualifiedErrorId : NoObjectInGetMember,Microsoft.PowerShell.Commands.GetMemberCommand

PS H:\> $Array2 | GM


TypeName: System.Int32

Name        MemberType Definition
----        ---------- ----------
CompareTo   Method     int CompareTo(System.Object value), int CompareTo(int value), int

…


PS H:\> $Array3 | GM


TypeName: System.String

Name             MemberType            Definition
----             ----------            ----------
Clone            Method                System.Object Clone(), System.Object ICloneable.Clone()
CompareTo        Method                int CompareTo(System.Object value), int CompareTo(string strB), int IComparab...

…

LIES, ALL LIES!

Here is how you force GM to tell you the truth, we use , . That is right ,, this small period that grew a tail is the truth serum to your filthy lying GM problems.

Let me demonstrate:

PS H:\> ,$Array|GM


TypeName: System.Object[]

…

PS H:\> ,$Array2|GM


TypeName: System.Object[]

…

PS H:\> ,$Array3|GM


TypeName: System.Object[]

…

Now we can clearly see the true type of the object and the methods that are available to us.

Thanks goes out to Marco Shaw for providing this solution here,

Update 2013-02-12:

If you open powershell and run get-help about_array, then navigate down to the heading GET THE MEMBERS OF AN ARRAY you will find the following:

GET THE MEMBERS OF AN ARRAY

To get the properties and methods of an array, such as the Length
property and the SetValue method, use the InputObject parameter of the
Get-Member cmdlet.

When you pipe an array to Get-Member, Windows PowerShell sends the items
one at a time and Get-Member returns the type of each item
in the array (ignoring duplicates).

When you use the InputObject parameter, Get-Member returns the
members of the array.

For example, the following command gets the members of the array in the
$a variable.

Get-Member -InputObject $a

You can also get the members of an array by typing a comma (,) before
the value that is piped to the Get-Member cmdlet. The comma makes the
array the second item in an array of arrays. Windows PowerShell pipes
the arrays one at a time and Get-Member returns the members of the array.

,$a | Get-Member

,(1,2,3) | Get-Member

There you have it, the comma forces PowerShell to dynamically create an array of array’s and then get-member gets the type of the arrays within the array thereby giving us the actual properties of the array we are interested in.

Tagged
Follow

Get every new post delivered to your Inbox.