I have this Windows server installed which hosts a couple of websites, and is really not doing much.
Every now and then I need to log in for some server operation, only to find the typical Windows Updates Pending notification. Every time this happens I think to myself, hey, I’d like to get some notification that there are updates available, without having to log in.
So I finally dug around the internets – well, as it turned out, the info was all on StackOverflow:
The result is a tiny PowerShell script called update-mailer.ps1, which queries the Windows Update Service and sends an email accordingly, and can be found on my GitHub PowerShell repository.
To run this script as a Scheduled Task, create a .cmd batch file containing the command
SpecFlow steps are public methods adorned using the [Given], [When], and [Then] attributes inside classes marked with the [Binding] attribute.
To load the assembly containing the compiled SpecFlow steps, I found it necessary to load all other assemblies inside the bin directory. Otherwise an error would occur in the assembly’s GetTypes() method stating the references to other assemblies could not be resolved.
along with all [Binding] classes and there methods with the attributes listed above.
The attributes are collected in an array, as their Regex property contains the regular expression that will be used to search the .feature files using PowerShell’s select-string cmdlet.
:Outer foreach($a in $attributes) { foreach($ff in $featurefiles) { $found = $featurefilecontents[$ff] | select-string -Pattern $a.Regex -CaseSensitive if ($found.Matches.Length -gt 0) { #write-host "found $($a.Regex) in file $ff" $foundattributes += $a continue Outer } } write-host "did not found reference for $($a.GetType().Name) $($a.Regex)" $notfoundattributes += $a }
The script has been developed to analyze a project using SpecFlow 2.4.0, and is available on my PowerShell Snippets repository.
Microsoft provides a list of Latest updates for Microsoft SQL Server which gives you the latest Cumulative Update (CU) number for each Service Pack (SP) of each version of SQL Server since 2000.
But how do you check which version of SQL Server you have installed on your machines?
Sure, there internets are full of SQL scripts which retrieve this information (MS, SO, TN, etc), but those scripts require you to connect to every SQL Server instance and query it individually.
If you have direct access to the machine running SQL Server (such as your development machine), wouldn’t it be nice if you saw the list of installed SQL Server versions, whether the instances are running or stopped?
So I typed along in the PowerShell ISE, retrieving all executables named sqlservr.exe (the executable hosting SQL Server), retrieving their VersionInfo, and outputting the relevant information:
So, as a first step, I have the ProductVersion numbers.
As I found while researching this script, Microsoft also provides a list of product versions (“build versions”) indicating SQL Server version, Service Pack and Cumulative Update. The list can be downloaded as .xlsx from aka.ms/SQLServerbuilds.
So I have the installed product versions as list of PS objects, and the build numbers in .xlsx format, let’s combine both.
Fortunately, there is a PS library to read .xlsx files called ImportExcel (GitHub), and you install it by simply running
Install-Module ImportExcel -Scope CurrentUser
(you probably need to update PowerShell’s nuget, which is done semi-automatically in the process)
As it turned out, ImportExcel is PS code wrapping OfficeOpenXml.ExcelPackage, which I have dealt with in previous C# projects, so you do not have to have Excel installed when parsing the downloaded .xlsx.
The script uses Get-ExcelSheetInfo to query all the worksheets of the Excel file, and for each worksheet runs Import-Excel to retrieve the worksheet’s data into a hashtable indexed by the build number.
Finally, the original procedure is extended to lookup the ProductVersion number in the hashtable, and merge the Excel data with the original result:
Umbraco 7 keeps track of the installed or upgraded versions in the table umbracoMigration:
id
name
createDate
version
1
Umbraco
2017-05-11 15:45:57.410
7.6.1
2
Umbraco
2019-08-26 10:49:14.110
7.7.0
3
Umbraco
2019-08-26 13:56:57.727
7.15.2
web.config
Umbraco’s upgrade mechanism makes use of the <appSettings> to figure out whether a database upgrade is necessary. Depending on the version, the keys umbracoConfigurationStatus (up to V7), or Umbraco.Core.ConfigurationStatus (V8) are used:
PowerShell kann enumerate the information under these keys using the Get-ItemProperty commandlet. You can retrieve the PS properties of this data using the Get-Member commandlet.
If you develop web applications for IIS, or administrate IIS and file systems, you sooner or later end up with orphaned IIS application which point to non-existing directories.
To get an overview of IIS applications, start up PowerShell in administrator mode (I prefer PowerShell ISE) and run
import-module webadministration
dir "IIS:\\sites\Default Web Site"
To exclude any files that reside in IIS root, filter out the “file” entries:
import-module webadministration
dir "IIS:\\sites\Default Web Site" | where { $_.NodeType.Trim() -ne "file" }
Finally, we test the Physical Path property of the resulting entries
import-module webadministration
dir "IIS:\\sites\Default Web Site"
| where { $_.NodeType.Trim() -ne "file"
-and ![System.IO.Directory]::Exists( $_.PhysicalPath ) }
This command lists all IIS web application and virtual root directories which point to non-existing directories. The result looks like this:
Type Name Physical Path
---- ---- -------------
application application1 D:\app\app1
application web2 D:\app\web2
There are various way to query the Like count of Facebook pages.
If you just want to want the live stats of a given Facebook page, Quintly provides a live statistics page. Upon entering the FB page name (www.facebook.com/[page-name] becomeswww.quintly.com/facebook-live-statistics/[page-name]), the page displays a live chart updating every couple of seconds.
But you can also query Facebook directly using the FQL API or the Graph API.
To tidy up the applications on the IIS of my development PC, I wanted to find out which IIS application (and their virtual directories) point to which physical paths.
Starting out with this SO answer and some help from MSDN I quickly assembled this PS solution for the task:
This piece of code generates a tab-separated list of physical paths referenced by IIS sites, applications, and virtual directories. The application’s physical path is contained in a virtual directory with Path=”/”.
Note that the script must be “run as administrator” from the PS prompt or the PowerShell ISE.
Then a production system was upgraded to SSRS 2012 (and thus VS SSDT 2010), and report deployment did not work anymore, as VS displayed the error messages when hitting Deploy
The definition of the report ‘/myreport’ is invalid
and opening the report
[rsCompilerErrorInCode] There in an error on line 1 of custom code: [BC30560] “TimeZoneInfo” is ambiguous in the namespace “System”
or, in German
“TimeZoneInfo” ist im Namespace “System” nicht eindeutig.
So there’s no deployment for you, sorryyyyyyy.
A bit of google-fu brought up this question on Social and this issue on Connect, which means that MS is sitting on this bug for half a year now, and the “workaround” seems to be to create your own SSRS assembly, and/or to use Reflection to access the TimeZoneInfo methods, as sketched here.
I did not want to give in so easily, so I tried and verified that the procedure described in Deploying SSRS Reports with PowerShell still works for SSRS 2012.
And a colleague found that you can still deploy to SSRS 2012 using BIDS 2008 if you set the project’s TargetServerVersion to “SQL Server 2008 R2 or later”.