Unsupported .Net SDK version error in Visual Studio Code extension

During the last couple of days, my VSCode installation greeted me with a strange error message on startup:

Installed .NET SDK version 6.0.100-preview.7.21379.14 is newer than the currently supported versions. Project build will not work. Please install .NET Core SDK version 3.1 and include a global.json in the project folder specifying the SDK version to use. More Information

Recently installed .Net SDK 6

VSCode is currently version 1.62.2 (it auto-updated today), and the extension version data is:
– Released on: 28.10.2021, 00:30:16
– Last updated: 28.10.2021, 00:32:34
– Identifier: ms-mssql.sql-database-projects-vscode

The More Information link points to the page Select the .NET version to use and does not seem to apply to installed VSCode extensions.

After a bit of digging, I found a couple of GitHub issues in the azuredatastudio project, which the SQL Database Projects extension is part of.

Issue 16766 deals with the behavior I noticed: SQL projects don’t build when active dotnet version is 6.0 preview

and confirmed by one comment:

Unfortunately, this affects Visual Studio Code as well. Annoyingly, the warning pops up even if no database project is loaded.

I tried changing the extension settings “Sql Database Projects: Net Core SDKLocation” (Full path to .NET Core SDK on the machine) but this did not get rid of the error message.

So I navigated to the install location of the VSCode extension, which can be found under
%userprofile%\.vscode\extensions\ms-mssql.sql-database-projects-vscode-0.13.0
(adjust version number) and tried to figure out how the error message is raised.

It turns out that the extension does not specifically query for the required dotnet version, but rather issues a “dotnet –version”, which returns the highest installed version number:

> dotnet --version
6.0.100-preview.7.21379.14

This version number is checked against the hard-coded upper limit of

t.maxSupportedNetCoreVersionCutoff = "6.0.0";

So I opened the file extension.js and tried to change this value to a SemVer higher than “6.0.100”, such as “7.0.0”, and restarted VSCode. And … the error message was gone!

Ok, so the extension was enabled, and VSCode does not throw an error when it did before, but would the extension still actually work?

So I created a new SQL Database project by Ctrl+Shift+P and selecting “Database Projects: Create Project From Database”, subsequently entering all the connection string data to connect to a database, which resulted in a .sqlproj file and generated CREATE TABLE statements in one .sql file per table.

Sending Email Notifications for Windows Updates

I have this Windows server installed which hosts a couple of websites, and is really not doing much.

Every now and then I need to log in for some server operation, only to find the typical Windows Updates Pending notification. Every time this happens I think to myself, hey, I’d like to get some notification that there are updates available, without having to log in.

So I finally dug around the internets – well, as it turned out, the info was all on StackOverflow:

The result is a tiny PowerShell script called update-mailer.ps1, which queries the Windows Update Service and sends an email accordingly, and can be found on my GitHub PowerShell repository.

To run this script as a Scheduled Task, create a .cmd batch file containing the command

powershell.exe -ExecutionPolicy Bypass -File path\to\update-mailer.ps1

to bypass PowerShell’s restricted execution policy, and call the script from the task.

Concatenating a list of strings in T-SQL

Remember how clumsy it was to concatenate string identifiers of child records within a sub-SELECT and adding this artificial FOR XML PATH('')? (if you don’t remember)

Writing plain SQL again after a long time, I found the “new” function STRING_AGG (introduced in SQL Server 2017) to be a great simplification. I guess mainly because the sub-select can be inlined and the whole SELECT can be GROUP BY’d.

Case in point, I needed to find out which database tables have unique indexes defined, and which columns they indexed.

Here’s the magic:

select o.name, i.name,
    STRING_AGG(c.name, ', ') WITHIN GROUP (ORDER BY ic.index_column_id)
from sys.index_columns ic
inner join sys.indexes i 
    on ic.object_id = i.object_id and ic.index_id = i.index_id
inner join sys.objects o on i.object_id = o.object_id
inner join sys.columns c 
    on ic.object_id = c.object_id and ic.column_id = c.column_id
where o.is_ms_shipped = 0 
and o.name not like '%_history'
and i.is_primary_key = 0
and i.is_unique = 1
group by o.name, i.name
order by 1, 2

Constructing Type-Safe SQL Statements from ORM Classes

Suppose you have a database application, and the database tables are mapped onto C# classes as used by an ORM such as Entity Framework or NHibernate.

Suppose you need to construct an SQL statement manually, because your ORM does not support or implement (or interface to) a given SQL feature.

Of course, you can always write the SQL statement manually, and query data using EF’s Database.SqlQuery() or NHibernate’s CreateSQLQuery().

The problem I found is that as soon as the data model changes, these manually crafted SQL statements are bound to fail.

Let’s have a look at a simple SELECT statement involving a JOIN of two tables (of course ORMs manage such a query, this is just an illustration):

SELECT s.Name AS SupplierName, a.ZipCode AS AddressZipCode, 
       a.City AS AddressCity, a.Street AS AddressStreet
FROM  Supplier s
INNER JOIN Address a ON s.AddressId = a.Id

The statement includes table names, table aliases, table column names, and column aliases, and I want to construct the column names for the SQL statement from the ORM’s mapped classes.

Gladly I already have the GetPropertyName() function to retrieve as class’s property names, so we can focus on enumerating them:

public static string EnumerateColumns<T>(
    string tablePrefix, 
    string columnPrefix, 
    params Expression<Func<T, object>>[] columns)
{
    return string.Join(", ", 
        columns.Select(c =>
            (tablePrefix != null ? (tablePrefix + ".") : "") +
            GetPropertyName(c) +
            (columnPrefix != null 
                ? (" AS " + columnPrefix + GetPropertyName(c)) 
                : "")));
}

So we have a function handling optional table aliases and optional column name prefixes, which we can invoke for every joined table of our statement:

var sql = "SELECT "
    + EnumerateColumns<Supplier>("s", "Supplier", s => s.Id, s => s.Name)
    + ", "
    + EnumerateColumns<Address>("a", "Address", 
        a => a.ZipCode, a => a.City, a => a.Street)
+ @"
FROM " + nameof(Supplier) + @" s
INNER JOIN " + nameof(Address) + " a ON s." 
    + nameof(Supplier.AddressId) + " = a." + nameof(Address.Id);

Console.WriteLine(sql);

There is a little bit of cheating hidden in this code, assuming that table names match their class names, and column names match their property names. If they names differ, you can mark up the class declaration with some attribute and query it using GetCustomAttributes(), or EF’s pluralization service (if it is used).

The full code for this article available on my GitHub.

Updating TortoiseGit to use github’s Personal Access Token

I recently got github’s email on their changes to authentication tokens, but of course ignored as long as I could.

We recently updated the format of our API authentication tokens, providing additional security benefits to all our customers.

So today was the day to update my local authentication token in TortoiseGit.

I headed over to github’s Personal Access Token page to generate a new token.

But it was not really clear how to add it as authentication for repositories existing on my local PC.

After some google-fu, I opened the TortoiseGit Settings dialog in the root directory of one of my repositories,

TortoiseGit credentials (right-click on repository directory)

changed the Credential helper to Advanced, and entered my github username.

The next TortoiseGit/Push… command open an authentication dialog where I copied the github token in the password field.

After the push was completed, I could verify that the new token was added in Windows under Control Panel / Accounts / Credential Manager / Windows Credentials

Updated Windows credentials

Renaming files after their time stamp in Ubuntu

Downloading data files from certain web sites, the data files usually either are already named after their creation timestamp, or they end with subsequent numbering (1), (2), … in their names.

To rename such numbered files, I found that the date -r command displays a file’s modification timestamp, which can be formated with the +format option:

date -r somefilename.txt +%Y%m%d

To iterate over all downloaded files, I use

for f in file*name*pattern* do

Putting it all together, I came up with the one-liner

for f in pattern*; do mv $f `date -r $f +filename_%Y%m%d`.csv; done

Handling __ivy_ngcc_bak compiler errors

An Angular project I work on uses some custom libraries from a private repository. When making changes to the library, it is necessary to test locally, before publishing to the repository.

So how do you test your changes locally? I found it sufficient to copy the result of the ng-packagr script into the library’s directory of the project’s node_modules directory, run ng build, and you’re done.

This changed when Angular Ivy came along as we made the switch to Angular 10.

Suddenly, calling ng build after copying the packagr files resulted in multiple warnings stating

WARNING in Unable to fully load D:/path/to/web/node_modules/weblibrary/lib/filename.d.ts for source-map flattening: Circular source file mapping dependency: D:/path/to/web/node_modules/weblibrary/lib/filename.d.ts.map -> D:/path/to/web/node_modules/weblibrary/lib/filename.d.ts.map

and an error message

ERROR in Tried to overwrite D:/path/to/web/node_modules/weblibrary/lib/filename.d.ts.__ivy_ngcc_bak with an ngcc back up file, which is disallowed.

Well, I checked, and indeed, the file existed. Let’s delete the *.__ivy_ngcc_bak files, and run ng build again. I also found it necessary to delete the library’s __ivy_ngcc__ directory in the target project.

Run ng build again, and only the warnings are output. Run ng build once more, and the warnings are gone.


As I prepared this post, I wondered whether I had missing a solution that already exists for this problem.

I found npx install-from which seems to present itself as an alternative to npm link. I tried it, but unfortunately it stopped with an error message

D:\path\to\web>npx install-from D:\path\to\weblibrary\dist\weblibrary
npx: installed 5 in 3.016s
(node:14136) UnhandledPromiseRejectionWarning: Error: spawn npm ENOENT
at Process.ChildProcess._handle.onexit (internal/child_process.js:267:19)
at onErrorNT (internal/child_process.js:469:16)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
(node:14136) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag --unhandled-rejections=strict (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:14136) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

without any indication what might have gone wrong.

Note that if you run a module which is not installed locally, npx will download it every time from your configured repository. If you use a package more often, better install it locally running npm install -g package.


So I checked again, and found that npm install also supports installation from a folder, not only from repository.

Running npm install D:/path/to/web/node_modules/weblibrary/package replaced the library’s <DIR> entry under node_modules with a <JUNCTION> entry (i.e. symlink) pointing to the path given as parameter:

Install the package in the directory as a symlink in the current project. Its dependencies will be installed before it’s linked.

In the package directory, installing the package adds a node_modules directory, and running ng build also updates the package.json file in the package directory. The application’s package.json entry for the package is updated from a version-specific reference to the library in the repository to a “file:…” reference to the package directory.


Now it became clear what the install-from is trying to do:

  • run npm pack in the library directory to create a .tgz
  • run npm install from the .tgz in the application directory
  • it fails somewhere

As a work-around to recreate the functionallity of import-form, I call npm pack package-directory from the library’s directory, which creates a library-version.tgz file, and npm install from the .tgz in the application directory.


So I came up with 3 methods to update a library in an application:

  • Clean-up Ivy artifacts and inject ng-packagr result using xcopy
  • npm install from library’s ng-packagr directory
  • npm pack to .tgz and npm install from .tgz

Use awk to grep

I have gawk installed on my Windows 10 machine, but no grep, but still needed to quickly find text in some files.

So I came up with (read: “found on the internet”) this one-liner

@for /f "eol=: delims=" %F in ('dir /b /s [directory]') do
@awk "/[search string]/ {print $0}" %F

Usually I use grepWin for such tasks, but only later realized that a simply Ctrl-C would copy the selected search result to the clipboard – grepWin’s context menu does not have a menu item for this function, even though it has been suggested in a related issue.

I also noticed grepWin has a memory problem if you search huge files on a machine with little RAM. On the other hand, it searches using the Windows codepage rather than the DOS codepage.

Fixing the –startvm Error Message

After upgrading Ubuntu from 18.04 to 20.04, I noticed that my VM .desktop shortcut throws the error message

–startvm is an option for the VirtualBox VM runner (VirtualBoxVM) application, not the VirtualBox Manager.

Before the upgrade, it simply started the virtual machine referenced as parameter value.

It seems that VirtualBox moved the --startvm parameter from the previous VirtualBox executable to VirtualBoxVM. More infos and links can be found in this VirtualBox ticket.

The (easy) solution was to open the .desktop file in an editor, and change the line

Exec=/usr/lib/virtualbox/VirtualBox ....

to

Exec=/usr/lib/virtualbox/VirtualBoxVM ....

Finding unused SpecFlow step implementations with PowerShell

SpecFlow steps are public methods adorned using the [Given], [When], and [Then] attributes inside classes marked with the [Binding] attribute.

To load the assembly containing the compiled SpecFlow steps, I found it necessary to load all other assemblies inside the bin directory. Otherwise an error would occur in the assembly’s GetTypes() method stating the references to other assemblies could not be resolved.

$asm = [System.Reflection.Assembly]::LoadFrom("$basedir\$specdll", 
$null, 0)
$types = $asm.GetTypes()

Next, the SpecFlow attribute types are retrieved

$binding = [TechTalk.SpecFlow.BindingAttribute]
$given = [TechTalk.SpecFlow.GivenAttribute]
$when = [TechTalk.SpecFlow.WhenAttribute]
$then = [TechTalk.SpecFlow.ThenAttribute]

along with all [Binding] classes and there methods with the attributes listed above.

The attributes are collected in an array, as their Regex property contains the regular expression that will be used to search the .feature files using PowerShell’s select-string cmdlet.

:Outer foreach($a in $attributes) {    
foreach($ff in $featurefiles) {
        $found = $featurefilecontents[$ff] | 
select-string -Pattern $a.Regex -CaseSensitive
        if ($found.Matches.Length -gt 0) {
            #write-host "found $($a.Regex) in file $ff"
            $foundattributes += $a
            continue Outer
        }
    }
    write-host "did not found reference for $($a.GetType().Name) $($a.Regex)"
    $notfoundattributes += $a
}

The script has been developed to analyze a project using SpecFlow 2.4.0, and is available on my PowerShell Snippets repository.