Opening .url Files in Ubuntu

When browsing the web with Chrome for Android, I save the URLs on my Nextcloud server by sharing using the Nextcloud App. Each URL is then stored as a .url file looking like this

[InternetShortcut]
URL=https://devio.wordpress.com/

Today I noticed that those .url files cannot be opened on Ubuntu, i.e. a double-click won’t start a browser with the contained URL.

Instead, I get a an error dialog

Could not display “<HTML page title>.url”.

There is no application installed for “Internet shortcut” files.
Do you want to search for an application to open this file?

No     Yes

Screenshot from 2020-03-22 07-40-58.png

Clicking the Yes button, a toast message appears

mimetype required.png

which you have to click before it disappears, which finally opens the software installer:

unable to find software.png

Not good.

Surprisingly, Firefox does not register itself as an application to handle the .url file extension on Ubuntu. It also does not know that the Windows Firefox would know how to open the file.

More surprisingly, Ubuntu knows that .url files are “Internet shortcut” files, and have the associated MIME type application/x-mswinurl.

So I had to solve two problems:

  • Retrieve the URL stored in a .url file
  • Start Firefox using this URL using Ubuntu’s MIME type handling

Retrieving the URL stored in a .url file

As shown above, a .url file is simply a text file in .ini format. In it’s simplest form, it contains a section [InternetShortcut] with a single Key “URL=”. The key’s value is the URL to navigate to.

With a little help from askubuntu, I figured out the command to extract the URL value

grep -Po 'URL=\K[^ ]+' *.url

Using the result of the grep operation as argument for firefox would look something like this:

firefox `grep -Po 'URL=\K[^ ]+' "$1"`

After a bit of digging, I found how you can manually add MIME type handlers in Ubuntu. Following those instructions, I created a file

/usr/share/applications/mswinurl.desktop

(you need sudo in this directory) with the following content (spoiler: don’t copy this yet!):

[Desktop Entry]
Name=Firefox Shortcut
GenericName=Firefox Shortcut

Type=Application
Exec=firefox `grep -Po 'URL=\K[^ ]+' %U`
TryExec=firefox
MimeType=application/x-mswinurl;
Icon=firefox

However, this did not work as intended, as I got an error message complaining about the backtick `. So, if I cannot have shell operations in the .desktop file, let’s create a batch file

/usr/local/bin/runurl

and place the shell magic there:

firefox `grep -Po 'URL=\K[^ ]+' "$1"` &

Don’t forget to make the batch file executable using

sudo chmod 755 runurl

and reference the runurl script rather than Firefox in /usr/share/applications/mswinurl.desktop:

[Desktop Entry]
Name=Firefox Shortcut
GenericName=Firefox Shortcut

Type=Application
Exec=runurl %U
TryExec=firefox
MimeType=application/x-mswinurl;
Icon=firefox

After creating the file, run

 sudo update-desktop-database

to register the new .desktop file.

Double-clicking a .url file now opens the URL in a new Firefox tab.

SMOscript 0.50

I added a couple of new options to my scripting tool SMOscript:

  • The new -td command-line switch generates two files for tables: a [table].create.sql file containing the CREATE TABLE statement, and a [table].details.sql file containing the CREATE statements all table details (foreign keys, indexes, triggers), thus allowing recreating of a database schema without referential integrity problems
  • The -o (object name) switch semantics has been extended to other commands than script
  • The new -ol command-line switch implements a filter on object names similar to T-SQL LIKE. The filter is applied on the object name, and if not matching, on the combination “schema.objectname”.

The new command dd operates on data diagrams (see my previous post on SQL Server Data Diagrams):

  • Using dd without parameters will list all data diagrams stored in the database.
  • Using dd -o [diagram name] will list all tables contained in the data diagram, even tables that have been DROPped.
  • Using dd -o [diagram name] -dt lists the DROPped tables only.

The latest version of SMOscript is available for download on my download website.

Accessing MS Access Databases from SQL Server

I recently installed WinTV 8.5, and after scanning TV channels, I noticed that editing channel numbers was incredibly cumbersome.

According to the FAQ, it would involve a lot of scrolling, and 4 clicks – *per channel*. I also noticed a couple other peculiarities in the UI, such as the EPG viewer, which made me research where the program’s data is stored.

Actually it’s all in the FAQ: it’s an MS Access database called hcwChanDB_5.mdb residing in a directory set during installation.

To save your channel database go to:

C:\Users\Public\WinTV\Channel Database (Windows 7 and Vista)

C:\Documents and Settings\All Users\Shared Documents\WinTV\Channel Database (Windows XP)

Copy the  hcwChanDB_5 file to a place  for safe keeping. To replace saved Channels just copy HcwChanDB_5 back to Channel Database Dir.

When I double-click the .mdb file, a nice reminder shows up that I only have a Test edition of MS Access, and not the fully activated Professional edition. Essentially this means that I can view the data, but cannot change it.

Time to start up SSMS, link the Access database and browse the data from SSMS. … which turned out to be not so easy.

Linked Server to MS Access via ODBC System DSN

WinTV creates an ODBC System DSN called HCW_mdb, and you can easily create a linked server in SSMS referencing this DSN. Running

exec sp_catalogs 'hcw'

on the linked server returns the path names of the MS Access databases, but using them as catalog names in a SELECT * FROM statement only raises an error message

OLE DB provider “MSDASQL” for linked server “hcw” returned message “[Microsoft][ODBC Microsoft Access Driver] Syntaxfehler (fehlender Operator) in Abfrageausdruck ‘`Tbl1002`.`id` `Col1005`’.”.
Msg 7321, Level 16, State 2, Line 4
An error occurred while preparing the query “SELECT `Tbl1002`.`id` `Col1005`,`Tbl1002`.`service_id` `Col1006`,`Tbl1002`.`preferred_name` `Col1007`,`Tbl1002`.`preferred_number` `Col1008`,`Tbl1002`.`description` `Col1009`,`Tbl1002`.`audio_lang` `Col1010`,`Tbl1002`.`subtitle_lang` `Col1011`,`Tbl1002`.`audio_mode` `Col1012`,`Tbl1002`.`irblaster_data` `Col1013`,`Tbl1002`.`channel_type` `Col1014`,`Tbl1002`.`language` `Col1015`,`Tbl1002`.`genre` `Col1016`,`Tbl1002`.`duplicate_channel_id` `Col1017`,`Tbl1002`.`duplicate_priority` `Col1018`,`Tbl1002`.`blocked_flag` `Col1019`,`Tbl1002`.`last_used` `Col1020`,`Tbl1002`.`epg_source` `Col1021`,`Tbl1002`.`preferred_minor` `Col1003` FROM `D:\data\WinTV\Channel Database\hcwChanDB_5`.`hcwChannels` `Tbl1002`” for execution against OLE DB provider “MSDASQL” for linked server “hcw”.

However, you can query the database using OPENQUERY()

select * from openquery(hcw, 'select * from hcwChannels')

where hcwChannels is one of the tables in the MS Access database.

Linked Server to MS Access via OLEDB provider

I kept searching, and fortunately found a forum post on social.MSDN in which the last comment provided a hint to another workable solution:

Note: The same issue is present using Sql Server 2012 RC0, and still forcing the use of the deprecated MSDASQL provider (ms ole db provider for odbc drivers) as a workaround.

Update: besides using msdasql, it works, in some cases, to do the following before creating the linked server:
right-click and select “properties” for “Microsoft.ACE.OLEDB.12.0” under Server objects, Linked servers, Providers in SSMS.  Check “nested queries” and “allow inprocess.”   Then create the linked server for access accdb using the ACE.OLEDB.12.0 provider, with the access filename under data source.

So I checked the OLEDB provider settings and changed them accordingly. Then I created a linked server using Microsoft.ACE.OLEDB.12.0 referencing the hcwChanDB_5.mdb database.

While I could not enumerate the catalogs of the new linked server, the database could now be queried using standard SELECT statements such as

select * from hcwole...hcwChannels

I could now create a SYNONYM for each table inside the MS Access database.

See my WinTV repository on GitHub for the SQL files.

 

Anonymizing Personal Data

Databases used to collect all the data ever entered, and running out of disk space was solved by adding some new storage device.

Then came GDPR and companies storing personal data (or personally identifyable data) suddenly had to consider Data Deletion Policies and Data Retention Policies.

The easiest way is of course to delete the records storing obsolete personal data. But sometimes (read: almost always) it is not possible to delete the record, because the mere existence of a record may be important for historical or statistical reasons (e.g. the query “number of purchases per time interval and region” would almost certainly refer to a person and their location).

So it’s more realistic that certain data of a person is kept, some is erased, and the personal identifiers (first name, last name) are simply anonymized or pseudonymized.

What would be easier to replace the name fields with an arbitrary combination of letters?

To make the pseudonyms more beautiful, it would be nice if we considered a typical syllable structure, the most basic would be C-V-C (consonant vowel consonant), or any extension thereof: C-V-C-V-C, C-V-C-C-V, etc.

So we define the letters for consonants and vowels as

DECLARE @cons NVARCHAR(100) SET @cons = 'bcdfghjklmnpqrstvwxyz'
DECLARE @vow NVARCHAR(100) SET @vow = 'aeiouäöüaeiouaeiou'

and select a random consonant or vowel using the expressions

SUBSTRING(@cons, (ABS(CHECKSUM(NEWID())) % LEN(@cons))+1, 1)
SUBSTRING(@vow, (ABS(CHECKSUM(NEWID())) % LEN(@vow))+1, 1)

We use the function ABS(CHECKSUM(NEWID())) to generate random numbers rather than RAND() because this expression is evaluated for every record in a multi-record statement, rather than once. To illustrate the difference, compare the result of the query

SELECT RAND(), ABS(CHECKSUM(NEWID())) FROM sys.objects

Note that by repeating a letter inside either strings you can change the relative frequency of that letter in the resulting name.

By prepending ‘Zy’ to the generated string we can make sure that the generated names are easily recognize, and occur at the end of lists.

Replacing a persons name with a generated pseudonym may be as easy as this code:

DECLARE @cons NVARCHAR(100) SET @cons = 'bcdfghjklmnpqrstvwxyz'
DECLARE @vow NVARCHAR(100) SET @vow = 'aeiouäöüaeiouaeiou'
DECLARE @cons1 NVARCHAR(100) SET @cons1 = 'bcdfghjklmnpqrstvw'
DECLARE @vow1 NVARCHAR(100) SET @vow1 = 'aeiou'

UPDATE dbo.PersonTable
SET LastName = ' Zy' 
    + SUBSTRING(@cons, (ABS(CHECKSUM(NewId())) % LEN(@cons))+1, 1)
    + SUBSTRING(@vow, (ABS(CHECKSUM(NewId())) % LEN(@vow))+1, 1)
    + SUBSTRING(@cons, (ABS(CHECKSUM(NewId())) % LEN(@cons))+1, 1)
    + SUBSTRING(@cons, (ABS(CHECKSUM(NewId())) % LEN(@cons))+1, 1)
    + SUBSTRING(@vow, (ABS(CHECKSUM(NewId())) % LEN(@vow))+1, 1)
    + SUBSTRING(@cons, (ABS(CHECKSUM(NewId())) % LEN(@cons))+1, 1),
    FirstName = 'Zy'
    + SUBSTRING(@cons1, (ABS(CHECKSUM(NewId())) % LEN(@cons1))+1, 1)
    + SUBSTRING(@vow1, (ABS(CHECKSUM(NewId())) % LEN(@vow1))+1, 1)
    + SUBSTRING(@cons1, (ABS(CHECKSUM(NewId())) % LEN(@cons1))+1, 1),
WHERE [AnonymizeCondition] = 1

 

AnonymizeCondition refers to the condition you determined when a personal record needs to be anonymized, and has not yet been anonymized.

See the code in my GitHub repository with T-SQL snippets.

Tracking down ScrewTurn Wiki

Back in 2012, the world was shocked by news that ScrewTurn Wiki would not be developed any further.

The universe of .Net-based open-source wikis was small then, and it is even smaller today.

Time to find out what happened to the original STW source code. I tracked down a couple of versions of the original STW3, as well as successors:

Compilation and initial startup showed the following results:

 

Version .Net Version VS Version License compiled run test
3.0.5.610 3.5 12 (2013) GPL V2 VS 2013 ok
3.0.5.629 3.5 11 (2010) GPL V2 VS 2013 ok
3.0.6.630 4.5 12 (2013) GPL V2 VS 2013 fails to load sql page provider
3.1.0.0 4.7.2 15 (2017) GPL V2 VS 2017 fails to load sql provider
4.0.5.145 4.0 11 (2010) GPL V2 VS 2017 with Azure SDK ok
6.0.0.0 4.6 14 (2015) GPL V3 VS 2017 with Azure SDK ok (run as site)

Comments:

  • VS Version is the version of Visual Studio the original code was developed with, while compiled shows the Visual Studio version I used to compile the code.
  • Versions 3.0.6.630 and 3.1.0.0 compiled and ran, but did not load the SQL Providers.
  • Versions 4 and above reference Azure libraries. Since the Azure SDK from Visual Studio 2010 is not available anymore, I used Visual Studio 2017 with its Azure SDK.
  • Version 6 is an ASP.Net MVC application. All previous versions are ASP.Net.
  • Version 6 requires ScrewTurn Wiki to run as an IIS site due to lack of application-relative paths in code and markup. All previous versions also run as IIS applications inside an IIS site.

The great Database Diagram mystery

You may (or may not) know that the Database Diagrammer of SSMS stores the created diagrams in a table called dbo.sysdiagrams.

This table has always been a mystery to me, especially the column “definition”, as it contains binary data, and documentation cannot be found about its format.

So when I tried to figure out what all the bytes meant, and my standard editor plus hex viewer plug-in (read: Notepad++) could not help me any further, I searched and found FlexHEX which promised to provide the functionality to annotate and structure binary data. Somehow this did not work out as expected – I could not figure out how to create and edit “structures” – but as I clicked around, I hit the menu “Open OLE Compound File…” as a last resort. What could go wrong.

flexhex compound file.png

To my great surprise, this command actually opened the file and displayed a file structure, with each of the sections containing some data.

As it turned out, the section DSREF-SCHEMA-CONTENTS was the one I was looking for, as it contains, as I later found out, all the textual information in the diagram:

  •  (original) diagram name
  • connection string
  • list of all tables in the diagram (as of last save)

flexhex data diagram.png

This section is stored as a Microsoft Compound File or Compound File Binary Format or Compound Document File, and the easiest way to process it in C# is the nuget package OpenMcdf (source on GitHub).

Writing a small tool, I verified that the sections from the binary data really have those names, and are not artifacts of the hex editor.

Once I got hold of the data contained in section DSREF-SCHEMA-CONTENTS, making sense of the structure of this data was quite straight-forward.

SQL Server Sysobjects Types (Version 2019)

MS SQL Server stores the information about the objects (tables, views, functions, etc) stored in the database in a table, which is accessible by selecting from the sys.objects view. The type of object is given by the column “type”

I collected this cross-version information from Microsoft Docs online, but fixed a couple of historical mistakes there.

Object type: 2008 2012 2016 2019
sys.objects.type
AF Aggregate function (CLR) x x x x
C CHECK constraint x x x x
D DEFAULT (constraint or stand-alone) x x x x
EC Edge constraint x
ET External Table x x
F FOREIGN KEY constraint x x x x
FN SQL scalar function x x x x
FS Assembly (CLR) scalar function x x x x
FT Assembly (CLR) table-valued function x x x x
IF SQL inline table-valued function x x x x
IT Internal table x x x x
P SQL stored procedure x x x x
PC Assembly (CLR) stored procedure x x x x
PG Plan guide x x x x
PK PRIMARY KEY constraint x x x x
R Rule (old-style, stand-alone) x x x x
RF Replication-filter-procedure x x x x
S System base table x x x x
SN Synonym x x x x
SO Sequence object x x x
SQ Service queue x x x x
TA Assembly (CLR) DML trigger x x x x
TF SQL table-valued-function x x x x
TR SQL DML trigger x x x x
TT Table type x x x x
U Table (user-defined) x x x x
UQ UNIQUE constraint x x x x
V View x x x x
X Extended stored procedure x x x x

The sys.objects.type_desc column contains the description corresponding to the type:

type sys.objects.type_desc
AF AGGREGATE_FUNCTION
C CHECK_CONSTRAINT
D DEFAULT_CONSTRAINT
EC
ET
F FOREIGN_KEY_CONSTRAINT
FN SQL_SCALAR_FUNCTION
FS CLR_SCALAR_FUNCTION
FT CLR_TABLE_VALUED_FUNCTION
IF SQL_INLINE_TABLE_VALUED_FUNCTION
IT INTERNAL_TABLE
P SQL_STORED_PROCEDURE
PC CLR_STORED_PROCEDURE
PG PLAN_GUIDE
PK PRIMARY_KEY_CONSTRAINT
R RULE
RF REPLICATION_FILTER_PROCEDURE
S SYSTEM_TABLE
SN SYNONYM
SO SEQUENCE_OBJECT
SQ SERVICE_QUEUE
TA CLR_TRIGGER
TF SQL_TABLE_VALUED_FUNCTION
TR SQL_TRIGGER
TT TYPE_TABLE
U USER_TABLE
UQ UNIQUE_CONSTRAINT
V VIEW
X EXTENDED_STORED_PROCEDURE

My previous post on this topic only covered SQL Server up to version 2008.

Length of UTF-8 VARCHAR in SQL Server

Foreword

For as long as I can remember, a VARCHAR (or CHAR) was always defined as “1 character equals 1 byte”. Different character sets (code pages) where implemented as COLLATIONs, so that you had basic database support for internationalization.

Then came Unicode, and we got NVARCHAR strings (or NCHAR), where the rule was “1 character equals 2 bytes”, and we could store any text from around the world without bothering with code pages, encodings, etc. The .Net framework brought us the string class with similar features and the world was beautiful.

Then, in 2001, came Unicode 3.1 and needed more space:

For the first time, characters are encoded beyond the original 16-bit codespace or Basic Multilingual Plane (BMP or Plane 0). These new characters, encoded at code positions of U+10000 or higher, are synchronized with the forthcoming standard ISO/IEC 10646-2. For further information, see Article IX, Relation to 10646. Unicode 3.1 and 10646-2 define three new supplementary planes.

These additional planes were immediately supported in SQL Server 2012. From now on, using an *_SC collation, NVARCHARs could be 2 or 4 bytes per character.

In C#, the StringInfo class handles supplementary planes, but it seems, they are still a bit behind:

Starting with the .NET Framework 4.6.2, character classification is based on The Unicode Standard, Version 8.0.0. For the .NET Framework 4 through the .NET Framework 4.6.1, it is based on The Unicode Standard, Version 6.3.0. In .NET Core, it is based on The Unicode Standard, Version 8.0.0.

(For the record, the current Unicode version is 12.1, and 13.0 is going to be released soon)

UTF-8 Collations

So now SQL Server 2019 supports UTF-8-enabled collations.

A question on SO quoted the documentation as

A common misconception is to think that CHAR(n) and VARCHAR(n), the n defines the number of characters. But in CHAR(n) and VARCHAR(n) the n defines the string length in bytes (0-8,000). n never defines numbers of characters that can be stored

(emphasis mine) which confused me a little bit, and the quote continues

The misconception happens because when using single-byte encoding, the storage size of CHAR and VARCHAR is n bytes and the number of characters is also n.

(emphasis mine).

This got me investigating, and I had a look into this issue. I create a UTF8-enabled database with a table with all kinds of N/VARCHAR columns

CREATE DATABASE [test-sc] COLLATE Latin1_General_100_CI_AI_KS_SC_UTF8

CREATE TABLE [dbo].[UTF8Test](
  [Id] [int] IDENTITY(1,1) NOT NULL,
  [VarcharText] [varchar](50) COLLATE Latin1_General_100_CI_AI NULL,
  [VarcharTextSC] [varchar](50) COLLATE Latin1_General_100_CI_AI_KS_SC NULL,
  [VarcharUTF8] [varchar](50) COLLATE Latin1_General_100_CI_AI_KS_SC_UTF8 NULL,
  [NVarcharText] [nvarchar](50) COLLATE Latin1_General_100_CI_AI_KS NULL,
  [NVarcharTextSC] [nvarchar](50) COLLATE Latin1_General_100_CI_AI_KS_SC NULL,
  [NVarcharUTF8] [nvarchar](50) COLLATE Latin1_General_100_CI_AI_KS_SC_UTF8 NULL
)

I inserted test data from various Unicode ranges

INSERT INTO [dbo].[UTF8Test] ([VarcharText],[VarcharTextSC],[VarcharUTF8],[NVarcharText],[NVarcharTextSC],[NVarcharUTF8])
    VALUES ('a','a','a','a','a','a')
INSERT INTO [dbo].[UTF8Test] ([VarcharText],[VarcharTextSC],[VarcharUTF8],[NVarcharText],[NVarcharTextSC],[NVarcharUTF8])
    VALUES ('ö','ö','ö',N'ö',N'ö',N'ö')
-- U+56D7
INSERT INTO [dbo].[UTF8Test] ([VarcharText],[VarcharTextSC],[VarcharUTF8],[NVarcharText],[NVarcharTextSC],[NVarcharUTF8])
    VALUES (N'囗',N'囗',N'囗',N'囗',N'囗',N'囗')
-- U+2000B
INSERT INTO [dbo].[UTF8Test] ([VarcharText],[VarcharTextSC],[VarcharUTF8],[NVarcharText],[NVarcharTextSC],[NVarcharUTF8])
    VALUES (N'𠀋',N'𠀋',N'𠀋',N'𠀋',N'𠀋',N'𠀋')

then selected the lengths and data lengths of each text field

SELECT TOP (1000) [Id]
    ,[VarcharText],[VarcharTextSC],[VarcharUTF8]
    ,[NVarcharText],[NVarcharTextSC],[NVarcharUTF8]
FROM [test-sc].[dbo].[UTF8Test]
SELECT TOP (1000) [Id]
    ,LEN([VarcharText]) VT,LEN([VarcharTextSC]) VTSC
    ,LEN([VarcharUTF8]) VU
    ,LEN([NVarcharText]) NVT,LEN([NVarcharTextSC]) NVTSC
    ,LEN([NVarcharUTF8]) NVU
FROM [test-sc].[dbo].[UTF8Test]
SELECT TOP (1000) [Id]
    ,DATALENGTH([VarcharText]) VT,DATALENGTH([VarcharTextSC]) VTSC
    ,DATALENGTH([VarcharUTF8]) VU
    ,DATALENGTH([NVarcharText]) NVT,DATALENGTH([NVarcharTextSC]) NVTSC
    ,DATALENGTH([NVarcharUTF8]) NVU
FROM [test-sc].[dbo].[UTF8Test]

Select Lengths.png

I was surprised to find that the old mantra “a VARCHAR only stores single byte characters” needs to be revised when using UTF8 collations.

Table data only

Note that only table columns are associated with collations, but not T-SQL variables, as you cannot declare a collation on a variable

SELECT @VarcharText = [VarcharText],@NVarcharText = [NVarcharText]
FROM [test-sc].[dbo].[UTF8Test]
WHERE [Id] = 4
SELECT @VarcharText, Len(@VarcharText), DATALENGTH(@VarcharText)
    , @NVarcharText, Len(@NVarcharText), DATALENGTH(@NVarcharText)

SELECT @VarcharText = [VarcharTextSC], @NVarcharText = [NVarcharTextSC]
FROM [test-sc].[dbo].[UTF8Test]
WHERE [Id] = 4
SELECT @VarcharText, Len(@VarcharText), DATALENGTH(@VarcharText)
    , @NVarcharText, Len(@NVarcharText), DATALENGTH(@NVarcharText)

SELECT @VarcharText = [VarcharUTF8], @NVarcharText = [NVarcharUTF8]
FROM [test-sc].[dbo].[UTF8Test]
WHERE [Id] = 4
SELECT @VarcharText, Len(@VarcharText), DATALENGTH(@VarcharText)
    , @NVarcharText, Len(@NVarcharText), DATALENGTH(@NVarcharText)

 

Select Variable Lengths.png

 

Retrieving Active GitHub Forks

The problem with GitHub repositories is that, whenever there exists forks for a given repository, it is not immediately obvious which forks are active, containing newer commits, and which forks are just-for-fun forks, which should have been Starred rather than Forked.

Of course, you can always navigate to that forked repository, and check for “commits ahead” and “commits behind”, but that’s really painful.

Apparently there are others that think such a feature would be useful

but the functionality is still not built-in. Both tickets link to helper applications or scripts that might perform the task, though.

So I came across Active GitHub Forks which lists Stars, Forks and Last Push date for each forked repository of a given base repository. (Their are also other solutions such as this Active GitHub Forks page)

So I forked this repository, had a little look on the GitHub API, I learned that you can query the forks of a given repository, sorted by stars or newest or oldest, but the list of forked repositories does not contain the relevant attributes.

Instead, you have to query each of the forked repositories separately to retrieve their attributes “total_commits”, “ahead_by”, “behind_by”. And each separate query weighs on GitHub’s rate limit. (No wonder then that some of the tools screenscrape rather than query the API)

Visit my solution of Active GitHub Forks and query the forked repositories individually.

Active GitHub Forks screenshot