Category : Scripting

Gradle Generate Release Notes from Git

I thought my ‘Gradle Generate Release Notes’ script might be useful for others, so here you go:

If you are using the built in Gradle tool for Android Studio and you want to automagically generate release notes (or a list of commit messages) from Git then this little script might help (whilst it is pretty specific for Android Studio, it could easily be modified for any Gradle


I got the inspiration from this post over at coders kitchen – but it seemed a little to complex for my needs, so I cut it right back to just text based, and just dates and commit messages between the tags. More than sufficient for my needs.

PowerMeter from PowerShell

image I was trying to get my Home Energy Monitor application working to Google PowerMeterimage this evening. To get things moving quickly I decided to prototype in PowerShell (as you the full sugary goodness of the .Net framework for free). Here’s the details on accessing PowerMeter from PowerShell…

Although it is called Google PowerMeter, it is simply a service that records variables (of either cumulative or instantaneous values).

Firstly you need to get registered for an account, and it is not obvious how you actually get a Google PowerMeter account if you don’t have some of the supported devices or don’t have a contract with one partner utility companies. The easy way is to put together a url that basically requests a new account. the format of the url is :-

All the details are on this page. I have a CurrentCost Envi, so my url became:

Note I’m using dvars at the end instead of cvar – dvars are for durational measurements and cvars are for instantaneous measurements – you need to get these right or your uploads will fail. the dvars=1 means I want only 1 variable (energy), I could have opted for more (dvars=2, or dvars=3 etc), but 1 will do for now.

imageWhen you’ve created your url, simply browse to it. Google will authenticate you with your usual Google account and then ask you to give a friendly name to the variable(s) you created. When complete you’ll be presented with a activation code. You can get this activation again by browsing to your settings page in Google PowerMeter.  From this activation code you need 3 pieces of data as highlighted below :


The first is your ‘auth token’, the second is your ‘user id’ and the third is your ‘device id’.

Now for the PowerShell script. It is fairly simple, it creates an Xml string with the start date of the reading, the duration (in seconds) and the value and then uploads this to Google PowerMeter. It does need some Headers adding first to make sure your sending the correct Content-Type and to make sure you are authorized…

$user = "YOUR USER ID"
$device = "YOUR DEVICE ID"
$variable = "d1"
$url = ""
$var = "$user/$user/variable/$device.$variable"

$start = [System.Xml.XmlConvert]::ToString([System.DateTime]::Now)
$duration = 1
$energy = 9999

$xmlData = @"
    <feed xmlns=`"`"
        <category scheme=`"`"
        <meter:startTime uncertainty=`"1.0`">{1}</meter:startTime>
        <meter:duration uncertainty=`"1.0`">{2}</meter:duration>
        <meter:quantity uncertainty=`"0.001`" unit=`"kW h`">{3}</meter:quantity>

$rdgData = [string]::Format($xmlData, $var, $start, $duration, $energy)

$wc = New-Object System.Net.WebClient
$whc = New-Object System.Net.WebHeaderCollection$res
$whc.Add("Content-Type", "application/atom+xml")
$whc.Add("Authorization", "AuthSub token=`"$auth`"")
$wc.Headers = $whc

$response = $wc.UploadString($url, "POST", $rdgData)

Now you have the xml response in the $response variable. To check this you can simply ([xml]$response).feed.entry.status.code – you’re looking for a 201 (‘Created’).
You should now have a measurement lodged with Google PowerMeter !!  Enjoy…

GEO 51.4043388366699:-1.2875679731369

Access ODBC Connection Strings

I was working on an old (classic) ASP page the other day. It was pulling data from an Access database file and using an ODBC driver to get the connection.

It was working fine on a Windows 2003 server, but when I pulled the file into a local website on my Windows 7 machine (with Office 2010 beta) it kept failing at the ODBC layer. The reported error message was :

Microsoft OLE DB Provider for ODBC Drivers error ‘80004005’
[Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified

Looks like the driver specified in my connection string couldn’t be found. I was using the following :

    objConn.Open "DRIVER={Microsoft Access Driver (*.mdb)}; DBQ=c:inetpubwwwrootpstdiscovery.mdb;"

This all looked correct and checking the excellent “” website they were saying the same thing – strange. It then struck me that I’m using Win 7 and Office 2010, either of which could have changed the ODBC driver or installed a new driver, so checking the “Data Sources (ODBC)” tool I see that the driver also works with .accdb files, so I’m guessing this is an updated driver.

Changing the connection string (adding the *.accdb) was the next step.

objConn.Open "DRIVER={Microsoft Access Driver (*.mdb, *.accdb)}; DBQ=c:inetpubwwwrootpstdiscovery.mdb;"

Testing with this new connection string worked fine – problem solved….




Google Results Ranking

Disclaimer: Screenscraping results like this probably contravening Google’s Terms of Use (or something) and I do not advocate that you do it – this is purely hypothetical, if I did want to do it, this is how I would go about it  😉

Further Disclaimer: The results page formats could change at any time and may well break this script, if that happens you are on your own (FireBug and some modified regex should help you out).


So, if you wanted to get the Google ranking of a bunch of domains when searching for a particular term you could use one of the many SEO page ranking test sites that are available, but these are a pain in as much it they require you to enter the search term and the domain name you are looking for and they give you the ranking (what position in the results the domain name comes). that is fine for individual searches (like what position is if I search on ‘Ken Hughes’), but not very good for doing a comparison of multiple domains against the search term.

I looked at using Googles Search API to get this info, but unfortunately it only returns 4 or 8 results (it is mainly designed to present some brief results in a box on your website), what I needed was to look at a lot more results (like up to 500)….

Back to my trusty friend – PowerShell…

I create a web client, have it download the first X (500) results to the search term, load the link Url and the position into a hashtable and then lookup the hashtable to find the rank position of each of the domain names I am looking for.
It was actually pretty easy, the only difficult part was getting the regex(s) correct – Regex is evil, as evil as Perl….

Here is the script code :

  $domainNames = "", "", "", ""
  $maxResult = 100
  $searchTerm = "search"

  $urlPattern = "<s*as*[^>]*?hrefs*=s*[`"']*([^`"'>]+)[^>]*?>"
  $hitPattern = "<s*(h3)sclass=r>(.*?)</1>"

  $wc = new-object "System.Net.WebClient"
  $urlRegex = New-Object System.Text.RegularExpressions.Regex $urlPattern
  $hitRegex = New-Object System.Text.RegularExpressions.Regex $hitPattern
  $urls = @{}

  $resultsIndex = 0
  $count = 1
  while($resultsIndex -lt $maxResults)
    $inputText = $wc.DownloadString("$searchTerm&start=$resultsIndex")

    "Parsing : " + $resultsIndex

    $index = 0
    while($index -lt $inputText.Length)
      $match = $hitRegex.Match($inputText, $index)
      if($match.Success -and $match.Length -gt 0)
        $urlMatch = $urlRegex.Match($match.Value.ToString())
        if(($urlMatch.Success) -and ($urlMatch.Length -gt 0))
          $newKey = $urlMatch.Groups[1].Value.ToString()
            $urls.Add($newkey, $count)
        $index = $match.Index + $match.Length
        $index = $inputText.Length
    $resultsIndex += 10

  foreach($domain in $domainNames)
    $maxPos = -1
    foreach($key in $urls.Keys)
        $pos = [int] $urls[$key]
        if(($pos -lt $maxPos) -or ($maxPos = -1))
          $maxPos = $pos
    if($maxPos -eq -1)
      $domain + " : Not Found"
      $domain + " : Found at result #" + $maxPos

Drop me a line in the comments if you find it useful…

GEO 51.4043197631836:-1.28760504722595

Replace in Files for PowerShell

A while back I restructured my website so that this blog no longer started at the root, instead starting from /blog. This was so that I could introduce some other web apps and have a subfolder for projects etc.

One of the pains of this restructure was modifying all the links – I thought I had caught all this with a Redirector HttpModule, but recently realised that for some reason I had not caught images embedded in the posts themselves.
Also it was becoming a pain having to remember to include the HttpModule in my web.config everytime I upgraded my blog (dasBlog)

I wanted it fixed properly this time, so grabbed a copy of all the XML files in my ‘content’ folder, copied them to a local folder and cracked open PowerShell…

I wanted every instance of changed to – not difficult, but this would also change valid urls such as to (note the /blog/blog in the url)

So I got everything I needed done with two ‘one liners’ in PowerShell…

dir | %{ $a = get-content $_ ; $a = $a -replace (“”, “”) ; set-content $_ $a }


dir | %{ $a = get-content $_ ; $a = $a -replace (“”, “”) ; set-content $_ $a }

All fixed…


GEO 51.4043197631836:-1.28760504722595 

A couple of handy scripts

I have been updating some of my ‘magicwords’ for SlickRun recently. This a great tool for getting focus on a particular task. Instead of having to mess about opening folders, word documents, web sites all in preparation for a task you can enter one ‘magicword’ and have it do all that work for you.sr_header

For example when we (C2C) release a new hotfix the process requires :

  • Review of the technical notes / fix details (from a database report)
  • Grab all the relevant files into a .zip package (I really should have this section automated)
  • Update the ‘versions’ xml file that our app checks so that end users get notified of the fix availability
  • Post the zip file containing the hotfix to our support website.

(in fact I really should automate ALL of this)

Anyway, there were a couple of things that I had wanted to do to make SlickRun a touch better at getting this environment set up for me…

The first was to minimize all current windows (before opening the set of new ones)
The second was to automatically post form data to a website.

Both of these required a little scripting….


' Minimize all windows to the taskbar
' Ken Hughes
' 23 Jan 2008

Set objShell = CreateObject("Shell.Application")
Set objShell = Nothing

Just run the script for the results….



' HTTP POST script
' Post form data to a url
' Ken Hughes
' 23rd Jan 2008

' Check cmd line args
If (WScript.Arguments.Count <> 2) Then
    ' none - show usage
    Wscript.echo ""
    Wscript.echo "USAGE: httppost.vbs url ""data"""
    ' got them - so post the data
    sURL = Wscript.Arguments(0)
    sFormData = Wscript.Arguments(1)

    Dim objIE
    Set objIE = CreateObject("InternetExplorer.Application")
    objIE.Visible = True
    objIE.Navigate sURL, , , sFormData, "Content-Type: application/x-www-form-urlencoded;"
End If

Run the script with the URL and the post data as command line parameters – for example httppost.vbs “field1=value1&field2=value2”

GEO 51.4043197631836:-1.28760504722595

Todo.txt scripts

I’ve been spending a bit of time at LifeHacker recently, there is some pretty good tips over there (it’s where I found the PointUI for windows mobile 6). lifehacker-book-cover-sm

One of the posts I came across was this The command line tool caught my eye, as I always have a todo.txt hanging around on my desktop, or often a ‘todo list’ email languishing in my inbox.

Baulking at the idea of installing Cygwin to get a bash shell and getting to grips with ‘another’ scripting tool / language, I decided to port it to plain old VBScript.

The first thing was to set the systems default scripting host to cscript instead of wscript (wscript directs all input/output to windows instead of the command line). To do this simply open a command prompt and enter :

cscript //h:cscript //s                           (NOTE: to change back simply use cscript //h:wscript //s)

Then there is a whole bunch of code around parsing of the command line arguments and manipulation of text files – simple stuff really. There are two aspects not implemented in this version.image

  • Output colouring (the standard windows command line does not support this
  • The ‘list’ output does not sort the entries alphabetically (I may get around to this later…)

It follows most of the features of the original (see the details here), in the example displayed todo.vbs has been shortened to t.vbs and you can shorten the ‘actions’ (optionally list becomes l, add becomes a, replace becomes rep, append becomes app, prioritize becomes pri, archive becomes arc and do becomes d – it’s all clearly visible from the script source)

The script source is attached to this post, feel free to use / modify / ping me with questions…

t.vbs (5.24 KB)

GEO 51.4043197631836:-1.28760504722595

New home backup regime

The king is dead, long live the king.

Over the Christmas break the license for my beta of Windows Home Server ran out, so I needed an alternative backup / storage solution. I briefly considered Linux with some iSCSI software, Windows with DFS or FRS, or indeed forking out some of my scheckles for a folder sync application.quicklinks_whs_logo

The requirements were as follows:-

  • NTFS, for large file support (12 Gb in some cases).
  • Easy duplication of the data (including hierarchy) across multiple drives.
  • UNC pathname support, so I could ‘rehome’ my docs, music, photos etc to it.ide_newlogo

In the end I opted for a fairly simple solution :-

  • A windows machine with a drive for the OS and two additional data drives.
  • One of the additional drives would be the primary where folders are ‘rehomed’ to and all data is stored.
  • A batch file would fire off ‘Robocopy’ (free in the Windows Resource Kit) to mirror this primary data drive to the secondary data drive.
  • Another batch file would fire off ‘Robocopy’ for copying to external USB drives.
  • Batch files would be scheduled using AT command line tool and would email results files using the free Blat! command line tool.flickr_logo_gamma_gif_v1_5
  • The primary data drive would also be backed up to my ‘iDrive Pro’ account (online 150 Gb storage facility for $50 / year).
  • Of course, photos are also backed up to my Flickr Pro account (unlimited online storage of images for $25 / year).
GEO 51.4043197631836:-1.28760504722595

PowerShell Plus

One of my colleagues switched me on to PowerShell Plus and I’m loving it.

PowerShellPlusUI Code editor, snippets, values of variables, logging tools and much more, including a really neat feature called ‘MiniMode’ (see the toolbar icon at the extreme right in the image.

This ‘MiniMode’ closes all toolbars/toolwindows except the main console but also makes the console window transparent (user configurable level of transparency). This mode is real easy to work with…


There is a free single user license for non commercial use.

I encourage you to try it out.

GEO 51.4043197631836:-1.28760504722595