Authenticating IoT

I was re-entering my password into our NowTV box in the bedroom when it occurred to me. Authentication sucks on the Internet of Things (IoT). The problem is you have a simple device with minimal extras. On the NowTV you have a basic remote that looks like this:

Can you imagine entering a password with a length of over 20 that’s a mixture of numbers, special characters and both upper and lower case characters? Now imagine changing that password. Regularly.

If you have to press the arrows 5 times per character, that’s over 100 presses! That’s insane!

So, what’s the solution? Well I think the technology already exists. And PayPal already had it patented. QR codes. Not sure if PayPal had thought about using it for IoT, I suspect they only thought about using it as a way of paying. So you have a QR code on the door then you scan it via the PayPal app, pay, then get your tickets sent to your Wallet to the club. Or scanning the code from the receipt to pay the bill.

For IoT, the device would generate a encryption key, this would be re-generated when the device is wiped, for example when it is resold, the device would then display a QR code, via a small E-Ink display or such, that would allow pairing (or such) between the device and a user account – via a internet connection to a web service. Unpairing the device from the user account would revoke the encryption key requiring the device to regenerate a new key (and a new QR code). However wiping the device would destroy the encryption key but wouldn’t revoke the key, this would cause some housekeeping to occur. Perhaps trying to unpairing first, however it shouldn’t be dependent on a internet connection to a web service in order to work. If the hard reset button is pressed, it must destroy the encryption key regardless if the unpairing fails or not. It must force this.

It’ll be interesting to see if PayPal expands then authentication  business beyond just for payments in the future.

Where has SIMS Bulk Import gone?

Cut-down version, I’m not working with SIMS anymore and I can’t pass on my work without them getting a large bill from Capita for using the business objects.

Years and years ago I was working on a SIMS help desk at a Local Authority (LA). I had a school log a call asking about importing email addresses. Like many schools this school had just purchased a messaging service that allows sending emails and texts, the problem was they needed them in SIMS .net, as you can imagine the idea of manually retyping 1,000+ email addresses was rather daunting. So, after putting a call into Capita Partner Team asking for the API documentation I spend the evening building a simple import process. The next day I phoned the school to give them the good news and we imported their email addresses. A few days later a few more schools asked the same question and I decided to continue spending my evenings expanding (it imports email, telephone and User defined fields from XML, CSV and Excel spreadsheets) and refining the tool. I’ve always developed in my own time, I’ve never claimed overtime etc and I’ve never charged a penny for using it.

SIMS Bulk Import uses the SIMS .net business objects (API), its uses the the SIMS .net DLLs. Its the exact same .NET libraries that the SIMS .net library uses.

At the time we were in a partnership with BT, who had a procedure for raising new business ideas. The process involved working out what type of change it was – in this case it was a efficiency saving. To put it simply this means you can’t charge more then what you can save, in this case labour.

Capita charge the school. Then they charge the partner for write access, which the partner then charges on the school so the schools end up paying twice for support on one thing.

When you take into account the Capita charge, the handling of money costs (invoicing, collecting and chasing etc), the helpdesk costs, the fact people expect a certain standard, ie you’re going to have to invest alot more in terms of development – including documentation. It just doesn’t make sense. It’s actually more cost effective to hire a temp to manually key in all those details!! Nuts!!

So at this point I basically decide I’ll give it away. I really didn’t want to see my hard work go to waste. I basically managed to wangle it into the public domain without finding a massive Capita bill land on my desk!! Its been in a wild for many, many years (with Capita knowledge) with a grand total of ZERO corrupt SIMS databases. I find this quite an achievement. Don’t get me wrong, SIMS Bulk Import has failed a number of times, but it’s never left your SIMS system in a worse state (unless you’ve done something stupid like successfully imported the same email address to every pupil in SIMS!)

A few years later I switched teams and stopped working with SIMS .net. SIMS Bulk Import has been stable for a while and I’ve had a few commits from individuals. I’m now at the stage where I’m going to leave the LA and go work somewhere else and its unlikely to have a SIMS .net license let alone API access. I needed to find a new owner for SIMS Bulk Import. Anyone who’s talked to me would have described SIMS Bulk Import as the poor man’s Salamander Active Directory, it is simply put the next logical step if you work out how you’d improve SIMS Bulk Import, its what I would have done to make SIMS Bulk Import into a commercial product. Luckily Richard agreed to take it on and even help me recover some of the costs of SIMS Bulk Import. Before you shoot off to SalamanderSoft to download it, let me save you the disappointment. Capita has said they would charge them a license fee for each school using it, ie it wouldn’t be free. At this point I guess you can see where I’m going with this? SIMS Bulk Import isn’t worth paying for and Richard already has the expanded version (that IS worth paying for).

So in short, your options are:

  1. Except you can’t bulk import anymore and get typing or copy and pasting \ hire a temp
  2. Look at automating your processes and buy Salamander Active Directory
  3. Wait for Capita to come out with their own product – I suspect they will and charge. They do a limited SQL script that injects the records directly into the database, ironically bypassing the business objects (but hey, they support it so its all good right?)
  4. Switch MIS who doesn’t charge for partner access \ gives you bulk import routines (Eduware Network has a list of MIS suppliers)

Option 5 is carry on using it. I’m sure even with me saying, no, don’t do that (and I’m sure Capita will agree) – someone will. So a few comments.

Make sure you have the latest (or should that be last) version – its 2.5.0

It should be digital signed, think of it as SSL for your applications. If you right-click on SIMSBulkImport.exe or the .msi installer you should see an extra tab – Digital Signature and you should see its signed by me – Open Source Developer, Matt40k. If your copy doesn’t have the signature its possible code has been injected and it is unsafe.

digitalsignature

You should be OK as it uses whatever version of SIMS API you have installed, so it’ll just break one day and by break I mean it won’t let you login or will just give you all import failed (if it does fail in terms of SIMS database corrupt then some thing terrible has happened with Capita API, but I digress).

A few people have forked my code, for whatever reasons, I would just point out that the most up-to-date fork is 80 commits behind mine. That’s a fair amount of work that missing from those forks.

Anyway, hopefully you’ve found it useful whilst it last.

Build problem and duplicate releases

Yesterday I went to investigate a problem someone had reported only to discover a butt load of releases


As you can see, I have more releases then commits. A build was triggered by a commit, but this doesn’t explain why.

I tried switching it up so it only builds when you put [build] in the commit message – which resulted in having to commit the yaml file as the feature isn’t available via the web interface, it does make sense to have this under version control. However it continued to loop.

Looking at the build history revealed it was looping, per commit

An email off to support resulted in speedy response.

Add “skip_tags: true” to your appveyor.yml.
-Feodor Fitsner, AppVeyor

Bingo. It was looping because I create a GitHub release for each build and tag the release. Which was then causing a build, which created a release that then got tagged, which created a release…

I’m grateful I’m not paying per build! Hopefully this will serve as a warning for others.

I’ve started to remove the releases but I’ve hit the API limit last night. Hopefully I’ll clean up my PowerShell script to deal with it, failing that I may have to rely on the kindnesses of GitHub human.

“I have always depended on the kindness of strangers.”

– Blanche DuBois

Update

I’ve managed to remove the duff tags as well and SIMS Bulk Import is down to two releases on GitHub. I’ve been slightly heavy handed when it came to deleting. Thankfully all the duff releases where pre-release so the problem wasn’t as bad is it could have been.

Below is the PowerShell script I quickly knocked together, the tags I did over lunch at work, thus the proxy bit – note that isn’t our actual proxy address 😉

Showing SIMS Bulk Import some love

So today I managed to move SIMS Bulk Import over to Appveyor. So what does this mean? Well it means I don’t have to worry about going over my Azure credit limit each month for starters! Appveyor has excellent support for GitHub, so each commit is automatically build, tested, and (when I enable it) a new release created.

The next release will include

  • Pupil username creation – you can blame\thank Andrew Mulholland for this, I saw his project PiNet and thought we can make this better.
  • PowerShell module – you can blame\thank Ryan Yates for making me want to do this one
  • Log submission – I’ve finally started pull this together, I’ve desensitised the log file by basically creating two. One for log submissions and other for local debugging. The main issue is testing, both the web service and the application.

Clever SQL Jobs?

Just thinking out load (because its good to get feedback)

Just to expand on my tweet. I’ve lot of SSIS packages for load and building my BI warehouse (SSAS cubes), now currently I have a SQL job that has multiple steps

For example

  1. Load data from Source system into Staging
  2. Build ODS intermediate tables
  3. Build Warehouse tables

And this works for DataMart where they are a denormalized copy of a single source system, but when the source system, or the intermediate tables, are used multiple times, you don’t want to run them multiple times. Equally I don’t want to figure this out each time we add a new package.

I basically want a controller task that manages it.

We have a automation build\deploy process,  so you commit the (SSDT\SSIS\SSAS) packages and it deploys it and creates a SQL job for each SSIS package. We could add step to trigger the controller to update the schedules based on the change.

Now we would need to define the dependencies, basically we need build a hierarchy. Technically I could read the SSIS packages then parse the select statements then create some logic to work it out automatically, but for now, it’ll be a static table(s).

So. Do I create a SQL stored procedure to create a SQL job(s) with multiple steps or a SSIS package using something like BIML?

 

 

Technical Analysis on Schools websites in England

So in my last blog post I asked, are you ready for IPv6? The post came about when I was looking at Schools MIS data, which Graham, Joshua and myself have being look at to see who are the big movers and shakers in the Schools administration software (MIS) arena. Data is collected by what software suppliers a school uses to submit the School Census (in England) which is requested under the Freedom of Information (FOI) from Department of Education(DfE) (saving having to FOI every individual school). In order to enhance this data I was joining the data onto the general schools data that can be extract from EduBase. Looking at the data I notice that the website addresses listed in the extract was of extremely poor quality. A number even had email addresses listed!

This got me thinking, are schools ready for IPv6? If Sky are running out of IPv4 addresses and offer IPv6 only connections at a lower price, how many parents are going to jump on the deal only to find out they can’t access their child’s school website later on. After a bit of scripting and a support call to Mythic-Beasts to enable a response in JSON that I could automate, I had the results. It wasn’t good.

Still, no-ones ready for IPv6 are they, sure they’ll be ready in time. Won’t they?

We can only judge the future from what we have suffered in the past

Themistocles , 300: Rise of an Empire

To this effect, I’ve gathered data to look at:

  • šDomain registration correctness
  • šContent management systems
  • šDocument type definition
  • šRaw HTML homepage size
  • šGoogle Analytics
  • šIPv6 readiness

At the moment I’m still creating the presentation detailing my findings, but you can download the raw data from: https://github.com/matt40k/SchoolsWebsites-England

Slow progress

I’m currently getting bogged down with other projects and unfortunately my final sprint of SIMS Bulk Import has grind to a halt. On the plus side I managed to swash a few more bugs last month, get a code signing certificate. This will mean all future releases will be signed as coming from me which is great news, it adds a layer of confidence that the code you run is unaltered by a third party. I’ve also started reviewing my other projects, ensure that the code on the public repository is up-to-date, I’ve also switched over to Git and copied them all over to GitHub. At bit more on that later.

Part of the updated process was to use a Continuous Integration server – specifically TeamCity. Simply put, these allowed the builds to occur on a dedicated box which made the whole process quicker and easier – it also added confidence as it ensured that everything required was commit in the repository. The only downside to this is the cost – although renting a VPS from OVH is cheap, it still isn’t cost effective due to the limited amount of time I actually use it, I’m therefore looking at moving it to Azure as you only pay for provisioned resources and your can de-provision servers and only pay for the storage. Alas this means more messing about setting up servers.

TeamCity AssemblyInfo Patcher not working

Tonight I was trying to get TeamCity to auto-update the version number for my SIMS Bulk Import application, however the simple AssemblyInfo Patcher just failed to work. No errors. Nothing.

Then I read the manual, again…

Note that this feature will work only for “standard” projects, i.e. created by means of the Visual Studio wizard, so that all the AssemblyInfo files and content have a standard location.

So I created a new package which placed the AssemblyInfo.cs into the Properties folder, mine was in the main directory, moved it into the subfolder and bang, it works. Awesome.

 

WIX installer why you no MSBUILD no more?

Last night I chopped up my WIX installer project – I basically split the actual installer parts into one file – CoreMsi.wxs, the versions into a variable file – Version.wxi and the file list into a separate fragment file – SimsBulkImport.wxs. This worked.

Great I thought, simple PowerShell script to generate the file list, I can even grab the main repository complied bin as a artifact dependency – got to love TeamCity.

Problem was, it fails to build.

error CNDL0104: Not a valid source file; detail: ‘.’, hexadecimal value 0x00, is an invalid character. Line 2, position 1.

bit of trial and error later I figured it out. It was my old friend Mr Encoding. After adding

-Encoding “UTF8” 

to the end of the Out-File and it builds again.

SIMS Bulk Import

Overview

If your in the UK, the chances are your local school is using SIMS .net. Just checkout the stats on their site. 22,000 schools taking 2,500,000 children’s attendance, every day. That’s impressive.

So what is SIMS .net? Well its a MIS system, but what is a MIS system? In the simplest terms its a database that holds the school records, for both students and staff.  So it makes sense that you are going to want to interface with it as it’s your at your core when it comes to data sources.

Now extracting data is pretty straightforward. Capita have created a custom reporting engine that allows simple report creation that can then be scheduled and produce exports of data. The problem is getting data back into SIMS .net. Take for example identity management, it’s easy enough to export a list of students, then write a PowerShell script to generate user accounts, but wouldn’t it be good to get that username added back into your core data source? Or what about adding in new students home email addresses and telephone numbers? Well it isn’t that straightforward to bulk import. Although Capita provide a API it isn’t as straightforward and certainly isn’t as friendly as a RESTful web service and certainly requires a programming background to understand it which rules out many schools being able to use it.

Back in May 2012 I was working on a SIMS support desk doing technical support and one of our customers asked if he could bulk import email addresses back into SIMS .net. This resulted in me asking Capita for documentation regarding the API which they provided, at no cost, along with a few snippets of example code. I then spent the following nights coding away at home to what has become SIMS Bulk Import.

About

So what are SIMS Bulk Import features

  • It’s free – Thank you Phil Neal for waiving the licensing costs 🙂
  • It uses the SIMS .net Business Objects (no large Capita charges for corrupting your SIMS .net database!)
  • Imports from CSV, Excel spreadsheets and XML files
  • Matches the file fields with SIMS .net fields

The future

I’ve since moved away from SIMS support and I’ve been trying to find a suitable new home for SIMS Bulk Import. Whether that’ll be Capita writing a replacement, or someone else taking up the project it isn’t clear. For now I’ll continue to support it. But to as part of this I’ve moved the source over to GitHub, the releases will still be via CodePlex.  I’ve also created a separate organisation on GitHubSIMSBulkImport, which effectively “owns” the code. I fork the code from  SIMSBulkImport into my personal repository, and do a pull request to merge the code back into the main SIMSBulkImport repository. This will help should the “owner” of SIMS Bulk Import change in the future.

I’ve also created a number of sub-projects – the SIMS interface library will be moved from Unfuddle onto GitHub, also the installer element will become a separate repository. I’m looking at setting up TeamCity to automate the build process so building and creating releases is a lot more streamlined and less time consuming. Also a simple web site setup giving easy advise for getting started.

Another change is a web API – this allows checking for newer versions and secure uploading of log files to the developers (OK, me). Mythic-Beasts have kindly donated a server hosted in Cambridge, UK to host this. This written in PHP using the Laravel framework, I’ve started writing it based on some of Bobby Allen original code.

Going forward I would like some (official) recommendation from schools and support teams, ideally I’d like to put together some sort of list or program detailing where you can get support from, for example:

Your Local SIMS Team supports SIMS Bulk Import!

Your Local SIMS Team is a SIMS Bulk Import silver partner!

It’s just difficult making a business case to people when your product is free. The main reason I want to do this is I’m find the odd problem where someone need to remote in, this is difficult without some sort of support agreement and T and Cs – most of the problems are actual SIMS problems like the SIMS client hasn’t be installed correctly which their SIMS support team could resolve. I appreciate working in the dark isn’t easy and you can’t know about every SIMS third party product, so your going to get a element of from pillar to post. Ideally I need a main backer to act as data controller with regards to the log files, at the moment I’m investigating if I could get around it by limiting the data it could spit out but that’s going to limit the usefulness of the logs. It’ll also been good to get the program digitally signed.

If anyone is willing to help out feel free to drop me a email – matt [at] matt40k [dot] uk