SSRS 2016 by default

For the past 4 months I’ve been¬†using Visual Studio 2015 rather then Visual Studio 2013 and yesterday, I hit a bug with SSRS (shows how much I love SSRS).

The bug appeared when it came to deployment

The error was deploying SSRS report

Exception calling “CreateCatalogItem” with “7” argument(s): “The definition of this report is not valid or supported
by this version of Reporting Services. The report definition may have been created with a later version of Reporting
Services, or contain content that is not well-formed or not valid based on Reporting Services schemas. Details: The
report definition has an invalid target namespace
‘http://schemas.microsoft.com/sqlserver/reporting/2016/01/reportdefinition’ which cannot be upgraded.

The report that error deploying was the one I had just changed. Looking at the code diff in Visual Studio Team Services you can see its added in some new 2016 goodness. I checked and double-checked my settings. Visual Studio 2015 wants to give you a 2016 file. If you manually deploy from within Visual Studio it works. Checking out the bin\debug folder reveals that it’s using the 2016 version to build a legacy version, which makes sense as the 2016 is the old (2008R2) with extra goodness slapped in – like being able to define parameter location.

So, we now have to build our SSRS reports. To do so, I’ve just created a PowerShell script that builds it using devenv.exe with the parameters for a silent build. This worked great, only it took over 50mins to run. The main lag was every time it loaded the solution it builds\checks\something with the database projects. It takes an age. The fix was to create a dedicated Report solution file (Report.sln) – I did this with a PowerShell script that copying the main solution file, which included all the SSDT (database), Integration (SSIS) packages, analysis models as well as the SSRS reports, then just removing all projects but the SSRS projects. I did this via PowerShell just because I don’t like extra manual steps. It’s a pain, when you add a new SSRS project you’d have to add it to Report.sln, which you’d probably forget and waste spend time debugging a silly mistake.

Creating the Report.sln reduce the build time down to just over 2mins (from over 50mins!), so it was worth doing.

Comments

One of the annoying things about working with multiple languages is they each have their differences – obviously, otherwise there would only be 1 language! Although there is good reason to have these differences they still have elements that are a pain. One of the most annoying things, other then “do you require a semicolon at the end of the statement?” is comments. Comments aren’t executed or complied but they help developers read code. Remember, the code is a interpretive language, its designed to be gateway between the machines and humans, if you write something complex your more then likely need to be include more comments explain what it does or look at refactoring so other developers can continue your work.

The few options are:

Double dash (–)

— Single line quote

Single Hash (#)

# Single line quote

Double forward-slash (//)

// string domain = “matt40k.uk”;

All of these are single line quotes, as in each line needs the require comment and the comment continues till the end of the line \ carriage break.

You also got:

Forward-slash and asterisk (/*)

/* Multi-Line
quote */

Forward-slash and asterisk marks (/*) the start of the comment, the comment continues until the asterisk marks and Backward-slash (*/). The code must close any open Forward-slash and asterisk marks (/*) comments in order to compile.

With Forward-slash and asterisk, you can comment in-line, for example

Select * from /* Change the code to the right */ sys.tables

This is valid T-SQL that will return the all the columns for meta data about tables on the selected database

You can also use this across multiple lines, for example

select

TableSchema = schema_name(schema_id)

,TableName = name

/*
You could also add

other tables

 

*/
from

sys.tables

 

Power Query Formula (M) Language

Double dash (–) No
Single hash (#) Yes
Double forward-slash (//) No
Forward-slash and asterisk (/*) Yes

Transact-SQL (TSQL)

Double dash (–) Yes
Single hash (#) No
Double forward-slash (//) No
Forward-slash and asterisk (/*) Yes

C sharp (C#)

Double dash (–) No
Single hash (#) No
Double forward-slash (//) Yes
Forward-slash and asterisk (/*) Yes

 

Fun fact, my most commonly used comment?

To-Do

Build problem and duplicate releases

Yesterday I went to investigate a problem someone had reported only to discover a butt load of releases


As you can see, I have more releases then commits. A build was triggered by a commit, but this doesn’t explain why.

I tried switching it up so it only builds when you put [build] in the commit message – which resulted in having to commit the yaml file as the feature isn’t available via the web interface, it does make sense to have this under version control. However it continued to loop.

Looking at the build history revealed it was looping, per commit

An email off to support resulted in speedy response.

Add “skip_tags: true” to your appveyor.yml.
-Feodor Fitsner, AppVeyor

Bingo. It was looping because I create a GitHub release for each build and tag the release. Which was then causing a build, which created a release that then got tagged, which created a release…

I’m grateful I’m not paying per build! Hopefully this will serve as a warning for others.

I’ve started to remove the releases but I’ve hit the API limit last night. Hopefully I’ll clean up my PowerShell script to deal with it, failing that I may have to rely on the kindnesses of GitHub human.

“I have always depended on the kindness of strangers.”

– Blanche DuBois

Update

I’ve managed to remove the duff tags as well and SIMS Bulk Import is down to two releases on GitHub. I’ve been slightly heavy handed when it came to deleting. Thankfully all the duff releases where pre-release so the problem wasn’t as bad is it could have been.

Below is the PowerShell¬†script I quickly knocked together, the tags I did over lunch at work, thus the proxy bit – note that isn’t our actual proxy address ūüėČ

Showing SIMS Bulk Import some love

So today I managed to move SIMS Bulk Import over to¬†Appveyor. So what does this mean? Well it means I don’t have to worry about going over my Azure credit limit each month for starters!¬†Appveyor¬†has excellent support for GitHub, so each commit is automatically build, tested, and (when I enable it) a new release created.

The next release will include

  • Pupil username creation – you can blame\thank¬†Andrew Mulholland¬†for this, I saw his project PiNet and thought we can make this better.
  • PowerShell module – you can blame\thank¬†Ryan Yates¬†for making me want to do this one
  • Log submission – I’ve finally started pull this together, I’ve¬†desensitised the log file by basically creating two. One for log submissions and other for local debugging. The main issue is testing, both the web service and the application.

Automated SQL Server BI deployments with OctopusDeploy

Today, my pull request for the SSAS deployment script was merged into the Octopus Deploy Community Library.

So what is Octopus Deploy and why should I care?

Octopus is a friendly deployment automation tool for .NET developers.

But don’t be fooled. Octopus Deploy¬†isn’t just limited to .NET developers deploying hip new cloud based applications, it can do much, much more. I currently use Octopus¬†to deploy Jasper reports to our 6 HR environments, I also use it to deploy our HR interfaces which are a series of SSIS packages and SQL objects which uses¬†Microsoft SQL Server 2008. I use DbUp, a open source library, for database deployment and a custom C# application for SSIS package deployment. Today, we surpassed 1,000 report deployments to production, we’ve also deploy over 270 SSIS package changes in about a year.

So when it came to upgrading our BI platform from SQL 2008 to SQL 2014, one of the key things I want was deployment automation. The SQL 2008 platform required manual deployments which often lead to mistakes and ended up writing the entire day off, per deployment. Unfortunately, my current process was pretty basic. Database deployments are idempotent, it drop any objects and recreated them, every time. This is fine for interfaces where tables only hold in transit data, but for BI, the idea of dropping a staging table with 140 million rows that takes over 4 hours to load doesn’t make me want to do any deployments. Luckily, the problem is already solved. SSDT. And there is already a PowerShell step template on the Octopus Deploy Community Library.

Also moving to SQL 2014 allowed me to use the new ISPAC, again, there is already a PowerShell step template on the Octopus Deploy Community Library. There is even a PowerShell step template for SSRS.

The only thing missing was SSAS. After watching Chris Webb’s video tutorial – Cube Deployment, Processing and Admin¬†on Project Botticelli, I decided it had to use Microsoft.AnalysisServices.Deployment.exe. After a bit of scripting and testing, I managed to write a PowerShell that updates the xml config files for the deployment – it sets the ProcessingOption to DoNotProcess’. It updates the Data source – where the cube will refresh the data from. The script isn’t perfect. For starters, what if you have more then one data source? Also what if your not using SQL Server 2014? Still the great thing about open source is that other can update it. Anyone can improve it, its not reliant on me having free time. So hopefully by the time we move to SQL 2016 someone will have already updated it to work with SQL 2016.

In a future post I’m going to blog about Octopus Deploy in a bit more detail and how I’ve setup my SQL Server BI deployment process (in a lot of detail). I’m hoping to try using Microsoft Visual Studio Team Services to build the Octopus packages. Currently I use a on-prem TeamCity, which is excellent, its just… I don’t like managing extra servers when I can use SaaS. I like development over administration.

I’ll leave you will a screenshot of my deployment screen in Octopus Deploy, and yes, that’s one button press to get my changes into UAT

octopusbi

WIX installer why you no MSBUILD no more?

Last night I chopped up my WIX installer project – I basically split the actual installer parts into one file – CoreMsi.wxs, the versions into a variable file –¬†Version.wxi and the file list into a separate fragment file¬†–¬†SimsBulkImport.wxs. This worked.

Great I thought, simple PowerShell script to generate the file list, I can even grab the main repository complied bin as a artifact dependency – got to love TeamCity.

Problem was, it fails to build.

error CNDL0104: Not a valid source file; detail: ‘.’, hexadecimal value 0x00, is an invalid character. Line 2, position 1.

bit of trial and error later I figured it out. It was my old friend Mr Encoding. After adding

-Encoding “UTF8”¬†

to the end of the Out-File and it builds again.

PowerShell, DigitalOcean and WordPress

Back in May (2014) I decided to start up a blog, again. I decided to keep it simple and just get a off the shelf hosting package rather then going to extremes like I normally do – configuring a VPS or even a Dedi server! I ended up going with CS New Media – it was just their lite Linux package which¬†I signed up for 3 months. I was tempted to stick with what I know and go with Mythic-Beasts like I normally but I like to tinker and try new things so I went with¬†CS New Media¬†after a failed attempt to spend my dosh with NameCheap – when it came to entering my card details, it just failed to connect to “their” backend and their response was, well it works for me, try another browser – by the time I phoned them I had already tried different browsers, internet connections, computers so at that point I just walked away.

Anyway fast forward two months and I was getting ichy to move away from¬†CS New Media¬†– nothing wrong with them, a bit of downtime, but nothing major and it was all stupid-o-clock, performance was ok, nothing special and support – well I never used it so, can’t comment. Using Plesk¬†was a blast from the past – that was the first web control panel I ever used, so it was nice to see how it’s improved.

Problem was, where do I go, well Mythic-Beasts had fitted SSDs into their web servers since I’ve hosted with them, tempting. Then DigitalOcean did it. They announced a new data centre. In London. With IPv6. And they have that sexy API. Ooo and I have credit with them for some reason or other, happy days. I decided to take it slow and script the whole process – I’ve been playing with PowerShell lately, so I created a PowerShell script that creates the DNS records, builds the droplet. I’ve even managed to get it to convert the Putty public SSH key into OpenSSH format so DigitalOcean will accept it and I can use SSH keys to connect to the server. I was planning on scripting the build¬†process of installing the LEMP stack – but I lost interest to be honest and enough already exist.

Net result, it’s got my PowerShell knowledge up – all the codes in my GitHub repository. One of the off-shot projects of the project was a PowerShell function to generate secure passwords, the idea was my DigitalOcean PowerShell script would generate passwords for your newly created droplet – MySQL root password for example, store it as say CSV on your local machine and only pass the password as an argument¬†to the stack install script over SSH¬†– so it wouldn’t get stored on the remote server discs.

The term ‘Invoke-WebRequest’ is not recognized

I’ve currently been working on a PowerShell script to build a virtual server using DigitalOcean excellent RESTful API. So far this has gone extremely well and I have been¬†impressed with both PowerShell and DigitalOcean’s API. Today however I tried running on another machine (an important part of development) only to find it doesn’t work. The first error that appears is:

The term ‘Invoke-WebRequest’ is not recognized as the name of a cmdlet, function, script file, or operable progr
am. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:18
+ Invoke-WebRequest <<<<
+ CategoryInfo          : ObjectNotFound: (Invoke-WebRequest:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException

In simple terms it’s saying it can’t find the¬†‘Invoke-WebRequest’ cmdlet. This should be a built-in cmdlet in PowerShell, so off I went to TechNet to find the details of the cmdlet –¬†

http://technet.microsoft.com/en-us/library/hh849901.aspx

The key bit of information was this line:

This cmdlet was introduced in Windows PowerShell 3.0.

The other way is at the top of such documents on TechNet it defaults to the most recent version of the, in this case, cmdlet and next to it gives you the option select a older version, clicking it reveals previous versions and does indeed confirm the above statement that it was not present in any version of PowerShell prior to 3.0.

TechNet
TechNet

Next question, is PowerShell 4.0 installed? Well for Windows 8.1 and Windows Server 2012 R2 it’s built-in, you can check by running (in PowerShell)

$PSVersionTable.PSVersion

sure enough, the other machine (which is Windows 7 Enterprise x64) returns

Major  Minor  Build  Revision
—–¬† —–¬† —–¬† ——–
2      0      -1     -1

For help installing PowerShell 4.0 go to: http://social.technet.microsoft.com/wiki/contents/articles/21016.how-to-install-windows-powershell-4-0.aspx