SSRS support for both US Letter and UK A4

One of those annoying things about the US is they use different paper sizes to us Brits.

US Letter is 215.9 by 279.4 mm (8.5 by 11.0 inches)
The UK equivalent, A4, is 210 by 297 mm (8.26 by 11.69 inches)

One of our customers is based in the UK but has remote sales offices in the US, so their SSRS report are set to A4 and when they printed them in the US offices, the footer is cut off. The fix? Simple. Change the height to 279.4 mm.

Assuming you have a footer, it will now hover 17.6mm higher up the A4 page, which hopefully won’t be too terrible. It will now be American proof.

Date Format

I’ve had a problem recently when it can to formatting a datetime as just date where FormatDateTime didn’t work

The fix was to change it to Format.

Oddly, despite the parameter being a datetime, I found I still had to cast it as a date.

FormatDateTime was introduced in Vs 2008. I’m not 100% sure why this didn’t work.

TextBoxImpl

Another error I hit

Warning               1

[rsRuntimeErrorInExpression]

The Value expression for the textrun ‘Textbox28.Paragraphs[0].TextRuns[0]’

contains an error: Overload resolution failed because no Public ‘/’ can be called with these arguments:

‘Public Shared Operator /(d1 As Decimal, d2 As Decimal) As Decimal’:

Argument matching parameter ‘d2’ cannot convert from ‘TextBoxImpl’ to ‘Decimal’.

C:\Projects\Reports\1. Report.rdl

Thankfully Google found Qiuyun answer. I was missing the .Value at the end.

So (wrong)

=ReportItems!Textbox1

Fixed (working)

=ReportItems!Textbox1.Value

SSRS copy and paste fail

In case you haven’t heard, I’ve started a new job and one of my first tasks was to speed up a SSRS report. One of the first issues I stumbled across was this:

An error occurred during local report processing.

The definition of the report ‘/1. Report’ is invalid.

The Value expression for the textrun ‘Tx412.Paragraphs[0].TextRuns[0]’ contains an error: [BC30456] ‘RdlObjectModel’ is not a member of ‘ReportingServices’

Looking into it the issue it appears its expanded expressions when it got copied – it’s basically making the it fully qualified.

So

Cint

becomes:

Microsoft.ReportingServices.RdlObjectModel.ExpressionParser.VBFunctions.Cint

The fix was to just edit to code then do a find and replace, replacing the added reference with nothing. Simple.

SSRS 2016 by default

For the past 4 months I’ve been using Visual Studio 2015 rather then Visual Studio 2013 and yesterday, I hit a bug with SSRS (shows how much I love SSRS).

The bug appeared when it came to deployment

The error was deploying SSRS report

Exception calling “CreateCatalogItem” with “7” argument(s): “The definition of this report is not valid or supported
by this version of Reporting Services. The report definition may have been created with a later version of Reporting
Services, or contain content that is not well-formed or not valid based on Reporting Services schemas. Details: The
report definition has an invalid target namespace
‘http://schemas.microsoft.com/sqlserver/reporting/2016/01/reportdefinition’ which cannot be upgraded.

The report that error deploying was the one I had just changed. Looking at the code diff in Visual Studio Team Services you can see its added in some new 2016 goodness. I checked and double-checked my settings. Visual Studio 2015 wants to give you a 2016 file. If you manually deploy from within Visual Studio it works. Checking out the bin\debug folder reveals that it’s using the 2016 version to build a legacy version, which makes sense as the 2016 is the old (2008R2) with extra goodness slapped in – like being able to define parameter location.

So, we now have to build our SSRS reports. To do so, I’ve just created a PowerShell script that builds it using devenv.exe with the parameters for a silent build. This worked great, only it took over 50mins to run. The main lag was every time it loaded the solution it builds\checks\something with the database projects. It takes an age. The fix was to create a dedicated Report solution file (Report.sln) – I did this with a PowerShell script that copying the main solution file, which included all the SSDT (database), Integration (SSIS) packages, analysis models as well as the SSRS reports, then just removing all projects but the SSRS projects. I did this via PowerShell just because I don’t like extra manual steps. It’s a pain, when you add a new SSRS project you’d have to add it to Report.sln, which you’d probably forget and waste spend time debugging a silly mistake.

Creating the Report.sln reduce the build time down to just over 2mins (from over 50mins!), so it was worth doing.

Another year, another SQL Saturday

Unfortunately this year I’ve not been able to make any of the pre-cons, however the (free) community day looks epic – the hardest part is going to be selecting which session to attend, the final one looking the worst. Aghhh!!! I’m going to have to roll a dice or something!

CDC – Change Data Capture in Detail
Uwe Ricken

What’s new in the SQL Server 2016 for a BI Professional
Prathy Kamasani

Exploring the Azure SQL Database
Murilo Miranda

All things time-related
Kennie Pontoppidan

Data-driven personal decisions with PowerBI and Machine Learning
Ruben Pertusa Lopez

One session that really peak my inter-geek, which had me looking at getting a Microsoft Band – only to find out they are running the stocks low at the moment (they release the first and second one in October, so it looks like the 3 will be out soon).

The (Consumer) Internet of Things on your arm – MS Band & PowerBI connected

The Internet of Things (IOT) gets more and more attraction - not only on the business but also on the customer side. Connected fridges, cars and smart watches - always and everywhere connected! In this session Wolfgang will show you some possibilities of the Microsoft Band 2 SDK: how-to connect and read sensor data out of this device. But what should be done with that data? 
Power BI seems to be an ideal candidate for analyzing and presenting those kind of data. The different types of real-time analytics (Stream Analytics, Power BI API, ..) will be presented and their pros and cons will be envisioned. 
The challenge: Let's prepare a real-time dashboard of Band2 data in Power BI in 60 minutes!

Wolfgang Strasser

It’s also good to see the a small local company getting heavily involved :p

And not forgetting the Micro:Bit sessions on

Imagine If – Train the Trainer Micro:bit session

Want to help inspire young people and deliver a Micro:bit workshop?

If you are a teacher, parent or anyone interested in inspiring the next generation of technical experts; then come along to the Micro:bit session to get you started on how you could deliver a micro:bit workshop for either Primary or Secondary school children to code and create. 

In this session you will be taken through a workshop setup and approach (introduction, 3 coding tasks and a quiz) as well as getting hands on with the BBC micro:bit

This session requires external registration via http://microbit_opt1.eventbrite.co.uk

Amy Nicholson

MDSCHEMA_CUBES DMV not returning all cubes

I had previously created a SSIS package with a simple Process Full on the SSAS MD database, however, as the project has progressed this hasn’t been ideal. Its basically all or nothing. In order reduce the damage a failure can cause I’ve setup a Process Full for each dimension and cube so each is processed independently of each other. I’ve used the data from the Analysis Services Dynamic Management Views, or DMV for short, which I’ve used for the documentation to feed the foreach loop.

However there is a major problem with the DMVs. They only list processed cubes (and their dimensions). So how do you get a list of unprocessed cubes (and their dimensions)?

Answer? Using the Analysis Management Objects (AMO). This will return the same list as SQL Server Management Studio (SSMS), rather then just the processed, like the DMV or Excel lists. I’m currently trying out a few ideas to find the best solution – so as a script component (source) or a stored procedure. As the saying goes, it’s production ready, but its not GitHub ready – not yet anyway.

Thanks again to Chris Webb for pointing me in the right direction

 

Automated SQL Server BI deployments with OctopusDeploy

Today, my pull request for the SSAS deployment script was merged into the Octopus Deploy Community Library.

So what is Octopus Deploy and why should I care?

Octopus is a friendly deployment automation tool for .NET developers.

But don’t be fooled. Octopus Deploy isn’t just limited to .NET developers deploying hip new cloud based applications, it can do much, much more. I currently use Octopus to deploy Jasper reports to our 6 HR environments, I also use it to deploy our HR interfaces which are a series of SSIS packages and SQL objects which uses Microsoft SQL Server 2008. I use DbUp, a open source library, for database deployment and a custom C# application for SSIS package deployment. Today, we surpassed 1,000 report deployments to production, we’ve also deploy over 270 SSIS package changes in about a year.

So when it came to upgrading our BI platform from SQL 2008 to SQL 2014, one of the key things I want was deployment automation. The SQL 2008 platform required manual deployments which often lead to mistakes and ended up writing the entire day off, per deployment. Unfortunately, my current process was pretty basic. Database deployments are idempotent, it drop any objects and recreated them, every time. This is fine for interfaces where tables only hold in transit data, but for BI, the idea of dropping a staging table with 140 million rows that takes over 4 hours to load doesn’t make me want to do any deployments. Luckily, the problem is already solved. SSDT. And there is already a PowerShell step template on the Octopus Deploy Community Library.

Also moving to SQL 2014 allowed me to use the new ISPAC, again, there is already a PowerShell step template on the Octopus Deploy Community Library. There is even a PowerShell step template for SSRS.

The only thing missing was SSAS. After watching Chris Webb’s video tutorial – Cube Deployment, Processing and Admin on Project Botticelli, I decided it had to use Microsoft.AnalysisServices.Deployment.exe. After a bit of scripting and testing, I managed to write a PowerShell that updates the xml config files for the deployment – it sets the ProcessingOption to DoNotProcess’. It updates the Data source – where the cube will refresh the data from. The script isn’t perfect. For starters, what if you have more then one data source? Also what if your not using SQL Server 2014? Still the great thing about open source is that other can update it. Anyone can improve it, its not reliant on me having free time. So hopefully by the time we move to SQL 2016 someone will have already updated it to work with SQL 2016.

In a future post I’m going to blog about Octopus Deploy in a bit more detail and how I’ve setup my SQL Server BI deployment process (in a lot of detail). I’m hoping to try using Microsoft Visual Studio Team Services to build the Octopus packages. Currently I use a on-prem TeamCity, which is excellent, its just… I don’t like managing extra servers when I can use SaaS. I like development over administration.

I’ll leave you will a screenshot of my deployment screen in Octopus Deploy, and yes, that’s one button press to get my changes into UAT

octopusbi

Change text direction – Visual Studio 2013 / SQL Server 2014

I’ve spent a while today trying to change the text direction (orientation) of a header on a SSRS report – the option was rather oddly placed in Visual Studio – the option is within the properties (for the text) Localization > Writing Mode > then change from Default to either Rotate270 or Vertical. I was hoping it would be within the Font options. Guess that would be too easy 😉