This week I’ve been working on another new cube which had the time spent, which is in minutes. To pretty it up it wanted to be in days, mins and hours.
Here is a simplified version of the final output
and the hopefully easier to understand MDX code
Adding the days option isn’t much more work – it does however make the code look alot more angry. Also you have the issue of what is a day – is it 24 hours? Is it the working hours, or is it less, is it the realistic billable time after you deduce admin time?
Unfortunately this year I’ve not been able to make any of the pre-cons, however the (free) community day looks epic – the hardest part is going to be selecting which session to attend, the final one looking the worst. Aghhh!!! I’m going to have to roll a dice or something!
One session that really peak my inter-geek, which had me looking at getting a Microsoft Band – only to find out they are running the stocks low at the moment (they release the first and second one in October, so it looks like the 3 will be out soon).
The (Consumer) Internet of Things on your arm – MS Band & PowerBI connected
The Internet of Things (IOT) gets more and more attraction - not only on the business but also on the customer side. Connected fridges, cars and smart watches - always and everywhere connected! In this session Wolfgang will show you some possibilities of the Microsoft Band 2 SDK: how-to connect and read sensor data out of this device. But what should be done with that data?
Power BI seems to be an ideal candidate for analyzing and presenting those kind of data. The different types of real-time analytics (Stream Analytics, Power BI API, ..) will be presented and their pros and cons will be envisioned.
The challenge: Let's prepare a real-time dashboard of Band2 data in Power BI in 60 minutes!
Want to help inspire young people and deliver a Micro:bit workshop?
If you are a teacher, parent or anyone interested in inspiring the next generation of technical experts; then come along to the Micro:bit session to get you started on how you could deliver a micro:bit workshop for either Primary or Secondary school children to code and create.
In this session you will be taken through a workshop setup and approach (introduction, 3 coding tasks and a quiz) as well as getting hands on with the BBC micro:bit
This session requires external registration via http://microbit_opt1.eventbrite.co.uk
Octopus is a friendly deployment automation tool for .NET developers.
But don’t be fooled. Octopus Deploy isn’t just limited to .NET developers deploying hip new cloud based applications, it can do much, much more. I currently use Octopus to deploy Jasper reports to our 6 HR environments, I also use it to deploy our HR interfaces which are a series of SSIS packages and SQL objects which uses Microsoft SQL Server 2008. I use DbUp, a open source library, for database deployment and a custom C# application for SSIS package deployment. Today, we surpassed 1,000 report deployments to production, we’ve also deploy over 270 SSIS package changes in about a year.
So when it came to upgrading our BI platform from SQL 2008 to SQL 2014, one of the key things I want was deployment automation. The SQL 2008 platform required manual deployments which often lead to mistakes and ended up writing the entire day off, per deployment. Unfortunately, my current process was pretty basic. Database deployments are idempotent, it drop any objects and recreated them, every time. This is fine for interfaces where tables only hold in transit data, but for BI, the idea of dropping a staging table with 140 million rows that takes over 4 hours to load doesn’t make me want to do any deployments. Luckily, the problem is already solved. SSDT. And there is already a PowerShell step template on the Octopus DeployCommunity Library.
The only thing missing was SSAS. After watching Chris Webb’s video tutorial – Cube Deployment, Processing and Admin on Project Botticelli, I decided it had to use Microsoft.AnalysisServices.Deployment.exe. After a bit of scripting and testing, I managed to write a PowerShell that updates the xml config files for the deployment – it sets the ProcessingOption to ‘DoNotProcess’. It updates the Data source – where the cube will refresh the data from. The script isn’t perfect. For starters, what if you have more then one data source? Also what if your not using SQL Server 2014? Still the great thing about open source is that other can update it. Anyone can improve it, its not reliant on me having free time. So hopefully by the time we move to SQL 2016 someone will have already updated it to work with SQL 2016.
In a future post I’m going to blog about Octopus Deploy in a bit more detail and how I’ve setup my SQL Server BI deployment process (in a lot of detail). I’m hoping to try using Microsoft Visual Studio Team Services to build the Octopus packages. Currently I use a on-prem TeamCity, which is excellent, its just… I don’t like managing extra servers when I can use SaaS. I like development over administration.
I’ll leave you will a screenshot of my deployment screen in Octopus Deploy, and yes, that’s one button press to get my changes into UAT
One of the problems we have is documentation and how to keep it up-to-date. We tend to deliver a new batch of reports as part of a CR (Change Request) then make changes as part of BAU (Business as usual) unless its a particularly complex or time consuming change. The problem is often BAU changes happen rapidly often within a short time frame and time isn’t always allocated to updating documentation.
Over time a small, simple change and evolve into a major change and the documentation can significantly drift out-of-sync. Documentation is an important part of the communication between developer, analysis and the end-user, without it misunderstanding and confusing can occur and errors and mistakes happen.
Whilst investigating how other are documentation work I rediscovered the much unloved extended property MS_Description. As you might have guess, using the description field to enter, yes, you guessed it, a description of the tables, views, stored procedure and other SQL objects. Speaking of naming – it’s also important to name things correctly. A table named dbo.Table1 helps no-one. I tend to name any temporary tables MS_TMP – this then quickly identities me as the owner and the fact it can most likely be destroyed. Any dimensions are prefix with Dim and facts are, yes, you’ve guessed it again – Fact.
I have a TSQL script that pulls this data into a table on the staging database as part of a SSIS package, this is configured a SQL Job, however it is only execute post-deployment – we use TeamCity to build the project and then OctopusDeploy for deploying to the various environments.
I have however modified it, I’ve followed the standard ETL process, so I’ve extracted the data from the DMVs into a staging data, like SQL objects, this is only executed post-deployment by a SQL job that is executed by OctopusDeploy task. I didn’t like the idea of dynamically querying the data every time and as we use continue integration\deployment I don’t need to worry about the data being out of data.