Pushing data thru Azure Stream Analytics to Power BI – part 3

After the first two parts (1, 2) now it is time to look at the code to push data to the event hub and process the data from the event hub and push it to the Power BI API. Initially my idea is to use the Microsoft Band sensor to be pushed to Power BI, so I use for that purpose a HeartBeat class as records to be pushed.

Basically we need to code the following steps:

Sending data to event hub

Before we can send data to an Azure event hub we first need the connection information. In the Event Hub the is an option to retrieve the connection information at the bottom Connection Information. This will popup a screen with the connection string for the different defined policies.

image

To access the event hub in C# we need to create a EventHubClient with the connectionstring and event hub name as parameters. After initializing the object we can fire a SendAsync method to send to event to the event hub. In this case we are creating a random heartbeat and serialize an initialized HeartBeat class to JSON as that is the in and output of the configured Stream Analytics. To initialize the event sending we can define a Task and call that task via the Wait() method.

static async Task SendingRandomMessages()
{
  var eventHubClient = EventHubClient.CreateFromConnectionString(eventHubConnectionString, eventHubName);
  while (true)
  {
    try
    {
      List<HeartBeat> heartbeats = new List<HeartBeat>
      {
        new HeartBeat{EventDate = DateTime.Now, HeartBeatValue = new Random().Next(80, 150)},
      };
      string message = heartbeats.ToJson(JavaScriptConverter<HeartBeat>.GetSerializer());

      await eventHubClient.SendAsync(new EventData(Encoding.UTF8.GetBytes(message)));

      Console.WriteLine("{0} > Sending message: {1}", DateTime.Now.ToString(), message);
    }
    catch (Exception exception)
    {
      Console.ForegroundColor = ConsoleColor.Red;
      Console.WriteLine("{0} &gt; Exception: {1}",
      DateTime.Now.ToString(), exception.Message);
      Console.ResetColor();
    }
    await Task.Delay(200);
  }
}

Reading data from event hub

First we need some basic configuration settings. Besides the event hub policies, retrieved via the same way as the sending event hub, we need the corresponding storage account its name and access key to create a connection sting.

With that information we can register a EventProcessorHost with the information from above and start the receiver with calling the RegisterEventProcessorAsync method with an IEventProcessor class as type.

host.RegisterEventProcessorAsync<PowerBIEventProcessing>().Wait();

To read the events from the event hub you need to define in the IEventProcessor class the IEventProcessor.ProcessEventsAsync method to process the received messages.

async Task IEventProcessor.ProcessEventsAsync(PartitionContext context, IEnumerable<EventData> messages)
{
  foreach (EventData eventData in messages)
  {
    //Process Recieved Message
  }
  if (this.checkpointStopWatch.Elapsed > TimeSpan.FromSeconds(10))
  {
    await context.CheckpointAsync();
    this.checkpointStopWatch.Restart();
  }
}

Pushing data to Power BI API

I started with downloading the Power BI API sample for .NET from here. This gives a basic idea of communicating with the API. If you do not have played with the API, please take a look at the sample and the way the API is working. Especially on how the authorization and authentication is working (https://msdn.microsoft.com/library/dn877544).

The biggest problem with the sample is that generic code and application specific code are mixed. For this sample that is not a problem, but to reuse the code for my application I had to rewritten a big part of the code. I converted the class with static methods to a generic class and methods without any specific application logic, like table structures. All that information has to be set during when the class is initialized. Also I changed the logic on how the AccessToken is used. In my code it is possible to retrieve the token after the class is initialized and be set when it is reinitialized. This last is important, because if we want to use it in combination with an Event Hub, multiple instances are created due to the partition size. And if we can eliminate the need to reauthorize the application over and over again, it would help.

Additional changes to reading events

After the change of the code to be usable in combination with an Event Hub, the reading of the code needs to be changed to be able to initialize the PowerBIHelper class with the correct AccessToken to eliminate an extra authorization. Problem with the previous way of initializing the event hub processing class is that it is instantiate under the hood when it is registered.

If we want to custom instantiate the helper class we need to use an IEventProcessorFactory which will instantiate the IEventProcessor classes to process the events. After initializing the factory with the preset parameters needed for the Power BI helper class, we can register the factory with a EventProcessorHost instance. When the factory is registered the event CreateEventProcessor is called multiple times regarding the amount of partitions which provides a custom initialized IEventProcessor class.

host = new EventProcessorHost(eventProcessorHostName, EventHubName, EventHubConsumerGroup.DefaultGroupName, EventHubConnectionString, storageConnectionString);
PowerBIEventProcessingFactory PowerBIEventProcessingFactory = new PowerBIEventProcessingFactory(ClientID, DataSetName, TableName, JsonTableSchema, Token, DataSetID);
await host.RegisterEventProcessorFactoryAsync(PowerBIEventProcessingFactory);
IEventProcessor IEventProcessorFactory.CreateEventProcessor(PartitionContext context)
{
  return new PowerBIEventProcessing(clientID, dataSetName, tableName, jsonTableSchema, token, dataSetID);
}

After registering the event processing part, the real processing of the events triggers an event that will write the collected json to Power BI. The json that is returned by the event hub, is the original HeartBeat class information (because Stream Analytics doesn’t alter any column structure) wrapped in an event hub class. We need to deserialize that json to an custom object to get the included HeartBeat classes.

public class EventMessage
{
  public List<HeartBeat> rows { get; set; }
  public DateTime EventProcessedUtcTime { get; set; }
  public int PartitionId { get; set; }
  public DateTime EventEnqueuedUtcTime { get; set; }
}
List<EventMessage> eventMessages = JsonConvert.DeserializeObject<List<EventMessage>>(Encoding.UTF8.GetString(eventData.GetBytes()));

In the snippet above the eventData is one of the messages that is returned by the event hub. After we have it deserialized to a known object we can read all the rows to read and store those in Power BI via the API.

See the attached code for more details: the code.

Next Steps

  1. Introduction: https://azurebi.wordpress.com/2015/03/30/pushing-data-thru-azure-stream-analytics-to-power-bi-part-1/
  2. Configuring Azure services: https://azurebi.wordpress.com/2015/04/02/pushing-data-thru-azure-stream-analytics-to-power-bi-part-2/
  3. Sending, reading and processing the events: https://azurebi.wordpress.com/2015/04/14/pushing-data-thru-azure-stream-analytics-to-power-bi-part-3

-JP

Posted in PoC | Tagged , | 3 Comments

Pushing data thru Stream Analytics to Power BI – part 2

In part 1 I provided an overview of the given solution. And now it is time to configure Azure Service Bus, Event Hubs and Stream Analytics to be able to sent and receive messages. I still be using the ‘old’ Azure Management Portal as I cannot use the new portal: I always get lost in that one.

The steps that needs to be done:

Configure Azure Service Bus

Create a new namespace

To use the Azure Service Bus and their Event Hubs in combination with Stream Analytics we first need to create a new namespace by clicking on Create a new namespace in the Service Bus page in the Azure Portal. One important thing is that it needs to be a Standard messaging tier. This is because we need to define a different Consumer Group as Stream Analytics cannot connect to the $Default group.

image

Create a sender event hub

After we have created the namespace the Event Hubs can be created. This can easily done by using the standard way of creating objects in the Azure Management Portal.

image

By default an event hub has 16 partitions and a retention of 1 day. There is no need to change this in this case, but in a real life scenario these values can be altered to satisfy the business need.

Define access policy

To access the even hub externally access policies need to be created. The direction of the policies are based on the program/apps that is using the  event hub. By clicking the arrow left of the event hub name we can configure the event hub. In the configure tab we need to add a policy and set the permission to Send.

image

Create a custom consumer group

To use the sender event hub (PowerBIEventHub) with Stream Analytics we need to create a new Consumer Group. This can be done via the Consumer Groups tab next to the Configure tab.

image

Create a receiver event hub

For the receiver event hub (PowerBIReceiver) we can create it via the same way as the sender event hub. Only this one doesn’t need a custom consumer group and the permission of the access policy needs to be set to Listen.

image

Configure Azure Stream Analytics

Now that the event hubs are set, Stream Analytics can be created and configured. This is pretty simple. Stream Analytics needs a storage account, but this can be done while creating a Stream Analytics.

image

Stream Analytics is basically a pipeline with an input and a output and in between a query defines which records of the input flows to the output. In our case we are linking the event hubs as in- and output of this stream via the Input and Output tab

image

Connect the sender event hub as input

In the Input tab we can easily add an input and a wizard pops up. In the first step choose Data Stream

image

In step 2 choose Event Hub as we are connecting our event hub to Stream Analytics

image

In step 3 we need to provide all the input event hub information. In this step the consumer group is needed we created before and it will not accept the $Default group as special characters are not accepted. The Input Alias is needed while defining the query; it is the ‘table’ used to select information from by the stream query.

image

And in the last step (4) choose JSON and UTF8 as we will send this information to the sender event hub.

image

Connect the receiver event hub as output

Adding an output is almost the same as adding an input: after clicking the add output option a wizard pops up. In the first step we select Event Hub as this is the only option we can easily use to process the data directly. (Note: in the UserVoice suggestions there is already a request to add an API output with status ‘Under Review’…)

image

And in the final step, number 2, we provide the receiver event hub information. There is no need to configure the advanced settings in this case.

image

Define the event processing query

Now that we have a input and an output, we need to define a query to retrieve the input information and add it to the output. This is done by using the query tab. In my case I use a very simple query that is just putting all the input unfiltered to the output. The query has a SQL-like syntax and uses two ‘tables’, namely the input and output name provided by the different wizards.

image

After we created the query we can test it by uploading a JSON file and look at the output. If everything is set correctly we can start the stream by clicking the play button and the Azure service are setup and working correctly and ready to use by the sending and processing app.

Next Steps

  1. Introduction: https://azurebi.wordpress.com/2015/03/30/pushing-data-thru-azure-stream-analytics-to-power-bi-part-1/
  2. Configuring Azure services: https://azurebi.wordpress.com/2015/04/02/pushing-data-thru-azure-stream-analytics-to-power-bi-part-2/
  3. Sending, reading and processing the events: https://azurebi.wordpress.com/2015/04/14/pushing-data-thru-azure-stream-analytics-to-power-bi-part-3

-JP

Posted in PoC | Tagged , | 3 Comments

Pushing data thru Azure Stream Analytics to Power BI – part 1

Since the beginning of this year a new version of Power BI has been release in preview and since half March this version is available worldwide. One new feature of this preview is that there is now an API to push data programmatically to Power BI. While pushing this data the dashboard based on the dataset is altered in real time giving a ton of opportunities.

Azure Stream Analytics

One of the advantage of using Stream Analytics between the sender and the Power BI API is that the throughput of the API is limited and can be increased by using Stream Analytics. Another advantage is that Stream Analytics can be used to filter the events and we can use multiple sender apps to send events to the service hub and be processed by a single processor app and both will reduce the amount of API calls.

With the current versions of Stream Analytics I have to use a processor app to consume the output of the stream and the Power BI API, because the only usable output is via an event hub. Maybe in a future release there will be a option to direct output to an API. In my case I will use the processor app to do some extra aggregation to reduce the number of call to the Power BI API.

App Concept

During the last SQLBits Kasper de Jonge (PM of Power BI) showed the possibility to use Azure Stream Analytics. And the developer site of Power BI also mentions the possibility, but as ‘coming soon’. But I couldn’t wait and tried it my self and got it working. In a series of blogposts I will provide a guide to achieving this yourself.

PowerBIStreamAnalyticsOverView

The idea is quite simple. There are two apps a sender (Power BI App) and processor (Power BI Process App) and a Stream Analytics pipeline connected to Azure Service Hub as input and output. And the last step is that the processor app will feed the Power BI Dashboard via the API.

What do we need?

First we need an active Azure subscription and be part of the Stream Analytics preview, you can apply via here. Also we need an active Power BI subscription which is linked to your Azure Active Directory of your subscription. See this link for more information.

Next Steps

  1. Introduction: https://azurebi.wordpress.com/2015/03/30/pushing-data-thru-azure-stream-analytics-to-power-bi-part-1/
  2. Configuring Azure services: https://azurebi.wordpress.com/2015/04/02/pushing-data-thru-azure-stream-analytics-to-power-bi-part-2/
  3. Sending, reading and processing the events: https://azurebi.wordpress.com/2015/04/14/pushing-data-thru-azure-stream-analytics-to-power-bi-part-3

-JP

Posted in PoC | Tagged , | 2 Comments

Multiple hierarchies and xTDs

Last week I was challenged by a co-worker: ‘create a way that a tabular cube case serve a YTD value for different time hierarchies.’ Because I work at a retailer at this moment, there is a fiscal calendar and a 454 calendar, which of course doesn’t align in the begin and end date.

While I was writing this post, Marco Russo wrote a great post on how to create different xTD values. Please read it: http://www.sqlbi.com/articles/week-based-time-intelligence-in-dax/ (Note to myself: do not postpone the writing of your posts)

This has one advantage for me: I don’t have to explain the logic behind xTD who are not based on a standard calendar. Lucky me :-).

The only challenge left is the one measure with different hierarchies, but that can easily been solved by using to Date tables in PowerPivot. One table containing the standard calendar and one containing the year-week data. Both will get a hierarchy and are linked to the same fact table using the same date key.

Diagram view

And in the fact table two values are defined: YTDMonth and YTDWeek using both a different date table. And the combined is a measure with a IF() statement showing one of the two YTD values.

YTD:=IF(HASONEVALUE('Calendar Week'[Year]), [YTDWeek],[YTDMonth])

Basically it is the same situation as there was with Analysis Service 2000: per dimension one hierarchy. And as long as DAX is not aware of the dimension that is selected or scoping, this design paradigm should be used to solve this problem. Or stick with a value per xTD.

-JP

Test file: Multiple hierarchies with one YTD measure

Posted in PoC | Tagged , ,

Adding #Trendlines to #PowerPivot for #PowerView

It started during a presentation of Cathy Dumas at the latest SQLBits where she compared the different charting capabilities of Excel, Power View, PerformancePoint and Reporting Service. One of those capabilities were trendlines who are missing in V1 of Power View. She suggested to add a trendline as a measure, but didn’t succeed, so she raised a challenge. And here I am taking this challenge to add a trendline to a PowerPivot model to be used in Power View.

Trendline theory

The theory behind a linear trendline is minimizing the least-square of the given set. The Wikipedia page gives a lot of theory and I must admit that the Dutch version has besides the theory also a simple solution of solving the equation.

Given a trendline with the following equation:

image

Then a and b can be solved by using the following equations:

imageimage

Example

For the following dataset:

x y
1 10
2 1
3 2
4 7

The following base values can be calculated and are input for a and b:

n 4
E(x) 10
E(y) 20
E(x*y) 46
E(x2) 30

So a and b are now:

a = –0.8
b = 7

And the trendline is now y = –0.8x + 7. With this formula it is possible to calculate a trendline value for each x of the dataset, which gives a nice trendline as a extra line chart.

Plot this trendline with the original data and the build-in trendline in Excel, the two line are overlapping each other:

image

PowerPivot

To convert this theory to a real life scenario I use the AdventueWorks database with the FactResellerSales fact table, the DimProduct tables and the DimDate table. With this dataset I ran into some challenges. First challenge is the series – I use a combination of Years and Quarters, like (20051, 20052, …) – which is not linear and continues. Also the data is not on the lowest grain of those series values.

The first step will be to convert the used series to a usable series for minimizing the least-square. Next step is adding all the needed measures to the FactResellerSales table to calculate the correct a and b.

Converting the Series

I created a nice long date series by adding a calculated column to DimDate YearQuarter

=[CalendarYear] & [CalendarQuarter]

Next I added this calculated column to the FactResellerSales table by using a calculated column RelYearQuarter

=VALUE(RELATED(DimDate[YearQuarter]))

Problem is that this series is not nicely formed as need for a correct calculation, so a conversion is needed. The current version of PowerPivot has the function RANKX which can be used to rank our series. The default order of RANKX is ascending, but a descending order is needed. Also we need a dense ties to get a continues series. This will give as a calculation for the column RankYQ

=RANKX(ALL(FactResellerSales), [RelYearQuarter], [RelYearQuarter], 1, Dense)

Calulation the Base Values

Most of these values needs to be calculated against the full selected set. Therefor most of the calculation have ALLSELECTED(FactResellerSales) included to make sure that the measure is calculated against the selected dataset creating the PivotTable.

n : NrSeries

To calculate the number of series value a maximum of the newly RankYQ column should be enough. Problem is that RankYQ is calculated again the complete set of the data, ignoring all PivotTable filters. So when the first series value is not selected in the PivotTable, the RankYQ will remain the same and is not recalculated. One solution is to convert the column to a measure, but this will cause a huge performance impact if a maximum is calculated. The best way is to correct the value of the maximum of RankYQ by subtracting the minimum of RankYQ value and adding 1 for the right offset. In this case the total number of series – NrSeries – will give the correct value for a continues selection of the series:

=MAXX(ALLSELECTED(FactResellerSales), [RankYQ])-MINX(ALLSELECTED(FactResellerSales), [RankYQ])+1

E(x) : SumSeries

To calculate the total sum of the series values the best way is to revert back to basic mathematics knowledge. What we need for our sum of the series is for a given n:

image

Which can be calculated  for that same n as:

image

So the measure SumSeries can be calculated as follows

=0.5*[NrSeries]*([NrSeries]+1)

E(x2) : SumSeries2

With the same mathematical knowledge we can solve the sum of our series values squared:

image

The measure SumSeries2 can be calculated as follows

=(1/6)*[NrSeries]*([NrSeries]+1)*((2*[NrSeries])+1)

E(y) : SumAmount

This is the most easiest calculation we need as it is a standard way to calculate the total of the selected values, here named SumAmount

=SUMX(ALLSELECTED(FactResellerSales), [SalesAmount])

E(x*y) : SumRankSales

To calculate this measure the same trick is needed as with the NrSeries measure: The selection of the series can start not as the first value, so it is not possible to simply multiply the RankYQ value with the Sales Amount. For this value also the minimal selected RankYQ value of the series needs to be subtracted of the current RankYQ value and corrected with 1. So the SumRankSales measure is calculated as follows

=SUMX(ALLSELECTED(FactResellerSales), ([RankYQ]-MINX(ALLSELECTED(FactResellerSales), [RankYQ])+1)*[SalesAmount])

a and b

With all the base values in place a and b can be calculated as follows. In the calculation of a there is a check for division by zero (0).

=IF((([NrSeries]*[SumSeries2])-([SumSeries]*[SumSeries]))=0, BLANK(), (([NrSeries]*[SumRankSales])-([SumSeries]*[SumAmount]))/(([NrSeries]*[SumSeries2])-([SumSeries]*[SumSeries])))

=((1/[NrSeries])*[SumAmount])-((1/[NrSeries])*[a]*[SumSeries])

SalesAmount Trend

With a and b now correctly calculated we can calculate a measure – SalesAmount Trend – that gives the trendline usable for Power View. A IF statement is included to make sure that the trendline is only calculated when there is a original value of SalesAmount available. And the current series value is calculated using the current RankYQ value minus the minimal selected RankYQ value (and a correction of 1 to complete the calculation).

=IF(ISBLANK([Sum of SalesAmount]), BLANK(), [a]*(MAX([RankYQ])-MINX(ALLSELECTED(FactResellerSales), [RankYQ])+1)+[b])

This trend value can now be used to validate with Excel if it shows the correct trendline.

image

Limitations

Of course there are some limitation to this approach, which aren’t there if Power View supports trendlines natively:

  • The value of series needs to be a continues series, so no gaps. Is this case the trendline will be off if 2006 is deselected and  2005 and 2007 are selected.
  • The Trendline is only valid for the given series and values.
  • Only a linear trendline is calculated at this moment. For Exponential and Logarithmic trendlines the calculation is almost the same, but maybe I blog about this some later time.
  • In Power View it is not possible to combine a bar chart with a line chart, so if you need to show your data as bars this trendline will also show as a bar chart.

Round-up

With all the intermediate measure in the model, some hiding is needed before it can be really used in Power View. But most of that is needed anyway, because of the different nature of PowerPivot and Power View.

There is probably a better and faster way to determine the total distinct series values. If anyone has a good suggestion: please let me know.

Unfortunately I don’t have a working Power View installation, so I cannot show the real result. But you can try it yourself by download the sample Excel sheet from here.

Enjoy trendlining

-JP

Posted in PoC | Tagged , | 2 Comments

The beauty of the #SSAS DefaultMember

imageThe DefaultMember property for the dimension is know by every Microsoft BI professional.  This property sets the default selected member of a dimension attribute, so that member will be selected by default.

The dimension DefaultMember can be declared in BIDS or by using an Alter Cube statement as part of the cube calculations. While the first is static the latter can be used to dynamically sets a DefaultMember. The DefaultMember can also be set by defining a role, very useful for dynamic role security.image

But the dimension DefaultMember property isn’t the only one. Almost every object in Analysis Service has a DefaultMember property, but in this blogpost I only look at the DefaultMember of the Measures ‘dimension’.

Determination of the DefaultMember

The big question is: how is the DefaultMember property determined when it is not set? If we take a look at the BOL there is a remark for the DefaultMember property. (Although the article is referring to Analysis Service 2005, the behavior hasn’t changed):

If DefaultMember is not specified, or is set to an empty string, then Microsoft SQL Server 2005 Analysis Services (SSAS) chooses a member to use as the default member.

But what if the default behavior of choosing the DefaultMember by Analysis Service: it chooses the first member which can be found. So if we query a DefaultMember of a hierarchy, the All level is returned.

image

And if you query an attribute with the property IsAggregatable is set to false, the FirstChild is returned, like the Scenario attribute in the AdventureWorks cube.

image

But there is one very important DefaultMember that helps the overall experience of Analysis Service. This is the DefaultMember of the measures.

DefaultMember of the Measures

What will a query return without a Measure added? In this case the Query Engine of Analysis Service is adding the DefaultMember of the Measures ‘dimension’, the DefaultMeasure. If we look at the AdventureWorks cube this value is set to the Reseller Sales Amount

imageimage

Or when not set, Analysis Service will take the first Measure from the first Measure Group, in this case the Internet Sales Amount. You can manually remove the DefaultMeasure without processing the AdventureWorks cube.

imageimage 

But how can this behavior impact the overall experience? When we select in Excel a dimension attribute at the rows or columns, the following query is fired against Analysis Service.

image

This is a query without a measure included, so Analysis Service is adding the DefaultMember (DefaultMeasure) to the query as described above.

So this could result in a performance delay when this DefaultMeasure is part of a Measure Group with a lot of partitions and data. Then it can take a long time before the members of the dimension attribute is shown in Excel.

A simple way to solve this is to either set the DefaultMeasure to a measure from a small Measure Group or move that Measure Group up to be the top of the Measure Groups. In this case the measure less query will only hit the partitions of that small Measure Group.

image

-JP (@jppp)

Posted in Out the field | Tagged

My PerformancePoint experiences

Finally I made some time to write about my experiences with PerformancePoint 2010 and then focused on my dashboard specially for the iPad and the new SP1 features.

Last April the was a business request to enable the existing PerformancePoint dashboard for viewing on a iPad. The first step was, besides getting an iPad with the correct WiFi, 3G and VPN connections, trying the existing dashboard opening in the iPad Safari browser.

This gave already the first challenge: with iOS4.3 update the JavaScript engine has significant changed. This resulted in either loading or not loading the dashboard. With an then not updated iPad running iOS4.2 the dashboard showed, and an iPad with iOS4.3 not!

Next challenge was that the original dashboard used a filter showed as treeview. This is a rich HTML/JavaScript object and failed to behave correctly on the iPad. When a filter used a normal dropdown box, a core feature of any web browser, Safari shows that a finger touch friendly control. So the first choose was to abandon the multiselect and use a dropdown as the filter. This should work fine, but with a content of more that 300 options the user-friendliness isn’t optimal.

As at that moment we were running a RTM version of SharePoint, so the only dynamic option for filters was the visibility, not the initial content. And there were two filters based on the same dimension which controlled the content of the scorecards and reports. So how to make the dropdownbox as small as possible, say one region and respect the original behavior. Luckily I could use a extra filter to ‘pre-filter’ the main filter. But that filter could only set the visibility of the main filter. In the end I could solve it the following way:

First I added an extra dimension, which was just an extra copy of the original dimension used in the main filter. After that I based the second filter on a SharePoint list mainly with members of that extra dimension. Only the All level of that filter was from the other original dimension hence the reason for the SharePoint list. After that I created for every child level a corresponding filter with the visibility linked. The last part was to attach all filters, in my case +/- 30, to all parts of the dashboard.

Now with the new functionality of dynamic filters in SP1 for PerformancePoint, the content of a filter can be based on another filter. In my case I can redesign my dashboard by using just two filters for the situation above. Just link the filters together and let PerformancePoint filter the content for you. How easy can it be. One important thing for everyone who is upgrading from RTM to SP1: redefine your filter be simply step thru the wizard or else you cannot link the filters. Looks like the PerformancePoint items are not  updated automatically.

BTW. I haven’t tested iOS4.3 with PerformancePoint 2010 SP1, so I don’t know if the dashboards are showing at an iPad at this moment. Although PerformancePoint works with the Safari browser (http://office.microsoft.com/en-us/dashboard-designer-help/using-performancepoint-services-with-multiple-web-browsers-HA102237261.aspx), the iOS Safari browser is a different browser that the desktop version and works, of course, different. So it is a good choice of Microsoft to use the same browser engine for their phone devices as for the desktop: no worries that some JavaScript would not work on different devices.

-JP

Posted in Out the field | Tagged , ,