Sunday, December 02, 2018

Formating Web API query for getting VoC Survey questions

Working with a Flow to do some text analysis and sentiment analysis on Voice of the customer responses.

The trick, as the payment model is per run, is to trigger it per Survey Response, and not Question Response. Hence the logic has to loop through all the question responses.

The way you create a filter in Flow for the query is to use Odata filters. However, I found that these were a casing nightmare, which those of you who have worked with more, probably also have noticed.

After troubleshooting a lot with different queries in the browser I finally found that the following actually worked. Note that you will have t change the guid to your own.

https://dev-dtn.api.crm4.dynamics.com/api/data/v9.1/msdyn_questionresponses?$select=msdyn_name,msdyn_SurveyResponseId&$filter=msdyn_SurveyResponseId/msdyn_surveyresponseid%20eq%20460279E7-2AF2-E811-A97F-000D3AB0C08C

The tricky part, as you can see, is that the first part of the lookup attribute, is defined in camel-case and the one in the related entity (Survey Response) in lower case.

The part you need in the Flow is the last part, but it is useful to test it directly in the browser to make sure you get the syntax correct.
Its the filter part of the query that you are to enter into the "Filter query" field, and make sure to make it dynamic. :)

And this is how it looks in Dynamics 365 CE if you check out the fields. I like to look at them in the list view as I can see the schema name there, which isn't visible in the Form.

msdyn_SurveyResponseId Lookup from the Question Response Entity. As you can see it seems to be using the Schema name above.

This is the primary field Survey Response. Do note the subtle difference between the fields, that Id is spellt with a capital "I" in the Schema name. Based on the information above, it hence seems to be using "Name" to indicate the field.

Hence based on the above, the supposition would be that the syntax is <Schemaname of the lookup>/<name of field in target entity>


I then did a query to business Unit and I was very surprised to find that it was rather inconsistent and looked like this:

https://dev-dtn.api.crm4.dynamics.com/api/data/v9.1/businessunits?$select=cntso_organizationbaseurl&$filter=parentbusinessunitid/businessunitid%20eq%20null

with just the query that would be

parentbusinessunitid/businessunitid eq null 

Let's have a look at the fields in Dynamics:


The Parent Business Unit Lookup in Business Unit (Self Referential). Note that the Schema name is Pascal Case.

key field, businessunitid in Business Unit


And as you can see, if we were to follow the syntax set by the example above, this should be:

ParentBusinessUnitId/businessuniti

However, that didn't seem to work, and as a pragmatist, I have to conclude, somewhat sad, that this doesn't seem to be very consistent.

My recommendation is hence when working with this:

  • Do not take any casing for granted
  • <Schemaname of the lookup>/<name of field in target entity> is probably correct for most custom fields/entities.
  • Many older entities and fields, like the businessunit shown above, has been there since CRM 1.0 or at least 3.0, if I remember correctly and hence the syntax might be different.
  • Test your queries directly in the browser like I have shown above.

Good luck with your Flows.

And if you know Swedish, make sure you check out my colleague Martin Burmans article on Flow as well. Not sure how well it turns out in translation. https://www.crmkonsulterna.se/flow-i-medvind/

Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

Wednesday, November 21, 2018

Quickfind bug in 9.1.0.643

Tonight Microsoft rolled out an update to Dynamics 365 that seemed to have had a few issues. Most noteably if you have any Lookup-fields in a quickfind views "Find"-columns, it will break. Most of the time, not always. Microsoft knows about this, there are angry threads talking about this, like this one:  https://community.dynamics.com/crm/f/117/t/301925?pi61802=3#responses

and you can of course create your own support ticket with Microsoft at https://admin.powerplatform.microsoft.com/

The temporary solution to get this working, or the essential parts for your system, is to remove the Lookup fields from your find columns for the entities that are breaking. This will of course have the effect that no searching can be done in this entity, but you can switch it on again later.

We have also done some preliminary tests and it seems like the UCI (Unified Interface) is not affected by this. So making a quick UCI App could also be a good, fast fix, especially for the most critical user groups.

Note also, that some lookups may break as they use the quickfind logic to search the related entity when you are inputing data. Hence if this happens, you might have to do an interim fix there too.

The error is a "SQL Error" and if you download the logfile, there are two different error messages that I have seen/heard of: (I changed the fieldname to "contactidName")

Unhandled Exception: System.ServiceModel.FaultException`1[[Microsoft.Xrm.Sdk.OrganizationServiceFault, Microsoft.Xrm.Sdk, Version=9.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35]]: System.Xml.XmlException: Microsoft.Crm.CrmException: Sql error: 'Invalid column name 'contactidName'.'
   at Microsoft.Crm.Application.Platform.ServiceCommands.PlatformCommand.XrmExecuteInternal()
   at Microsoft.Crm.Application.Platform.ServiceCommands.RetrieveMultipleCommand.Execute()
   at Microsoft.Crm.ApplicationQuery.RetrieveMultipleCommand.RetrieveData()
   at Microsoft.Crm.ApplicationQuery.ExecuteQuery()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.GetData(QueryBuilder queryBuilder)
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadQueryData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareGridData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareData()
   at Microsoft.Crm.Application.Controls.GridUIProvider.Render(HtmlTextWriter output)
   at Microsoft.Crm.Application.Components.UI.Grid.DataGrid.RenderInnerHtml(HtmlTextWriter output)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.GetResetResponseHtml(AppGrid appGrid, StringBuilder sbHtml)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.Reset(String gridXml, String id, StringBuilder sbXml, StringBuilder sbHtml)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.ProcessRequestInternal(HttpContext context) ---> Microsoft.Crm.CrmException: Sql error: 'Invalid column name 'kuoni_BookingIdName'.'
   at Microsoft.Crm.Application.Platform.ServiceCommands.PlatformCommand.XrmExecuteInternal()
   at Microsoft.Crm.Application.Platform.ServiceCommands.RetrieveMultipleCommand.Execute()
   at Microsoft.Crm.ApplicationQuery.RetrieveMultipleCommand.RetrieveData()
   at Microsoft.Crm.ApplicationQuery.ExecuteQuery()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.GetData(QueryBuilder queryBuilder)
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadQueryData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareGridData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareData()
   at Microsoft.Crm.Application.Controls.GridUIProvider.Render(HtmlTextWriter output)
   at Microsoft.Crm.Application.Components.UI.Grid.DataGrid.RenderInnerHtml(HtmlTextWriter output)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.GetResetResponseHtml(AppGrid appGrid, StringBuilder sbHtml)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.Reset(String gridXml, String id, StringBuilder sbXml, StringBuilder sbHtml)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.ProcessRequestInternal(HttpContext context)
   --- End of inner exception stack trace ---
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.ProcessRequestInternal(HttpContext context)
   at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
   at System.Web.HttpApplication.ExecuteStepImpl(IExecutionStep step)
   at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously): Microsoft Dynamics CRM has experienced an error. Reference number for administrators or support: #BDDD78E5Detail: 
<OrganizationServiceFault xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.microsoft.com/xrm/2011/Contracts">
  <ActivityId>64efc536-e25e-4e36-a163-dbe707b07302</ActivityId>
  <ErrorCode>-2147220970</ErrorCode>
  <ErrorDetails xmlns:d2p1="http://schemas.datacontract.org/2004/07/System.Collections.Generic" />
  <Message>System.Xml.XmlException: Microsoft.Crm.CrmException: Sql error: 'Invalid column name 'contactidName'.'
   at Microsoft.Crm.Application.Platform.ServiceCommands.PlatformCommand.XrmExecuteInternal()
   at Microsoft.Crm.Application.Platform.ServiceCommands.RetrieveMultipleCommand.Execute()
   at Microsoft.Crm.ApplicationQuery.RetrieveMultipleCommand.RetrieveData()
   at Microsoft.Crm.ApplicationQuery.ExecuteQuery()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.GetData(QueryBuilder queryBuilder)
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadQueryData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareGridData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareData()
   at Microsoft.Crm.Application.Controls.GridUIProvider.Render(HtmlTextWriter output)
   at Microsoft.Crm.Application.Components.UI.Grid.DataGrid.RenderInnerHtml(HtmlTextWriter output)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.GetResetResponseHtml(AppGrid appGrid, StringBuilder sbHtml)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.Reset(String gridXml, String id, StringBuilder sbXml, StringBuilder sbHtml)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.ProcessRequestInternal(HttpContext context) ---&gt; Microsoft.Crm.CrmException: Sql error: 'Invalid column name 'contactidName'.'
   at Microsoft.Crm.Application.Platform.ServiceCommands.PlatformCommand.XrmExecuteInternal()
   at Microsoft.Crm.Application.Platform.ServiceCommands.RetrieveMultipleCommand.Execute()
   at Microsoft.Crm.ApplicationQuery.RetrieveMultipleCommand.RetrieveData()
   at Microsoft.Crm.ApplicationQuery.ExecuteQuery()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.GetData(QueryBuilder queryBuilder)
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadQueryData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareGridData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareData()
   at Microsoft.Crm.Application.Controls.GridUIProvider.Render(HtmlTextWriter output)
   at Microsoft.Crm.Application.Components.UI.Grid.DataGrid.RenderInnerHtml(HtmlTextWriter output)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.GetResetResponseHtml(AppGrid appGrid, StringBuilder sbHtml)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.Reset(String gridXml, String id, StringBuilder sbXml, StringBuilder sbHtml)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.ProcessRequestInternal(HttpContext context)
   --- End of inner exception stack trace ---
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.ProcessRequestInternal(HttpContext context)
   at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
   at System.Web.HttpApplication.ExecuteStepImpl(IExecutionStep step)
   at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean&amp; completedSynchronously): Microsoft Dynamics CRM has experienced an error. Reference number for administrators or support: #BDDD78E5</Message>
  <Timestamp>2018-11-21T09:41:26.798591Z</Timestamp>
  <ExceptionRetriable>false</ExceptionRetriable>
  <ExceptionSource i:nil="true" />
  <InnerFault>
    <ActivityId>64efc536-e25e-4e36-a163-dbe707b07302</ActivityId>
    <ErrorCode>-2147204784</ErrorCode>
    <ErrorDetails xmlns:d3p1="http://schemas.datacontract.org/2004/07/System.Collections.Generic" />
    <Message>Sql error: 'Invalid column name 'contactidName'.'</Message>
    <Timestamp>2018-11-21T09:41:26.798591Z</Timestamp>
    <ExceptionRetriable>false</ExceptionRetriable>
    <ExceptionSource i:nil="true" />
    <InnerFault i:nil="true" />
    <OriginalException i:nil="true" />
    <TraceText i:nil="true" />
  </InnerFault>
  <OriginalException i:nil="true" />
  <TraceText i:nil="true" />
</OrganizationServiceFault>

Second error:

Unhandled Exception: System.ServiceModel.FaultException`1[[Microsoft.Xrm.Sdk.OrganizationServiceFault, Microsoft.Xrm.Sdk, Version=9.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35]]: System.Xml.XmlException: Microsoft.Crm.CrmException: A quick find filter cannot have any child filters
   at Microsoft.Crm.Application.Platform.ServiceCommands.PlatformCommand.XrmExecuteInternal()
   at Microsoft.Crm.Application.Platform.ServiceCommands.RetrieveMultipleCommand.Execute()
   at Microsoft.Crm.ApplicationQuery.RetrieveMultipleCommand.RetrieveData()
   at Microsoft.Crm.ApplicationQuery.ExecuteQuery()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.GetData(QueryBuilder queryBuilder)
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadQueryData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareGridData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareData()
   at Microsoft.Crm.Application.Controls.GridUIProvider.Render(HtmlTextWriter output)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.GetRefreshResponseHtml(IGridUIProvider uiProvider, StringBuilder sbTemp)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.Refresh(String gridXml, StringBuilder sbXml, StringBuilder sbHtml, Boolean returnJsonData)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.ProcessRequestInternal(HttpContext context) ---> Microsoft.Crm.CrmException: A quick find filter cannot have any child filters
   at Microsoft.Crm.Application.Platform.ServiceCommands.PlatformCommand.XrmExecuteInternal()
   at Microsoft.Crm.Application.Platform.ServiceCommands.RetrieveMultipleCommand.Execute()
   at Microsoft.Crm.ApplicationQuery.RetrieveMultipleCommand.RetrieveData()
   at Microsoft.Crm.ApplicationQuery.ExecuteQuery()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.GetData(QueryBuilder queryBuilder)
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadQueryData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareGridData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareData()
   at Microsoft.Crm.Application.Controls.GridUIProvider.Render(HtmlTextWriter output)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.GetRefreshResponseHtml(IGridUIProvider uiProvider, StringBuilder sbTemp)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.Refresh(String gridXml, StringBuilder sbXml, StringBuilder sbHtml, Boolean returnJsonData)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.ProcessRequestInternal(HttpContext context)
   --- End of inner exception stack trace ---
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.ProcessRequestInternal(HttpContext context)
   at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
   at System.Web.HttpApplication.ExecuteStepImpl(IExecutionStep step)
   at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously): Microsoft Dynamics CRM has experienced an error. Reference number for administrators or support: #48A7E659Detail: 
<OrganizationServiceFault xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.microsoft.com/xrm/2011/Contracts">
  <ActivityId>fa29912a-47b8-42cf-8e7b-fe3e0c13aecc</ActivityId>
  <ErrorCode>-2147220970</ErrorCode>
  <ErrorDetails xmlns:d2p1="http://schemas.datacontract.org/2004/07/System.Collections.Generic" />
  <Message>System.Xml.XmlException: Microsoft.Crm.CrmException: A quick find filter cannot have any child filters
   at Microsoft.Crm.Application.Platform.ServiceCommands.PlatformCommand.XrmExecuteInternal()
   at Microsoft.Crm.Application.Platform.ServiceCommands.RetrieveMultipleCommand.Execute()
   at Microsoft.Crm.ApplicationQuery.RetrieveMultipleCommand.RetrieveData()
   at Microsoft.Crm.ApplicationQuery.ExecuteQuery()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.GetData(QueryBuilder queryBuilder)
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadQueryData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareGridData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareData()
   at Microsoft.Crm.Application.Controls.GridUIProvider.Render(HtmlTextWriter output)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.GetRefreshResponseHtml(IGridUIProvider uiProvider, StringBuilder sbTemp)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.Refresh(String gridXml, StringBuilder sbXml, StringBuilder sbHtml, Boolean returnJsonData)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.ProcessRequestInternal(HttpContext context) ---&gt; Microsoft.Crm.CrmException: A quick find filter cannot have any child filters
   at Microsoft.Crm.Application.Platform.ServiceCommands.PlatformCommand.XrmExecuteInternal()
   at Microsoft.Crm.Application.Platform.ServiceCommands.RetrieveMultipleCommand.Execute()
   at Microsoft.Crm.ApplicationQuery.RetrieveMultipleCommand.RetrieveData()
   at Microsoft.Crm.ApplicationQuery.ExecuteQuery()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.GetData(QueryBuilder queryBuilder)
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadQueryData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderQueryBuilder.LoadData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareGridData()
   at Microsoft.Crm.Application.Platform.Grid.GridDataProviderBase.PrepareData()
   at Microsoft.Crm.Application.Controls.GridUIProvider.Render(HtmlTextWriter output)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.GetRefreshResponseHtml(IGridUIProvider uiProvider, StringBuilder sbTemp)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.Refresh(String gridXml, StringBuilder sbXml, StringBuilder sbHtml, Boolean returnJsonData)
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.ProcessRequestInternal(HttpContext context)
   --- End of inner exception stack trace ---
   at Microsoft.Crm.Core.Application.WebServices.AppGridWebServiceHandler.ProcessRequestInternal(HttpContext context)
   at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
   at System.Web.HttpApplication.ExecuteStepImpl(IExecutionStep step)
   at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean&amp; completedSynchronously): Microsoft Dynamics CRM has experienced an error. Reference number for administrators or support: #48A7E659</Message>
  <Timestamp>2018-11-21T10:28:58.4053402Z</Timestamp>
  <ExceptionRetriable>false</ExceptionRetriable>
  <ExceptionSource i:nil="true" />
  <InnerFault>
    <ActivityId>fa29912a-47b8-42cf-8e7b-fe3e0c13aecc</ActivityId>
    <ErrorCode>-2147217118</ErrorCode>
    <ErrorDetails xmlns:d3p1="http://schemas.datacontract.org/2004/07/System.Collections.Generic" />
    <Message>A quick find filter cannot have any child filters</Message>
    <Timestamp>2018-11-21T10:28:58.4053402Z</Timestamp>
    <ExceptionRetriable>false</ExceptionRetriable>
    <ExceptionSource i:nil="true" />
    <InnerFault i:nil="true" />
    <OriginalException i:nil="true" />
    <TraceText i:nil="true" />
  </InnerFault>
  <OriginalException i:nil="true" />
  <TraceText i:nil="true" />
</OrganizationServiceFault>



Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

Monday, October 22, 2018

Dynamics 365 Saturday in Stockholm - will be a blast!

For the first time ever 9 Business Solutions MVP:s will converge on Stockholm to share their knowledge! In the amazing Dynamics 365 Saturday event which will be held on the 10:th of November 2018 at the Microsoft Office in Kista, just outside Stockholm. As usual, the Dynamics 365 Saturdays are free.

As the main organizer, me, my company CRM-Konsulterna and the MVP Jonas Rapp, will also help out it is of course very satisfying to have so many talented people sign up to speak. A lot is happening, and version 9.1 was just rolled out today in EMEA if you didn't notice. Hence there is a lot to talk about.

There will be three tracks:

  • Application - sessions on configuration and usage of Dynamics 365
  • Development - sessions on development and configuration that could be viewed as programming like Flow, LogicApps, PowerApps/CanvasApps etc.
  • Business/Project Management - sessions on how to best run projects, businesses, your career and other softer issues but all related to Dynamics 365.
As the host, I really hope you are able to secure a seat, we are limiting the amount to 150 and we are getting signups by the hour, so be sure to book your seat now.

Read more and sign up here: http://365saturday.com/dynamics/stockholm-2018/ 


Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

Wednesday, September 05, 2018

Hybrid NTLM Server Side Sync and Exchange 2013 Cert secrets

The server side sync is a technology for connecting Dynamics 365 CE to an Exchange server. When connecting an Online Dynamics 365 to an onprem Exchange there are some requirement that need to be met. These can be found here: https://technet.microsoft.com/sv-se/library/mt622059.aspx

Piping data to and from Exchange and Dynamics
By Quartl [CC BY-SA 3.0], from Wikimedia Commons
However, I just had a meeting with Microsoft and based on the version shown 2018-09-05, they have now added some new features that they haven't had time to get into the documentation yet.

Some of the most interesting parts of the integration is that the it requires Basic Authentication for EWS (Exchange Web Service). Of the three types of authentication available Kerberos, NTLM and Basic, Basic Authentication is, as the name might hint, the least secure. Hence it is also not very well liked by many Exchange admins and may be a blocker for enabling Server Side Sync in Dynamics 365.

In the meeting I just had with Microsoft, they mentioned that they now support NTLM as well! That is great news as that will enable more organizations to enable Server Side Sync.

There is still a requirement on using a user with Application Impersonation rights which might be an issue as that can be viewed as having too high rights within the Exchange server. For this there is currently no good alternative solution. I guess making sure that the Dynamics Admins are trustworthy and knowing that the password is encrypted in Dynamics might ease some of that. But if the impersonation user is compromised, then a haxxor with the right tool or dev skills could compromise the entire Exchange server.

Microsoft also mentioned another common issue that can arise with the Outlook App when using SSS and hybrid connection to an Exchange 2013 onprem. It will show a quick alert saying "Can't connect to Exchange" but it will be able to load the entire Dynamics parts.

This might be caused by the fact, according to Microsoft, that Exchange 2013, doesn't automatically create a self-signed certificate that it can use for communication. Hence this has to be done.

This can be fixed by first creating a self signed certificate and then modify the authorization configuration using instruction found here . Lastly publish the certificate. It can also be a good idea to check that the certificate is still valid and hasn't expired.

I will see if I can create a more detailed instruction on this later.

Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

Saturday, June 16, 2018

Embedded PowerApp in Dynamics 365 CE

PowerApps or as the new name for it is, Canvas driven Apps, is a very fast low/no-code way of creating applications that can fill in the blanks where Dynamics 365 CE sometimes fit. Can for instance be when there is a mapping entity, a complex calendar functionality or even integrations with a camera or other applications directly. These things can be done, some are even supported within the new UI/UCI if you are running it through the tablet/mobile client. However, there is currently no easy way that I have found to just add a PowerApp/Canvas Drive App (I will just call it PowerApp below to make things easier) to Dynamics. But after a chat with my friend, ex MVP and now Microsoft Global Black Belt Carsten Groth (isn't that a cool name for a team by the way?) he told me that it is possible to embed PowerApps. So I googled a bit a found that no one seemed to have blogged about this (if you have, please tell me in the comments) except for other it concerning other applications like SharePoint. I tried it out and found that it wasn't that hard at all to embedd a PowerApp into Dynamics 365.

The first thing we have to start with is to create our PowerApp. You can get to this place by browsing to https://web.powerapps.com or clicking the waffle in Office365 and then selecting PowerApps. In my case, I used the Dynamics 365 template and pointed to the Case Entity and did more or less pressed Next-next-finnish. I did replace the field that was shown in the first screen from the Guid of the case to the field "title" as that makes a little bit more sense looking at.


So what I was looking for when creating the PowerApp was the AppId, in this case:
"904e3ac1-1e26-4b4b-a384-971485f6709c"

What we then need is the embedding syntax for PowerApps which is:
<iframe src="https://web.powerapps.com/webplayer/iframeapp?source=iframe&screenColor=rgba(104,101,171,1)&appId=/providers/Microsoft.PowerApps/apps/[AppId]" />

You need to replace the [AppId] with your AppId. Also, if you do not specify it, the iframe it will typically render rather small so I would recommend either setting the size of it directly with width and height attributes or using css, the latter being more kosher.

I tried setting an Iframe directly with this in Dynamics 365, but that didn't seem to work. All I got was a spinner like this:
Spinner which never stops when tried to add PowerApp directly
Not sure why this is.

So, I circumvented it and created a very small html-page like this:

<html>
<body>
<iframe width="1024px" height="768px" src="https://web.powerapps.com/webplayer/iframeapp?source=iframe&screenColor=rgba(104,101,171,1)&appId=/providers/Microsoft.PowerApps/apps/904e3ac1-1e26-4b4b-a384-971485f6709c" />
</body>
</html>

This time, when trying the TestPowerApp.html-page from my local drive, the spinner was quickly replaced by this and then by the App

Then I uploaded this file as a webresource to Dynamics.

Create a webresource with the simple html-page pointing to the PowerApp

After uploading it, in the WebResource screen, you can test the page by clicking on the URL. It should show a webpage with the PowerApp.

The PowerApp running as a webresource in Dynamicws 365
Now you can use this web resource wherever you want. Like directly in a sitemap:

Added PowerApp webresource to the Marketing Area as a Subarea called "Power App Cases"

It should be possible to send parameters to the PowerApp as well making them context aware so that we can create PowerApps that for instance show up within a case and show specific data on that case after you have selected it in Dynamics 365. However, if you look at the iframe syntax you will see that the AppId and stuff is already a parameter and I need to research this a bit more to be able to figure it out. I think that the html web resource probably has to shuffle the query string parameters from Dynamics to PowerApps. I will leave this hanging and hope that someone else in the community catches this ball! Otherwise I will try to figure it out later.

As CDS (common data service) and Modular Driven Apps are the same thing as Dynamics 365, just with less sugar on top, this technique can be used there as well. 

If you find anything unclear or have any questions, please leave a comment (moderated).

Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

Wednesday, June 06, 2018

Setting up Data Export Service without PowerShell Script

Setting up Dynamics 365 Data Export Service requires a Azure KeyVault to be set up which is typically done using a PowerShell script which can be found in the Data Export Service setup wizard. However, if you run into issues setting this up, it might be easier to do this directly in Azure by minimizing the steps of the scripts. This was a tip that my friend and Business Solution MVP Scott Durow recommended. He mentions this in his very instructive video, but doesn't actually show how, so I thought I'd just detail how I made it work.

First some background. The reason why I even started investigating how to do this manually was that when I tried running the PowerShell script supplied by Microsoft in the wizard.
Press the "i" icon to get a window containing the PowerShell Script that Microsoft recommends for setting up the Key Vault.
When running the PowerShell script both as myself (not a global admin) and asking a global admin to do it, it failed in the latter parts. The key vault was created by some of the access policies seemed to be missing and it just didn't work. My users rights in Azure was Contributor in the Resource Group, and it was a bit interesting cause the global admin and I got different error messages, but when I finally managed to create the key vault manually, I could do it all with my user, so it didn't seem I was missing any rights to do it.

First step is to make sure you have all your data straight. The power shell script is good for this. Check out Scott clip if you want to know how to find the different strings. He shows it very clearly.

 Just copied from the PS-Script:
$subscriptionId = '<subscription ID>'
$keyvaultName = 'MyVault'
$secretName = 'MySecretName'
$location = 'North Europe'
$connectionString = 'Server=tcp:<db-name>,1433;
Initial Catalog=<catalog>;
Persist Security Info=False;
User ID={your_username};
Password={your_password};
MultipleActiveResultSets=False;
Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;'
$organizationIdList = '<DYN365GUID>
$tenantId = '<AZURE TENANT ID>' 

The highlighted parts have to be replaced by your settings. I will use these variables to have something to reference to further in this article.

Search for Key Vault and add the "Key vault", the top one in this picture

Then we have to set it up. Not so tricky if you have worked with Azure before. Consider if you want to work in an existing Resourcegroup or if you want to create a new one. Typically you need to have Azure SQL services running as well so it might be good to keep them all together to be able to see the costs and control who has access why a resource group might be a good idea. But that should hence already exist. If not, you can create it. I would recommend keeping Azure SQL and Key vault in the same, not sure if it actually works in different resource groups, probably does, but I haven't tested.

Creating the key vault - in this case I am creating a new resource group, normally it would already exist
Azure will add you as the default principal with access to the key vault. We will add Data Export Service to this later. For now, just create it.

Now we need to open the Key vault and select the "Secrets" section in the menu on the left hand side and press the button:

"+ Generate/Import" 

Then you have to enter you Secret name ($secretName) and the connection string ($connectionString) into the value.
Creating a secret - $secretname in Name and $connectionstring in Value

Press "Create".

You should now return to the previous screen and see a row for your secret.
Select it.

It should open the settings panel for the Secret, press the "Tags" part which is located in the middle and add a tag which has $OrgIdList ($organizationIdList) as the key and Tennant ($tenantId) as value. I have blurred them out below as they are rather private.

Adding a tag with OrgIdList and tenantId to a Secret

You then need to go back to the Key Vault and click on the "Access Policies" menu item, you should then see yourself as the principal as this was set when we created the key vault. We now need to add Data Export Service as a valid Principal with read access rights.

So click "Add", click "Select Principal" and search for "b861dbcc-a7ef-4219-a005-0e4de4ea7dcf" which is the ID for Data Export Service. It should show up like this:

It needs to have "Secret Management Operations - GET" permissions and nothing else.

Now, go back to the Secret and copy the URI to the Secret.

Getting the URI for the Key Vault Secret
Paste it into the Data Export Service Wizard field for Key Vault.

Fill in the other information and press validate. Hopefully it will work out well!

Some issues

Being too cheap with the Azure SQL level
If you don't go for a Azure SQL P1 and choose a lower tier, you might get this warning:
We tried an S0 for our Dev environment and tried to sync a couple of million records and that just didn't work, we got tons of errors. We upgraded the ASQL to S2 and then at least we didn't get any errors. We are planning for P1s in UAT and production.

Might have to set activation date on secret
Seems that you might have to set an activation date on the secret. Not sure why this is, the PS-script doesn't seem to do this. But not very hard.
Added activation date on the Secret from June 4.th

Using Database schema that is not created
The default database schema is "dbo" in the Data Export Service Wizard. If you change this to something else like "crm" and you haven't created this in the database, you will get an error. It is simple to fix, you just have to go into the database and create the schema. To create the schema "crm" open a query and run:
CREATE SCHEMA crm

For more information on how to create schemas, check this site: https://docs.microsoft.com/en-us/sql/t-sql/statements/create-schema-transact-sql?view=sql-server-2017

Once the schema has been created, there should be no problem using it, as long as the user has permissions using it.


I hope this works for you. If you have any questions, don't hesitate to leave a comment.

Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

Thursday, May 24, 2018

Anonymizing data in UAT/Dev - GDPR

On the eve of GDPR what could be more fitting than a post on GDPR. I think everyone is probably deadly tired of all the consent emails and I think that they will probably even have reached our friends in the US and Asia by now.

This article relates to legal matters on GDPR and are based on my personal interpretations and are not to be viewed as legal advice.

One interesting thing that has to be considered in relation to GDPR is how to handle personal information in non-production environments/instances. Microsoft have made it painfully easy with the instance manager to be able to copy the production instance but do you really have the right to use your customers personal information in a development, UAT or staging environment? Do you have legal support for that? I think that would be a very hard argument to make? Have you gotten your customers explicit consent for using their personal data for that purpose? Probably not. Hence, if you are planning on keeping the instance/environment for more than 30 days, you will need to remove all personal information from to stay within the boundaries of GDPR. I have found that using SSIS with Kingswaysoft and the Anonymization component in the productivity toolkit is very useful. I will in this rather lenghty article describe how I have used it to set up an anoymization script.

First of all you need to download SSDT and Kingswaysoft Dynamics 365 and Productivity Packs

Then start a new Business Intelligence -> Integrations Services project.


Then start by right clicking in the empty field at the bottom where it says "Connection managers" and choose "New connection..."


In the dialog that shows up choose "DynamicsCRM"

You should now see a dialog showing the connection settings to Dynamics.

Connection settings for your instance
Choose the right settings for your instance. Test your connection at the bottom when you are done to make sure it works. Make extra sure you are not connecting to your production environment, wouldn't want to anonymize that!

When this is done, it is time to make your first Data Flow Task. Work in SSIS is divided into two parts, Control Flow and Data Flow. The control flow is the orchestration, which tells SSIS in which order everything is to be run. If you want thing to run in parallel, just have to boxes next to each other, if you want one Data Flow to run before the other, drag the arrow from the first to the second. It is also possible to have entire "Sequence containers" which can hold several components and make sure they execute before moving to the next stage.

Let's start by dragging one new Data Flow from the Toolbox on the left hand side to the Control Flow work pane. Then double click it. This will open it up.

You will now see that the tab at the top has changed to "Data Flow" from the previous "Control Flow". In the "Data Flow" view you will also have a different set of toolbox components available.

In the "Data Flow" you will control a single data flow. For instance the anonymization of Contact.

Start by dragging the Dynamics CRM Source from the toolbox (on the left) to the workspace (the big pane in the middle. Then double click it. - Before looking at the details of the source component, I like using FetchXML when building queries and of course the best way to build queries with FetchXML is using FetchXMLBuilder in XrmToolBox (thanks Jonas Rapp and Tanguy Touzard for all your work!) but if that is too much heavy lifting (it really isn't), the easiest way to get a FetchXML query is to make an advance find query and export it with the "Download FetchXML" button in the top right hand side of the ribbon of the Advanced Find query builder. So let's say we have decided the following fields in Contact are personal information and need to be anonymized:

The column editor in advaced find - don't use composite fields like "fullname" or Address1
Downloading the FetchXml, and setting the Source component in SSDT (Visual Studio) will make it look like this:

I have set the "Connection Manager" to "Target" as we are using the same source and target (reading and writing to the same system.

I am leaving batch size as 2000. Seems to work well. Don't reduce it too much, remember the API limit of 60 000 calls per 5 min period.

If you would like to try it a bit, you can set the "Max rows returned" to for instance "10" and then try it out a bit to see that it isn't going crazy.

Source type I like as FetchXML - remember that you can have FetchXMLs with data from several entities which can make queries a lot easier than trying to match the data with lookups in SSIS.

I always try to read all data in UTC and write it in UTC which in most cases makes it correct. But make sure you understand how timezones work if you need to fiddle with this.

Also, don't include more columns than you need. It will just make your script slow. After adding the FetchXML or changing it, it will try to parse it and read the meta data from Dyn365/CRM. Hence there might be a slight delay. You can check what data you will output from this component by clicking on "Columns" on the left hand side.

When done, press "OK" to go back to the "Data Flow" pane.

Now add a "Data anonymization" component and drag the arrow from the Source component to the Anonymizer. Then double click the anoymizer.

You should see something like this, where I have set anonymization settings for the different columns. By default it will say "Ignore" on all columns.


Try out the different anonymization types. Some are more generic than others. When done, click Ok and go back the data flow pane. Add a Dynamics CRM Destination and drag the blue arrow from the anonymizer (blue arrows are the normal data output, red arrows are error output) to the Dynamics CRM Destination component. Then double click it. The view in the data flow should look something like the picture below.

Dyn365 Soruce -> anoymizer -> writing to Dyn365
Dyn365 destination component - set values where the arrows are

When setting up the destination component, there are a few things to consider:

  • In this case we are always doing updates - hence set the action to update. It is faster than upsert.
  • You have to set the Destination Entity.
  • If you write data to Dyn365 with high latency, no batching and no threads you will be able to update at about 2-3 records per second. With low latency, correct batch setting and multi threading, I have been able to get up to 300 records per second. Very dependant on entity. Hold the mouse of the blue "i" just after the "Enable Multithread Writing" for some deep end tips from the scholars at Kingswaysoft.
  • Error handling is recommended to be directed at a file or some other output where you can monitor it. If you do nothing about it, and you get an error, it will break the flow and stop. You can control error handling of the destination component by clicking on the "Error handling" tab on the left hand side. Remember that all types of exceptions thrown by Dyn365 you will get here as well, like missing rights, disabled records etc.
A normal problem that needs to be handled is that records are deactivated and deactivated records cannot be changed, they have to be opened first, then changed, then re-closed.

This is an example of a dataflow handling this:
Data flow which splits the deactivated records to the left, adds two special columns to reactivate them, writes an update with only the statecode & statusreason to the record and then merges the two data streams and writes the original values, which recloses the ones that were closed from the beginning


This is how the conditional split is defined - if statecode is 0, send them to the output called "open" otherwise send them to the output called closed

Then you can test run your data flow, by right clicking the data flow pane, and pressing "Execute Task"
And when you have assemble an larger control flow - like for instance this:
A control flow with several dataflow being disabled - all in sequence.
you can execute the entire flow by clicking the green plus sign in the ribbon.



Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se