FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate

Getting the count of records or averaging is one of the most commonly used Aggregate Functions in programming. FetchXML queries too facilitate aggregation. So here’s how you can utilize the same in a Flow using Common Data Service (Current Environment) connector [Because I’m waiting for it to be renamed to Dataverse Connector yet 😊]

To know about Fetch XML aggregation, here’s the Microsoft Docs link for the same – https://docs.microsoft.com/en-gb/powerapps/developer/data-platform/use-fetchxml-aggregation?WT.mc_id=DX-MVP-5003911

Scenario

Let’s assume, you want to get the count of records from a Common Data Service (Current Environment) connector. Here’s how you can do the same using existing Aggregate Functions provided by Fetch XML.

Disclaimer: This is of course, not he only way to get get the aggregate functions, you can implement custom logic after you’ve retrieved all the data as well.

  1. Here’s the Fetch XML I’ll be using to retrieve all the Accounts from the Dataverse environment.
    I’m using List Rows action from the connector which is this


    And this is the query which I generated from the Advanced Find in D365 CE


  2. Below are the changed I must make to work with Aggregates in Fetch XML.
    in the <fetch>, I’ll set aggregate=”true”
    And the columns which I’m using the Aggregate function on, I’ll mention the aggregate=”[Aggregate]” alias=”[AliasName]”

  3. Now, this query along won’t run and you’ll get the below error –
    An attribute can not be requested when an aggregate operation has been specified and its neither groupby not aggregate. NodeXml: [FirstAttributeInQuery]

  4. The reason being, since you are using Aggregate, only the columns on which aggregates are applied must exist. Hence, you’ll need to remove the other attributes which don’t have aggregate applied to them.
    |
  5. And the workable FetchXML will now look like this

  6. When you run this, these are the results you’ll get in which you’ll have the aggregate value.

  7. Observe the same below

Parse JSON to read the aggregate

Now, since you’ve got the aggregated results. You can do an extra step to read the value. There are several ways to contain this, but here’s a quick example of how I did it –

  1. Declare a variable. It must be outside of a For Each at all times.

  2. And in the For Each, because I’m selecting the Array inside the value attribute in the Fetch XML results, I can then use the sample data to generate the schema and use it. The loop will anyway run only once.

  3. And I’ll set the variable below

  4. And here’s the final result once you run it. Your scenario of usage may vary.

Hope this was useful!

Here’s a YouTube video I made to summarize this example –

Here are some more Power Automate / Flow posts you might want to check –

  1. Invalid type. Expected Integer but got Number error in Parse JSON – Error at runtime after generating Schema | Power Automate
  2. Secure Input/Output in Power Automate Run History
  3. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  4. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  5. Trigger Conditions not working in a Cloud Flow? Here’s Why | Power Automate Quick Tip
  6. Read OptionSet Labels from CDS/Dataverse Triggers or Action Steps in a Flow | Power Automate
  7. Asynchronous HTTP Response from a Flow | Power Automate
  8. FormatDateTime function in a Flow | Power Automate
  9. Tag a User in a Microsoft Teams post made using Power Automate
  10. Converting JSON to XML and XML to JSON in a Flow | Power Automate
  11. Office 365 Outlook connector in Cloud Flows showing Invalid Connection error | Power Automate
  12. Create a Team, add Members in Microsoft Teams upon Project and Team Members creation in PSA / Project Operations | Power Automate

Thank you!!

Advertisement

Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector

For newbies using Common Data Service (Current Environment) Connector, it might be a little puzzling to find all the records and other supporting output data while parsing from a List Rows action in the connector.

Here’s my post summarizing the same and helping you show what is in the output when and then you can make a decision!

Before we begin, please note the connector Icon at the time of writing this post – The tooltip if hover on will read as “Common Data Service (Current Environment)

List Rows

So, here’s what my List Rows action looks like

  1. I’m retrieving 10 Accounts in this example

  2. Now, I’m adding 4 Parse JSON variables to hold the different Outputs from the Dynamic Content of the List Rows.
  3. I’ll rename the first Parse JSON as Value and add Value from the Dynamic Content from List Rows output


  4. Second, I’ll rename the Parse JSON to Body and add Body from the Dynamic Content to it.

  5. Third, Parse JSON is renamed to Item and I’ll select the body/value – Item first.


    And as soon as I do that, the Block gets converted to For Each, because Item is list of all the records directly

  6. Finally, in my fourth Parse JSON block, I’ve renamed it to Outputs of List Rows and added the outputs of the List Rows step itself using outputs() function – More on outputs() here – Using outputs() function and JSON Parse to read data from missing dynamic value in a Flow | Power Automate


    Now, let’s Run the Flow and see what results we get!!

Value / Body / Item

Now, let’s look at the output JSON data from each of these blocks and see what we get –

  1. Value. I used JSON Beautifier to parse and look at the JSON data and here’s what it looks like.
    Body attribute and has array of all the records.


  2. Next, Body. Body has similar data as values but with some additional attributes to support the same.
    Then, the array of all the records under Value attribute instead of directly appearing under Body in the Value block above. I know it’s a little puzzling — 😊


  3. The, Item. It’s a simple JSON of a single record itself. Hence, it exists inside a For Each loop


  4. And finally, the outputs() of the List Rows action entirely, it has Body, Header and other


    But, note that the Body is also another attribute inside the main Body tag and sits next to Headers and StatusCode

I’ve also embedded my YouTube video explaining the same –

Here are some Power Automate / Flow posts which you might find worth checking out –

  1. Invalid type. Expected Integer but got Number error in Parse JSON – Error at runtime after generating Schema | Power Automate
  2. Asynchronous HTTP Response from a Flow | Power Automate
  3. FormatDateTime function in a Flow | Power Automate
  4. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  5. Office 365 Outlook connector in Cloud Flows showing Invalid Connection error | Power Automate
  6. FormatDateTime function in a Flow | Power AutomateUsing outputs() function and JSON Parse to read data from missing dynamic value in a Flow | Power Automate
  7. Formatting Approvals’ Details in Cloud Flows | Power Automate
  8. Trigger Conditions not working in a Cloud Flow? Here’s Why | Power Automate Quick Tip
  9. InvalidWorkflowTriggerName or InvalidWorkflowRunActionName error in saving Cloud Flows | Power Automate Quick Tip
  10. Using triggerBody() / triggerOutput() to read CDS trigger metadata attributes in a Flow | Power Automate

Thank you!!

Invalid type. Expected Integer but got Number error in Parse JSON – Error at runtime after generating Schema | Power Automate

If you are using JSON Parse function in Power Automate and are comfortably generating schema from Trigger Outputs (or any Output for that matter), to get the Dynamic Content but end up getting the below error even after generating the Schema by parsing actual data itself?

Let’s look at why this happens.

Scenario / Issue

Now, below is the usual step you follow to generate the schema –

  1. Let’s assume you are reading from the body of the CDS trigger. Again, it could be anything. And you are using Parse JSON and generating schema as below –


  2. And the schema is generated as below –

  3. I’ll point out an exception here. The above Schema was generated from a Schema whose data in decimal had 0.0

  4. Hence, when you generated the Schema from the Data, the Data’s Type was set to “integer



  5. So, when 0.0 passes through the parse, it’s successful because it is interpreted as 0 and not 0.0
  6. Now, when something like 2.5 passes through it, it gives the below error.



Solution

  1. Since you can’t be always aware that the data you are using for parsing is an actual Decimal or a Whole Number, so in that case, simply change the Type of the Schema in JSON Parse change to “number

  2. Now, in this case, if you resubmit the same Run again, it will pass through in case there are no other issues.

Hope this helps!!

Here are some more Power Automate / Flow posts you might want to check –

  1. Asynchronous HTTP Response from a Flow | Power Automate
  2. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  3. Tag a User in a Microsoft Teams post made using Power Automate
  4. Converting JSON to XML and XML to JSON in a Flow | Power Automate
  5. Office 365 Outlook connector in Cloud Flows showing Invalid Connection error | Power Automate
  6. FormatDateTime function in a Flow | Power Automate
  7. Formatting Approvals’ Details in Cloud Flows | Power Automate
  8. Trigger Conditions not working in a Cloud Flow? Here’s Why | Power Automate Quick Tip
  9. Create a Team, add Members in Microsoft Teams upon Project and Team Members creation in PSA / Project Operations | Power Automate
  10. Read OptionSet Labels from CDS/Dataverse Triggers or Action Steps in a Flow | Power Automate
  11. Using outputs() function and JSON Parse to read data from missing dynamic value in a Flow | Power Automate
  12. Adaptive Cards for Outlook Actionable Messages using Power Automate | Power Platform

Thank you!!

Asynchronous HTTP Response from a Flow | Power Automate

By default, whenever yo submit an HTTP Request to a Flow, your application will wait till the request is completed. Meaning, the HTTP Response by default in a Cloud Flow is Synchronous.

So, let’s see how you can make it Asynchronous and how you can later retrieve the status of the HTTP Request itself from the Flow. If you make a HTTP Response as Asynchronous, the calling HTTP Request application will receive a 202 Accepted in response.

HTTP Request by Default

Let us see how the HTTP Request and Response is structured by default.

  1. Here’ s the HTTP Request which will receive the request and process the data further.

  2. And let’s assume you some lengthy process involved in your Flow which is making an average execution time span to a significant time. To mimic that, I’ve simply introduced a delay of 3 minutes to demonstrate the Async example.

    In case you want to review how to pause a Workflow using Delay and Delay Until, you can check this post – Pause a Flow using Delay and Delay Until | Power Automate
  3. And finally, assuming my execution of the Flow has completed and I’m supposed to send the Response back to the HTTP Request calling application, I’ll use the Response Action from the Request connector
    }
  4. And my Response will look something like this letting the caller application know that the processing has been successfully completed.


    Now let’s see how this calling application will behave in this case. Assuming, we haven’t manipulated any settings as yet.

    The Flow looks like this just to give you a visual perspective of the implementation.

  5. Now, Once I send a Request from Postman, the Postman itself will be waiting for a response till the execution of the Flow is completed.


  6. This is because the Flow itself is waiting to be processed and yet to reach the Response block in the Flow.

Asynchronous Response

We will now turn on the Asynchronous Response of the HTTP Response Action –

  1. Go to Settings on the Response action step.


  2. In Settings, turn on Asynchronous Response On and save it.

  3. As mentioned in the description of the Asynchronous Response setting, the caller Application will immediately get a 202 Accepted code upon sending the HTTP Request.

  4. Notice the Headers, you’ll see a Location header item

  5. Capture this URL in your calling application as you can use it to check the status of the Request later on once the operation is completed on the Flow.

Checking Request Status

Since we have the Location information which was passed on from to the Response Header previously, we can use the URL to check the Response Status separately by making another call to the URL itself.

  1. If the Status is still Running, you’ll get this response. It will imply that the Flow is still Running and the Response is not available yet.


  2. Else, you’ll get the actual Response which was supposed to be received had the Flow’s HTTP Response was set to Synchronous by default (Asynchronous Response = OFF)


Hope this was helpful!

I’ve added a YouTube Video for the same. You can alternatively, check the same as well –

Here are some more Power Automate / Flow posts you might want to check –

  1. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  2. Call HTTP Request from a Canvas Power App using Flow and get back Response | Power Automate
  3. Accept HTTP Requests in a Flow and send Response back | Power Automate
  4. Setting Retry Policy for an HTTP request in a Flow | Power Automate
  5. Make HTTP request from Flow in Power Automate
  6. Connecting XrmToolBox to an MFA enabled Dynamics 365 environment | Azure ADFormatDateTime function in a Flow | Power Automate
  7. Visualize Adaptive Card for Teams user action within a Cloud Flow | Experimental Feature
  8. FormatDateTime function in a Flow | Power Automate
  9. Form Access Checker in new Power Apps Form Designer | Model-Driven Apps in Dynamics 365Formatting Approvals’ Details in Cloud Flows | Power Automate
  10. Trigger Conditions not working in a Cloud Flow? Here’s Why | Power Automate Quick Tip
  11. Setting Lookup in a Flow CDS Connector: Classic vs. Current Environment connector | Power Automate Quick Tip
  12. Make On-Demand Flow to show up in Dynamics 365 | Power Automate
  13. Adaptive Cards for Outlook Actionable Messages using Power Automate | Power Platform
  14. Run As context in CDS (Current Environment) Flow Trigger | Power Automate

Thank you!!

Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate

In a Cloud Flow, if you are using an HTTP Request Trigger that accepts HTTP Requests – you have the option to validate the Incoming data based on the Schema of the JSON.

Scenario

Let’s assume this is the data which will be passed to the HTTP Request Flow –



HTTP Request & Schema

Let’s look at the HTTP Request Trigger itself in a Cloud Flow –

  1. First, let’s initiate the HTTP Request Trigger here and make sure we generate the Schema from the same data we will be passing. I also have a separate post on the HTTP Request itself which you can check here – Accept HTTP Requests in a Flow and send Response back | Power Automate
     
  2. Here, we can paste the sample data which we saw in the scenario above.

  3.  Once you click Done, the Schema will be generated from the sample data you’ve just entered. Additionally, if needed you can also specify the method you are looking to implement. POST in this case


  4. Further on, I’m doing some operations internally and finally, I’ll be sending a response to the caller using the HTTP Response Action in HTTP connector in a Cloud Flow

  5. And here is what I’m sending back in Response if my Flow happens to work as expected.

  6. So, the end of my Flow looks something like this. (Depending on what you are trying to do in your Flow)



     

Turning Validation On

Now, coming back to the HTTP Request trigger itself –

  1. Go to Settings on the HTTP Trigger itself –


  2. Now, look for Schema Validation option and turn it on.


Now, let’s consider the scenario and test the Postman.

Testing with Postman – Validation On

Let’s try to send incorrect data to the Flow using Postman which doesn’t comply with the Schema and see how we receive the 400 Error message in Postman

  1. Now, when I try to send the correct data as expected by the Schema once the validation is turned on, I’ll get a success message once the Flow has finished running and return with a 200 code


  2. Now, if we try to send incorrect data, it will not be accepted. So first, I’ll add a new attribute which I’ve not added in the schema already in order for the validation to fail and give us an expected validation error.
    See the data which I’m sending. To which, I get the 400 Bad Request error

  3. If you get back a 400 Bad Request, the Flow will not register it as a Flow Run. Hence, you won’t know if the Flow was hit or not.
  4. And in case the validation is turned OFF, you’ll get the Response as expected from the Flow in the event that Flow has completed successful execution.

Hope this post helped!

Here are some more Power Automate / Flow posts you might want to check –

  1. Tag a User in a Microsoft Teams post made using Power Automate
  2. Converting JSON to XML and XML to JSON in a Flow | Power Automate
  3. Office 365 Outlook connector in Cloud Flows showing Invalid Connection error | Power Automate
  4. FormatDateTime function in a Flow | Power Automate
  5. Formatting Approvals’ Details in Cloud Flows | Power Automate
  6. Trigger Conditions not working in a Cloud Flow? Here’s Why | Power Automate Quick Tip
  7. Read OptionSet Labels from CDS/Dataverse Triggers or Action Steps in a Flow | Power Automate
  8. InvalidWorkflowTriggerName or InvalidWorkflowRunActionName error in saving Cloud Flows | Power Automate Quick Tip
  9. Setting Lookup in a Flow CDS Connector: Classic vs. Current Environment connector | Power Automate Quick Tip
  10. Using outputs() function and JSON Parse to read data from missing dynamic value in a Flow | Power Automate
  11. Using triggerBody() / triggerOutput() to read CDS trigger metadata attributes in a Flow | Power Automate
  12. Secure Input/Output in Power Automate Run History

Here are som

Tag a User in a Microsoft Teams post made using Power Automate

As Power Automate becomes more of a norm to make Microsoft Teams in a channel, tagging a user is one of the most frequent and obvious asks for Power Automate / Flow developer.

Here’s how you can take a User from Office 365 to a Teams channel.

Let’s look at these straight-forward steps! It’s easy.

Getting a Mention Token

First, let’s create a Mention token. Let’s see how –

  1. In Microsoft Teams connector in Power Automate, look for Get @mention token for a user (preview)

  2. This Action step asks for a User Principal (If you are using an Active Directory) or a User ID which needs to be tagged/mentioned.

  3. Let’s enter an User Principal or User ID here. I’m hardcoding this for visibility, your scenario would vary.


Use Token in a Post to Teams

There are different ways to Post a message to Teams. Let’s see the options.

  1. Below are some options to post a message to Teams

  2. Let’s go with Post a message (v3) (preview) for this example to post a message to Teams.

  3. Finally here’s how your step would end up looking with the Mention tag added in the message to be posted in ex: Open Board Meetings Teams’ General Channel.

Post in Teams

So here’s how the post will look once the Flow is run and the message is posted to Teams

Hope this was helpful! Here are some more Power Automate / Flow posts you might want to check out –

  1. Converting JSON to XML and XML to JSON in a Flow | Power Automate
  2. Visualize Adaptive Card for Teams user action within a Cloud Flow | Experimental Feature
  3. Create a Team, add Members in Microsoft Teams upon Project and Team Members creation in PSA / Project Operations | Power Automate
  4. Task Completion reminder using Flow Bot in Microsoft Teams | Power Automate
  5. Office 365 Outlook connector in Cloud Flows showing Invalid Connection error | Power Automate
  6. Import multiple Users in Office 365
  7. Read OptionSet Labels from CDS/Dataverse Triggers or Action Steps in a Flow | Power Automate
  8. Turn Teams On / Off at Org Level, provisioning users | M365 Admin Center Tip
  9. FormatDateTime function in a Flow | Power Automate
  10. Approval Process using Power Automate

Thank you!

Rich Text Control for Canvas and Model-Driven App | Quick Tip

In one of my previous posts, I had highlighted, you can enable a Rich-Text Control for a Multiple-Line of Text type of field. Here it is again – Use Rich-Text Control for Multiple Lines of Text in Dynamics 365 CE | Quick Tip

In the above post, I had highlighted that we can change the control of a Model-Driven control of Text to Rich-Text and make it appear as below

Now, reading the same field in a Canvas App will make it appear as below

Now, let’s see how we can overcome this and read the correct Rich-Text Control formatting in the Canvas App’s Form as well.

Canvas App Form

Let’s look at the changes you’ll need to do to the Gallery control in order to read the Rich-text formatting as seen in the Model-Driven App

  1. Select the Gallery itself and expand the Fields section and locate the field which is a Rich-Text control in the Model-Driven app itself.

  2. Now, change the Control type and select the Edit rich text control from the options.


  3. Once this is selected, the Control will be updated to read formatting as in the Model-Driven app itself.

Hope this was helpful. Here are some more Power Apps / Power Platform posts you might want to check –

  1. Setting Correct Default Mode for Forms in a Canvas App | [Quick Tip]
  2. Rating Control to represent data from Dataverse in a Canvas Power App | Power Platform
  3. Clear a field value & Reset Form in a Canvas Power App [Quick Tip]
  4. Get Dynamics 365 field metadata in a Canvas App using DataSourceInfo function | Common Data Service
  5. Debug Published Canvas Power App with other users using Monitor | Power Platform
  6. Download a File from a Canvas Power App using a button | Power Platform
  7. AddColumns() function to dynamically add columns to a Data table in Canvas Power App | SharePoint List
  8. Implement real-time search in Gallery of CDS records in a Canvas Power App | Power Platform
  9. Implement character length validation in a Canvas Power App | Power Platform
  10. Implementing Exit app, Logout and Confirm Exit features in a Canvas Power App

Thank you!!

Converting JSON to XML and XML to JSON in a Flow | Power Automate

In this very simple post, let’s look at how you can convert JSON to XML and XML back to JSON while working in Power Automate.

First, let’s look at converting JSON to XML and then, XML to JSON from the same result of the first conversion.

JSON to XML

Let’s look at an example where we have a sample JSON which we will convert to XML in Power Automate using xml() function and we’ll revert the same operation using xml() function in Power Automate itself.

  1. So starting off with JSON data, you’ll need a String based JSON data. I’ll store the same in a variable which looks like below.

  2. If I format the same data in JSON formatter online, it’ll look like this –


  3. Next, we can use the formula xml(<jsonData>) in the expressions and use it as below

    Now, since I’m storing my JSON data in String already, I’m converting it to JSON by using json() function inside the xml() function.
  4. The result of the same is as below

  5. And if we take it to an XML formatter, it’ll look like below
Let’s look at what won’t work
  1. Cannot use an Array and there can be only 1 Root element. Hence, the below won’t work –
    You cannot have an Array of JSON elements which looks like below –


    It will result in the below error saying, “The template language function ‘xml’ parameter is not valid. The provided value cannot be converted to XML:’Data at the root level is invalid…


  2. Also, when it is not an Array already, but there are Multiple Attributes at the Root level itself, it won’t work either. Something like below –

    Or if we format and look at it, the Name and the Newsletters is the same Root level –

    Which will result in the below error ‘The template language function ‘xml’ parameter is not valid. The provided value cannot be converted to XML: ‘JSON root object has multiple properties. The root object must have a single property in order to create a valid XML document. Consider specifying a DeserializeRootElementName. Path ‘Newsletters’.’

XML to JSON

Similarly, let’s see how we can inverse the conversion now from XML back to JSON –

  1. In this post, we are taking the same XML result which we first converted from JSON back to JSON again. But you can start fresh or take the source from elsewhere, of course.
    The formula to convert from XML to JSON is

    Like in the previous step, the XML was in String as a result captured from the previous step and we need to convert it to XML first in order to convert it to JSON.

  2. The result is as follows –

  3. Also, like in the previous JSON to XML conversion, Root level node has to be present. Else, you’ll see the following error in case you don’t have a root for the XML.


    And it will result in the below error

Official Microsoft Links for the above functions are –

  1. JSON – https://docs.microsoft.com/en-us/azure/logic-apps/workflow-definition-language-functions-reference#xml?WT.mc_id=DX-MVP-5003911
  2. XML – https://docs.microsoft.com/en-us/azure/logic-apps/workflow-definition-language-functions-reference#json?WT.mc_id=DX-MVP-5003911

Hope this was helpful! Here are some more Power Automate / Flow posts you might want to check –

  1. Office 365 Outlook connector in Cloud Flows showing Invalid Connection error | Power Automate
  2. Filter records in a View owned by a Team you are a member of | Dynamics 365 CRM
  3. FormatDateTime function in a Flow | Power Automate
  4. Formatting Approvals’ Details in Cloud Flows | Power Automate
  5. InvalidWorkflowTriggerName or InvalidWorkflowRunActionName error in saving Cloud Flows | Power Automate Quick Tip
  6. Read OptionSet Labels from CDS/Dataverse Triggers or Action Steps in a Flow | Power Automate
  7. Using outputs() function and JSON Parse to read data from missing dynamic value in a Flow | Power Automate
  8. Trigger Conditions not working in a Cloud Flow? Here’s Why | Power Automate Quick Tip
  9. Make On-Demand Flow to show up in Dynamics 365 | Power Automate
  10. Run As context in CDS (Current Environment) Flow Trigger | Power Automate

Thanks!!

Filter records in a View owned by a Team you are a member of | Dynamics 365 CRM

In Dynamics 365, “My” views show records owned by the System Users themselves. Let’s look at how you can have views that let’s you filter records based on the Owner Team which you are a part of.

Scenario

Let’s assume the below scenario in terms of Contacts entity. All users have Contacts owned by them. Out-of-the-box views show filter “My” views only by Owner field.

  1. Default My Active Contacts view will show you Contacts you are Owner of. (Similarly, this could apply to every other record as well)
    Example: Amit is logged in and he is seeing his Contacts under My Active Contacts




  2. Now, Amit is also part of a Sales Team which is an Owner Team in Dynamics 365 CRM/CE


  3. And, there is a Contact which is assigned to the Sales Team itself and not an individual user.


    I’ll just expand the header and show you the Owner

  4. So, we’ll make this Contact also appear for Amit on his new My Team(s) Contacts’ view. You can call your view something else as well.
    So let’s see we can do this.

Create a new View – Edit Filter Criteria

In order to make the Owner Team’s record available, we’ll have to work by creating a new view and editing the Filter Criteria of the new view. Let’s see how –

  1. I’ve created a new view called as My Team(s) Contacts. And I’ll start by editing the criteria itself.



  2. Start by selecting Owning Team (Team) under the Related section of the fields selection drop-down.



    Under that, open the drop down to expand it’s related records.


  3. Under Owning Team (Team), look for Users. Notice that it doesn’t have any entity name mentioned in brackets like other fields in the list. Meaning, it is the sub-grid i.e. Child Record list under the Team record.

  4. Once you select the Users, expand the dropdown under Users which will be the fields of the System User record itself.


  5. In this list, select User itself. This is the GUID of the User record. Pro Tip: Any field with the name of the entity itself is a GUID/Primary Key of the entity record itself.

  6. And in this last selection, you’ll notice that the Current User is already selected for you.

  7. That’s it. Save your criteria and publish the changes.
    Let’s see the results.

Result

Now, if you navigate to the new view you created with the selected criteria, you’ll see the records Owned by Teams which the logged in user is a part of –

Caveat – Clubbing into 1 view (Owner + Owning Team’s Member) is not possible

Let me point out a caveat right away before we proceed – In case if you are wondering that we can do this in the same view by adding more ‘Related’ entity criteria which will look like the below – it won’t work! 😦

Because, the Filter Criteria will not consider (or let you select these rows in any order) and let you make it in an OR group.
By Default, this is an AND group i.e. a Field + Related criteria can’t be grouped together.
If you set the above criteria in any order, it’ll end up returning 0 results.


Hope this was useful!

Here are some more Dynamics 365 related posts you might want to check –

  1. Use Rich-Text Control for Multiple Lines of Text in Dynamics 365 CE | Quick TipDuration field in Dynamics 365 converts Hours value to Days in Dynamics 365 | [Flow Workaround to convert in Hours and Mins]
  2. Import lookup referencing records together in Dynamics 365 CRM | [Linking related entity data during Excel Import]
  3. Show custom ribbon button based on Security Role of the logged in User in Dynamics 365 | Ribbon Workbench in XrmToolbox
  4. Connecting XrmToolBox to an MFA enabled Dynamics 365 environment | Azure AD
  5. Form Access Checker in new Power Apps Form Designer | Model-Driven Apps in Dynamics 365
  6. Use Rich-Text Control for Multiple Lines of Text in Dynamics 365 CE | Quick Tip
  7. Ribbon button visibility based on a field value in Dynamics 365 | Ribbon Workbench
  8. Make On-Demand Flow to show up in Dynamics 365 | Power Automate
  9. Find deprecated JS code used in your Dynamics 365 environment | Dynamics 365 v9 JS Validator tool | XrmToolBox
  10. Remove ‘This Email has been blocked due to potentially harmful content.’ message in Dynamics 365 Emails | OrgDbSettings utility

Thank you!