Update SVG Icon to Custom Entity in Sitemap | Model Driven Apps

So, if you are used to updating Icons to entities in the classic UI, here’s what you need to do in order to update the SVG image of a Custom Entity you just created using new Power Apps Maker portal.

Let’s see below is you custom entity and it comes with its default icon which you want to set to a custom SVG icon.

Adding SVG Icon to Custom Entity

Given that you have appropriate access to the be able to Customize the system, follow the below steps –

  1. In your solution, you have the table as well as the SVG Icon you just created the Web Resource for and uploaded an image which you want to set as Icon.

  2. Now, select the Table you want to set the SVG icon to, and click on Properties.

  3. On the right hand pane, expand the Advanced area and look for the Choose table image field.


  4. Then, start typing the Display Name of the SVG icon which you wish to set to this Entity.


    Click Save if no other changes are to be done.

  5. Once Saved, click on Publish.

  6. Now, when you refresh the App where the custom entity is listed in the Sitemap, you’ll see the icon updated.

Hope this was useful!

Thank you!

Preferred Solution in Dataverse | Power Platform Admin Center

By default, everything goes inside a Default Solution if you are aware of the classic way of doing Customization in Dynamics 365 CRM. And this causes components to be lost in Default solution without knowing who created where and what was that.

Hence, to be able to collect all the components created outside of Solutions, Preferred Solution is a great way to automatically add components created outside Solution in a single solution to ensure accountability.

Let’s see how this works with help of this simple post!

Mark a Preferred Solution

Given you have appropriate rights like System Administrator or System Customizer, you can go to the Maker Portal (https://make.powerapps.com/), and follow the steps below –

  1. In the Power Apps Maker Portal, when you navigate to Solutions – you’ll see a message saying ‘Set your preferred solution’ and on the right hand-side show that the Common Data Services Default Solution is already preferred [You’ll know this from Customizations option in classic UI].

    And on the top, you’ll see button to Set preferred solution.


  2. Now, when you select to set preferred solution, you’ll see all the unmanaged solutions you have.
    Select the one you want to mark as Preferred for anything not directly added to a solution.

  3. Then, you’ll see that Preferred Solution label has been applied for that Solution.

  4. Now, even if you add anything directly from other areas like Tables and add a field (for example), it’ll end up having the Prefix of the Solution itself.


  5. In this example, it’s add Field 2. The Prefix set for the Preferred Solution was “cf301

  6. And when you open the Preferred Solution itself, the component you created outside the solution will be added to the Preferred Solution automatically.


  7. This way, it’s easy to not lose any customization in Default Solution and makes it easy for all the components which were created outside of the solution to be gathered in place when you want to investigate your environment!

Hope this was useful!

Thank you!

Use Monitor to debug Model-driven apps remotely | Power Platform

Monitor is one feature that comes in super handy when end-users complain about an issue which is difficult to ask end users to send across logs from the browser.

And here’s where Monitor comes in handy!
Let’s see how this works through this simple blog post!!

Capture events from Monitor in Model Driven Apps

Here’s how you can Monitor in Model-driven apps’ Monitor to capture issues on an End User

  1. You can go to Power Apps Maker Portal (https://make.powerapps.com/) and make sure you are switched to the intended environment.
  2. Then, select Apps on the left hand pane and expose all the Apps. Select the Model-Driven app you want to enable Monitor for. Once you select, you can then drop down from Details flyout menu and click on Monitor.

  3. Once you click on Monitor, it opens the Monitor application itself where all the logs you work on will be captured. And you can also notice that there’s a Play model-driven app button as well to enter in Debug mode.


  4. It opens the Model-driven app in a new tab and asks you to confirm if you want to join the debug session.

  5. Once you click on Join, it’ll run the app in debug mode and you can see the Monitor tab and notice that it has started capturing the logs based on your operations in the Model-driven app session you are running in parallel.


  6. And when you go about working in the model-driven app, it’ll keep capturing the traffic just like on a browser’s Network in Dev Tools

  7. Now I deliberately added an erroneous code in my custom JS so that I could capture an exception in the monitor.

  8. And if you look at the monitor, you’ll see that this has been captured.

  9. And this is the wrong script I entered so that my code wouldn’t find the incorrect field name and throw an error when I try to retrieve value from an attribute that doesn’t exist (without null checking if the attribute exists or not)

  10. However, best use case is when you ask end-users to join your session. Let’s see in the next session on how you can achieve this.

Invite Users to your Debug session

In the Model-driven apps monitor, here’s how you can invite other users to join your session –

  1. In the Monitor, you’ll see Invite or Connect to a User. For this example, I’ll choose Connect user option.

  2. Then, I can simply search for the User whom I want to generate a join link for.

  3. Now, once this user is added, you’ll see a copy link option to copy the link and pass it on to the user who needs to join.

  4. Once the end user has this link, then can join the session and they’ll see this message on their Dynamics model-driven app

  5. And similarly, once they start reproducing the issue, you can start capturing the traffic on your end.


Hope this was useful! In order to fully understand the capabilities of Monitor for model-driven apps, here’s Microsoft’s official documentation – https://learn.microsoft.com/en-us/power-apps/maker/monitor-collaborative-debugging?WT.mc_id=DX-MVP-5003911

Hope this was useful!

Thank you!

Why Environment Variables don’t appear in Flows? | [Quick Tip]

At times, if you are new to working with Environment Variables and you’re looking to use them in your Flows but don’t see them?

Here’s why!

Flows Outside Solutions

If you are creating Flows from the My Flows section, let’s see if you can access Environment Variables or not –

  1. If you use My Flows way to create your Flows as shown below –

  2. You won’t be able to access Environment Variables in that Flow

Flows in Solutions [Default and other Solutions]

So, you’ll need to have your Flows inside a Solution – even if you are creating a Flow in Default Solution, you’ll be able to access Environment Variables from another solution –

  1. If you are in a Default Solution as shown below and you create a Flow there, you’ll be able to access Environment Variables.


  2. And you create a Flow there, you’ll be able to access Environment Variables.

Hope this was useful!

Thank you!

Pre-Export Step Required setting in Deployment Pipeline | Power Platform Pipelines

Now that you must’ve already setup your basic Power Platform Pipeline as yet and are looking to explore how to extend the Power Platform Pipeline to do more advanced operations, this post is for you!
In case you are still looking to first setup your Power Platform Pipeline, you can check this Blog Series which this very post too, is a part of – Power Platform Pipelines | Blog Series

What is Pre-Export Step Required Setting?

This is the ability to have a trigger before an Export operation from the Development Environment is initiated in order to run the pipeline – only available for the first stage in the pipeline.

This is provided so that you may want to run some external operations before this is taken through the pipeline for deployment.

Use Case is – that you want to first seek an approval from the Admin before the Solution is deployed to Production (or rather, sent through the pipeline for deployment). Once approved, the pipeline should automatically proceed towards executing the rest of the deployment stages.

Pre-Export Step Required

While setting up your Pipeline, in case you were wondering what Pre-Export Step Required setting was, see below –

  1. Once you mark this field as checked/Required, save the record and it’ll appear like this on the record.

  2. What this does is, it runs the trigger action ‘OnDeploymentRequested’

  3. And once this Flow is trigger based on this Action, you can perform custom logic to be carried out and be successful before the deployment is carried forward.
    In this example, I’m setting a simple Approval process to be in place so that the Admin is aware and approves all the Deployment requests.

  4. Now, once an Approval is received, you need to check the status of the request and if it’s Approved, you need to run Perform an unbound action to initiate the Action ‘UpdatePreExportStepStatus
    You’ll need to pass the StageRunId – You’ll get this in the Dynamics Content Properties of the Flow itself from the trigger.
    Then, you need to set the Status of 20 – this means Approved.
    For rejection, the status to set is 30.

  5. Now, once this Flow is in place, every time a Pipeline is Run to deploy the solution, it’ll first wait for the Approval process to complete and the pipeline itself will show the below message.

  6. This status can also be seen in the Deployment Stages in the ‘Deployment Pipeline Configuration‘ app as well.

  7. Now, the Admin on the other hand, will receive a Power Automate Approval like this (based on whatever you have configured). This is received on both Approvals in Teams and in Power Automate as well.

  8. Once the Approver approves, I’ll enter some notes while approving.

  9. The pipeline will then proceed to deploy to production.

  10. And this will also proceed on the UI in Pipelines as well.

  11. Once deployed, you’ll see that this is completed Successfully if there are no issues.

  12. You can also see the History. The End Time will represent when it was completed as opposed to Start Time representing when the Deployment Request was initiated.

  13. And also in the ‘Deployment Pipeline Configuration‘ app.



Here’s official Microsoft documentation on how you have Gated Extensions like these to be in place in Power Platform Pipelines – https://learn.microsoft.com/en-us/power-platform/alm/extend-pipelines#gated-extensions-available?WT.mc_id=DX-MVP-5003911

Hope this was useful!

Thank you!

Run a Power Platform Pipeline

In case you setup your first Power Platform Pipeline and looking to test it out? This post is for you.

Or if you haven’t yet configured your Power Platform Pipelines first, refer this post – Setup Power Platform Pipelines

Now that you have your basic Power Platform Pipeline set in place, let’s run a created Pipeline!

Run Power Platform Pipeline

Here’s what you need to do in order to Run your pipeline –

  1. Go to the Dev environment on which you have Hosted your pipeline (or which is supposed to be your first environment from where all the customization/configuration should move over).
    Go to the Solution which you want to Run through the Pipeline.
    For the simplicity of this example, this Solution has just 1 custom column on the Account table.

  2. Now, click on Pipelines and look for the Deployed Pipeline which is ready to be used.

  3. Now, once you get to see the stages which you have set in the blog post – Setup Power Platform Pipelines, those stages will appear here.
    Then, verify the environment details mentioned and then click on Deploy here once you are sure.

  4. Now, once you click on Deploy here, you’ll be given option to choose when you want to deploy – whether now or later.

  5. For this example, I’m choosing Now instead of scheduling it for later. Then, I click Next and it’ll go into Validating Stage.

  6. Once it all looks good, you’ll get AI generated notes already if you are in the US Region (at the time of writing this post). Then, click Deploy once everything looks good.

  7. Once this is in progress in the background, you’ll see that the pipeline is deploying your solution.

  8. Once this is completed, you’ll see that this is deployed successfully.


  9. And this will be successfully deployed to the Target environment like so in the Managed Solutions section.

Hope this short tutorial was helpful!

Hope this was useful!

Thank you!

Setup Power Platform Pipelines

Given that you need to setup Power Platform Pipelines, here’s a post for you!
This post will walk you through on how you can setup Power Platform Pipelines.

Pre-Requisites

Here’s what you need to setup in order to enable Power Platform Pipelines –

  1. You need to enable Managed Environments for the environments which need to participate in Power Platform Pipelines. Here’s a post on Managed Environment which I’ve written in the past – Enable Managed Environments in Power Platform Admin Center

    Given that all participating environments have been enabled with Managed Environments, select an Environment which is supposed to a “Host” environment where all the Pipelines master data will house and then go to it’s Dynamics 365 Apps section from Resources to install Power Platform Pipelines into that environment.

  2. Once you are in, click on Install app and then search for Power Platform Pipelines.

  3. Confirm that you are about to install this Solution.

  4. Once installed, go to Power Apps Maker Portal (https://make.powerapps.com/) and then select the Host environment in which you have installed Power Platform Pipelines on.
    Then go to Apps and you’ll see Deployment Pipeline Configuration app. Play that app!



    Let’s see how you can set the environments up first!

Setting up Environments

Here’s how you can setup your Environments in the –

  1. Once you are in the Deployment Pipeline Configuration App, go to Environments and create a New record.

  2. Then, enter all the details. Also, mention if the Environment type is Development Environment or Target Environment.

  3. Once you save the record, this the configuration will be validated.


  4. In case you are wondering how to you find the Environment ID, here’s where you’ll find the Environment ID in Power Platform Admin Center (https://admin.powerplatform.microsoft.com/environments), select the environment and you’ll see the details as below –

  5. Once all the Environments are set in the Deployment Manager, here’s how it should look


Configure Deployment Pipelines

Now that your environments are set, let’s also configure the Deployment Pipelines –

  1. Go to Pipelines and create a New record.

  2. Now, fill in all the relevant information and save the record.

  3. Now, link your Managed Environments in the Linked Deployment Environment grid below. Then click on Add Existing Environments button.

  4. And once you add, they’ll appear like this while selecting them in lookups. Then click Add.

  5. Once added the Development Environments, go ahead and create new Pipeline Stages too.

  6. In the new Deployment Stage, I’ll simply tag the Production Environment and save the record to keep this example simple.

    At this point, your Pipeline is all set to Run.

    Shortly, I’ll share another post on how you can Run a Pipeline in Power Platform!

Hope this was useful!

Thank you!

Pass Entity parameter from Power Automate Flow to an Action

Let’s say that you want to run a Power Automate Flow on a set of Dataverse records and those records will be referenced in your C# Plugins.

And the next steps is you’ll create an Action for this and register it in the Plugin Registration Tool.

In case you are new to plugins in CRM, you can refer this series – Plugins Development in Dynamics 365 CRM for Beginners | [Blog Series]

Bound Action in CRM

Let’s suppose you are aware on how to create Actions in Dynamics 365 CRM. This is an old concept since many years and I’m also assuming you know how to profile a plugin (which is registered as an Action in CRM) –

  1. When you open an Action, let’s suppose you are passing Account as well as a Contact Entity to the Action itself.
    Notice that the Action is registered as a Bound Action on the Account entity already.


  2. Also, assuming that you have Activated this Action and then registered this Action as a Step on the Plugin in the Plugin Registration Tool.


    Now, let’s see how we can pass the parameters to the plugin itself from the Flow

Power Automate Flow – Bound Action

You must’ve used the Dataverse connector a lot, so here’s how you can call the Bound Action and pass the Entity parameters to the Action itself –

  1. You’ll need to use a Perform a bound action action in Power Automate which is offered by Dataverse connector.

  2. Then, select the Table which you have registered the Action on in CRM and you’ll then see the Action’s backend name appear for selected, then pass the primary key of the record.

  3. Now, since there are 2 Entity parameters to be passed for the Action, here’s you’ll see all the fields from those Entity itself!

  4. For each of these Entity parameters, you have to look for the Primary Key field of those tables and then pass the GUIDs of the Entity records you presumably have.

  5. But wait! There’s a limitation here. You can only have 1024 parameters saved for the step selected. Hence, only 1 lookup will suffice.
    You’ll get the below error when trying to set both the Lookups that I’m passing to the action above.
    The error says “The dynamic schema response from API ‘commondataserviceforapps‘ operation ‘GetMetadataForBoundActionInput‘ is too large, only schemas with at most 1024 properties are supported.

  6. Hence, I’ll go back and remove 1 Entity parameters just for this example to work!

  7. And then, I’ll simply profile the Plugin Action to see what is passed in the InputParameters.

Plugin Context

Now, given that you might have already Profiled the plugin and attached it to the Plugin Registration Tool process, let’s examine the context’s Input Parameters –

  1. the InputParameters in the plugin’s execution context will contain the following parameters, which we’ll see in later section of this blog post.
    Target [EntityReference, in a CRUD operation registered plugin step – Target is an Entity in the plugin context]
    AccountRecord [Entity]

Considerations

Here are some considerations if you want to make design decisions for your implementation –

  1. Cannot have 2 or more Entities as parameters as the Perform a bound action step itself has limitations of 1024 properties at the Power Automate level.
  2. Pass some other Entity than the Action you have registered on since you’ll get the registered Entity as EntityReference itself in the “Target” parameter.

Hope this was useful!

Thank you!

Risk Assessments on Projects using Copilot in Project Operations

Now that you must have enabled Copilot for Project Operations [If not, you can refer this post to learn how you can enable Copilot for Project Operations – Enable Copilot for Project Operations]

Note: Please note that this is a Preview feature at the time of writing this post and hence, not recommended for Production usage.

Enable Copilot For Project Operations

First, you need to ensure that Copilot is enabled for your Project Operations environment – Enable Copilot for Project Operations

Once this is enabled, you’ll be able to see Copilot on the ribbon button on the Project. Let’s look at the same in the below section.

Risk Assessment

Now that you have Copilot enabled for Project Operations, you can now create Task Plans for a Project in Project Operations –

  1. When you navigate to a Project, you’ll see Copilot appear on the Ribbon given that you have enabled it for your Project Operations instance.

  2. Once you select Risk Assessment, it runs in the background and takes a few minutes till Copilot will populate the Risks for you.



  3. Once this is generated in a few moments, it appears in the Risks tab on the Project itself.


  4. Additionally, you are free to use this in reporting as well

Here’s Microsoft’s Full Documentation on how Risk Assessments work in Project Operations’ Copilot (Preview) – https://learn.microsoft.com/en-us/dynamics365/release-plan/2023wave1/finance-operations/dynamics365-project-operations/assess-issues-risks-project-using-project-manager-copilot?WT.mc_id=DX-MVP-5003911

Hope this was useful!

Thank you!

Enable Copilot for Dynamics 365 Sales environment

Copilot for Dynamics 365 Sales is in Preview at the point of writing this post. Hence, I’ll start by showing how you can turn Copilot for Dynamics 365 Sales on for your environment.

It is recommended to do this in your Sandbox instances first.

Enable Copilot for Dynamics 365 Sales

Given that you have the correct licenses setup and you are a System Administrator, you can follow the below steps in order to enable Copilot for Dynamics 365 Sales –

  1. Make sure you are in the Sales Hub app.

  2. And then go to the App Settings

  3. Here, you’ll see the Copilot as an option on the Sitemap, select that.

  4. Note that all the Settings are turned off by default.

  5. First thing you can do is, turn Auditing On. It’ll take a while to Save the changes in the background.

  6. Then, you can turn on other features which are in Preview On and enable the Copilot App based on your Published apps in your environment.

  7. Here are the Preview features which are listed under the See what’s in preview link in the above screenshot – https://learn.microsoft.com/en-gb/dynamics365/sales/view-copy-email-summary
  8. Once the changes are saved, it’ll look like this in the Published state. the selections will remain the Publish button will be disabled.

  9. At this stage, the Copilot has been enabled for the selected Apps in your environment.
    Now, you can move to other options like Opportunities and Leads tabs on the settings page.
    The Summary section in the entities will show which fields should be included in Summary information that Copilot will generate.
    The Recent Changes in the entities will show which fields should be included to keep track of the changes which happen on these fields.


At this point, you are all set in configuring Copilot in your Dynamics 365 Sales environment. Next, I’ll write about how we can use the features which we enabled in this post and I’ll share link of the upcoming post here.

Hope this helps!

Thank you!

Position Hierarchy Settings in Dynamics 365 CE

In this post, you’ll learn how to configure Position Hierarchy for Dynamics 365 CE environment –



Let’s first look at the scenario which we want to look at and then how we can configure the Hierarchy to limit and show the Positions of the Users the intended data.

Scenario

Let’s consider the below scenario on who report to whom in the org CFT300 based on the below Positions in the Org –

In the above example,

  1. Salesperson should see then own records.
  2. VP of Sales should say their own records and of Salesperson roles too.
  3. Executive Director should see their own records and only those of VP of Sales, but not Salesperson roles.


Position Hierarchy

Given that you already know how to navigate to Hierarchy Settings in Power Platform Admin Center, refer the below to understand how to configure the same based on the above scenario –

  1. Once you are in the Hierarchy Settings in the Environment’s Settings area in Power Platform Admin Center –


  2. Now, you can select Enable Position hierarchy Model and click on Save to apply the Position Hierarchy Model access to your environment. Once Saved, you’ll see as below.

  3. The Depth defines how many levels of Positions should a User be able to access records of other user in other positions in a top-down approach.

  4. Let’s click on Configure in order to start setting up the Positions in the Org.

  5. Now, based on the diagram above, I’ll create the Position hierarchy on this page

  6. Now, based on the same, I’ve created the below Positions in a hierarchy


    And the tree looks like this –

  7. Now, next is to assign these Positions to different Users in Power Platform Admin Center. If you go to Users and select any of the Users, you’ll see Change Position button on the ribbon.

  8. Then, you can find the Position you created which you want to give to the Users. Select it and Save it on the Pane.

  9. Complete the process for all the Users who need to be having one of the Positions you created.


    So based on this, Jack Green will be the Executive Director and will be able to access Amit Prajapati’s records and not Vidit Gholam’s or Ethan Rebello’s records.

    Also, the selected Tables are the ones to which the Position Hierarchy should apply.


    Now, based on the above setup and the Scenario provided, let’s look at how the records will be visible to the Users in the hierarchy.

Dynamics CRM Records access based on the Position Hierarchy Security –

  1. Let’s start reviewing from the bottom of the hierarchy. Vidit and Ethan, both will see their own record in the Active Accounts view and no one else’s based on the Hierarchy Settings.
    Also, note that the Read privilege for all the Users in their Security Roles is set to “Users” and not “Organization“.

    Ethan Rebello


    Vidit Gholam

  2. We move 1 level up to Amit Prajapati – he’ll see his own record and also Vidit’s and Ethan’s records in Active Accounts view.

  3. And Jack Green can access his own record and as he’s the Executive Director and can see VP of Sales position records, he’ll see only Amit’s records for Accounts and not Ethan’s and Vidit’s.


    This will change if we increase the Depth to 2, 3 and onward based on the hierarchy structure.

Hope this was useful!

Thank you!

Manager Hierarchy Settings in Dynamics 365 CE

In this post, you’ll learn how to configure Manager Hierarchy for Dynamics 365 CE environment –


Let’s first look at the scenario which we want to look at and then how we can configure the Hierarchy to limit and show the Managers the intended data.

Scenario

Let’s consider the below scenario on who reports to whom in the Org CFT300 used in this example –

In the above example,

  1. Manager of Vidit Gholam and Ethan Rebello is set as Amit Prajapati.
  2. Manager of Amit Prajapati is Jack Green.

And this structure looks as below in Dynamics 365 CRM environment –

Note: Please note that in order to set Hierarchy Settings correctly, the Read privileges on the intended entity must be set to “User” level. If it is set to “Organization”, the the User will anyway be able to access everyone’s records despite Hierarchy Security Settings in place.

Manager Hierarchy

Given that you already know how to navigate to Hierarchy settings in Power Platform Admin Center, refer the below to understand how to configure the same based on the above scenario –

  1. Once you are in the Hierarchy Settings in the Environment’s Settings area in Power Platform Admin Center –

  2. Now, you can select Enable Manager hierarchy Model and click on Save to apply the Hierarchy Model access to your environment.

  3. The Depth defines how many levels should a Manager be able to access records of the Users in the top-down approach.
  4. You’ll also need to set the Users’ Manager in Power Platform Admin Center. If you go to the Users and select any 1 of them, you’ll see Change Manager button on the ribbon.

  5. And, search the User who should be the Manager of the User which you are editing.


    So based on this, Jack Green will only be able to access Amit Prajapati’s records and not Vidit Gholam’s or Ethan Rebello’s records.

    Also, the selected Tables are the one to which the Hierarchy Security should apply.


    Now, based on the above setup and the Scenario provided, let’s look at how the records will be visible to the users in the hierarchy.


Dynamics CRM Records access based on Manager Hierarchy Security –

  1. Let’s start reviewing from the bottom of the hierarchy. Vidit and Ethan, both will see their own record in the Active Accounts view and no one else’s based on the Hierarchy Settings.
    Also, note that the Read privilege for all the Users in their Security Roles is set to “User” and not Organization” level.

    Ethan Rebello


    Vidit Gholam –

  2. We move 1 level up to Amit Prajapati – he’ll see his own record and also Vidit’s and Ethan’s records in Active Accounts view.


  3. And Jack Green can access his own record and as he’s the Manager of Amit Prajapati, he’ll only see Amit’s records for Account and not Ethan’s and Vidit’s.


    This will change is we increase the depth to 2, 3 and onward based on the hierarchy structure.

Hope this was useful!

Thank you!

Time and Materials Billing Backlog table in Project Operations

Even wondered what is Time and Materials Billing Backlog table in Project Operations which you see in the Sitemap but you might not be sure what records are in it –

Time and Materials Billing Backlog

Here’s the purpose of Time and Materials Billing Backlog entity –

  1. Given that you have 2 Time Entries Submitted for Approval to the Project Management / Account Manager.

  2. When the Time Entries are Submitted, the Approver receives them for Approval. The Project Approver can then Approve the Time Entries.

  3. Once they Approve the Time Entries, they are turned into Actuals (msdyn_actual) in Project Operations.
    Then, Actuals are then Supposed to be marked as Ready to Invoice.

  4. So, when the Actuals are the created and even when they are marked Ready to Invoice, that’s when they appear in Time and Materials Billing Backlog view.
    This table is in fact Actuals table itself! 😊 (There’s no Time and Entry Billing Backlog as a separate Table in Dataverse).

  5. In fact, Accounting managers or Project Managers can Ready the Actuals from the Time and Materials Billing Backlog table itself.

  6. Now, then the Invoice is created for the ready Actuals, they are added to Invoice Lines

  7. And then, they are removed from the Time and Materials Billing Backlog view.

Hope this was useful!

Thank you!

Create Project Plan using Copilot in Project Operations

Now that you must have enabled Copilot for Project Operations [If not, you can refer this post to learn how you can enable Copilot for Project Operations – Enable Copilot for Project Operations]

Note: Please note that this is a Preview feature at the time of writing this post and hence, not recommended for Production usage.

Enable Copilot For Project Operations

First, you need to ensure that Copilot is enabled for your Project Operations environment – Enable Copilot for Project Operations

Once this is enabled, you’ll be able to see Copilot on the ribbon button on the Project. Let’s look at the same in the below section.

Task Plan

Now that you have Copilot enabled for Project Operations, you can now create Task Plans for a Project in Project Operations –

  1. Before you proceed to trying to create a Task Plan, the Copilot in Project Operations will look for data on the Project record such as the Description of the Project, Start Dates, End Dates etc as reference point in order to understand what type of Tasks are to be created.


    The sample Description I put for this Project is – “This project will be a 3-month implementation for Microsoft Dynamics 365 Project Operations with integration to F&O system. This project will involve developers to design, architect, develop code, test and deploy the same.

  2. Once Project details are in place and the Project doesn’t have a Project Plan yet. It should look something like this and having Copilot button on the ribbon visible.

  3. If you expand the menu, you’ll see that it has an option called as Task Plan. Click on it and it’ll begin processing operations and show loading screen while it processes in the background.


    And the loading screen will keep showing messages like Computing, Collecting, analysing etc.

  4. In a few moments, it’ll process successfully and you’ll see Tasks being generated in the Tasks pane on the Project. Note that it’ll only segment into tasks and sub-tasks based on the Description and Start/End Dates – it won’t assign anyone or estimate any hours.

Here’s Microsoft Documentation on how the Task Plan feature works. Please note that this in Preview at the time of writing this post – https://learn.microsoft.com/en-us/dynamics365/release-plan/2023wave1/finance-operations/dynamics365-project-operations/generate-project-plans-using-project-manager-copilot?WT.mc_id=DX-MVP-5003911

Hope this post was useful!

Thank you!

Enable OneNote Integration for Dynamics 365 Sales environment

Since collaboration tools like Teams, SharePoints are a norm, so is OneNote. Here’s how you can enable OneNote Integration in your Dynamics 365 CRM Instance –

Settings

Given that you are a Dynamics 365 Administrator, you can navigate to Power Platform Admin Center of your environment –

  1. Go to the Settings on the Environment you want to enable OneNote Integration on

  2. Once you are in Settings, you can expand Integration section and then click on Document management settings.

  3. Then, you’ll be redirected to the classic UI and there, you can select Enable OneNote Integration (provided that SharePoint Based Document Integration is already enabled).

  4. It’ll open up this dialog box and you’ll see the pre-enabled OneNote for the tables which you want to enable. You can choose the ones you want to enable OneNote Integration for and click Submit. This dialog box will then close.


  5. Also, double-check on the entity level if OneNote Integration is turned On or not.

  6. Once this is done, you can refresh the record on which the OneNote Integration is enabled and click on + on the Timeline to show the Activity options. You’ll see OneNote appear as well.

  7. Once you click on it, it’ll be redirected to SharePoint location where the OneNote notebook is created.

  8. Now, when you come back on the record and click on Related section and go to Documents. You’ll see the OneNote notebook appear there in the record.



Thank you!

Setting up Project Operations Lite | Blog Series

I felt that I should document steps to setup a Project Operations Lite Trial environment for everyone who’s trying to spin up a Project Operations Instance and trying to get the basic setup done as of January 2024.

Hopefully, this blog series will help you to quickly access and understand the info you need to setup your Project Operations Lite trial environment –

Provisioning a Project Operations Trial

BlogDescription
Starting a Project Operations Lite Trial EnvironmentDescribes how you can provision a new trial for Project Operations Lite and

Setting up Master Data in Project Operations Lite

BlogDescription
Setting Up Default Organizational Unit, Cost and Sales Price List in Project Operations LiteFind and rename a default Organizational Unit, attach a Cost Price List to the default Org Unit and also create a Sales Price List
(Roles, Prices will be added in preceding post)
Resource Roles and adding Role Prices to Cost and Sales Price List in Project OperationsReview and Create Resource Roles, Add Resource Roles to Sales and Cost Price Lists

Billing / Invoicing

BlogDescription
Time and Materials Billing Backlog table in Project OperationsExplains what Time and Materials Billing Backlog table in Dataverse in Project Operations is used for.

Copilot for Project Operations

BlogDescription
Enable Copilot for Project OperationsEnable / Disable Copilot as a feature for Project Operations
Create Project Plan using Copilot in Project OperationsCreate a Project Plan for your Project using Copilot based on the Description and other details on the Project record.

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Setting Up Default Organizational Unit, Cost and Sales Price List in Project Operations Lite

One of the first things to do when starting a new Project Operations Lite environment is to start setting up organizational data for your environment and in that, setting up Organizational Units, Price Lists and Roles is one of the first things to start setting up –

Let’s see on how you can start the same!

Organizational Unit

One of the first things is to find the Default Org unit for your Project Operations Lite environment –

  1. Ensure that in your Project Operations environment, you are in Project Operations app. On the SiteMap, switch to the Projects area and then, Settings sub-area.

  2. Now, locate Parameters and there should be 1 record in the Parameters which is mapped to the Default Organization Unit. Select the same (click on the clickable Default Organization Unit lookup itself to directly go to the Organizational Unit [or referred to as Contracting Unit in further blogs])

  3. Once in the Org Unit record, update the name to what should be relevant based on your use case.

  4. I’ve renamed this to a more readable name based on my example. Now, it should be easy for you to deal with this Org Unit while referencing further.

Cost Price List in Organizational Unit

Now that the Org Unit is found and renamed correctly, let’s add a Cost Price List first –

  1. In your Org Unit record, click on Add Existing Price List.

  2. Then, click on New Record button upon clicking the Look for records lookup field.

  3. Select Price Lists as the record type and it should be the only option there.

  4. Then, it’ll open the Price List record creation form on the window and you can start filling in the details. Make sure the Context selected is of type Cost.
    Also, the Effective Start Date and Effective End Date should be relevant as all calculations will be successful only if the dates to be processed for transactions fall inside the intended Price List dates.

    Save and Close the record once completed.

  5. Then the saved record will be shows as selected on the lookup field and you can then click on Add button at the bottom.

  6. Once added, the sub-grid on the Org Unit will take a while to refresh and then show you the tagged Cost Price List correctly.

Now, you have tagged a Cost Price List successfully. Let’s quickly add a Sales Price List.

Sales Price List

Now that you have created and tagged a Cost Price List in the previous section, let’s quickly create a Sales Price List shell so that you have a corresponding Sales Price List to look at as well –

  1. Navigate to the Sales area and then select, Sales sub-area in the Sitemap.

  2. Here, click on the Price Lists table and then click to add a new Price List record.

  3. Note that the Context of the record should be Sales this time since we want to create a Sales Price List. Save & Close once the record looks satisfactory based on your relevant information to fill in.

  4. Now, you can see that the Sales and Cost Price Lists are ready in your environment.

Next Section here will cover adding Resource Roles, and then adding Role Prices to the Sales and Cost Price Lists.

In later parts of this series, we’ll continue to add more to the Price Lists, Roles etc.

Hope this post was useful!

Thank you!

Starting a Project Operations Lite Trial Environment

One of the first things you want to get done is to spin up a Project Operations Trial environment. Here’s how you do it!
Note that no Credit Card is required for a Project Operations trial environment.

Project Operations Trial Environment

Here’s how you start a Project Operations trial environment –

  1. Navigate to https://trials.dynamics.com/ which will be redirected to https://dynamics.microsoft.com/en-us/dynamics-365-free-trial/ and the page you’ll see will be something like this –

  2. Scroll down on the page till you see the option for Project Operations. Click on Try for free.

  3. Once you click on Try for free, you can enter a new email on which you want to start a trial. I’ve come up with a sample email which I haven’t used before. I prefer not to enter an email/username which I already use.

  4. Click Next and then you can select Setup my account.

  5. Then, you can validate/enter the details which would be something like this and click Next.

  6. Then, choose if you want to receive an OTP via Text or a Call. Text is my preferred way so I selected that and clicked on Send verification code.

  7. I received an OTP which I entered as asked on the next pane and clicked Verify.

  8. Further, I’ll be shown the login details of the trial environment I want to create. Once this is done and I save the information, I click on Next.

  9. Then, you’ll be shown the details of the environment you’ll be provisioning. Save this info before proceeding.
    Next, click on Start using Dynamics 365 Project Operations (CE) – Preview Trial. It’ll redirect you to the questionnaire page on a new tab where you’ll need to answer the questions before choosing the Deployment Type.

Deployment Type Questionnaire

As we left off from the Step #9 in the above section, you’ll be taken to a questionnaire Wizard in order to provision a Project Operations trial –

  1. Yes to managing Opportunities as they move through the process.

  2. Yes to requiring advance or extensible resource management.

  3. Yes to requirement of workflow for approval of Time and Expense.

  4. No to advanced Expense management.

  5. No to Non-Stocked Materials

  6. No to Stocked materials as well.

  7. Select the version here. Even if you would have answered the questions randomly, you would have still gotten to choose a different version than recommended. Hence, the answers above point to a recommendation of Project Operations Lite type of Deployment.
    Click Begin Setup once Lite Deployment is selected.

  8. Setup will start provisioning once you click on Start.

  9. And it’ll take a while and it’ll say that it will redirect you to the Power Platform Admin Center.

  10. And once you are redirected to Power Platform Admin Center and you’ll see that the environment is provisioning. This will take a few minutes before completion.

Set Name and URL for your environment

Provisioning a Project Operations trial will get set a predefined URL and name which you should update to be readable –

  1. Open the environment once the link is clickable and the State is set to Ready.
  2. Then, click on the Edit button

  3. Once you click Edit, you’ll get to update the Name and the URL. Set to something that is relevant to the purpose of the environment.

  4. Once you click on Save, you’ll see this loading page which will take a while to complete and finally, your Project Operations Lite environment setup will be ready!

Here’s Microsoft’s Documentation detailing which all Deployment Types have different Project Operations features available – https://learn.microsoft.com/en-us/dynamics365/project-operations/environment/determine-deployment-type?WT.mc_id=DX-MVP-5003911

Hope this post was useful!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Difference between Project Operations Deployment Types

Now, as you know that there are the following 3 types of Project Operations Deployments –

  1. Project Operations Lite (Deal to Proforma Invoicing).
  2. Project Operations for resource/non-stocked scenarios.
  3. Project Operations for production/stocked scenarios.

To keep the comparison understandable and simple, let’s review the below in a short summary.

Comparison

Project Operations LiteProject Operations for Resource/Non-stockedProject Operations for Production/Stocked
Finance & Operations module is not setupFinance & Operations setup is requiredFinance & Operations Setup is required
No Dual Write IntegrationDual Write Integration is required.Dual Write Integration is required.
Up to Pro-forma Invoicing is availablePro-forma Invoicing in PO and Customer-facing Invoicing in F&OFull Invoicing in F&O
Basic ExpenseBasic and Full Expense with Receipt OCRFull Expense with Receipt OCR

Hope this quick summary was useful. I’ll continue adding more to this article over time.

Here’s official Microsoft Documentation for full deployment guidance with questionnaire –

Thank you!

Microsoft Loop is now in GA for Microsoft 365 work accounts

Microsoft Loop is now Generally Available for work accounts in Microsoft 365. Here’s an exciting new product that will make collaboration fun and productive!

Here’s Microsoft’s Announcement article on the same – https://techcommunity.microsoft.com/t5/microsoft-365-blog/microsoft-loop-built-for-the-new-way-of-work-generally-available/ba-p/3982247?WT.mc_id=DX-MVP-5003911

But, let me summarize first impressions of the same!

Accessing Loop in Microsoft 365

Given that you have the correct access for Loop services in M365, here’s how you can access and use Loop in your Organization –

  1. Go to https://loop.microsoft.com/. You’ll be taken to the Welcome page which looks like this.
    You can click on the Loop logo on the top-left to see all the Workspaces you have access to.

  2. You can start of my creating your own Workspaces.

  3. Name your Workspace based on the purpose or the project.

  4. And you’ll have the Workspace ready to create your Pages and Links in the Workspace.
    If you click on the + icon, you’ll be able to create new sections like Pages and Links.

  5. In the Pages itself, you can start adding Loop components and generating content as below –
    Slash (/) to insert a Loop component

  6. And you can tag people/content using @

  7. In the Tag menu itself, if you scroll to the very bottom you’ll be able to change the Editor Settings.


    And this is what the Editor Setting menu looks like where you can change the settings which suit your needs.

And you can explore further what all Loop can do! I believe there are more features coming soon!

Ensure Loop is available for the Organization

By default, Loop is enabled for the your organization. However, you can double check as below –

  1. Given that you might be having Global Admin access to your tenant, you can go to the Org Settings.

  2. And you can see that the workspace access have been given to all the users.

Licensing

Having one of these Microsoft 365 licenses will let you use advanced features of Microsoft Loop –

  1. Microsoft 365 Business Standard
  2. Microsoft 365 Business Premium
  3. Microsoft 365 E3
  4. Microsoft 365 E5


Detailed Documentation here on the Licensing requirements – https://support.microsoft.com/en-us/office/loop-access-via-microsoft-365-subscriptions-92915461-4b14-49a4-9cd4-d1c259292afa?WT.mc_id=DX-MVP-5003911

In case you don’t have the correct required licenses, you’ll still be able to access Loop but won’t be able to create Workspaces or use any advanced features. Typically, you’ll see a screen like this –

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform\
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Thank you!

Power Automate Cloud Flows designer using Copilot | Now in GA

As Microsoft rolled out using Power Automate Cloud Flows designer using Copilot on 8th Nov 2023, here’s a look at how you can use it in your scenarios while designing Cloud Flows!

Copilot in Power Automate

As you might have noticed by now (based on the currently supported region you are in), Copilot is enabled in Power Automate Flow Designer directly!

  1. You’ll notice that the Designer’s look and feel is new and refined than the previous UI. Of course, I’ll need time to get familiar with this in coming days. 😊
    But I’ll share with you what I learnt so far.

  2. Next, when you click on the Step, the Properties are on a left hand side pane so you don’t see a menu dropping down in your screen as before which needed you to scroll that used to make your Action go above the screen from the top

  3. Once the Property pane appears, you can select the different types of Triggers available from the Runtime menu which was previously a Tab within the Action selection dialog box.

  4. On the right hand side, you can see the Copilot button to show and hide the Copilot pane where you can type in your Commands.

  5. Example, I can type in a query in natural language to retrieve records from Dataverse, for example. Here’s how it looks.
    My request is then turned into an appropriate trigger retrieving the correct information I was looking for.

  6. And if I check what was retrieved, I can click on this trigger to reveal the Properties and verify/change if I need to.

  7. In case there’s something that isn’t clear to understand. You won’t be returned with any action taken on the Flow itself – probably you

  8. Then, you can simply click on the thumbs down icon and submit your feedback.

  9. I faced an issue while submitting Feedback but I think I might be missing something or this is being fixed still.


  10. Further, here’s how the Add an action works – it simply reveals the pane on the right hand side which was previously a flyout menu.


Overall, the visual improvement helps in identifying with the structural flow of logic better and I’m looking forward for more updates on this is coming days/weeks and months!


For now, you can always go back to the classic designer by clicking on the ellipses and then selecting



Here’s a link to the Microsoft post on the announcement of this feature – https://learn.microsoft.com/en-gb/power-platform/release-plan/2023wave2/power-automate/use-power-automate-cloud-flows-designer-copilot?WT.mc_id=DX-MVP-5003911

Hope this helps!

Thank you!

Read Excel File from SharePoint Online and create Records in Dataverse | Power Automate Flow

One of the most common scenarios is to be able to pick an Excel spreadsheet from a SharePoint Document location and create records in Dataverse.

There are several ways to do this. But, one of the most common scenarios could be to use Power Automate Flow and use Excel Online and SharePoint Online connectors to perform this operation!

Scenario

Here’s the scenario which you can expand on and fit the same according to your Use Case –

  1. There’s a file in a SharePoint Document location called as AccountImport.

  2. This file has some Account information that needs to be inserted in Dataverse
    Here’s the Excel content which has a Table in it.


  3. And this data needs to be Inserted in Dataverse [You can either Automate this Flow, Make it On Demand — based on whatever is suitable for you]


    Let’s create an On-Demand Flow in order to pick this File and then insert into Dataverse.

Power Automate Flow

Here’s the Power Automate Flow which we’ll create. For the same of this example and to keep it simple, we’ll create an On-Demand Flow –

  1. Create an On-Demand Flow in Power Automate [You could even choose to run the Flow when a SharePoint file is created or changed or even to Run the Flow once every day — depending on what best suits your case]
    Then, look for the action called as List rows present in a table from Excel Online connector

  2. In this Action, you can select what SharePoint Site is to be selected where you have the File in the Document Location.
    Also, then select the Document Library where the Documents reside. That’s the ideal place where you would want to place your Documents.
    Finally, select the File itself by navigating from the Folder icon on the File property as shown below.


    And then select the File once you find it.

  3. Once you have selected the File, the Table itself will be available to pick up from the Table property. Make sure to convert the Excel data into a Table.

  4. Now, once all the Properties are set on the List rows present in a table action, select a For Each loop in the Flow.
    And in the Inputs, give value (List of Items) from the List rows present in a table action which we just completed above.
  5. Once this is set, select Add a new row action from the Dataverse connector in order to create the records sequentially in Dataverse.

  6. Here, map the Columns to the fields in Dataverse. Firstly, select the Table in Dataverse you want to insert these records into.
    Then, select the fields from the Excel which the Excel connector itself separated out for you.

  7. Once you complete all the Fields from the Excel to the Dataverse connector.
    Save and Test the connector itself.
    This will create the records in Dataverse (Dynamics CRM)


    And it’ll Run in a few moments and succeed if everything goes right.



Dataverse Records

Now, let’s see the Flow in action –

  1. Because this is an on-demand Flow, you can Run it whenever you want. And when it Runs successfully, records in the Dataverse will be created as shown below


Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Thank you!

Dataverse Low-Code Plugins | Dataverse Accelerator | [Preview]

So by now, you must’ve come across Dataverse Low Code Plugins quite a lot if you’re following Dynamics 365 Wave Updates.

Here’s a post that demystifies and summarizes what Low Code Plugins are all about and how you can start implementing the same.

Note: Please note that this is in Preview at the time of writing this post. Hence, this feature is not recommended for Production use.

What are Dataverse Low-Code Plugins?

Here’s what Low-Code plugins are about in Dataverse –

  1. Low Code plugins let you write server side business logic without having to write custom .NET code and register manually on the Dataverse.
  2. Currently, this is in Preview and once this is out of Preview, a lot of features are expected to be released for Low Code plugins.
  3. You can create 2 types of Low Code Plugins –
    • Instant [On Demand]
    • Automated [Based on a Dataverse event]

Let’s see how you can get started in using Low Code Plugins in Dataverse.

Getting Started

Here’s how you can get started on Low-Code Plugins in Power Platform.

  1. Ensure that you have the Dataverse Accelerator App installed on your environment. From 1st Oct 2023, all new environments have the Dataverse Accelerator App installed in their environments.

  2. Now, you can run the Dataverse Accelerator in Power Platform App Make [https://make.powerapps.com/environments/] –

  3. Usually you’ll see a Play button when you hover over the Dataverse Accelerator App listing, but alternatively, you can also click on the three dots (ellipses) and choose Play.

  4. You can even Dataverse Accelerator as an App in your Dynamics 365 CRM Sales environment just like other Model-Driven Apps

  5. When the App opens in the new tab on your browser, you’ll see that it looks similar to the Power Apps Maker Portal, you can now see that you’ll get an option to create a new Plugin –
    You can create 2 types of Plugins – Instant and Automated.


    Or even using the large buttons on the Home Screen

  6. In this example, I’ll create an Automated plugin that will do some Action when an Account is Updated.
    Example: So simply when any field is updated on the record, Description field will be updated with the value “This is my first Low Code Plugin“.
    So, when I click on Create automated plug-in, I get to define the Name of the Low Code Plugin and details like when it should be triggered.

  7. Once you save this, your Plugin will appear in the Automated plugins list in Dataverse Accelerator.


  8. Now, once you edit the record, notice that the field is blank –

  9. And when you Save the record, the Description field will be updated with the value as mentioned in the Formula of the Plugin.


Low Code Plugin Power Fx [Preview]

Here’s where you start learning Power Fx for Low Code plugins. Please note that a lot of Formulas are currently not supported as of the day of writing this post – https://learn.microsoft.com/en-us/power-apps/maker/data-platform/low-code-plug-ins-powerfx?WT.mc_id=DX-MVP-5003911

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Thank you!

Enable App Auto-Updates in Power Platform Admin Center | [Preview]

Sometimes, Apps need to be up to date in order for some features to run effectively. Power Platform Admin Center now allows you to select Third-Party Publishers for an Environment to allow automatic App Updates in your defined Maintenance Window slots.

Enable Auto-Updates

Here’s how you turn on Auto-Updates for certain Publishers in your environment –

  1. Go to the Environmnent itself in Power Platform Admin Center given that you have correct Admin access. [https://admin.powerplatform.microsoft.com/environments]

  2. Then, look for Updates and then App Update Settings

  3. Then, turn On Select publishers from which you want to receive app updates to this environment.

  4. Finally, select the Publisher which you want to receive Automatic App Updates on.

  5. And once you select, you’ll see the selection as below to show which apps are allowed to auto-update in your Maintenance Window Hours.

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Thank you!

Validate Email address on Email field in Model-Driven Apps | [Preview]

As part of various implementations, often you’ll need to ensure that the field validation is in place.

Note: Please note that this feature is still in Preview at the time of writing this post.

Enable Data Validation in Power Platform Admin Center

Here’s how you can enable Data Validation for Email fields in Power Platform Admin Center

  1. Once you have the correct rights in Power Platform Admin Center (https://admin.powerplatform.microsoft.com/environments), select the Environment you want to enable Data Validation for.

  2. Now, expand Product section and go to Features as show below

  3. When you scroll to the very bottom, you’ll find Data Validation option turned out by default.

  4. Turn it on and Save the Settings.

  5. Also, in the App Designer for Model Driven Apps, click on Settings on the Editor. Click on Settings in the Command bar and then go to Upcoming features in the Settings list.
    You’ll find that the Enable Smart Email Address Validation Control feature is still set to No.

  6. Turn this feature on, Save the settings and Publish the App again before running it.



Validation in D365 Sales Example

Let’s look at the below example –

  1. When you have enabled the Email address validation, you can try to enter the below data which is invalid on purpose and see the result.
    When you enter an address, it’s validating on the fly.


  2. It’ll show that the domain is known but disposable, for example.

What Validation is considered?

Below are the validations that are carried out –

  1. Incorrect Syntax – Like Email Domain and Username can’t exist together
  2. Disposable Domain – Temporary Email domains.
  3. Test or spam emails – You can turn this feature off is you are testing in Sandbox and enable for Production to prevent spam emails being entered.
  4. Expired Email Addresses – Disabled emails which can’t send or receive Emails.
  5. Emails that bounce back – Like disabled addresses which bounce back

Here’s Microsoft official documentation for this feature – https://learn.microsoft.com/en-us/power-apps/maker/data-platform/data-validation-email-column?WT.mc_id=DX-MVP-5003911

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform\
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Thank you!

Install correct .NET Framework for Dynamics 365 CRM Plugin Assembly | Plugin Registration Tool

Here’s how to solve if you see the below error while updating/registering your Plugin assembly in Plugin Registration Toolbox

Here’s how to go about solving the same!

Error while Updating Plugin Assembly

In case you see the above error in Plugin Registration Toolbox, here’s how you can upgrade your Project in Visual Studio to compile using the correct .NET Framework version as suggested in the error –

  1. Go to Properties of the Project and look for Application tab

  2. Now, drop down on the Target Framework and look for the version asked as in the error. At times, it appears that you don’t have it installed on the machine given that you just changed your system, so you need to download it.

    Select “Install other frameworks…”

  3. You’ll be redirected to this page (unless there’s a change in the future) – https://dotnet.microsoft.com/en-us/download/visual-studio-sdks?cid=getdotnetsdk
    And you can find the right Framework once you look for it. In this case, I need 4.7.1 version.

  4. Save the file and install once download.

  5. Once you begin to install, it’ll go about the same way as any other installer.


    And you’ll see this will be installed quickly [within few moments]

  6. Once done, close and restart Visual Studio and then again check the Applications section in Project’s Properties.
    You’ll see the installed version appear there.

  7. And once you select it, it’ll ask before applying this framework.


  8. Click Yes and compile the Assembly again. Once done, you’ll be able to register your plugin on the Plugin Registration Tool successfully.

Hope this helps!

Thank you!

Profitability Analysis on a Project Service Quote | Project Operations

So, while working on Project Operations Quotations, you see this below tab called as Profitability Analysis.

And below are the fields that you see –


In case you are trying to find what each of these fields mean and how they calculate – this post if for you!

Chargeable Calculations

Now, let’s look at how the Profitability Analysis fields calculate –

  1. Let’s open the Quote Lines and see what we have. We have 1 Quote Line which has a budget of $50. Hence, you saw Total Revenue to show as $50K already in the beginning of this post.

  2. When you start adding Quote Line Details in the Quote Line, for example some services as below –
    I’ve added a Chargeable Quote Line of 40 hours of Consulting at $200/hour for a Consulting Lead role.


    This will show up on the Profitability Analysis

  3. Now, let’s look at how this is calculated.
    Total Chargeable Costs come from the Cost Price list of the Contracting Unit this Quote/Opportunity is in.
    So, 40 x $150 =. $6000


    Here’s the Org Unit on the Quote (that came in from the Opportunity)


    And it has a Cost Price List associated with it which has Role Prices set for each Role.


    Which has a Cost Price of the Consulting Lead Role set to $150.

  4. Similarly, the Total Revenue now reflects the Equivalent Sales Price with respect to the Role in the Org Unit. The Sales Price is associated with the Quote in the Project Price List tab as shown below. Also, please be mindful of the Start Date and End Date of the Price Lists added so that Prices are picked from the right Price List.


    And the Sales Price for the Consulting Lead for the current Org Unit is set to $200.


    Hence, the calculation
    $200 x 40 = $8000.
  5. This also then calculates the Gross Margin to show a positive 25% and the Project is Profitable.


Non-Chargeable Calculations

Now, let’s look how non-chargeable calculations are considered –

  1. Let’s say there’s an Account Manager role added to the Cost Price list as shown below-

  2. And in the Quote Line, the Account Manager is considered as a Non-Chargeable Role for this Quote.

  3. Now, if there’s a Quote Line Detail added for the Account Manager Role in this Quote Line – it’ll be added to Non-Chargeable Costs. I’ve added 2 hours of effort for this Quote Line Detail entry.

  4. This will reflect the Total Non-Chargeable Cost to be $85 x 2 = $170.
    And the Adjusted Gross Margin will show 22.88% instead of 25%. Adjusted Gross Margin shows the adjustment considering Non-Chargeable Components in this Quote.

Hope this helps!

Thank you!

App Access Checker in Power Platform Admin Center

Here’s how you can find if users have the correct access to a Model-Driven App assigned or not. This becomes the first line of troubleshooting when a Users say that they are not able to access an App.

Let’s see how to use this!

In order to check a Users’ access to a certain app, make sure you a Dynamics 365 Administrator and has access to the environment in the Power Platform Admin Center (https://admin.powerplatform.microsoft.com/environments)

App Access Checker

Here’s how you can check in the Power Platform Admin Center which user has the correct access to the Published Apps in Power Platform.

  1. Enter Settings by first selecting the desired environment where you want to check a User’s App Access.


  2. Then, look for Users.

  3. Alternatively, if you had clicked on the Environment name and entered the Properties, you’ll also find a Users link directly.



  4. Then, you’ll see the option for ‘app access checker’.

  5. Now, you’ll see a simple textbox which asks for the User ID of the user in the org whose access you are expecting to check.
    If you notice the URL, you can also bookmark this shortcut to your org’s name
    <a href="https://<orgName&gt;.crmhttps://<orgName>.crm<regionNumber>.dynamics.com/WebResources/msdyn_AppAccessChecker.html

  6. Once you enter and check, you’ll be able to see the status of the results.

  7. And you’ll see the details per app.

Hope this helps!

Thank you!

Enable Modern Controls for Canvas Apps in Power Platform

Now, Modern Controls have been recently announced, here’s how you can check them out.

Enable Modern Controls in Canvas Studio

By default, they are turned off, you need to explicitly enable them for your Canvas App.

  1. Modern Controls are not enabled by default as they are still in Preview as of the time of writing this post.
    Hence, you can see the Classic Insert menu as below.

  2. Open Settings in your Canvas App Studio editor

  3. Now, look for Experimental Features and look in the Preview section (You can also search ‘modern controls’ using the Search in this window to find the option quickly)

  4. Now, you can Save the app and do a full-page refresh if required. You’ll now see Modern Controls alongside the Classic ones in your Power Apps Editor.

Now, you can play along with these and test them out!

Microsoft Documentation on Modern Controls: Overview of modern controls in canvas apps (preview)

Thank you!

Configuring Power Apps Assets for Internal Documentation and help for Makers | Power Platform

As you organization grows, more developers join and look for help in order to get help on some of the best practices or advise on how to get the Apps developed the right way.

Power Apps Assets comes to rescue when finding important touch-points for sources of information within the organization.

Configure Power Apps Assets at the Org Level

Here’s how you can configure the Power Apps Assets at the Org level first in order to make them available for Makers in the environment –

  1. You can navigate to Power Platform Admin Center (https://admin.powerplatform.microsoft.com/) and expand Resources to find Power Apps assets

  2. Here, you’ll find that you can add links to Internal Resources and Advisors for the Power Apps.
    I can add Documentation, a Teams Group and a Yammer Community for Power Apps.
    Documentation – Ideally, you can add a link to a place where Makers should be able to find links to the best practices followed at the Org.

    Teams Group – You can have a Teams channel dedicated to helping Makers for any technical queries for new Makers.

    Yammer Community – You can make an internal community of Yammer and add link here so that Makers can reach out to Community for any help.

  3. Given that you know how to get links to these resources like getting the link to a Teams Group, SharePoint Repository and link to a Yammer Community, you can add these to the Power Apps Assets and save the Settings.

  4. Now, let’s add Advisors to the Advisors section. And you can add Advisors as shown below. Process is same as adding users to any area in Power Platform based on their Name/Email.

  5. Here, you have set the Power Apps Assets. Now, Makers are ready to utilize it.
    Here’s a post on how to enable Users for Power Apps Advisors: Enable yourself as a Power Apps Advisor

  6. Once a team member have made themselves for Advisor, they’ll show up as available in the list.

Utilizing Power Apps Assets

Now, that we have set the Power Apps Assets in the Power Apps Admin Center, we are ready to utilize the same. Given the scenario where the Makers are in a Canvas App building an App, here’s how they can utilize these resources –

  1. Example, Priyesh is logged in to Canvas Apps studio and needs help in starting the development process.

  2. Now, you’ll see a Power Virtual Agent in the Canvas Apps Studio, click on it to be able to access Power Apps assets.

  3. When you click on this button, you can enter some initial search term like Internal Resources. The bot will try to find anything matching your keywords.
    When it doesn’t find you can choose to not rephrase and let the bot give you other options to explore.

  4. Now, the bot will further ask you if you want to explore anything within the organization itself or not. Choose yes.

  5. When you click on yes, you’ll see the Power Apps assets show up.
    Also, an option to connect with an Advisor will be available.

  6. When you click on the links listed as Assets, they’ll open up the areas which you entered in Power Apps assets.

  7. Further, when you click on Advisor you’ll see which all Advisors are available to help you out.
    I had enabled CRM Admin user as an Advisor.
    So, the logged in user can choose to seek help from CRM Admin by sending them a message based on preferences set by the Advisor themselves when they made themselves available for advisory.

  8. Now, when you click on Send Message (given that the preference to reach out to advisor was set as Teams), a Teams message window will be opened for you to send in your message to them.
    You can send them a Teams message like you would for for any other user.

And that’s how Power Apps assets could help you out.

Hope this helps!

Thank you!

Create Dataverse Virtual Table from SQL in Azure | Power Platform

Now, you can create a Dataverse virtual table by referencing a SQL Table. This is an easy way to bring in schema from your SQL Table in use and make it into a Dataverse Table.

Let’s see how.

Create SQL Connection Reference

First, let’s create an SQL Connection reference in Power Platform –

  1. Go to Connections, create a new Connection.

  2. Now, you can select the authentication type as SQL Server Authentication. You can choose based on how your setup is, but if you want to quickly test this feature, you can follow this process.

  3. Then, you need to fill in the Server Name and Database Name.
    You can get the Server Name from here in the Azure Portal. You’ll find a Server name in the SQL Server details.

  4. And Database Name under the Server itself as

  5. And fill out the information as below.

  6. Further, if you scroll, you’ll need Server Admin name which will be on the Server itself.

  7. And fill the same in the fields below in the connection dialog box. Finally, click Create.

  8. Connection is now created.

Create Table

Here’s how you can create a virtual Table in Power Apps Maker (https://make.powerapps.com/) –

  1. Once you are in Power Apps Maker, go to Tables and you can drop down from Create Table menu.

  2. You’ll see option to Create by connecting to external source.

  3. Now, you have 2 options at this point of writing this blog. One is SQL and other is SharePoint. We’ll select SQL for this example. Both also use the connection you are logged in.
    You can select the connection we created in the steps above.

  4. In the next step, you’ll see the Tables from SQL. In this example, our table is student.

  5. When you click Next, you’ll see the columns from SQL

  6. Once everything looks good, you can then click Next. Finally, you’ll see he summary of the Dataverse virtual table to be created.

  7. And finally, the Table will read the data from SQL and display here. The data will not be synced back to SQL.


In case you also want to know how to convert a SharePoint list to Dataverse table, you can refer this post – Create Dataverse Virtual Table from SharePoint List | Power Platform

Hope this helps!

Thank you!

Create Dataverse Virtual Table from SharePoint List | Power Platform

Now, you can create a Dataverse virtual table by referencing a SharePoint List. This is an easy way to bring in schema from your SharePoint List in use and make it into a Dataverse Table.

Good news is that this also syncs back to the SharePoint List! Let’s see how.

SharePoint List

Let’s say you have a SharePoint List which you want to convert into a Dataverse Table in your Power Platform –

Let’s do this by creating using a Virtual Table which now connects to SharePoint and SQL. Please note that at the time of writing this post, connecting to external data is still in Preview.

Create Table

Here’s how you can create a virtual Table in Power Apps Maker (https://make.powerapps.com) –

  1. Once you are in Power Apps Maker, go to Tables and you can drop down from Create Table menu.

  2. You’ll see and option to Create by connecting to external source.

  3. Now, you have 2 options at this point of writing this blog. One is SQL and other is SharePoint. We’ll select SharePoint for this example. Both also use the connection you are logged in.

  4. Once you select SharePoint, you’ll either be asked to select one of the recently used Sites or if you know the URL of the Site in which your list resides, you can choose the same.
    Refer below that I’m choosing up to the name of the Sub-site in question.

  5. So, you can either select a recently used Site or just paste the URL of the Site which has your List in it.


    OR


  6. Once you are in the desired Site, you’ll see the List under that Site.

  7. Once I select the List, I’ll be asked if I need to change any schema name from the ones identified by the wizard.

  8. Or if everything looks OK, you can simply click on Next. You’ll be given a summary of what Table will be created from your SharePoint list.

  9. It takes a few moments to create this Virtual Table for you with the data from the SharePoint List.
    Once ready, it’ll appear as below with the data and you can start adding your data as well.

  10. Also, the data you add from the Virtual Table is also sent back to the SharePoint List.

  11. This entity / table will be listed as any other Table in the Power Apps Maker with the Virtual type.


Hope this helps!

Thank you!

Environment Assignment settings in Power Platform Admin Center

Environment Assignments and who should be able to create what types of Dynamics 365 CRM / CE / Dataverse environments can now be easily controlled.

You can either let everyone create a certain Type of environment or only enable Admin groups to create environments in Power Platform Admin Center! Let’s see how.

Restricting Users from Creating Environments

You can now restrict which types of Environments are allowed to be created for the type of audience in your tenant –

  1. If you are one of the Global Admins, Power Platform Admins, you have access to the Power Platform Admin Center settings (https://admin.powerplatform.microsoft.com/tenantsettings)
    And you’ll see the below types of Environment Assignments available for you to tweak
    Developer environment
    Production environment
    Trial environment
    Add-on capacity


  2. And in this example, let review and modify the settings for Production types of instances and who should be able to create this environments.
    Every type of assignment will have the below 2 options –
    Everyone
    Only specific admins (Global Admins, Power Platform Service Admins & Delegated Admins)
    Reference Link: https://learn.microsoft.com/en-gb/power-platform/admin/control-environment-creation?WT.mc_id=ppac_inproduct_settings?WT.mc_id=DX-MVP-5003911


  3. Once your preferences are set, just click on Save and the settings will be applied. Let’s see in the next section how this works.

Restriction Imposed

Here’s how a User will be restricted when they try to create the Types of environments when they are not part of the Admin Groups –

  1. Although, every user will get an option to Create using the Create button and select a Type of environment. When a User selects a Type, in this case Production

  2. And when they click Next, and enter further details in order to create the environment, they’ll see this error message.

  3. And to validate, you can look at the Roles in Microsoft 365 Admin Center under Users as to what Roles they have been assigned with.

Hope this helps!

Exchange Online Mailbox License Error | Exception Missing Exchange License

In case you are setting up your M365 and setting up Exchange Online in the process, your Email URL is: https://outlook.office365.com/mail/

But, you see the below error which reads as

err: Microsoft.Exchange.Clients.Owa2.Server.Core.OwaUserHasNoMailboxAndNoLicenseAssignedException


No License Exception

Here’s why you see the Exception.

  1. If you open the User record in the Microsoft 365 Admin Portal, you’ll see that Microsoft Exchange Service is not seen in the list of Services.

  2. Now, let’s procure a trial for Exchange Online or any of the E3 / E5 plans which you plan to purchase based on your requirements. They have Exchange Online services available which will enable your Exchange Online mailbox.
    In this example, I’ll start a trial because I don’t want to buy for Enterprise purposes for the sake of this example.


  3. Now, once I started this trail. And if you see the comparison chart above, you could even opt for other licenses that offer Exchange Online – say, Office 365 E3
    Now, I can see that this license is available for me to assign to the user.


  4. And when I select the above license, I will see the Service available in the list as well.

  5. Save your licensing changes / preferences and let 5-10 mins pass for the services to be applied to the User.
    And once you reload/refresh the Outlook web app, your mailbox will be available or retained (if it was expired on an existing license.)

Hope this helps!

See which form is displayed in Dynamics 365 CRM | [Quick Tip]

At times, it is not clear which form is displayed when looking at a record.

It may seem like Account form is displayed based on what the label says but things could be different.

And you assume it is this form that is being displayed.

But, this is now always the case. So, let me explain!

See Form Name

Here’s how you can see the true form name –

  1. Hover on the tab where Dynamics 365 CRM is opened.
    You’ll see the actual form that is being displayed even if the name under the record Name is shown as “Account”. In this case, it’s the entity name.

  2. Here, the form name displayed is “Account for Interactive Experience”

When does this happen?

  1. This happens if there are no other forms in the App you are using which have been enabled for your Security Role.
    And then there’s no other form left for you to see.

Fallback Form?

  1. The fallback form will take into effect if no other form is enabled for your security role. In my example, the form “Account for Interactive Experience” was in fact enabled for my Security Role and hence, I didn’t see the default “Account” form even though it was ranked higher in the Form Sequence.

Hope this helps!

PSA to PO Upgrade Errors | Resolve by reviewing Upgrade Logs

Since you and your Org are preparing to upgrade from PSA v3 to Project Operations given that you are covered in terms of what licensing you need, the first step is to try upgrading your environments to PO.

This upgrade operations needs to ensure you have clean data in your environment before you could upgrade. Hence, these validations will cause failure in upgrading the PSA on the selected environment.

Let’s see what needs to be done to identify and rectify the failures.

Upgrade Failure

Typically, you’ll see below failure if you are trying to upgrade from PSA to PO in the Power Platform Admin Center –

  1. You’ll see that the installation has failed.

  2. Now you can directly go in Project Service.

Error Logs in Project Service

Now, in Project Service, you can do the below –

  1. You can see the Upgrade Logs and sort the Started field by Descending to make the latest one appear on top.
    You’ll notice a Failure status entry.


  2. When you open it up, it’ll have the details. You’ll see the upgrade entry as shown below. You need to open it.

  3. Once you open this Upgrade Version record as shown above, you’ll need to sort the Steps as shown below to show all the Failure status records first.

  4. Once you open one of the failed ones, you’ll be able to see why the upgrade failed on the Upgrade Step record.


    These error messages are self-explanatory and you should be able to take corrective action by going through them and working to resolve given you have functional knowledge of Dynamics 365 PSA/CRM.
  5. As you resolve the issues as you find them, you should eventually get to upgrading the Project Operations successfully once all data dependent validation issues are resolved.

Hope this helps!

Get the New Teams client for your organization | Teams Admin Center

Here’s how you can enable Users to choose to update to the new Teams if you are a Teams Administrator in M365 Admin Center

So since you are using the current (or old) Teams version,and the option to enable New Teams is not avilable to you, you would see no option to update in this Title Bar of the Teams app itself.

Let’s see how we can enable Teams for the Users organization-wide.

Enable Org-Wide New Teams Update

If you are the M365 Admin / Teams Administrator in your organization, here’s how you can enable the new Teams switch for the users –

  1. Look for Teams in the Admin Centers area.

  2. Once in Teams Admin Center, you can look for Teams Update Policy section.

  3. Given that you don’t want to create a new policy but want to apply this setting Org-wide, you can open the Global policy which is already present by default.
    Once you open this policy, you’ll find the Use new Teams client option and choose Users can choose option.

  4. Once sure, click on Apply. And you’ll be asked for Confirm.



  5. Once you are confirmed, that’s it.

New Teams

Now, here’s how you ensure you are getting the new Teams.

  1. Once you know the Admin has enabled the new Teams for your organization, sign out from Teams.


  2. Now, when you login again, you can see the button appear for you.

  3. Once you click the switch, you’ll get the Get it now button.

  4. If you click on See the full list, here’s the Documentation for the same – https://adoption.microsoft.com/en-us/new-microsoft-teams/
  5. You’ll see the new Tems app show up.

  6. Please note that if you are also part of other organizations, your other orgs will also show up in the same new Teams client.

Hope this helps!

Perform a changeset request in Dataverse connector in Power Automate

You must’ve noticed Perform a changeset request in the Dataverse connector in Power Automate.

Purpose of this Action is to perform batch of available Dataverse connector actions successfully or rollback the batch performed inside this changeset request.

Here’s what it does.

Perform a changeset request

Here’s how to use the connector action

  1. Select Perform a changeset request in the Dataverse connector Actions list.

  2. Now, since this works like a batch of operations to perform all actions successfully or “rollback” operations, you’ll see this working like a Scope but only for Dataverse actions.

  3. You have the below Actions available to perform.

  4. Now, let’s design an example changeset batch here.
    First, I’ll create an Account and then a Contact.

  5. When this Flow runs, for example, if the Contact creation fails, the Account creation too will be rollbacked unlike having these steps outside of the “Perform a changeset request” action.



  6. Please note that the Outputs of the Changeset request itself or even within the steps within a changeset request cannot be captured or referred to in Dynamic Content.

Hope this helps!

Migrate Flow to latest Microsoft Dataverse connector

If you are using the legacy Dataverse connector which has the gray logo and looks like the below.

You can run Flow checker to identify if Power Automate can help you migrate the Flow to utilize the latest connector.

Flow checker message

Here’s what the old Flow looks like –

  1. You would see the below message once Flow Checker suggests you changes –

  2. Once you click on the “Open the migration assistant” link on the suggestion, it’ll pop-up a window to ask if you are ready to allow Migrating the Flow to the latest Dataverse connector.

  3. Once you click on Migrate, it’ll start the migration process and based on how lengthy your flow is – in a few moments your new Flow will be ready.

  4. So, once the migration is completed, you’ll see a message like this.

  5. Once you click on Open the new flow, you’ll notice that the new Flow now has (Migrated) written in.

  6. And once you click on Edit, you’ll see that the Flow step where old Dataverse connector was used has been replaced by the new Connector

Hope this helps!

Resolve Project Operation errors | PSS Error Logs

In case you are new to Project Operations or have recently upgraded from Project Service Automation to Project Operations, and if you are running into some issues, here’s how to identify them

Error

Here’s an example error –

  1. Let’s say you are trying to add Tasks to the Schedule in Project Operations, and you see the below error in a few moments.

  2. Now, if you look at the above error in red, it won’t let you know exactly is the issue. Hence, here’s how you find it. Check the next section.

PSS Error Log

Here’ s how you can get to where the error is –

  1. You can look for the issue in the below section –
    Go to Settings in the Project Operations App.

  2. Then, go to the PSS error logs and you’ll find an entry there.

  3. And once you open it up, you’ll find the issue there. There could be different issues based on what operations you are trying to perform in Project Operations.


Now, based on your knowledge of Error Resolution in Dynamics 365 CRM application, you can work towards solving your errors.

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Identify deprecated JS code in your Power Platform solution using Solution Checker

Recently, you must’ve received an intimation from Microsoft saying that ODATA v2 is now deprecated. See this link: https://powerapps.microsoft.com/en-us/blog/odata-v2-0-service-removal-date-announcement/

And if you are wondering how to go about finding what has been used, you can use Solution Checker to identify deprecated code which should show up like below using your Solution Checker.

Let’s see how we can use Solution Checker!

Running Solution Checker

Here’s how you can run Solution Checker in Power Apps Maker Portal (https://make.powerapps.com/) and see the results –

  1. Given that your solution already contains the JS Web Resources which you want to run Solution Checker on, you can select the Solution and expand Solution Checker to click Run as shown below.

  2. It takes a few moments to Run the Solution Checker solution. You’ll see the spinner as shown below.

  3. Once completed, you can expand the Solution checker and click on View Results.

  4. Once you click on View Results, you’ll see the list of detected anomalies in your JS Web Resources.
    Apart from the suggested JS best practices, you can see the Category Upgrade Readiness to identify the deprecated code being used in your solutions.

  5. Once you click on the Reference link in each of these results, it’ll open up a Pane on the right hand side to show what the issue is

  6. And when you click on Get the complete guide, here’s the link of the same (https://learn.microsoft.com/en-gb/power-apps/maker/data-platform/powerapps-checker/rules/web/avoid-2011-api?WT.mc_id=DX-MVP-5003911) which will give you the below details in Microsoft Learn Documentation –

  7. Now, if you look at the code that is being references in the Solution Checker result, you’ll see the old code (marked with Red Box)being used.
    Instead, you should use the new code (marked with Green Box) to do carry out the newer version of the functionality –

  8. See the next section in this blog to get the details on the deprecation in client side scripting for Power Platform / Dynamics 365 CRM.

Important Deprecations for Power Apps & Power Automate

Here’ are the Deprecated APIs for Power Platform / Dynamics 365 CRM Client Scripting: https://learn.microsoft.com/en-us/power-platform/important-changes-coming#some-client-apis-are-deprecated?WT.mc_id=DX-MVP-5003911

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Power Apps Developer Plan environments | Power Platform

Developers can now have Environments of their own to test and review Power Apps / Power Automate etc. Here’s how you can get yours!

Learn More about Power Apps Developer: https://learn.microsoft.com/en-us/power-apps/maker/developer-plan?WT.mc_id=DX-MVP-5003911

If you are looking to Sign Up for the Developer Plan, you can use this Link: https://powerapps.microsoft.com/en-us/developerplan/?WT.mc_id=DX-MVP-5003911

Create Developer Environment

Given that you are Power Platform Admin Center, you can create a new Environment like so –

  1. Go to Power Platform Admin Center and then Environments (https://admin.powerplatform.microsoft.com/environments). Click on + New to create a new Environment.



    And then you can select the Type

  2. Now, can you check what URL you want to provide and then click on Finish

  3. Then, your environment will be initiated for creation like any other Environment. Notice the type is Developer.

  4. Once created, if you go in the Environment, you can click the Edit to review the Settings for this Environment.

  5. And you can see that the Security Group cannot be added to this Environment.

User’s Environment

  1. If a User wants to create their own environment under the Power App Developer Plan (https://powerapps.microsoft.com/en-us/developerplan/?WT.mc_id=DX-MVP-5003911), they can go to the Homepage of the Power Apps Developer plan and click on Existing User? Add a dev environment >

  2. Then, you can enter your credentials and you’ll see this page.

  3. Once this is provisioned, you’ll be taken to your Environment and I’ll look like below –



  4. And in the Power Platform Admin Center, the Admins can see that the Environment has been created of Type Developer by SYSTEM.

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform\
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Thank you!

Weekly Digest for usage insights | Power Platform Admin Center

One of the features in Power Platform Admin Center is that now you can receive Weekly Updates in the form of a Newsletter for the Managed Environments.

Pre-Requisites

Below are the Pre-requisites for enabling Weekly Digest

  1. Tenant-Level Analytics must be enabled in your Power Platform Tenant. Here’s a post to see how you can enable Tenant-Level Analytics – Tenant-Level Analytics in Power Platform Admin Center | For Power Apps and Power Automate
  2. Only Managed Environments Updates are available – Here’s how you can learn more about Managed Environments – Enable Managed Environments in Power Platform Admin Center
  3. Under those Managed Environments, only those Managed Environments will be considered which have been enabled for Weekly Digest. While enabling Managed Environments in the above step #2, ensure this checkbox is marked in order to consider the Managed Environment for Weekly Digest –

Enable Weekly Digest

Given the above Pre-requisites are met, here’s how you can enable Weekly Digest –

  1. Navigate to Tenant Settings in the Power Platform Admin Center (https://admin.powerplatform.microsoft.com/tenantsettings) – Look for Weekly digest. Notice that it is marked with a green icon which indicates that it will only be applicable for Managed Environments.

  2. On the right hand side, you’ll see place to enter email addresses (separated by semicolons) who should receive the Weekly Digest emails.
    Power Platform Administrators and Dynamics 365 Administrators only will receive these updates.

  3. Once you entered the email addresses, you can Save those and you’ll see the below message upon confirmation.

  4. That’s it!
  5. And then on a Monday, I saw this in my mailbox!

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform\
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Thank you!

Power Platform self-service analytics Data Export to Data Lake [Preview] | Power Platform Admin Center

Now, you can also export the Analytics Data to Azure Data Lake in order to further extend the derive rich data analytics!
At the time of writing this post, this feature is in Preview (As you’ll also see from the screenshots below)

This is a great feature where you can extract this Data into Data Lake and then further enrich and derive rich Power BI reporting based on your use-case.

In case you are new to understanding Azure Data Lake and pricing, you can review this – https://azure.microsoft.com/en-gb/solutions/data-lake/?WT.mc_id=DX-MVP-5003911

Data Export (Preview)

In Power Platform Admin Center (https://admin.powerplatform.microsoft.com/), here’s how you can setup Data Export –

  1. Navigate to Data Export in Power Platform Admin Center given that you have appropriate rights –


  2. Then, you’ll get to choose amongst the Power Apps or Power Automate data to be exported to Data Lake.

  3. In this example, I’ll choose Power Automate. As I select Power Automate, you’ll see that Tenant-Level Analytics are required and hence, already considered as Yes. If not, you’ll need to Enable Tenant-Level Analytics while doing this step – Here’s another post on how to Enable and Use Tenant Level Analytics –


  4. Now, In the next section you’ll need to choose the Subscription.


  5. Further, select the Resource Group and eventually, the Storage Account as well.


    And Storage Account is selected as well.

  6. Once everything looks good, you can click on Create.

  7. In a few moments, this will appear in Data Lake section of the Data Export. It will take up to 24 hours for the data to first start showing in Data Lake.

  8. Once this is completed after about 24 hours, you’ll see the status of the Data Lake data package changed to connected.


Data Export to Data Lake

Let’s look at the Azure Storage Explorer to connect to our Data Lake and see the Power Platform data – In case you are looking to install Azure Storage Explorer, here’s a post – Microsoft Azure Storage Explorer | Getting Started

  1. Once authenticated to the correct environment in Azure Storage Explorer, here’s what you would see in the ADLS Gen 2 (In case you want to create ADLS Gen 2 storage account, you can review this post – Create ADLS Gen 2 Storage Account for Azure Data Lake)
    You’ll see powerplatform folder show up.

  2. If you open this folder, since we had chosen Power Automate, it’s folder will be created.

  3. And let’s go in Flows folder to see the data. You’ll find json files of the same. You can double click to open it and it’ll open in whatever editor you have installed.

  4. In this case, I had VS Code, so here’s what the Flow data looks like –

  5. Likewise, you can dig deeper in this data and use this further for your reporting!

Here’s Microsoft Learn Document on the same – https://learn.microsoft.com/en-us/power-platform/admin/self-service-analytics?WT.mc_id=DX-MVP-5003911

Here’s Microsoft Learn Docs for Tenant-Level Analytics – https://learn.microsoft.com/en-gb/power-platform/admin/tenant-level-analytics#how-do-i-enable-tenant-level-analytics?WT.mc_id=DX-MVP-5003911

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform\
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Thank you!

Tenant-Level Analytics in Power Platform Admin Center | For Power Apps and Power Automate

As your organization and tenant usage grows, it’s difficult to keep track of Adoption. To tackle this, Tenant-Level Analytics have been introduced in Power Platform Admin Center (https://admin.powerplatform.microsoft.com/).

Let’s see how we can turn this on for your tenant!

Enable Tenant-Level Analytics

Given that you have appropriate permissions in Power Platform Admin Center, here’s how you can enable Tenant-Level Analytics –

  1. In Power Platform Admin Center, look for Settings area – under this, you’ll see an option called as Analytics

  2. Once you select Analytics, you’ll see on the right-hand side, a simple switch to turn it On. it could be Off by default.

  3. Once you Turn if On and Save, you’ll also see a confirmation message.


    And it’ll show this message once applied.

  4. Now, do a complete Broswer refresh.


Reading Tenant-Level Analytics

Here’s how you can review tenant-level analytics once enabled for your Tenant via the Power Platform Admin Center –

  1. When Tenant-Level Analytics are disabled, you’ll not find anything on the top right corner of the Analytics under Power Automate or Power Apps

    Tenant-Level Analytics: OFF (for both, Power Automate and Power Apps)


    Tenant-Level Analytics: ON


  2. You can drop down on the menu and find Tenant-Level Analytics.

  3. Once you choose Tenant level analysis, you’ll see the below report show up – this is the same for Power Automate as well as Power Apps.
    Please note that it takes up to 24 to 48 hours for the metrics to start showing from the previous day.


  4. And after approx. 48 hours later, I see this data now showing up.

Here’s Microsoft Learn Docs for Tenant-Level Analytics – https://learn.microsoft.com/en-gb/power-platform/admin/tenant-level-analytics#how-do-i-enable-tenant-level-analytics?WT.mc_id=DX-MVP-5003911

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform\
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Thank you!

Microsoft Azure Storage Explorer | Getting Started

If you are wondering how to get and setup the Microsoft Azure Storage Explorer – Here’s this post!

Azure Storage Explorer

Here’s how you can download and setup Microsoft Azure Storage Explorer.

  1. Search For Azure Storage Explorer and you’ll see something as below –

  2. Once you open the azure.microsoft.com link, you’ll see the below –
    Drop down to select Windows. And the Setup will will be prompted to save on your browser (or directly download based on your browser settings)

  3. And it’ll appear that it has been downloaded to your machine.

  4. Now, click on the Setup and let it start. Accept the Terms if everything looks OK to you, then click on Install


  5. It’ll then ask you where to setup and what it should call on the system. Standard stuff.



  6. Then, installation will begin.


  7. Then, open it up when finished.

  8. Now, this will come up. You are now ready to Sign-In!

Sign In with Azure in Microsoft Storage Explorer

Now, picking up from the step above, here’s how you sign in –

  1. Click on the Sign In with Azure if that’s your case unless you are trying other options in this wizard.

  2. If your Azure is usual one to login, can you simply click on Azure and click Next.


  3. Then, you’ll be asked to authenticate. Enter your credentials and authenticate like you would for any Microsoft Account.

  4. Once successful, you’ll see this and you can close the window.

  5. Now, if you open the App, it’ll detect your Azure Subscription if you have one.
    If it looks correct, you can simply click on Open Explorer.

  6. Once opened, you can expand on the Subscription and see all your Storage Accounts.

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Smart Buttons in Ribbon Workbench | XrmToolBox
  2. Hide options from OptionSet using JavaScript in Dynamics 365 CRM
  3. Select the item based on a key value using Filter Array in Power Automate
  4. Select values from an array using Select action in a Power Automate Flow
  5. Blocking Attachment Extensions in Dynamics 365 CRM
  6. Upgrade Dataverse for Teams Environment to Dataverse Environment
  7. Showing Sandbox or Non Production Apps in Power App mobile app
  8. Create a Power Apps Per User Plan Trial | Dataverse environment
  9. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  10. Co-presence in Power Automate | Multiple users working on a Flow
  11. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  12. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  13. Call a Flow from Canvas Power App and get back response | Power Platform
  14. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  15. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  16. Asynchronous HTTP Response from a Flow | Power Automate
  17. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  18. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Thank you!

Tracked Properties in Power Automate Flow Step

If you are new to Power Automate and are wondering how Tracked Properties are and how they work?

Tracked Properties are data properties which are hidden away from the Input/Output sections of the Flow and which you can explicitly retrieve in a Flow Run.

Here’s a post to explain the same!

Tracked Properties

Below are what Tracked Properties are –

  1. If you look at the Settings section of different Actions, you’ll see Tracked Properties.

  2. And you’ll see Tracked Properties at the bottom once all Action specific Settings are listed.

  3. Here, you can create and store your own properties and it’s value. It’s value could also be results of preceding steps or from expressions.
    See example below –

  4. Once you create these Properties, here’s how you can retrieve the same.

Retrieving Tracked Properties

Here’s how you can retrieve Tracked Properties –

  1. You need to address using actions() method in Power Automate to read Tracked Properties of a certain step.
    Hence, the syntax is “action('<stepname>')?['TrackedProperties']

  2. You can store it in an Object variable and see the result as below

  3. Or, if you want to retrieve only a specific property, you can mention the same in the expressions itself.


  4. And it’ll show up like this (In anything stores Integer / String)

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Smart Buttons in Ribbon Workbench | XrmToolBox
  2. Hide options from OptionSet using JavaScript in Dynamics 365 CRM
  3. Select the item based on a key value using Filter Array in Power Automate
  4. Select values from an array using Select action in a Power Automate Flow
  5. Blocking Attachment Extensions in Dynamics 365 CRM
  6. Upgrade Dataverse for Teams Environment to Dataverse Environment
  7. Showing Sandbox or Non Production Apps in Power App mobile app
  8. Create a Power Apps Per User Plan Trial | Dataverse environment
  9. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  10. Co-presence in Power Automate | Multiple users working on a Flow
  11. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  12. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  13. Call a Flow from Canvas Power App and get back response | Power Platform
  14. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  15. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  16. Asynchronous HTTP Response from a Flow | Power Automate
  17. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  18. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Thank you!

Smart Buttons in Ribbon Workbench | XrmToolBox

You must’ve definitely used Scott Durow’s Ribbon Workbench in XrmToolBox which is one of the most popular tools for in the XrmToolBox. Here’s what Smart Buttons can do to further extend capabilities of the Ribbon!



Let’s see how you can install Smart Buttons if you already have not installed the solution in your environment and also I’ll summarize how each of these will work! Hope this post captures the bare-minimum well.

Installing Smart Buttons

Here’s how you can install the Smart Buttons solution to make it appear in your Ribbon Workbench!
Link: Install Smart Buttons for Ribbon Workbench | XrmToolBox

Smart Button Posts

  1. Run Reporthttps://d365demystified.com/2023/01/17/run-report-using-smart-button-in-ribbon-workbench-xrmtoolbox/
  2. Run Workflow https://d365demystified.com/2023/01/17/run-workflow-smart-button-in-ribbon-workbench-xrmtoolbox/
  3. Run Webhookhttps://d365demystified.com/2023/01/17/run-webhook-smart-button-in-ribbon-workbench-xrmtoolbox/
  4. Quick JS https://d365demystified.com/2023/01/17/run-js-snippet-using-smart-button-in-ribbon-workbench-xrmtoolbox/
  5. Open Dialoghttps://d365demystified.com/2023/01/17/open-dialog-using-smart-button-in-ribbon-workbench-xrmtoolbox/

Hope this helps!

Here are some Power Automate posts you want to check out –

  1. Select the item based on a key value using Filter Array in Power Automate
  2. Select values from an array using Select action in a Power Automate Flow
  3. Blocking Attachment Extensions in Dynamics 365 CRM
  4. Upgrade Dataverse for Teams Environment to Dataverse Environment
  5. Showing Sandbox or Non Production Apps in Power App mobile app
  6. Create a Power Apps Per User Plan Trial | Dataverse environment
  7. Install On-Premise Gateway from Power Automate or Power Apps | Power Platform
  8. Co-presence in Power Automate | Multiple users working on a Flow
  9. Search Rows (preview) Action in Dataverse connector in a Flow | Power Automate
  10. Suppress Workflow Header Information while sending back HTTP Response in a Flow | Power Automate
  11. Call a Flow from Canvas Power App and get back response | Power Platform\
  12. FetchXML Aggregation in a Flow using CDS (Current Environment) connector | Power Automate
  13. Parsing Outputs of a List Rows action using Parse JSON in a Flow | Common Data Service (CE) connector
  14. Asynchronous HTTP Response from a Flow | Power Automate
  15. Validate JSON Schema for HTTP Request trigger in a Flow and send Response | Power Automate
  16. Converting JSON to XML and XML to JSON in a Flow | Power Automate

Thank you!

Button to send Email based on Templates in Dynamics 365 CE

Great blog below by Vidit to demonstrate how you can use ribbon buttons in Dynamics 365 Sales to send emails based on Templates.

Power Platform Pipelines | Blog Series

Here’s a blog series to get you up to speed on Power Platform Pipelines!

Setting up and Running Power Platform Pipelines

Here is what you need to get done in order to setup Power Platform Pipelines –

  1. Setup Power Platform Pipelines
  2. Run a Power Platform Pipeline

Advanced Settings

ScenarioBlog
Once request for deployment is submitted.Pre-Export Step Required setting in Deployment Pipeline | Power Platform Pipelines

Here’s official Microsoft Documentation on Power Platform Pipelines – https://learn.microsoft.com/en-us/power-platform/alm/pipelines?WT.mc_id=DX-MVP-5003911

Hope this was useful!

Thank you!