Salesforce: Check if Record Has Been Manually Edited

Published by:

We recently had a requirement in Salesforce that was a little unique. An ongoing automated process would update the associated Contact to a record, however they didn’t want the automated process to do the update if the field had been updated by a human. Makes sense – if someone takes the time to make a manual association of a Contact to a record, then it’s probably better information than the automatic matching algorithm.

As far as I can tell it’s impossible at runtime to know if the edit is due to a UI edit or is coming from code, so that’s out.

Here’s how we made it happen. The idea is to have a “last updated by code” checkbox that will be set by a workflow. But in order for it to know if it’s been updated by code, you need a 2nd checkbox that is only updated in Apex.

First, make two checkboxes on your record:
Contact_Change_Trailing_Lock__c
Contact_Last_Changed_By_Code__c

Set the latter to true by default when a new record is created.

When your code updates the record, set Contact_Change_Trailing_Lock__c = true.

Then, make two workflows.
if(and(Contact_Change_Trailing_Lock__c=false,ischanged(Contact__c)),true,false)
-> Field Update – Contact_Last_Changed_By_Code__c = false

And
if(and(ischanged(Contact_Change_Trailing_Lock__c),Contact_Change_Trailing_Lock__c=true),true,false)
–> Field Update: Contact_Last_Changed_By_Code__c = true
–> Field Update: Contact_Change_Trailing_Lock__c = false

This results in you being able to use the Contact_Last_Changed_By_Code__c checkbox downstream in your logic to know not to overwrite the manual edit.

MavensMate: TLS 1.0 Deprecation Error (UNSUPPORTED_CLIENT)

Published by:

If you use MavensMate for SublimeText (on Mac) for your Salesforce development, you might have encountered an issue after your org was updated to Summer ’16. Apparently Salesforce deprecated TLS 1.0 and MavensMate doesn’t support it in versions less than 7.0.

There is a lot of information out there but it’s all over the place. Also the MavensMate community has a maddening tendency to assume everyone’s an uber-developer and understands all the shortcuts and terminology.

Here’s how to nix the problem.

  1. Download and install the MavensMate.app from GitHub.
  2. In Sublime Text, edit your Package Control settings for your user. You can’t edit the global ones.
    Screen Shot 2016-06-28 at 3.55.45 PM
  3. Add the following line to your user settings.
  4. "install_prereleases":
    [
    "MavensMate"
    ],

    Mine looked like this when I was done.
    Screen Shot 2016-06-28 at 3.57.33 PM

  5. Quit and restart Sublime Text.
  6. In Sublime, use COMMAND+SHIFT+P and type “Package Control”. Run “Upgrade Package”.
  7. Now, in the MavensMate app you installed, you need to Settings in the hamburger menu at the top left. Then, add the path to your source files – in my case, /users/bhatcher/documents/source.Screen Shot 2016-06-28 at 4.10.09 PM

 

May the force be with you. Send me a tweet – @BobHatcher – if you have any questions.

Salesforce: Who Has Permission? (And what Profiles/Permission Sets Grant It?)

Published by:

“Can you run me a report to see who has edit permissions on Cases?”

It’s the kind of thing that makes you cringe since it’s such a simple question and not remotely easy to achieve since the config screens are a disaster and it could be in any number of Profiles or Permission Sets.

Here’s how to find out. The following query will give you a quick list. (H/T Adam Torman)

SELECT Assignee.Name
FROM PermissionSetAssignment
WHERE PermissionSetId
IN (SELECT ParentId
FROM ObjectPermissions
WHERE SObjectType = 'My_Object__c' AND
(PermissionsCreate = true OR PermissionsEdit = true))
and Assignee.isActive = true

You can run this in the Developer Console under “Query Editor” tab.

Note that you have to change “My_Object__c” to your object name, and this filters by active users and Create or Edit permissions. You can run this using Workbench, dump it to Excel and deduplicate it. Problem solved.

Great. So What Profiles/Permission Sets Grant Access?

So the next question is going to be, OK, what profiles or permission sets grant that access?

This gets a little tricky because Profiles actually use Permission Sets under the hood. So when you use the above query, you are looking at Permission Set Assignments related to Users, including those assigned via a Profile. This is a little confusing. It’s driven by the isOwnedByProfile flag on PermissionSet – if the flag is true, it’s a hidden Permission Set that underpins a Profile.

So basically you need to do this twice: first, find the Profiles that include an underlying Permission Set (isOwnedByProfile = true). Then find the regular Permission Sets.

You can run the following Apex in the Anonymous Window. (If you’re not technical, don’t be scared, it’s just Your Name -> Developer Console -> Debug -> Execute Anonymous.)


List<PermissionSetAssignment> pa = [SELECT Assignee.ProfileId, PermissionSet.ProfileId, Assignee.Profile.Name
FROM PermissionSetAssignment
WHERE PermissionSetId
IN (SELECT ParentId
FROM ObjectPermissions
WHERE SObjectType = 'My_Object__c' AND
PermissionsEdit = true)
(PermissionsCreate = true OR PermissionsEdit = true))
and PermissionSet.isOwnedByProfile = true];
    
Set <Id> paIds = new Set<Id>();
for (PermissionSetAssignment thisPa : pa)
    paIds.add(thisPa.Assignee.ProfileId);

List <Profile> profiles = [Select Name, Description from Profile where Id in :paIds];
for (Profile thisProfile : profiles)
    System.debug('Profile: ' + thisProfile.Name);

This will (inelegantly) output all the Profiles that grant the permission you’re looking for. (To see the output, go to Logs tab in the Dev Console, open the topmost Log, and check the box that says “Debug Only.”)

Finally, you can check the regular Permission Sets that contain the privilege by using this query:

SELECT PermissionSet.Name
FROM PermissionSetAssignment
WHERE PermissionSetId
IN (SELECT ParentId
FROM ObjectPermissions
WHERE SObjectType = 'My_Object__c' AND
(PermissionsCreate = true OR PermissionsEdit = true))
and Assignee.isActive = true
and PermissionSet.isOwnedByProfile = false

Dump to Excel and dedup.

It’s an ugly solution but it should be enough to answer the question when it’s asked.

Salesforce vs Dynamics CRM: Security Model

Published by:

To follow up on my popular Dynamics vs Salesforce: The War From the Trenches, I thought I’d dig a bit deeper into the security models. The models are considerably different and have their own strengths and weaknesses.

When deciding between Salesforce and Microsoft, the security model is perhaps the most important difference. When implementing CRM, who can see what is an enormous time suck and causes a lot of trouble. It’s the bad side of the 80-20 rule – this is the thing that’s 20% of your project but will consume 80% of your time.

Dynamics – Simple & Consistent, But Susceptible to Exceptions

In Dynamics, you have user roles, and hierarchy governed by something called Business Units. This means you can implement a clean org-chart security model very easily. The East Region VP sees everything in the New York and Atlanta branches, but nothing in San Francisco. And Johnny Sales Rep in New York can see only his own stuff but not the guy in the next cube’s.

The model starts and ends with ownership of a given record. It’s pretty straightforward: Johnny Sales Rep can edit Opportunities he owns but only view Opportunities he doesn’t, within his level in the hierarchy or below him.

To manage this you have a user role concept, which a user can have many of and are only additive. So if Johnny Sales Rep has Role A that allows him to edit a record, and Role B that wouldn’t allow him to edit it, he can edit it. Usually you can end up with a nice layered approach – a base role then a VP role that adds some capabilities, for example.

Dynamics Security Role Example

Dynamics Security Role Example

Dynamics’ security model has recently been improved to include a managerial/organizational hierarchy that reduces the need for Business Units.

The major weakness of this model is that it is focused on what’s at and below a given user in the tree, which makes it extraordinarily hard to operate cross-functionally. Say your customer GlobalCo is based in New York and is owned by Joe Sales Rep, so it rolls up to the North America unit. But GlobalCo has a branch in Johannesburg owned by Sally SuperSales, so that branch rolls up to your EMEA unit. The fact that they are in different branches makes it very hard for Sally to see what’s going on in New York. And if you have a strategic account manager that needs to see all branches of GlobalCo? Forget it. You’re left with workflows implementing automatic sharing, which isn’t reportable or traceable in any meaningful way, or even achievable using out-of-the-box tools. Addressing this stuff will consume your project budget.

This diagram illustrates what I mean.

It’s easy to create visibility within or below a branch of the hierarchy (shaded area). But crossing (the red line) is a disaster.

The good news is that sharing is applicable to almost anything – you can share individual Dashboards, Views, Reports, etc., in addition to records.

There is a Team concept that is marginally helpful here, but seems to always have one limitation that prevents you from using it.

This model is very clear-cut and it applies everywhere. Run a report, it’s relative to what you’re allowed to see, period.

Salesforce – Complicated, But More Crossfunctional

Salesforce’s model is conceptually complex but in the end is vastly more practical.

First, you need to decide if everyone should see everything. If so, you’re done. Otherwise, it includes three main components:

  • Profiles, of which everyone has exactly one. This governs the base of what you can see. By default, everyone has permission to do whatever they want to records they own. If you want to prevent deleting Accounts, you need to take that away in the Profile.
  • Permission Sets – like a Profile but users can have more than one; you can use this to grant permission in individual cases.
  • Roles – this is a hierarchical role. Marketing Specialist rolls up to Marketing Director rolls up to Marketing VP.

On top of this you have automatic sharing. In my mind, this is the #1 advantage Salesforce has over Microsoft. You can do stuff like grant a Sales Operations team control over certain opportunities.

Autoshare to Sales Ops Example

Autoshare to Sales Ops Example

You could think of other examples and applications – like maybe, grant an arbitrary group of product managers read-only access to any Opportunities lost to a certain competitor over features.

But this is messy as well. You have to know and keep track of:

  • When there’s a conflict between a Profile and a Permission Set, the highest permission wins.
  • When there’s a conflict between Sharing and a Profile, the lowest permission wins.
  • And so on.

There’s no consistent, global rules, like how Microsoft’s user roles are always additive. For example, you could create a Dashboard that shows company-wide data and publish it to the whole company. You may or may not want the ability to do this. Or, Dashboards are only editable by the person who created them.

Also, SFDC formula fields, workflow and custom (Apex) code ignore field-level security. Microsoft always runs in the context of a given user and applies their security settings.

Like I said in my first blog – everything in Salesforce has an exception.

Feedback? Leave a comment or find me on Twitter: @BobHatcher.

Dynamics vs Salesforce: The War From the Trenches

Published by:

I’ve been pretty deep in Salesforce.com (SFDC) the last few weeks. There are a lot of articles out there that compare the two solutions at a CIO level, so I thought I’d take a moment to outline some of the more nitty-gritty pros and cons.

2491924-mortal+kombat+2+game+playHere are some of the key differences in my mind.

Where Salesforce Wins

  • VisualForce makes for a much more customizable interface. You’re not limited to dropping fields on a form – you can make a custom webpage part of your process. Custom buttons too. So what you see on the screen is not limited to drag and drop fields.
  • Salesforce’s implementation of Apex unit testing is outstanding. You can create test classes and you must run them before you can promote code into an instance. It’s brilliant and makes you a lot more confident deploying code.
  • Salesforce has a lot of native functionality that Microsoft still hasn’t gotten to. Lead assignment rules, approval processes, autonumbering, rich text text areas, dependent picklists, and multiselect picklists come to mind. Formula fields in SFDC make life a lot easier too.
  • Login as. OMG, login as. Microsoft, if you can hear me, you must allow admins to login as other users and see what they’re seeing.
  • Salesforce’s autosharing rules by role are very helpful. It is very hard and time-consuming to share across the organizational hierarchy in Dynamics. If you have crossfunctional roles like strategic account managers, this is huge. See my follow-up post that addresses this in detail.

Where Microsoft Wins

  • Salesforce has no equivalent to Advanced Find and how I miss it so.
  • Dynamics’ workflow concept is much more robust, and the equivalent things in Salesforce are kind of scattershot. SFDC’s workflow rules only do a small set of things. For example, if you want to copy a Lookup value from a parent record onto a child record it’s a 60 second workflow in Dynamics. In Salesforce, it’s code. A lot of code. (And I needed to copy the value because Salesforce email templates don’t let you access related records as recipients.)
    • Edit: Salesforce has a tool called Process Builder, which is a visual workflow tool that on the surface is like Microsoft’s. However, it’s been out for a while now and has very low adoption. As I understand it, this is due to weak “bulkification” – the need in Salesforce to do everything in a batch process (i.e., update 100 records with one statement rather than execute 100 statements). Bulkification is a persistent concern with everything in Salesforce. So the processes tend to fail when performing high volume processing such as imports.
  • I miss Microsoft’s idiot-proof data import function.
  • It feels like everything in Salesforce is an exception. Want to require a field at the database level? No problem.. unless it’s a picklist. Want to map a field to a child record? No problem.. unless it’s a Lookup. Role hierarchy can be disabled but only for custom objects/entities. Dynamics is much more of a UI on top of a relational database, which means it’s a lot less restrictive.
  • Microsoft has a more complete vision for first-in, first-out queues. There is an intersection entity that enables users to view and claim work. Salesforce’s Queue is essentially a Dynamics Team – basically the record is owned by more than one person.

Other Thoughts

STUART SAVES HIS FAMILY, Al Franken, 1995.

  • Settings/Config search in Salesforce is great but it’s hard to switch between two things, like the config page for an object (entity) and a child object (entity). Microsoft’s tree structure is harder to find things, easier to toggle.
  • Microsoft’s picklist implementation is far superior, using a database value and a text label that’s swapped out at runtime. Salesforce picklist values are stored as text, and the only way to refer to picklist values in code is by their labels, so you basically can’t change picklist values once they’re set (although there is a global replace function for picklist values, but that won’t help your code.)
  • Salesforce’s config screens are maddening sometimes, how lists are not sortable or searchable.
  • Microsoft’s browser compatibility is not great. I’ve tried it on Windows with Chrome, IE, Edge, and Firefox and the one that works best is… Firefox? And on Mac only Safari is supported.. poorly.
  • Microsoft’s cascading rules are more flexible than Salesforce’s Parent-Child and Lookup relationships. Salesforce doesn’t natively support N:N either.
  • Salesforce has a lot of development tools, and so far I’ve tried Sublime/MavensMate, Cloud9, and the native Developer Console. None of them are simultaneously reliable and a good development experience. Visual Studio isn’t great either, and it’s heavy and expensive, but it’s reliable, and you can actually set breakpoints and look into objects at runtime. And you don’t get [expletive]blocked by multitenant hiccups like “Admin Operation Already in Progress.”
  • Salesforce’s Custom Settings make it much easier to manage things like the email address your custom code sends to. In Microsoft you can do it with a custom entity, but why should you have to?
  • I’ll say this again: how Microsoft doesn’t have multi-select and dependent picklists by now is beyond me.
  • Support for client side scripting on Salesforce screens (layouts) is poor.

I’ll post again when I have more to ramble about. Feel free to leave a comment to clarify or correct me.

Getting Started with Dynamics Marketing – Some Lessons Learned

Published by:

The connector from CRM to Dynamics Marketing (MDM) can be tricky. The definitive guide for setting it up is Jon Birnbaum’s blog found here.

Since comments on his blog are disabled, here are some things to keep in mind as you go through it:

  • You do not need to bother with the Azure Service Bus pieces unless you want to use the SDK. The writeup goes into a lot of detail about setting up your own queues but managed queues will suffice for simple use cases. Even if you need your own queues, I recommend you start with the managed ones just to keep it simple.
    • Note: If you do set up your own queues, you do have to do it via PowerShell. I tried the UI.
  • The CRM integration user you create must have the MDM user role, even if it is an admin.
  • The install files for the CRM package are in the MDM installer. You have to download the on-premise connector and install it, even if you are not connecting to on-premise, just to get the ZIP file.
  • The initial sync can take a long time, even in a demo environment without a lot of data.

Another thing that’s important, and buried in a paragraph, is that you need to add the “Sync Enabled” field to the CRM form and toggle it to “yes” for any records you want to go over to MDM.

How-To: Overcome Rollup Field Limitations with Rolling Batch Processing.. Even In the Cloud

Published by:

Rollup fields are great. But their filters are limited. The most common use case I can imagine is something like year to date sales. But you can’t do that with rollup fields because the filters don’t have relative dates. Want quarter-to-date sales? Sorry folks.

Here’s how to make it happen. This works.. even in the cloud. It should also work in 2011 although I have not tested it there.

In this scenario, we are using a Connection between a Product and an Account to hold quarter to dates sales numbers. I want to regularly calculate the sum of Invoices against the Account for the given Product and save it on the Connection. I’m calling the Connection my “target entity.”

Overcoming the Depth Property and Timeouts

Normally, you could create a recursive workflow that processes something, then calls itself. But you run into a problem with the Depth property; CRM will stop the process after so many iterations and consider it a runaway. So if your workflow calls itself, the 2nd time it runs the Depth property will be 2. After so many times, if Depth = x (I think 15 by default), CRM will kill the process.

The secret comes from Gonzalo Ruiz: the Depth property is cleared after 60 minutes.

The other issue is CRM’s timeouts; you need to make sure the update process doesn’t croak because it’s chugging through too many records.

So we’re going to chunk up our data into 1000 batches and run each batch asynchronously every 61 minutes. A lot of processes doing a little bit of work each. I don’t recommend this with synchronous processing.

The Approach

Here’s the process we’re going to create.

  1. Upon create, assign your target entity a random batch number. I’m using 1000 batches.
  2. An instance of a custom entity contains a batch number (1000 batch controller records for 1000 batches). A workflow fires on this custom entity record every 61 minutes.
  3. The workflow contains a custom workflow activity that updates all target records in its batch with a random number in a “trigger” field.
  4. A plugin against your target entity will listen for the trigger and fire the recalc.

2015-10-06 15_27_28-Drawing1 - Visio Professional

First, create a custom entity. I’ve called mine rh_rollingcalculationtrigger. All you need on it is one integer field.

Now, on your Connection (or whatever entity you want to store the rolling calculated fields), create two fields: rh_systemfieldrecalculationtrigger, and rh_systemfieldrecalculationbatch.

Now create a simple plugin to set the batch number to between 0-999 when the record is created. If you have existing records, you can export to Excel and reimport them with a random batch assignment – the Excel formula randbetween() is great for this.

protected void ExecutePostConnectionCreate(LocalPluginContext localContext)
{
	if (localContext == null)
	{
	throw new ArgumentNullException(localContext);
	}
	Random rand = new Random();
	IPluginExecutionContext context = localContext.PluginExecutionContext;
	Entity postImageEntity = (context.PostEntityImages != null && context.PostEntityImages.Contains(this.postImageAlias)) ? context.PostEntityImages[this.postImageAlias] : null;
	ITracingService trace = localContext.TracingService;
	IOrganizationService service = localContext.OrganizationService;

	// super-simple update
	Entity newConnection = new Entity("connection");
	// Add a random number between 0 and 1000.
	newConnection["rh_systemfieldcalculationbatch"]= rand.Next(0, 1000);
	newConnection.Id = postImageEntity.Id;
	service.Update(newConnection);
}

(Side note: In C#, Random.Next() returns a value exclusive of the upper bound. So each record will get a value 0-999 inclusive.)

Now, we create a custom workflow activity. This inputs the batch number and fires a process called FireBatch. So when this workflow runs on a rh_rollingcalculationtrigger entity with Batch ID = 5, it will call FireBatch against all records with batch ID = 5.

In my case, I like to assemble the service, context and tracing service in the plugin and call my own class.

public sealed class WorkflowActivities : CodeActivity
    {
        /// <summary>
        /// Executes the workflow activity.
        /// </summary>
        /// <param name="executionContext">The execution context.</param>
        protected override void Execute(CodeActivityContext executionContext)
        {
            // Create the tracing service
            ITracingService tracingService = executionContext.GetExtension<ITracingService>();

            if (tracingService == null)
            {
                throw new InvalidPluginExecutionException("Failed to retrieve tracing service.");
            }

            tracingService.Trace("Entered Class1.Execute(), Activity Instance Id: {0}, Workflow Instance Id: {1}",
                executionContext.ActivityInstanceId,
                executionContext.WorkflowInstanceId);

            // Create the context
            IWorkflowContext context = executionContext.GetExtension<IWorkflowContext>();

            if (context == null)
            {
                throw new InvalidPluginExecutionException("Failed to retrieve workflow context.");
            }

            tracingService.Trace("Class1.Execute(), Correlation Id: {0}, Initiating User: {1}",
                context.CorrelationId,
                context.InitiatingUserId);

            IOrganizationServiceFactory serviceFactory = executionContext.GetExtension<IOrganizationServiceFactory>();
            IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);

            try
            {
                IWorkflowContext wfContext = executionContext.GetExtension<IWorkflowContext>();
                IOrganizationServiceFactory wfServiceFactory = executionContext.GetExtension<IOrganizationServiceFactory>();
                IOrganizationService wfService = wfServiceFactory.CreateOrganizationService(context.InitiatingUserId);
                int batchId = (int)this.BatchNumber.Get(executionContext);
                Guid thisGuid = ((EntityReference)this.ThisEntity.Get(executionContext)).Id;
                RollupCalculations.FireBatch(service, tracingService, batchId, thisGuid, context.Depth);
            }
            catch (FaultException<OrganizationServiceFault> e)
            {
                tracingService.Trace("Exception: {0}", e.ToString());

                // Handle the exception.
                throw;
            }

            tracingService.Trace("Exiting Class1.Execute(), Correlation Id: {0}", context.CorrelationId);
        }
        [Input("Batch Number")]
        public InArgument<int> BatchNumber { get; set; }
        [Input("This Config Entity")]
        [ReferenceTarget("rh_rollingcalculationtrigger")]
        public InArgument<EntityReference> ThisEntity { get; set; }
    }

FireBatch: query all records with batchID = x and update them with a random number.

// In this case, the processingSequenceNumber is going to be the value from the CRM batch controller entity.
public static void FireBatch(IOrganizationService service, ITracingService trace, int processingSequenceNumber)
        {
            // First, get all Connections with that batch ID.
            String fetch = @"
            <fetch version='1.0' output-format='xml-platform' mapping='logical' distinct='false'>
              <entity name='connection'>
                <attribute name='connectionid' />
                <order attribute='description' descending='false' />
                <filter type='and'>
                  <condition attribute='rh_systemfieldcalculationbatch' operator='eq' uiname='' uitype='product' value='" + processingSequenceNumber + @"' />
                </filter>
              </entity>
            </fetch>";

            // Now do a bulk update of them. 
            EntityCollection result = service.RetrieveMultiple(new FetchExpression(fetch));
            trace.Trace("Processing " + result.Entities.Count + " records on batch " + processingSequenceNumber);

            ExecuteMultipleRequest multipleRequest = new ExecuteMultipleRequest()
            {
                Settings = new ExecuteMultipleSettings()
                {
                    ContinueOnError = false,
                    ReturnResponses = false
                },
                Requests = new OrganizationRequestCollection()
            };

            Random rand = new Random();
            
            int testLimit = 0;
            if (result != null && result.Entities.Count > 0)
            {
                // In this section _entity is the returned one
                foreach (Entity _entity in result.Entities)
                {
                    Guid thisGUID = ((Guid)_entity.Attributes["connectionid"]);
                    var newConnection = new Entity("connection");
                    newConnection.Id = thisGUID;
                    // Note here that we're just dropping a random number in the field. We don't care what the number is, since all it's doing is triggering the subsequent plugin.
                    newConnection["rh_systemfieldrecalculationtrigger"] = rand.Next(-2147483647, 2147483647);

                    UpdateRequest updateRequest = new UpdateRequest { Target = newConnection };
                    multipleRequest.Requests.Add(updateRequest);
                    //trace.Trace("Completed record #" + testLimit);
                    testLimit++;
                }
                service.Execute(multipleRequest);

            }
        }

Warning: use ExecuteMultiple to avoid timeouts.

Now, create a plugin against the Connection to do whatever it is you want. It should fire on change of the rh_systemfieldrecalculationtrigger field.

        protected void ExecutePostConnectionUpdate(LocalPluginContext localContext)
        {
            if (localContext == null)
            {
                throw new ArgumentNullException("localContext");
            }

            IPluginExecutionContext context = localContext.PluginExecutionContext;
            Entity postImageEntity = (context.PostEntityImages != null && context.PostEntityImages.Contains(this.postImageAlias)) ? context.PostEntityImages[this.postImageAlias] : null;
            ITracingService trace = localContext.TracingService;
            IOrganizationService service = localContext.OrganizationService;

            // In this method I do the logic I want to do against the specific record.
            RollupCalculations.SalesToDate(service, trace, postImageEntity);
        }

The final piece is to, well, get it rolling. First, create 1000 instances of rh_rollingcalculationtrigger, setting Batch ID’s 0-999.

Remember, we can create a recursive workflow with the 60 minute workaround. I’m setting it to 61 just to be safe.

2015-10-06 15_09_06-Document1 - Word

Manually fire the workflow once on each of your 1000 recalculation entities. Congratulations, you have perpetual recalculation.

I recommend setting up bulk deletion jobs to remove the system job records this creates. It can be a lot.

Outlook Plugin: Server-Side Sync vs CRM For Outlook

Published by:

In Microsoft Dynamics CRM 2015 Online with Exchange Online, you have the option for the Outlook client to work over “Server-Side Synchronization” or “CRM for Outlook.”

Which one should you choose?

Server-side sync uses server-to-server communication and the Outlook client basically runs everything through the user’s desktop computer. Things to keep in mind:

  1. If you use the Outlook version, and a tracked event occurs such as sending an email from the web client, it won’t be processed unless, or until, Outlook is open and connected on the user’s desktop.
  2. If you user Server-Side Sync, tracking activities in Outlook can take some time. You can adjust the sync interval in Settings -> Administration -> System Settings -> Outlook to as little as one minute. This may not be a problem, but if you are planning to use the “Convert To” or “Add Connection” functions in any meaningful fashion, you’re going to have to wait for the sync to occur before you can use those buttons. They even take away the Synchronize button! Your pilot users will undoubtedly yell “why is Convert To greyed out?!?” or “why is Add Connection greyed out??”
  3. Server-Side Sync, with CRM Online 2015 Update 1, can synchronize with Exchange based on e-mail folders.

Microsoft also says Server-Side Sync can improve Outlook performance, which makes sense, but I haven’t seen such a huge difference.

Fret Not: You can have the best of both worlds.

I’ve found that if I set outbound e-mail to process by Server-Side Sync, and inbound emails and other activities via the Outlook client, when you track items in Outlook they go up to the server instantaneously, enabling the Convert To and Add Connection buttons. But you can also send email from the mobile or web apps even if your laptop is closed in your bag.

You can set this globally in Settings -> Email Configuration -> Email Configuration Settings.

2015-08-16 14_05_11-Email Configuration - Microsoft Dynamics CRM

Outlook Plugin with CRM Online & Office 365: Tracked Items Aren’t Tracked

Published by:

We recently ran across an issue when setting up CRM Online 2015 Update 1 with Office 365 Exchange Online. A user installed the Outlook plugin, and could click the Track and Set Regarding functions – even select which record it should be Regarding – but the record never made it to CRM. The original symptom wasthe “Convert To” function was grayed out no matter what we did.

The issue is that in some cases, the Email Server Profile defaults to a value that doesn’t work. Both Incoming Server Location and Outgoing Server Location need to be set to

https://outlook.office365.com/EWS/Exchange.asmx

But how? The field disappears almost immediately, and you can’t create a new form against this entity, nor can you create workflows.

So I forced it to reopen by using the F12 debugger and setting the field to be editable. For example, incomingserverlocation can be edited by opening the F12 debugger and removing the applicable clause. For example, here, you would remove the part that says disabled=”disabled” altogether. This opens up the field to be edited.

2015-08-05 14_02_04-Inspector - https___xxxxx.crm.dynamics.com_main.aspx_etc=9605&extraq

You need to do this for the three fields.

  • Autodiscover Server Location = No
  • Incoming Server Location = https://outlook.office365.com/EWS/Exchange.asmx
  • Outgoing Server Location = https://outlook.office365.com/EWS/Exchange.asmx

2015-08-05 13_54_43-Email Server Profile_ Test 2 - Microsoft Dynamics CRM

Then associate your user(s) to the profile. You then need to approve them and click the Test button.

But mine came back with this error:

Email cannot be sent because the email address of the mailbox John Smith requires an approval by an Office 365 administrator. The mailbox has been disabled for sending email and the owner of the email server profile Test 2 has been notified.

Make sure the person doing the approving and testing is both an Office 365 admin and a CRM admin.

ClickDimensions: What a bug, what a bug..

Published by:

We use ClickDimensions a lot for marketing automation. It’s a pretty good tool. But today, working with their support, I came across this doozie of a bug.

I had a form that was working fine unless an option set for country was on it.

Here’s what we found – when you create a web form and map it to an option set, it will replace the substring “options” in the field name with “select”. My option set was called “new_countryoptionset” and ClickDimensions failed because internally it thought the field was called “new_countryselectet”. I recreated the field and called it “new_countrypicklist” and it worked fine.

I don’t know whether to laugh or cry. But if you are having issues with option sets in ClickDimensions keep this in mind.