The Silver Lining

A developer's view of Cloud Computing platforms & technologies.

Salesforce: Sharing Cheat Sheet

with one comment

Sharing is caring.

Sharing is complex, but necessarily so. It gives you incredibly fine-grained control over data access through it’s flexibility but requires quite a deep understanding to do it properly.

There are great articles out there that describe sharing in detail e.g.

Force.com object and record level security

An Overview of Force.com Security

I don’t want to recreate what’s in those articles, instead I’m providing a short, sharp cheat sheet of the major topics you need to understand. So without further ado…

Sharing Cheat Sheet

Sharing Metadata Records

  • “Object[Share]” for standard objects
  • “Object[__Share]” for custom objects
  • Fields: access level, record ID, user or group ID
  • Share records are not created for for OWDs, role hierchies or the “View All” or “Modify All” permissions

Implicit Sharing

  • For Accounts, Contacts, Cases and Opportunities only.
  • A platform feature, cannot be disabled.
  • Access to a parent account—If you have access to a child contact, case or opportunity record of an account, you have implicit Read Only access on that account.
  • Access to child entities—If you have access to a parent account, you may have access to the associated contact, case or opportunity child entities. Access is configure per child object when creating a new role.

Organisation-Wide Defaults (OWD)

  • All standard objects use sharing access through hierarchies and this cannot be disabled
  • Public (Read or R/W) can be seen by all users (including portal)
  • Can’t be changed for contacts if person accounts are enabled

No Relationship

  • All options are available

Master Detail

  • Child objects have their sharing access level and ownership dictated by their parent. This also stands for any grandchildren. The parents value for “Grant access through hierarchies” is also inherited.
  • Child objects don’t have a share-record of their own and will be shared along with the master record.
  • In fact you cannot even define sharing rules from the object detail-page.

Lookup

  • Child objects can have their own sharing access level and ownership. Sharing access through hierarchies can also be disabled.

Manual Sharing

  • Removed when owner changes
  • Removed when access via OWD becomes at least as permissive as the share
  • Private Contacts (those without an Account) cannot be shared manually

Apex Managed Sharing

  • Can be used for Manual Sharing although it isn’t called Apex Managed Sharing in this context
  • Using Apex to share Standard Objects is always considered Manual Sharing i.e. Apex Managed Sharing is only really a feature for Custom Objects
  • Maintained across ownership changes
  • Requires “Modify All” permission

Recalculation

  • Need to create a class that implements the Database.Batchable interface
  • The recalcuation is run when the OWD for the object changes
  • The OWD for the object in question must not be the most premissive access level

Choosing the Right Share Type

“Traditional” / Ownership-based Sharing Rules

  • You want to share the records that a user, group, queue or role own with another user, group or role (includes portal users with roles).

Criteria-based Sharing Rules

  • You want to share records based on values of a specific field or fields with another user, group or role (includes portal users with roles).

Apex Managed Sharing Rules

  • Your sharing requirements are batshit cray-cray. Examples include:
    • Sharing multiple records at once
    • Sharing records on object A based on criteria being met on object B
    • Criteria-based sharing using a field not supported by “Criteria-based Sharing”

Manual Sharing Rules

  • The record owner, or someone with modify all permission, wants to share an individual record with another user, group or role (includes portal users with roles)

Share Groups

  • You want to share records owned by HVP users with internal users, groups or roles (includes portals users with roles)

Sharing Sets

  • You want to “share” records with HVP users. These records need to fulfill the following criteria:
    • Objects has an organization-wide sharing setting different from Public Read/Write
    • Objects is available for Customer Portal
    • Custom object has a lookup field to account or contact

Portals

High Volume Portals (Service Cloud Portals)

  • Include High Volume Customer Portal and Authenticated Website profiles
  • They have no roles and can’t participate in “regular” sharing rules
  • You can share their data with internal users through Share Groups
  • You can share object records where the object is a child record of the HVP user’s contact or account. This is done with Sharing Sets.
  • They can also access records that are:
    • Available for portal, and
    • (Public R/RW OWD, or
    • (Private OWD, and
    • They own the record))
  • They can access a record if they have access to that record’s parent and the OWD is set to “Controlled by parent”
  • Cases cannot be transferred from non-HVP to HVP users

other portals

  • Have a role hierarchy at most 3 levels deep and can participate in regular sharing
    • Person accounts only have a single role
    • Business accounts can have 1 – 3 roles.

Large Data Volumes

  • Defer sharing settings (enabled by logging a case) and group calculation on large data loads and modifications

If you’ve got any other items you think should be in this list, let me know in the comments. Peas oat.

Written by Wes

February 20, 2013 at 12:33 pm

Salesforce: Insufficient privileges when trying to access Activity Settings

with one comment

This strange issue blocked access to certain areas of the setup menu in my production Org, and I couldn’t find a comprehensive solution so here we are.

The problem is documented most comprehensively here with problem statement as:

If you choose to show a custom logo in meeting requests, if the admin who specifies the logo specifies a document that other admins cannot access, then other admins will be locked out of the entire activity settings page.

If the file was created in the last six months you can find out which fart-face did this and have a quick chat with them. However, if the change was made more than 6 months ago you’re in a bit of a sticky situation.

The advice of the aforementioned document is to contact salesforce.com support and ask them to let you know who owns the file. However, you can do this yourself using Workbench.

First log in and then click Workbench > Settings and make sure that “Allows SOQL Parent Relationship Queries” is selected. Then click on Queries > SOQL Query.

SELECT Name, ContentType,Description,folder.name,author.name FROM Document WHERE folderId IN ('USER_ID1', 'USER_ID2', 'etc.')

This query will fetch all the Document records in the relevant users’ private folders. You’re looking for a ContentType that is an image, and hopefully the document name or description will help you further narrow the culprits down. The last step is to email all those people (or get log in access) and get them to check their Documents!

Good luck.

Written by Wes

February 4, 2013 at 6:09 pm

Salesforce JavaScript Remoting: Using Apex and JavaScript objects to pass data from client- to server-side and vice versa

with 13 comments

I’ve spoken about how to do this at a high-level during Cloudstock London and there are hints at how it can be done but no formal documentation that I’ve found, so here we are :)

Quite simply JavaScript Remoting will transform Apex objects and classes (or collections of these types) into JavaScript objects for you. The opposite is true too but there are some rules you need to observe.

Apex Types to JavaScript Equivalents

This is the easier of the type conversions in that you don’t have to really do anything to make it happen. The code below uses a custom class that I’ve defined but you can do the same with any sObject too. Let’s have a look at the code.

The Controller

public with sharing class RemotingObjectsController {

    /* The remoting method simply instantiates a two custom types, puts
       them into a list and then returns them. */
    @RemoteAction
    public static List<CustomClass> getClassInstances(){
        List<CustomClass> classes = new List<CustomClass>();

        CustomClass me = new CustomClass('Wes');
        CustomClass you = new CustomClass('Champ');

        classes.add(me);
        classes.add(you);

        return classes;
    }

    /* My custom type */
    public class CustomClass{
        public String firstName{get;set;}

        CustomClass(String firstName){
            this.firstName = firstName;
        }
    }
}

The Visualforce

<apex:page controller="RemotingObjectsController">
  <script>
      // Will hold our converted Apex data structures
      var classInstances;

      Visualforce.remoting.Manager.invokeAction(
        '{!$RemoteAction.RemotingObjectsController.getClassInstances}',
        function(result, event) {
          // Put the results into a var for pedantries sake
          classInstances = result;

          console.log(classInstances);

          // Assign the first element of the array to a local var
          var me = classInstances[0];

          // And now we can use the var in the "normal" JS way
          var myName = me.firstName;
          console.log(myName);
        });
  </script>
</apex:page>

The Output

Console output from the JS code.

JavaScript Types to Apex Equivalents

This is a little tricker, especially when it comes to sObjects. Note that the approach below works for classes and sObjects too.

The Visualforce Page

<apex:page controller="RemotingObjectsController">
  <script>
      /* Define a JavaScript Object that looks like an Account */
      /* If you were using custom objects the name must include the "__c" */
      function Account(){
          /* Note the field names are case-sensitive! */
          this.Id = null; /* set a value here if you need to update or delete */
          this.Name = null;
          this.Active__c = null; /* the field names must match the API names */
      }

      var acc1 = new Account();
      acc1.Name = 'Tquila';
      acc1.Active__c = 'Yes';

      var acc2 = new Account();
      acc2.Name = 'Apple';
      acc2.Active__c = 'Yes';

      var accounts = new Array(acc1, acc2);

      Visualforce.remoting.Manager.invokeAction(
        '{!$RemoteAction.RemotingObjectsController.insertAccounts}',
        accounts,
        function(result, event) {
          console.log(result);
        });
  </script>
</apex:page>

The Controller

There not much to the controller in this case.

public with sharing class RemotingObjectsController {

    @RemoteAction
    public static void insertAccounts(List<Account> accounts){
        insert accounts;
    }

}

Why is this cool?

Good question. If the Force.com Platform didn’t do this for you then we – the developer – would need to convert ours types explicitly on both the server-side and the client-side, and man-oh-man is that boring, error-prone work. Yet again the guys at salesforce.com have built in a convenience that saves us time and let’s us get on with the work of building cool apps.

Written by Wes

June 22, 2012 at 11:06 am

Using the Heroku Shared Database with Sinatra and Active Record

with 6 comments

ActiveRecord is an amazing (mostly) database-agnostic ORM framework and so it’s a natural choice to use with non-Rails frameworks such as Sinatra. Note that I’ll be using sqlite3 locally but the Heroku Shared Database is a Postgres database so I’ll be setting my environments appropriately.

In this post I’ve assumed that you have a Sinatra app that is working locally and on Heroku.

Getting it working locally

First up you’ll need a few extra gems in your Gemfile, once again note that I’m using different databases in development, test and production environments.

source 'http://rubygems.org'

gem 'sinatra'
gem 'activerecord'
gem 'sinatra-activerecord' # excellent gem that ports ActiveRecord for Sinatra

group :development, :test do
  gem 'sqlite3'
end

group :production do
  gem 'pg' # this gem is required to use postgres on Heroku
end

Don’t forget that you’ll need to install the gems using bundler.

bundle install

And you will need to “require” the appropriate files in either your app configuration or the main app controller, the choice is yours :)

require 'sinatra/activerecord'

At this point you need to provide information that tells your app how to connect to the database.

configure :development, :test do
  set :database, 'sqlite://development.db'
end

configure :production do
  # Database connection
  db = URI.parse(ENV['DATABASE_URL'] || 'postgres://localhost/mydb')

  ActiveRecord::Base.establish_connection(
    :adapter  => db.scheme == 'postgres' ? 'postgresql' : db.scheme,
    :host     => db.host,
    :username => db.user,
    :password => db.password,
    :database => db.path[1..-1],
    :encoding => 'utf8'
  )
end

You can include this information in your Sinatra app file but I suggest putting the information into a separate configuration file. I keep mine in a file ‘/config/environments.rb’. If you do this you’ll have to include it in your Sinatra app file(s).

require './config/environments'

In order to use migrations (to set up your object model) you’ll need to create a Rakefile with the following code.

# require your app file first
require './app'
require 'sinatra/activerecord/rake'

At this point you can use the typical ActiveRecord Migration syntax to create you migration files, for example:

rake db:create_migration NAME=create_foos

This creates a migration file in ‘./db/migrate’ and this file will be used to create your database table on migration. You will also need to create a class that is the “bridge” between your app and the database table.

class CreateFoos < ActiveRecord::Migration
  def self.up
    create_table :foos do |t|
      t.string :name
    end
  end

  def self.down
    drop_table :foos
  end
end

As with the database environment details this code can be included in your main app class but you should put it into it’s own file and include that in your app instead. Once you’ve done this you can run the following to create the database tables – this is only a local operation for now.

rake db:migrate

At this point you should have a local table and method to apply any CRUD action to said table.

And now for Heroku

Before pushing your new app to heroku you’ll need to add the Shared Database addon.

heroku addons:add shared-database

Commit and push your code to Heroku after which you’ll need to rake the remote database.

heroku rake db:migrate

And that’s it. You now have a ActiveRecord working locally and remotely and can develop in a consist way. Aw yeah.

Written by Wes

April 22, 2012 at 4:09 pm

Posted in Ruby

Tagged with , , , ,

Voodoo – A Todo list that demos the power of KnockoutJS

with 6 comments

Voodoo - A todo list

This small demo app will demonstrate the usage and power of JavaScript MVC frameworks and in particular KnockoutJS. You can learn more about the framework through the tutorials on the KO site. I will gloss over some of the details but you can learn more in framework documentation. My goal here is to give you a high-level sense of what’s possible. The picture along side shows what we’re building. You can find the demo here and the full sourcecode here.

The HTML

Strictly speaking jQuery is not required for KO to work but it is likely that you will often include it as a helper for the framework. As alway you need to start with the static resource inclusions.

<script type="text/javascript" src="js/jquery-1.7.1.min.js"></script>
<script type="text/javascript" src="js/knockout-2.0.0.js"></script> 

And you’ll need a form in order to create new todo items.

<form data-bind="submit: addTask" id="create-todo">
    <input class="new-todo" data-bind="value: newTaskText" placeholder="What needs to be done?" />
</form>

For the first time you’ll notice the data-bind attribute. The framework recognises this attribute and parses the attribute value to determine what logic to apply. In this case the input element is bound to a JavaScript property called newTaskText.
Next up you need the markup that contains and displays each task. Some actions are available for each item too.

<div class="todos">
  <ul data-bind="foreach: tasks, visible: tasks().length > 0" id="todo-list">
      <li>
        <div class="todo" data-bind="css: { editing: isEditing }, event: { dblclick: startEdit }">
          <div class="display" data-bind="css: { done: isDone }">
            <input type="checkbox" class="check" data-bind="checked: isDone" />
            <div class="todo-text" data-bind="text: title"></div>
            <a href="#" class="todo-destroy" data-bind="click: $parent.removeTask">&times;</a>
          </div>
          <div class="edit">
            <form data-bind="submit: updateTask">
              <input data-bind="value: title" />
            </form>
          </div>
        </div>
      </li> 
  </ul>
</div>

Again you’ll notice that each element that is to be used in someway by KO has an attribute of data-bind. Below I’ve picked out a few lines to demonstrate key functionality. The following line is an instruction to run through a collection of tasks and only display the ul element if there’s anything in the collection.

<ul data-bind="foreach: tasks, visible: tasks().length > 0" id="todo-list">

The line below is used to conditionally apply a style class and ensures that the doubleclick event is bound to the appropriate handler.

<div class="todo" data-bind="css: { editing: isEditing }, event: { dblclick: startEdit }">

And here we have an example of an input element being bound to a JavaScript object field isDone – the object structure will be shown later.

<input class="check" type="checkbox" data-bind="checked: isDone" />

Now here’s some of the magic of KO. Below are the some stats based on the number of tasks in the list. If you were using jQuery or just JavaScript you would have to track the number of elements in the list and update the stats appropriately.

You have <b data-bind="text: incompleteTasks().length">&nbsp;</b> incomplete task(s)
<span data-bind="visible: incompleteTasks().length == 0"> - its beer time!</span>

With KO the view is driven by the underlying object data. If the number of items in the list changes all related information is automatically updated in the view! In KO this is facilitated through concepts known as observables and dependency-tracking.

The JavaScript

KO is the first time I’ve used OOP within JavaScript for some time, and it’s pleasure to work with the concepts in such a paradigm! In this small app there are only 2 classes, one for tasks (fairly obvious) and another for the ViewModel which you can consider the application class.
The Task class contains the properties and methods applicable to Tasks. You’ll notice how the properties are initialised using using the ko.observable() method. This is a touch more magic and it means that the values of these properties will be “watched”. If they are changed either through the user interface or via JavaScript then all dependent views elements and JavaScript values will be changed too.

function Task(data) {
    this.title = ko.observable(data.title);
    this.isDone = ko.observable(data.isDone);
  this.isEditing = ko.observable(data.isEditing);

  this.startEdit = function (event) {
    this.isEditing(true);
  }

  this.updateTask = function (task) {
    this.isEditing(false);
  }
}

The ViewModel class exposes the Tasks in a meaningful way and provides methods on that data. Types of data exposed here are observable arrays of tasks and properties that return the number of complete and incomplete tasks. The operations are simple add and remove functions. Right at the end of the class I’ve used jQuery to load JSON objects into the todo list.

function TaskListViewModel() {
    // Data
  var self = this;
  self.tasks = ko.observableArray([]);
  self.newTaskText = ko.observable();
  self.incompleteTasks = ko.computed(function() {
    return ko.utils.arrayFilter(self.tasks(),
    function(task) {
      return !task.isDone() && !task._destroy;
    });
  });

  self.completeTasks = ko.computed(function(){
    return ko.utils.arrayFilter(self.tasks(),
      function(task) {
        return task.isDone() && !task._destroy;
      });
  });

  // Operations
  self.addTask = function() {
      self.tasks.push(new Task({ title: this.newTaskText(), isEditing: false }));
      self.newTaskText("");
  };
  self.removeTask = function(task) { self.tasks.destroy(task) };

  self.removeCompleted = function(){
    self.tasks.destroyAll(self.completeTasks());
  };

  /* Load the data */
  var mappedTasks = $.map(data, function(item){
    return new Task(item);
  });

  self.tasks(mappedTasks);
}

The very last line in the JavaScript code tells KO to apply all it’s magic using the ViewModel and markup we’ve written.

Summary

To me it’s amazing how little code you need to write in order to build such a neat app. And you don’t even need to track the view state at all! Hopefully this gives you the confidence to start using JavaScript MVC/MVVM frameworks because in the end it helps save you heaps of time and effort.

Written by Wes

March 23, 2012 at 5:58 pm

The rise of JavaScript and it’s impact on software architecture

with 4 comments

MVC and it’s siblings have been around for a while and developers are comfortable bathing in the warm light of their maturity and wide-spread advocation. However, a few years ago developers started doing more of their coding client-side and as a natural consequence the lines between M, V and C became blurred leaving many of us cold and uncomfortable when trying to explain where the architectural puzzle pieces belong.

I’m sure you’ve had a similar experience. Anyone who’s used jQuery, for example, has been in the uncomfortable situation where controller code now exists within view and even worse these two are tightly coupled by virtue of jQuery selectors. To make matters more complicated if you’ve ever used class-names for application state or .data() then you’re model, view and controller are now more tightly bound than the figures in a Kamasutra carving.

This is not a new problem but the solution(s) are quite new to me and so I thought I’d share my experiences.

jQuery is Great. But…

Read the rest of this entry »

Written by Wes

March 18, 2012 at 6:05 pm

Salesforce: JavaScript Remoting and Managed Packages

with 16 comments

I love the crap out of JavaScript Remoting, but came across a small bug when wrapping up the code in a managed package. As many of you know when you create a managed package it prepends your code with a unique name to prevent code conflicting e.g. a page controller called “MyController” becomes “MyPackage.MyController” where “MyPackage” is the prefix you’ve chosen for your managed package.

The bug I’ve found is caused by the fact that the prefix isn’t applied to the JavaScript that calls your Apex Remoting methods i.e you might have an Apex method called “myMethod” which is called like so outside of a managed package environment:

MyController.myMethod(parameters, function(result, event) {
  callback(result);
}, {escape: false});

Once you package up your code however this call will no longer work, and if you look in the debugging console of your browser you’ll find an error something like: “MyController is not defined”

This is because in the managed package environment “MyController” actually doesn’t exist but is now called “MyPackage.MyController”! @greenstork and others have come up with a solution for this and it looks something like:

[Edit] One of the Salesforce guys has given me a very neat workaround:

// Check if "MyPackage" exists
if(typeof MyPackage === 'undefined'){
  // It doesn't, so create an object with that name
  window["MyPackage"] = {};

  MyPackage.MyController = MyController;
}

// All code only refers to MyPackage.Controller
MyPackage.MyController.myMethod(parameters, function(result, event) {
  callback(result);
}, {escape: false});

I’ve posted a message on the forums about this issue and Salesforce is aware and is working on it. Now that’s great customer service!

As an aside I’d love to know how they’re going to solve this issue! It’s quite complex because their compiler needs to run through all of your JavaScript code (including any libraries you might have included) and try to figure out what code is actually making remoting calls, and prefix that exclusively! This is a new problem for managed packaging because for the first time they need to work on code that isn’t necessarily 100% part of their platform. This is further complicated because you can Zip your resources. An interesting challenge indeed...

Written by Wes

February 26, 2012 at 7:32 pm

Posted in SalesForce

Follow

Get every new post delivered to your Inbox.

Join 1,850 other followers

%d bloggers like this: