In my role at westbrook we take great efforts to focus on delivering innovative solutions to our customers, within this we aim for a best practice low impact delivery model. One way to deliver on this is to leverage pre-configured packages from the AppExchange, reducing the need to implement any complex customisations. We work closely with a number of ISV partners to help deliver this to our customers in an efficient manner. Alongside this we also like to add to this eco-system in a small way, whenever we see a need and a unique requirement.
We have range of “Let’s Talk About..” apps on the AppExchange, the most popular of these has been a simple but very effective relationship tool called “Let’s Talk About The Weather!” Which plays to a typical sales focussed requirement, “I need ways to generate conversation and angle during sales calls”… (You can discover more about it on the AppExchange.)
Let’s Talk About The Weather has now been installed over 350 times since its release and 140 times in the last 6 months, we have noticed an increasing trend of installs over the past 6 month and also in particular from Salesforce Sales Engineers (SE’s). SE’s play a critical role in selling into new opportunities, they propose early solution architectures and define use-cases to customer requirements whilst also delivering dynamic and functional demonstrations during customer workshops.
Customers love Let’s Talk About The Weather, because it gives them an opening and helps connect to their potential customers.
Sales Engineers love Let’s Talk About The Weather, because it brings an extra dimension and use case to sales demonstrations.
I keep coming back to the same thought, over-and-over-again, year-on-year, the same thought comes back.
Why can only Apple design really usable technology products and services?
I wont list them all here, I just don’t have the time, but with every new technology product that I adopt and need to make that decision between competing vendors, I like to do the Mum test… You consider the competing product/service/gizmo’s, for example consider the iPhone over the Android phone, and I ask myself: “Which one would I give to my Mum?”
The answer is always the same, the Apple! Simply because they reliably create simple, though-through, intuitive, honest, user-orientated solutions. The definition of a good design.
A good design is always the simplest possible working solution.
I was reminded today of this thought when I came to setup my iPhone to make ApplePay payments. (I won’t go into detail of how to do this, its not relevant and you don’t need a tutorial – it’s intuitive). It all just worked, the experience was simple and I just knew how to use it. The ApplePay passbook interface looked just like my old leather wallet, the image below justifies that again, Apple have the simplest possible working solution.
With NFC payments Google were first, as often they are, but it took Apple to make it work for humans and Mums… again!
Firstly, my genuine apologies are due to many past clients! It seems that all of those ERP integrations I have implemented over the years, nightly batch jobs, middle-ware syncronisations and roll-yer-own data-syncs… were all sub-optimal solutions! I’m sorry!
Salesforce aligned with OData (Open Data Protocol) about a year ago with its introduction of External Data Sources and External Objects, before this, building large scale ERP data synchronizations was the norm. More than that… it was an advanced integration, a task that I always welcomed, it often required the selection of some form of enterprise middle-ware platform and much detailed data-flow design. The final solutions delivered incredible alignment between Salesforce and back-office systems that allowed businesses to finally begin to align sales and operations and close the data divide that was present in most organizations between front and back office platforms. But it was no-where near real time, the data was old as soon as it was updated, in today’s faster moving world the demands are greater!
At westbrook I have been working recently with some key clients that had previously implemented sub-optimal processes that bring a little bit too much data into the salesforce environment than is considered ideal. These back-office integrations have been replaced with standards based OData, external data access solutions.
What is OData and Salesforce Lightning Connect?
OData is the open protocol that describes the schema of a data source or database, it can be implemented over many different database types and is implemented with a restful approach. From the perspective of our use, it allows access to multiple external data sources through a reliable standards based approach. It represents an interface to our external data, somewhere behind the OData service is an actual data source or database.
Lightning Connect is the term that describes the overall setup that includes External Data Sources and External Objects.
External Data Sources are the configuration element in Salesforce that defines the connection to an OData service, here we define any connection attributes and the endpoint of the OData interface onto the data.
External Objects are the salesforce custom objects that are created automatically from the OData schema document. You can add additional custom fields to these if needed. These external objects have the naming convention Object_Name__x, and although they do not store any data in salesforce they look and feel like other custom objects that do.
A large industrial wishes to display sales order history from ERP in Salesforce related to its active accounts, this provides valuable information for the sales and account management teams.
Backend datastore, for this we will use an traditional relational database, a cloud based PostgreSQL hosted on Amazon RDS.
OData interface, salesforce is not the OData service provider and its not baked into most databases, so we need an interface, this can be something you host on your infrastructure. I am aware of an open source PHP OData connector, and I’d imagine others are available. For cloud simplicity here I am using the DataDirect Cloud service from Progress Direct, its a paid service but takes away the cost of build, host & maintain of your own connector. They offer a 30 day free trial.
Data, within the SQL database we have around 1 million transaction and also the full product reference of around 1,000 products, the transactions all contain a customer account number identifier that we will use to map to the salesforce account record.
Pre-requisit, you already have access to a backend SQL data source, if you are just prototyping spin up a Amazon RDS PostgreSQL database, note the MySQL community edition is not supported for OData through Progress, and I had trouble setting up a MSSQL instance (but that might be me… or MS!).
Setup DataDirect, Signup at progress.com/datadirect-cloud to get started. Enter the typical connection information to create your data source. Then on the OData tap we use the built in tool to generate the schema json. Save and we are provided with the OData serivce access URI, similar to this: https://service.datadirectcloud.com/api/odata/RdsPostgrSql, this service is authenticated with DataDirect Cloud credentials.
Setup Salesforce External Data Source, Setup->Develop->External Data Sources, we create a new data source, specifying a type of Lightning Connect: OData 2.0, a service URL that is the DataDirect access URI we got earlier, and also specify the Authentication details as, Named Principle and Password authentication. Leaving all others as defaults (the Format provided by DataDirect is AtomPub).
Generate and sync the External Objects, pressing Validate and Sync will present you with note the “sync” part here refers to synchronizing the data schema and not the actual data, as of course no data is stored in Salesforce.
Setup user access, just like a custom object provide object and field level access to the users you want through profiles or permission sets. Additionally create a custom tab so you can quickly access the object and use list views. For my Transactions external object, there are around 1.2 million transactions, and not a single row counts against your salesforce data storage quota!
What makes it really useful is that we can apply some additional configuration to these external objects we can add new fields or configure existing fields to model relationships.
In my example above I have changed the Product_Code field to be an “External Lookup” field, which is like a regular look-up field but to another external object, this means we can drill to the product detail from the transaction level and see a transaction related list on the Product page.
Also I have changed the Customer_ID field to be an “Indirect Lookup” field, this is a really exciting new type (which I am waiting to be included as a standard field type on regular objects), it performs a dynamic lookup to another object but using an external Id field rather than the sfdc standard Id field. Kind of like an old-fashioned primary-key-foriegn-key relationship in a relational database! This is important because most back-office systems are not Salesforce aware and so the data will not have embedded salesforce Ids. The dynamic part of this field is that it will lookup to the related object if it finds a matching external Id, but if there is no match the data will still be available, this wouldn’t be possible with a sfdc Id relationship. It also gives a reverse lookup capability, where for example all transaction data may already be available through the integration, but the parent account is not in Salesforce. When eventually the account manager creates the account record and enters the Customer ID, all historic transactions for that account will instantly be related. Nice!
The old order of traditional business intelligence and analytics is changing, Salesforce brings the cloud, consumer web influence and fast user focused data visualisations that empower the question-answer-question analysis flow.
Analytics Cloud Pioneers
Recently I was lucky to receive 4 days of intensive training, review and product demonstration with the UK Analytics Cloud Pioneers group, led by Anil Dindigal, Director of Cloud Analytics, Salesforce. The Analytics Cloud Pioneers is a group of key specialists from the Salesforce and Business Intelligence partner communities, that have been provided early access to the Analytics Cloud to enable greater insight, product feedback and innovation.
During these sessions the group received a full review of the current capabilities of the platform along with insights into upcoming features and the overall direction of Analytics Cloud. These were long days with a large amount of detailed and complex knowledge transfer and with lots of great hands on activities, distilled down to some key takeaways.
Analytics Cloud solution represents both a BI platform (the technology that structures and stores the data and provides the analytics engine), and an Analytics App (the visualization GUI that communicates data to executives).
The App and the Platform can be used together or leveraged separately via APIs.
The App UI is fast, really fast and beautiful. The platform churns over and delivers data at speeds that are beyond any traditional BI expectation.
Analytics cloud takes a copy of all data for analysis, and supports data from multiple sources; Salesforce data and any other data you chose to throw at it, SAP, ERP, web-logs, etc.
Salesforce data can be scheduled for load though Analytics Cloud.
External data needs some other mechanism to get it loaded, usually this is with an ETL tool, but manual loads and an custom processes vis API are feasible too.
Analytics cloud architecture is not based on a traditional OLTP relational structure. All data is stored in json data structures as either columnless store or inverted index structures.
Many of the rules around data have changed, we like to flatten, to denormalise, we like to duplicate data and to transform it, and we like to store data at the lowest grain level; the detail row level. “Within the Analytics Cloud we should treat data like putty.”
All data values are divided into two main types, dimensions and measures. Measures – Numerical values that will be aggregated to generate reportable metrics or KPI’s. Dimensions – All other data that represents information, most commonly used to group or filter within reports.
Analytics Cloud platform will handle large volumes of data upto 500 million rows per instance, and reports of millions of rows can run with fast user experience.
The query engine behind Analytics Cloud (branded SAQL) is built on the technology from the Apache Pig module from the Hadoop project and is based on PigQL, a powerful programming query language designed for analysing very large data sets.
This means we can perform all sorts of complex operations that we have become used to not being able to do in salesforce reports and dashboards, such as dynamic joins, generating temporary datasets, or performing calculations on the fly.
At this moment of rapid development of the platform, implementations can be highly complex as many of the advanced capabilities can only be achieved by customizing json schemas, developing queries and making changes to the underlying dashboard scripts.
A quick demonstration
The proof is in the demonstration, the thing about Analytics Cloud that anyone new to it needs to experience is the user interface, the speed, and getting the concept of just how much data is being analysed and delivered on the fly.
There are many different use-cases for positioning Analytics Cloud, from full-on global corporate KPI reporting down to data analysis or business executives doing adhoc data exploration. In this brief overview I drill into a data set of the 2014 House Sales from the United Kingdom.
The data set contains 850,000 rows of data, every operation queries against this full data set. Here I am exploring the data from the top down national view and drilling in to identify the grain level details. This exact use case fits well with business scenarios where we are looking to spot trends or outliers.
Here we see the creation of a “lens” a single reporting component, and the the clipping of multiple lenses to create a very simple dashboard, where all items on the dashboard are dynamically bound. All of this from the user perspective can be achieved in just a few minutes.
(*More powerful dashboards do require full specification and technical work to implement.)
Just to set some perspective.
I am not a market analyst and don’t really have the depth of exposure needed to fully communicate what will happen around the BI industry once Analytics Cloud gets some pace, but below are some simple observations:
The BI industry is quoted as being a $38 billion industry, where-as the CRM industry is quoted as being worth $36.5 billion by 2017. The new Salesforce venture into BI may surpass the scale of its current operations.
Salesforce are planing a full set of professional certifications (3 + different grades) to support the new platform, providing a new professional infrastructure for a new world of cloud BI.
Currently Salesforce is the biggest player in the CRM market with a 16% share.
I spend lots of my time connecting Salesforce to other platforms and systems; the most fun way these days is with some form of API or web-service.
Connecting system-to-system to integrate at a data level is a fairly standard use case, so I was really excited when I heard about the littleBits electronics kits and in particular their CloudBit component; which turns any littleBits input or output circuit that you have built into an internet connected device. Wow, now you can get your own gear out there on the “Internet of Things!”
In the course of my work with Westbrook, I have had a few conversations recently with clients around the concept of integrating devices to salesforce, its an interesting area with a hugh amount of really-quite-do-able potential.
Here I am going to put together a simple example, the requirement is to display data on a connected device that relates to a customer in salesforce. So when I view a Lead in Salesforce I want to display the weather temperature for their city on an “Old Skool” display unit on my desk!
Luckily I have already build the app that gets my customers weather data (Let’s Talk About The Weather, on the ApExchange) so now the task is to send this data through to my littleBits internet connected device.
The integration architecture will be as follows:
The key steps are:
Get yourself all the littleBit modules and setup the cloudBit as per the instructions.
Create the callout code in a helper class as per your requirements.
Call the setCloudBit method as you desire, maybe from a Visualforce page onload action as I have done, or from a trigger. Whatever you need.
Done – Your connected device is now on the Internet of things.
I am not an expert on the SOAP specification, and even less so when it relates to off-platform services, but sometimes we need to stretch into the unknown and less desirable areas of neighboring technologies… I feel much happier around json and rest web-services!
I was integrating recently to some external web-services that are built on WCF (Windows Communication Foundation), as part of this I needed to authenticate the user to get access to the service. Often there are some methods defined in the service for this purpose or we use a standard http or oAuth approach.
However with WCF, WSSE or WS-Security (Soap Security) is used, this passes the credentials via the soap header (not to be confused with the http header). Unfortunately Soap headers are not supported in the Wsdl2Apex import tool, or in any other parser tool I have tried (Fuse-IT Explorer is a great tool to use for parsing wsdls). With WSSE we pass a UsernameToken element within the Soap Header that contains our user credentials.
So if you need to integrate to a WCF service and need to use WSSE security from salesforce, you will need to customise the stub apex class that was generated, you will need to add classes to define the structure of the header elements, and then you will need to instantiate these from the binding class.
There are lots of discussion and similar issues on the web relating to this, but none gave me a full solution, it took me a while to piece it all together, so here are the steps I found useful:
1 – The XML structure that you need to generate will likely look like the below
Most people who work in or around the salesforce platform are aware of the concept of “Governor Limits”. We often speak of these limits disparagingly, as something to be ashamed of or a dirty secret that those within a traditional IT infrastructure environment may use against us at some point in the future. However there are two sides to all things.
For some time now my view of these limits has been the opposite. Salesforce Limits are Powerful.
I prefer to see each limit as a capability, each capability generally refers to the amount of stuff you can do in a single execution context; being a button press, a trigger on save, or a scheduled batch process. So for example every-single-time a user edits, creates or saves a record you can take advantage of these salesforce limits.
I work a lot with web-service integration; making call-outs to external systems to get or send data. A great example of this is around the Future and Call-out limits. These have just been increased in winter 15, giving us even more power! They are as follows:
Maximum number of future methods per execution = 50
Maximum number of web-service call-outs per execution = 100
Now considering that we often choose or need to make call-outs asynchronously through the use of future methods. This means that every-single-time an event occurs we can now make 5000 call-outs to external systems to retrieve to send data. Every time a user clicks save we can send a message to 5000 other systems! That’s powerful! Too powerful to believe, so I had to test it!
* Governor limits are runtime limits enforced by the Apex runtime engine to ensure that code does not misbehave in a multitenant infrastructure; shared resources across all customers, partners, and developers.
We all know how fundamental to corporate networking with LinkedIn has become, this applies to all areas of our professional lives, and particularly powerful in getting our sales teams closer to their new prospects, leads and accounts.
LinkedIn has also become another administrative overhead, another task on the long list of things that a sales rep must do in, around and instead of selling. A number of reports recently identifying that sales reps spend 59% of their time *not* selling! (Accenture, Sales Performance)
So at westbrook we have implemented an automated LinkedIn invitation process that has proved successful, it allows the team to automatically send out LinkedIn invites to appropriate prospects and connect whilst they sleep! I personally have just returned from my summer vacation, and during that time I received daily alerts from LinkedIn to tell me that a new prospect had accepted my LinkedIn request.
This specific requirement was around new leads generated from AppExchange installations, but it could just as easily be from any lead generation source or account assignment process.
“When a new Lead is created with Source “AppExchange” send out a LinkedIn invitation to the prospect client that installed the app.”
“When a Lead is assigned to a sales director, send out a LinkedIn invitation from the sales director to the prospect.”
We use Zapier to deliver on this requirement, Zapier provide a cloud trigger service which provides the glue between around 300 cloud services or apps, (e.g. google Docs, twitter, basecamp, gotomeeting, etc). These apps can be glued together with zaps to make them communicate, many zaps can be built for free, however the more commercial ones such as Salesforce or gotomeeting are classed as “Premium” and need a payment plan. The plans start at $15, which is an insignificant amount for any commercial use.
* I do love Zapier – but it troubles me a little, as it makes it just so easy to setup “new world” integrations, app-to-app or cloud-to-cloud flows. Luckily they haven’t yet disrupted the ERP or Legacy IT system integration world yet!
The integration architecture
The implementation steps
Get a “Lead-in” process sorted out.
Set-up Lead assignment rules.
Fork out for a Zapier account, it starts @ $15 per month, you can afford it!
Create a new Zapier Zap from Salesforce to LinkedIn for each LinkedIn user you want to generate invites for.
Trigger the zap on a new lead, filtering by owner.
Also filter the leads to those with first name, last name and email. LinkedIn requires you to “know” the person in this way to send invites. If your lead-in process doesn’t get you these details, then sort that out!
Setup the send invite text to send from LinkedIn… “Hi John, you are amazing…”
Test and activate the Zap.
Get the team out selling, one more admin task has been removed!
I love public and open API data sources, they are what makes the web connected and programmable.
Sometimes I find an API that is completely open and that makes my life really easy, but often, unfortunately we need to jump through some hoops to get to the free data. Recently I have been integrating to Yahoo Finance web-services (I am not covering the service or data detail here as there were a few hoops there too! but this is a great resource for Yahoo Finance), and for development purposes there is no need to authenticate at all, we can make free unauthenticated calls.
However for production purposes, the limits are as follows:
Unauthenticated: up to 1,000 calls/day
Authenticated: up to 100,000 calls/day
So if you need to make more than 1000 calls then we need to authenticate with Yahoo OAuth process. But digging into the docs here we see that we can use the “two-legged” OAuth authentication as opposed to “three-legged”. Essentially this means that we can use OAuth flows for system integration without going through the extra user authentication steps. We just need to supply our credentials, receive the oauth_token and then continue to make our requests.
Hoop 1 : Get a Yahoo API Key
You need to have first a Yahoo account, these days thats not so common amongst tech types, as we all seem to prefer gmail.
Create a “Project” here to represent you integration and generate an API key. Nothing is critical on this page, but select the free options.
You now have the Consumer Secret & Consumer Key.
Important: there seems to be some bug, the keys dont work unless you check at least one of the “select APIs for private user access” options at the bottom of the page, even though we don’t intend to use or authenticate to user data. So go and tick some of these. Note, your Consumer Key and Secret will change every time you modify this page. So store the consumer data only once you are done.
Hoop 2: Make a callout to get the OAuth request token
We make a httpRequest as below, with explanations of parameters where it is not obvious
loadConsumerSettings();// get the key/secret from custom settings.
StringoauthTimestamp=String.valueOf((nowTime.getTime()/1000));//to produce a unix time stamp like: '1406105798'
StringoauthNonce=oauthTimestamp+String.valueOf(getRandomInt(1,9999));//"number used once" -- a unique number that we dont reuse
// build the url
+'&oauth_signature='+OAUTH_CONSUMER_SECRET+'%26'// for request the secret is signature with & appended
Hoop 3: Extract the OAuth response and build into the API call that you need.
In the OAuth call above we pull out and store the OAUTH_TOKEN & OAUTH_TOKEN_SECRET, we then need to send these as parameters for the final yahoo finance calls, so any future yahoo finance call needs to have the Oauth credentials appended, I create a utility class to add this to the callout url for reuse.
//add on the OAuth Authenticated bit
StringoauthTimestamp=String.valueOf((nowTime.getTime()/1000));//to produce a unix time stamp like: '1406105798'
StringoauthNonce=oauthTimestamp+String.valueOf(LTAYS_YahooOauth.getRandomInt(1,9999));//"number used once" -- a unique number that we dont reuse
This business is awesome! Thats the cloud, IT, business solutions, implementation, salesforce, mobile and colaboration business of course!
Everything is just ticking along fine, and then you spin round and realise that everything has changed! Again!
There are a number of fundamental changes I’m noticing here:
Mobile notifications, push, we have easier access to it.
The Appdemocracy, we can now get much more of what we need via an app or service for a few $
Devices, these are now our personal assistants and we need them to tell us stuff as opposed to having to check for stuff.
SO… I have been looking for an easy way to send out push notifications from Salesforce for a while. I was excited to learn of the new Push Notifications feature coming in the Salesforce Summer 14 release. And then a little disappointed when I realised I would need to build out a native Android mobile app…
BUT… with some creative thought and the App democracy, we can now build out a process to deliver push notifications, in real-time, to any device, (well… iOS, Android and desktop) from a Salesforce trigger which executes in accordance with your business process logic.
This is what I did:
When we get a new AppExchange install of one of our apps, we need to send instant mobile notification push message to a group of users.
(But your requirement could just as easily be, notify all sales managers when a high value opportunity is won.)
1. Configure Leads. The Salesforce AppExchange will push out a new lead to your org if setup. Set this up!
2. Create an account on Pushover. We are using Pushover as the 3rd party notification service it handles well and provides a certain level of free developer pushing.
– Create an App within Pushover, give it a name and an Icon, and grab the API token. Thats all!
– Install the Pushover App on your devices
– Grab the user api/id from the App settings on the device
– Add these users to the Group configuration in Pushover
2. Build a trigger on the Lead object to fire after insert and inline with specified criteria.
3. Make an API callout to pushover notification service that will handle the gubbins of interacting with the Google Cloud Notifications (GCM) and Apple Push Notification Service (APN).
The API docs are here for pushover and you can easily test the service by making a Post request through Runscope. … below is an example call from runscope. Easy