Zapier is now even zappier with multi action workflows!

38312522 - zap label backgroundZapier really bowled me over a couple of years ago when I first stumbled across their service.

I have worked around the Salesforce platform for nearly 10 years now and much of that has been focused on complex projects that require system, process or data integrations.

When I first used Zapier I was amazed at the simplicity of their approach and how this contrasted with a traditional data integration implementation.

 

Over the years I have listened to many sales pitches from heavy weight integration middleware providers where they describe the possibility of integration in weeks, or days… or even hours! These sessions always left me with a knowing smirk on my face…  knowing that the reality would be a 6-12 month implementation programme.

Then Zapier started talking about integration in minutes!!

The data integration industry is large, complex, established and dominated by large software vendors such as IBM, Informatica and Microsoft, and on the most part this is absolutely fine! This scale provides the stability and governance that global enterprise customers need.

Small to medium businesses – those that grew up with the cloud or were able to move much of their operations to the cloud – however have been trapped by this mismatch in operating and revenue models of the cloud versus large software vendors. Integrating salesforce to external systems became an overly costly process for these organisations, which has inhibited innovation within their businesses.

A new model for integration.

Then Zapier and some other cloud-app-trigger-action services came along and started to disrupt the data integration space. Some might say that this happened long before Zapier, as vendors began to offer cloud versions of their existing offerings e.g, Informatica Cloud, Scribe Online, Cast Iron Cloud, etc, but I think that these services although a huge step forward were merely re-enginering the delivery mechanism without any huge change to the value proposition. Zapier brought affordability, simplicity and speed of implementation to data and process integration, to the extent that I no longer consider integration via Zapier as an implementation! Its just another task!

This  affordability and simplicity is made true by the fact that I have personally been a fully signed-up paying customer of Zapier for the past two years. I am not a business, or a Zapier implementation professional, just a person with an interest and the desire to play around. I became a paying customer because I could and because the first pricing tier seems to operate at a consumer price point.  The cost is $15 a month and it gives me up to 20 integrations that run every 15 minutes and connect to in excess of 500 endpoints. This makes  Zapier a similar investment as Spottify, Amazon Prime or my Office 365 subscription, it provides a useful occassional consumer priced service to IT professionals.

A multi-action workflow architecture

Zapier have made the service massively more powerful and extended the scope of my 20 integrations. They added multi-step-zaps (multi-action workflows) for free! The original service was essentially point-to-point, a trigger-action flow, but now we can add multiple actions to a single trigger, orchestrated from the configured Zap.

The below workflow took me around 45 minutes to setup, including the time needed to create trial accounts on Cognito Forms and Xero.

The Requirement:

When a customer registers for my “Awesome Event”, we need to create the customer in Xero for invoicing purposes, and then create the customer in salesforce in a Business Account/Contact model and append both an Opportunity and an Order to the Account with appropriate statuses.

The Workflow:

When the form is submitted through Cognito forms, the workflow is kicked off in real time. Cognito pushes into Zapier with a webhook.

Within Xero we create a new contact with the data from the form, and we attach to this an invoice in a pending state for finance to approve.

Within Salesforce we then take the same data from the form and first search for a matching account and contact pair, if we don’t find them we create them. We add to this an Order and Opportunity to provide visibility to both the sales process and the service delivery process, we specify the order number in salesforce from a concatenation of the Cognito forms Id and the Xero invoice number to ensure that all is cross reference-able.

Zapier blog

zapier-tDisclaimer

I am not affiliated or sponsored by Zapier in any way, I honestly just feel this is an awesome service. Of course if anyone from Zapier is reading this… one of those zappy free t-shirts would be lovely!

Posted in Uncategorized

Salesforce Sales Engineers Love “Let’s Talk About The Weather!”

LTATW-AppExchange

In my role at westbrook we take great efforts to focus on delivering innovative solutions to our customers, within this we aim for a best practice low impact delivery model. One way to deliver on this is to leverage pre-configured packages from the AppExchange, reducing the need to implement any complex customisations.  We work closely with a number of ISV partners to help deliver this to our customers in an efficient manner. Alongside this we also like to add to this eco-system in a small way,  whenever we see a need and a unique requirement.

We have range of “Let’s Talk About..” apps on the AppExchange, the most popular of these has been a simple but very effective relationship tool called “Let’s Talk About The Weather!” Which plays to a typical sales focussed requirement, “I need ways to generate conversation and angle during sales calls”…  (You can discover more about it on the AppExchange.)

LTATW - Install Chart

Let’s Talk About The Weather has now been installed over 350 times since its release and 140 times in the last 6 months, we have noticed an increasing trend of installs over the past 6 month and also in particular from Salesforce Sales Engineers (SE’s). SE’s play a critical role in selling into new opportunities, they propose early solution architectures and define use-cases to customer requirements whilst also delivering dynamic and functional demonstrations during customer workshops.

Customers love Let’s Talk About The Weather, because it gives them an opening and helps connect to their potential customers.

Sales Engineers love Let’s Talk About The Weather, because it brings an extra dimension and use case to sales demonstrations.

Get yours today! Get It Now

Posted in appexchange, Salesforce.com

Apple Passes The Mum Test Again With ApplePay!

9074227_m

I keep coming back to the same thought, over-and-over-again, year-on-year, the same thought comes back.

Why can only Apple design really usable technology products and services?

I wont list them all here, I just don’t have the time, but with every new technology product that I adopt and need to make that decision between competing vendors, I like to do the Mum test… You consider the competing product/service/gizmo’s, for example consider the iPhone over the Android phone, and I ask myself: “Which one would I give to my Mum?

The answer is always the same, the Apple! Simply because they reliably create simple, though-through, intuitive, honest, user-orientated solutions. The definition of a good design.

A good design is always the simplest possible working solution.

I was reminded today of this thought when I came to setup my iPhone to make ApplePay payments. (I won’t go into detail of how to do this, its not relevant and you don’t need a tutorial – it’s intuitive). It all just worked, the experience was simple and I just knew how to use it. The ApplePay passbook interface looked just like my old leather wallet, the image below justifies that again, Apple have the simplest possible working solution.

ApplePay2

With NFC payments Google were first, as often they are, but it took Apple to make it work for humans and Mums… again!

Tagged with: , , , , ,
Posted in connected device

OData and Lightning Connect. Real-time ERP Data in Salesforce!

odata meme syncdataFirstly, my genuine apologies are due to many past clients! It seems that all of those ERP integrations I have implemented over the years, nightly batch jobs, middle-ware syncronisations and roll-yer-own data-syncs… were all sub-optimal solutions! I’m sorry!

Salesforce aligned with OData (Open Data Protocol) about a year ago with its introduction of External Data Sources and External Objects, before this, building large scale ERP data synchronizations was the norm. More than that… it was an advanced integration, a task that I always welcomed, it often required the selection of some form of enterprise middle-ware platform and much detailed data-flow design. The final solutions delivered incredible alignment between Salesforce and back-office systems that allowed businesses to finally begin to align sales and operations and close the data divide that was present in most organizations between front and back office platforms. But it was no-where near real time, the data was old as soon as it was updated, in today’s faster moving world the demands are greater!

At westbrook I have been working recently with some key clients that had previously implemented sub-optimal processes that bring a little bit too much data into the salesforce environment than is considered ideal. These back-office integrations have been replaced with standards based OData, external data access solutions.

What is OData and Salesforce Lightning Connect?

OData is the open protocol that describes the schema of a data source or database, it can be implemented over many different database types and is implemented with a restful approach. From the perspective of our use, it allows access to multiple external data sources through a reliable standards based approach. It represents an interface to our external data, somewhere behind the OData service is an actual data source or database.

Lightning Connect is the term that describes the overall setup that includes External Data Sources and External Objects.

External Data Sources are the configuration element in Salesforce that defines the connection to an OData service, here we define any connection attributes and the endpoint of the OData interface onto the data.

External Objects are the salesforce custom objects that are created automatically from the OData schema document. You can add additional custom fields to these if needed. These external objects have the naming convention Object_Name__x, and although they do not store any data in salesforce they look and feel like other custom objects that do.

Scenario

A large industrial wishes to display sales order history from ERP in Salesforce related to its active accounts, this provides valuable information for the sales and account management teams.

Systems Architecture

ODATA

Approach

Backend datastore, for this we will use an traditional relational database, a cloud based PostgreSQL hosted on Amazon RDS.

OData interface, salesforce is not the OData service provider and its not baked into most databases, so we need an interface, this can be something you host on your infrastructure. I am aware of an open source PHP OData connector, and I’d imagine others are available. For cloud simplicity here I am using the DataDirect Cloud service from Progress Direct, its a paid service but takes away the cost of build, host & maintain of your own connector. They offer a 30 day free trial.

Data, within the SQL database we have around 1 million transaction and also the full product reference of around 1,000 products, the transactions all contain a customer account number identifier that we will use to map to the salesforce account record.

Implementation

Pre-requisit, you already have access to a backend SQL data source, if you are just prototyping spin up a Amazon RDS PostgreSQL database, note the MySQL community edition is not supported for OData through Progress, and I had trouble setting up a MSSQL instance (but that might be me… or MS!).

Setup DataDirect, Signup at progress.com/datadirect-cloud to get started. Enter the typical connection information to create your data source. Then on the OData tap we use the built in tool to generate the schema json. Save and we are provided with the OData serivce access URI, similar to this: https://service.datadirectcloud.com/api/odata/RdsPostgrSql, this service is authenticated with DataDirect Cloud credentials.

datadirectcloud-setup1

Setup Salesforce External Data Source, Setup->Develop->External Data Sources, we create a new data source, specifying a type of Lightning Connect: OData 2.0, a service URL that is the DataDirect access URI we got earlier, and also specify the Authentication details as, Named Principle and Password authentication. Leaving all others as defaults (the Format provided by DataDirect is AtomPub).

Generate and sync the External Objects, pressing Validate and Sync will present you with   note the “sync” part here refers to synchronizing the data schema and not the actual data, as of course no data is stored in Salesforce.Validate and sync

Setup user access, just like a custom object provide object and field level access to the users you want through profiles or permission sets. Additionally create a custom tab so you can quickly access the object and use list views. For my Transactions external object, there are around 1.2 million transactions, and not a single row counts against your salesforce data storage quota!

transactionsWhat makes it really useful is that we can apply some additional configuration to these external objects we can add new fields or configure existing fields to model relationships.

In my example above I have changed the Product_Code field to be an “External Lookup” field, which is like a regular look-up field but to another external object, this means we can drill to the product detail from the transaction level and see a transaction related list on the Product page.

Also I have changed the Customer_ID field to be an “Indirect Lookup” field, this is a really exciting new type (which I am waiting to be included as a standard field type on regular objects), it performs a dynamic lookup to another object but using an external Id field rather than the sfdc standard Id field. Kind of like an old-fashioned primary-key-foriegn-key relationship in a relational database! This is important because most back-office systems are not Salesforce aware and so the data will not have embedded salesforce Ids. The dynamic part of this field is that it will lookup to the related object if it finds a matching external Id, but if there is no match the data will still be available, this wouldn’t be possible with a sfdc Id relationship. It also gives a reverse lookup capability, where for example all transaction data may already be available through the integration, but the parent account is not in Salesforce. When eventually the account manager creates the account record and enters the Customer ID, all historic transactions for that account will instantly be related. Nice!

AccountIndirect lookups from Transactions to Account.

Video setup and demo

 

 

Some additional limits and considerations need to be nderstood around this feature of course…

Posted in Uncategorized

A New Era of Business Intelligence with Salesforce Analytics Cloud

abacus-analytics

The old order of traditional business intelligence and analytics is changing, Salesforce brings the cloud, consumer web influence and fast user focused data visualisations that empower the question-answer-question analysis flow.

Analytics Cloud Pioneers

Recently I was lucky to receive 4 days of intensive training, review and product demonstration with the UK Analytics Cloud Pioneers group, led by Anil Dindigal, Director of Cloud Analytics, Salesforce. The Analytics Cloud Pioneers is a group of key specialists from the Salesforce and Business Intelligence partner communities, that have been provided early access to the Analytics Cloud to enable greater insight, product feedback and innovation.

During these sessions the group received a full review of the current capabilities of the platform along with insights into upcoming features and the overall direction of Analytics Cloud. These were long days with a large amount of detailed and complex knowledge transfer and with lots of great hands on activities, distilled down to some key takeaways.

  • Analytics Cloud solution represents both a BI platform (the technology that structures and stores the data and provides the analytics engine), and an Analytics App (the visualization GUI that communicates data to executives).
    The App and the Platform can be used together or leveraged  separately via APIs.
  • The App UI is fast, really fast and beautiful. The platform churns over and delivers data at speeds that are beyond any traditional BI expectation.
  • Analytics cloud takes a copy of all data for analysis, and supports data from multiple sources; Salesforce data and any other data you chose to throw at it, SAP, ERP, web-logs, etc.
    Salesforce data can be scheduled for load though Analytics Cloud.
    External data needs some other mechanism to get it loaded, usually this is with an ETL tool, but manual loads and an custom processes vis API are feasible too.
  • Analytics cloud architecture is not based on a traditional OLTP relational structure. All data is stored in json data structures as either columnless store or inverted index structures.
  • Many of the rules around data have changed, we like to flatten, to denormalise, we like to duplicate data and to transform it, and we like to store data at the lowest grain level; the detail row level. “Within the Analytics Cloud we should treat data like putty.”
  • All data values are divided into two main types, dimensions and measures.
    Measures – Numerical values that will be aggregated to generate reportable metrics or KPI’s.
    Dimensions – All other data that represents information, most commonly used to group or filter within reports.
  • Analytics Cloud platform will handle large volumes of data upto 500 million rows per instance, and reports of millions of rows can run with fast user experience.
  • The query engine behind Analytics Cloud (branded SAQL) is built on the technology from the Apache Pig module from the Hadoop project and is based on PigQL, a powerful programming query language designed for analysing very large data sets.
    This means we can perform all sorts of complex operations that we have become used to not being able to do in salesforce reports and dashboards, such as dynamic joins, generating temporary datasets, or performing calculations on the fly.
  • At this moment of rapid development of the platform, implementations can be highly complex as many of the advanced capabilities can only be achieved by customizing json schemas, developing queries and making changes to the underlying dashboard scripts.

A quick demonstration

The proof is in the demonstration, the thing about Analytics Cloud that anyone new to it needs to experience is the user interface, the speed, and getting the concept of just how much data is being analysed and delivered on the fly.

There are many different use-cases for positioning Analytics Cloud, from full-on global corporate KPI reporting down to data analysis or business executives doing adhoc data exploration. In this brief overview I drill into a data set of the 2014 House Sales from the United Kingdom.

The data set contains 850,000 rows of data, every operation queries against this full data set. Here I am exploring the data from the top down national view and drilling in to identify the grain level details. This exact use case fits well with business scenarios where we are looking to spot trends or outliers.

Here we see the creation of a “lens” a single reporting component, and the the clipping of multiple lenses to create a very simple dashboard, where all items on the dashboard are dynamically bound. All of this from the user perspective can be achieved in just a few minutes.

(*More powerful dashboards do require full specification and technical work to implement.)

Just to set some perspective.

I am not a market analyst and don’t really have the depth of exposure needed to fully communicate what will happen around the BI industry once Analytics Cloud gets some pace, but below are some simple observations:

  • The BI industry is quoted as being a $38 billion industry, where-as the CRM industry is quoted as being worth $36.5 billion by 2017. The new Salesforce venture into BI may surpass the scale of its current operations.
  • Salesforce are planing a full set of professional certifications (3 + different grades) to support the new platform, providing a new  professional infrastructure for a new world of cloud BI.
  • Currently Salesforce is the biggest player in the CRM market with a 16% share.
  • What will the BI Industry vendors market share look like in 5 years after the Salesforce Analytics Cloud disruption?

 

 

Posted in Analytics, Salesforce.com

Connected Devices – Salesforce and the Littlebits CloudBit

2015-01-22_14-05-33

I spend lots of my time connecting Salesforce to other platforms and systems; the most fun way these days is with some form of API or web-service.

Connecting system-to-system to integrate at a data level is a fairly standard use case, so I was really excited when I heard about the littleBits electronics kits and in particular their CloudBit component; which turns any littleBits input or output circuit that you have built into an internet connected device. Wow, now you can get your own gear out there on the “Internet of Things!”

In the course of my work with Westbrook, I have had a few conversations recently with clients around the concept of integrating devices to salesforce, its an interesting area with a hugh amount of really-quite-do-able potential.

Here I am going to put together a simple example, the requirement is to display data on a connected device that relates to a customer in salesforce. So when I view a Lead in Salesforce I want to display  the weather temperature for their city on an “Old Skool” display unit on my desk!

Luckily I have already build the app that gets my customers weather data (Let’s Talk About The Weather, on the ApExchange) so now the task is to send this data through to my littleBits internet connected device.

The integration architecture will be as follows:

blog - littlebits cloudbit

The key steps are:

  1. Get yourself all the littleBit modules and setup the cloudBit as per the instructions.
  2. Create the callout code in a helper class as per your requirements.
  3. Call the setCloudBit method as you desire, maybe from a Visualforce page onload action as I have done, or from a trigger. Whatever you need.
  4. Done – Your connected device is now on the Internet of things.

A brief demo of my device:

The callout code is as follows:

 

 

 

Posted in Apex, appexchange, Architecture, callout, connected device

Callouts from Salesforce – Adding Soap Headers for WSSE Security

soap

I am not an expert on the SOAP specification, and even less so when it relates to off-platform services, but sometimes we need to stretch into the unknown and less desirable areas of neighboring technologies… I feel much happier around json and rest web-services!

I was integrating recently to some external web-services that are built on WCF (Windows Communication Foundation), as part of this I needed to authenticate the user to get access to the service. Often there are some methods defined in the service for this purpose or we use a standard http or oAuth approach.

However with WCF, WSSE or WS-Security (Soap Security) is used, this passes the credentials via the soap header (not to be confused with the http header). Unfortunately Soap headers are not supported in the Wsdl2Apex import tool, or in any other parser tool I have tried (Fuse-IT Explorer is a great tool to use for parsing wsdls). With WSSE we pass a UsernameToken element within the Soap Header that contains our user credentials.

So if you need to integrate to a WCF service and need to use WSSE security from salesforce, you will need to customise the stub apex class that was generated, you will need to add classes to define the structure of the header elements, and then you will need to instantiate these from the binding class.

There are lots of discussion and similar issues on the web relating to this, but none gave me a full solution, it took me a while to piece it all together, so here are the steps I found useful:

1 – The XML structure that you need to generate will likely look like the below

*Note that initially you will be missing the <Header> elements, likely you will only have a closed header <Header/> tag.

2 – Add the Apex class types  needed to define your <Security> structure.

*These go into the apex class generated from the wsdl

3. Add instantiation for the header into the binding class.

*adding the _hns defines the namespace for the header

 

If you are trying the same sort of this out of salesforce I hope this will help, I used the following article in getting to my working solution, so some further reading if needed:

  • http://himanshooseth1.blogspot.co.uk/2014/07/how-to-add-soap-header-to-salesforce.html
  • https://developer.salesforce.com/forums?id=906F00000008yTdIAI
  • http://salesforce.stackexchange.com/questions/7587/forming-soap-header-through-apex-class-if-the-wsdl-provided-has-no-information-r
  • http://stackoverflow.com/questions/4392616/what-are-the-parameters-for-the-salesforce-webservicecallout-invoke-method

 

Posted in Apex, authentication, callout, integration, Security, webservice

Pushing Salesforce webservice call-outs to the max

13654267_sMost people who work in or around the salesforce platform are aware of the concept of “Governor Limits”. We often speak of these limits disparagingly, as something to be ashamed of or a dirty secret that those within a traditional IT infrastructure environment may use against us at some point in the future. However there are two sides to all things.

For some time now my view of these limits has been the opposite. Salesforce Limits are Powerful.

I prefer to see each limit as a capability, each capability generally refers to the amount of stuff you can do in a single execution context; being a button press, a trigger on save, or a scheduled batch process. So for example every-single-time a user edits, creates or saves a record you can take advantage of these salesforce limits.

I work a lot with web-service integration; making call-outs to external systems to get or send data. A great example of this is around the Future and Call-out limits. These have just been increased in winter 15, giving us even more power! They are as follows:

  • Maximum number of future methods per execution = 50
  • Maximum number of web-service call-outs per execution = 100

Now considering that we often choose or need to make call-outs asynchronously through the use of future methods. This means that every-single-time an event occurs we can now make 5000 call-outs to external systems to retrieve to send data. Every time a user clicks save we can send a message to 5000 other systems! That’s powerful! Too powerful to believe, so I had to test it!

* Governor limits are runtime limits enforced by the Apex runtime engine to ensure that code does not misbehave in a multitenant infrastructure; shared resources across all customers, partners, and developers.

See the full defenition: https://developer.salesforce.com/page/Governors_in_Apex_Code

 

Posted in Apex, Architecture, callout, integration, Salesforce.com

LinkedIn Integration – Connect to new prospects while you sleep!

sf-zapier-linkedIn

 

We all know how fundamental to corporate networking with LinkedIn has become, this applies to all areas of our professional lives, and particularly powerful in getting our sales teams closer to their new prospects, leads and accounts.

LinkedIn has also become another administrative overhead, another task on the long list of things that a sales rep must do in, around and instead of selling. A number of reports recently identifying that sales reps spend 59% of their time *not* selling! (Accenture, Sales Performance)

So at westbrook we have implemented an automated LinkedIn invitation process that has proved successful, it allows the team to automatically send out LinkedIn invites to appropriate prospects and connect whilst they sleep! I personally have just returned from my summer vacation, and during that time I received daily alerts from LinkedIn to tell me that a new prospect had accepted my LinkedIn request.

The requirement

This specific requirement was around new leads generated from AppExchange installations, but it could just as easily be from any lead generation source or account assignment process.

“When a new Lead is created with Source “AppExchange” send out a LinkedIn invitation to the prospect client that installed the app.”

or

“When a Lead is assigned to a sales director, send out a LinkedIn invitation from the sales director to the prospect.”

Zapier

We use Zapier to deliver on this requirement, Zapier provide a cloud trigger service which provides the glue between around 300 cloud services or apps, (e.g. google Docs, twitter, basecamp, gotomeeting, etc). These apps can be glued together with zaps to make them communicate, many zaps can be built for free, however the more commercial ones such as Salesforce or gotomeeting are classed as “Premium” and need a payment plan. The plans start at $15, which is an insignificant amount for any commercial use.

* I do love Zapier – but it troubles me a little, as it makes it just so easy to setup “new world” integrations, app-to-app or cloud-to-cloud flows. Luckily they haven’t yet disrupted the ERP or Legacy IT system integration world yet!

The integration architecture


zapier flow

 

The implementation steps

  1. Get a “Lead-in” process sorted out.
  2. Set-up Lead assignment rules.
  3. Fork out for a Zapier account, it starts @ $15 per month, you can afford it!
  4. Create a new Zapier Zap from Salesforce to LinkedIn for each LinkedIn user you want to generate invites for.
  5. Trigger the zap on a new lead, filtering by owner.
  6. Also filter the leads to those with first name, last name and email. LinkedIn requires you to “know” the person in this way to send invites. If your lead-in process doesn’t get you these details, then sort that out!
  7. Setup the send invite text to send from LinkedIn… “Hi John, you are amazing…”
  8. Test and activate the Zap.
  9. Get the team out selling, one more admin task has been removed!

The zap in the Zapier builder
zapier-zap-salesforce-linkedin

 

Posted in appexchange, Architecture, integration, Salesforce.com

Authenticating to Yahoo public API’s with OAuth

Yahoo-OAuthI love public and open API data sources, they are what makes the web connected and programmable.

Sometimes I find an API that is completely open and that makes my life really easy, but often, unfortunately we need to jump through some hoops to get to the free data. Recently I have been integrating to Yahoo Finance web-services (I am not covering the service or data detail here as there were a few hoops there too! but this is a great resource for Yahoo Finance), and for development purposes there is no need to authenticate at all, we can make free unauthenticated calls.

However for production purposes, the limits are as follows:

  • Unauthenticated: up to 1,000 calls/day
  • Authenticated: up to 100,000 calls/day

So if you need to make more than 1000 calls then we need to authenticate with Yahoo OAuth process. But digging into the docs here we see that we can use the “two-legged” OAuth authentication as opposed to “three-legged”. Essentially this means that we can use OAuth flows for system integration without going through the extra user authentication steps. We just need to supply our credentials, receive the oauth_token and then continue to make our requests.

Hoop 1 : Get a Yahoo API Key

  • You need to have first a Yahoo account, these days thats not so common amongst tech types, as we all seem to prefer gmail.
  • Create a “Project” here to represent you integration and generate an API key. Nothing is critical on this page, but select the free options.
  • You now have the Consumer Secret & Consumer Key.
  • Important: there seems to be some bug, the keys dont work unless you check at least one of the “select APIs for private user access” options at the bottom of the page, even though we don’t intend to use or authenticate to user data. So go and tick some of these. Note, your Consumer Key and Secret will change every time you modify this page. So store the consumer data only once you are done.

Hoop 2: Make a callout to get the OAuth request token

We make a httpRequest as below, with explanations of parameters where it is not obvious

Hoop 3: Extract the OAuth response and build into the API call that you need.

In the OAuth call above we pull out and store the OAUTH_TOKEN & OAUTH_TOKEN_SECRET, we then need to send these as parameters for the final yahoo finance calls, so any future yahoo finance call needs to have the Oauth credentials appended, I create a utility class to add this to the callout url for reuse.

Runscope is a great tool for testing any callout process, allowing fast correction and familiarization with services, and saves a lot of debugging time over going straight into apex.

An example yahoo oauth callout dummy with parameter stubs: https://www.runscope.com/public/c8c0fac8-97d3-4526-b49e-eff8976ed502/112e55b3-a92c-4788-b077-5cddc226fd30

An example yahoo callout dummy with parameter stubs: https://www.runscope.com/public/c8c0fac8-97d3-4526-b49e-eff8976ed502/bf3869f4-6f49-498f-9901-23d85667d500

 

 

Posted in Apex, callout, Data, integration, webservice