MuleSoft: Cloudhub vCores Usage Optimization

Mulesoft Connect 2019 was wrapped last month in North america. These connects are one of the premier conferences for API led connectivity and digital transformation.These conference brought more content for developers, architects, and business executives across different business domain. At MuleSoft CONNECT plethora of market experts, and business executives including industry’s CEO/CTO,  discussed their Mulesoft experience and democratization of innovation.

During these conferences, I got an opportunity to talk to some business executives about their Mulesoft experiences and challenges.

One of the biggest challenges is to optimize Mulesoft vCore in cloudhub to keep their project in budget. 

Here are few steps in Mulesoft application to keep vCore usage low and project in budget.

1. API Optimization — As per Mulesoft best practices, Mulesoft suggest API led connectivity to expose data to application within or outside of your organization through reusable and purposeful APIs.

The APIs used in an API-led approach to connectivity falls  into three categories:

  • Experience APIs
  • Process APIs 
  • System APIs

When you are working on API led connectivity, do we really need all three layers of APIs every time?

No, It is not necessary to implement all three layers of APIs every time.

API Layers

Here are some of API layers use-case to save vCores usage and optimize APIs led connectivity.

  • Experience APIs — Experience API is similar to process APIs but unlike Process APIs, Experience APIs are more specifically tied to a unique business context, and project data formats, interaction timings, or protocols into a specific channel and context. These  APIs simplifies your front end data, based on different GUI. For example if you are working on PC website or Mobile website, we display data based on user experience, so we need different APIs to show these data, but if your application needs  only data irrespective of user experience we can skip Experience APIs and application can work only on Process APIs or System APIs. This will save some vCores and keep project in budget.
  • Process APIs — Process APIs, if you are working on complex business logic based on different organization department then you can incorporate all these business specific data in process layer and expose these data through process APIs. But if APIs are not incorporating any complex business logic and most of datas are processing through System APIs then in this use-case you can skip Process APIs and expose your data through System APIs. In this way you can save some vCore and keep your project within budget.

2. Salesforce Platform Events Integration – Salesforce integration with Mulesoft is one of the very common integration use-cases. In the old days  Salesforce synced their data through polling. Poll run couple of time in whole day and sync data between different salesforce org. Since this is polling process, it is not easy to predict the volume of data flowing through Mulesoft application during a certain period of time. So in this case, we go with higher mulesoft vCore to avoid any memory leak. 

Salesforce introduced “Platform Events”   the Salesforce Enterprise Messaging Platform on June 2017. After introduction of “Platform Event”,  integration of Mulesoft and salesforce has become very easy. “Platform Event” enterprise messaging service is event based. So any update for any create Object within salesforce generates event and sends payload to salesforce messaging queue. Mulesoft-Salesforce connector read these payload for data sync from Salesforce messaging queue FIFO based. Since this integration is event based, so as soon as Mulesoft receives event from “Platform event” it is processes Platform event message. So any time we have no large set of data to process. In this integration then we can go for lower vCore and execute project within budget. 

3. Batch Process Optimization — Mulesoft allows to process messages in batch. Mule batch process provides a construct for asynchronous processing larger-than-memory data sets that can split into individual records. Mulesoft batch extracting, transforming and loading (ETL) information into a target system like hadoop.

               Mulesoft needs large memory/vCore to run large sets of data in batch process.   These Mulesoft batch process runs max once or twice a day . These Mulesoft batch  hold large number of vCore idle rest of day without any active usage. You can optimize vCore usage and reduce your batching processing cost by following these two steps.

  • Reuse vCore by deploying multiple batch process applications — As you know, batch run certain time of day once or twice. Suppose one batch application is running every midnight and other batch application is running every morning . Both your batch application is taking 1 vCore. So both applications consuming  total 2 vCore.

If you are configuring any CI/CD process like Jenkin/Code build to deploy your batch application into cloud then it is very easy to manage your process to reuse your vCore. Your can configure you CI/CD process to build your application and deploy your application into cloud when you want to run batch. Once batch is done then you un-deploy your application and deploy next batch application on same memory. In this way you can keep reusing your vCore memory and keep your project within budget.

  • Deploy Batch application in on-premise Mulesoft server — As we all know Batch process is simple and easy to maintain application in most to their use-case. In this case it is very easy to maintain on-premise Mulesoft server and deploy your batch application without much worry about vCore usage.

Link:

Salesforce Platform Event Mulesoft Integration : https://www.vanrish.com/blog/2018/10/01/mulesoft-salesforce-platform-events-integration/