Recession: Impact in Software as a Service(SaaS) 

Global uncertainties continue to dominate headlines. Inflation is expected to reach the highest levels of ~3.5% in the US and Europe by the end of 2023. To ease inflation, Central Banks need to dampen demand, by making it expensive (for financial institutions, businesses and households) to borrow by increasing Federal Reserve interest rates . We are expecting a federal rate hike of 4.75% – 5.0% by the end of 2023. These are all data showing we are heading toward recession. The US labor market was robust last quarter but this quarter it is not very promising. Everyday we are hearing layoff news from different sectors.

IMF inflation forecast

These inflation and layoff news are impacting our tech market. Many companies have a growth challenge: They expect to get as much as 50 percent of their revenue from new businesses and products by 2026 but are not on a path that will take them there. Current economic conditions are forcing high-growth yet unprofitable tech startups to tighten their financial belts.

There are few realities, software companies are facing for their growth.

US-based Venture capitalists backed software startups slowed down – VC are very clear of high valuation and demanding that companies spend less, improve profit margin and high output. Unicorn creation also slowed in 2022 Q4. This is one of the lowest quarterly count since the first quarter of 2020.

Depressed company valuations – Private company valuations are cooling down. Over the last 4 quarters, we have seen public valuations compressing.

 Software companies have three critical revenue streams.

  1. License / Subscription Revenue – When the customer pays for the right to own and use a copy of the software/hardware product or subscribe/access  software platform
  2. software or hardware product – Customer pays for ongoing support or premium support.
  3. Cloud based licensed software – Customer pays the software provider for specific deliverables such as software implementation or technical training.

In the current world all these 3 revenue streams are shrinking. Companies are using only essential services to run their business. This is directly impacting software revenue, which is leading these companies into low valuation.

Infrastructure Maintenance –  SaaS companies are providing the software as a service. This means the customer does not have to purchase hardware to run the software—that cost is transferred to the SaaS provider. This is implying continuous software running coast. This cost is not going anywhere.So due to inflation this SaaS running cost increases tremendously.

API Security

API is a key component of digital transformation. API is the interface of your legacy and SAAS data. The goal of APIs is to facilitate the transfer and enablement  of data between your system and external users. APIs are typically available through public networks like the internet to communicate to external users and expose your data into the public domain.

Since your data is exposed into the public domain through APIs, It can lead to a data breach. APIs can be broken and expose sensitive personal as well as company data. An insecure API can be an easy target for hackers to gain access to your system and network. Rise of IOT devices and usage of APIs by these IOT devices, APIs are now more vulnerable. 

According to owasp, these are 10 main API vulnerabilities.

  1. Broken Object Level Authorization – Expose endpoints that handle object identifiers, creating a wide attack surface Level Access Control issue.
  2. Broken User Authentication – Authentication mechanisms are implemented incorrectly.
  3. Excessive Data Exposure – Developers  expose all object properties without considering their individual sensitivity
  4. Lack of Resources & Rate Limiting – APIs do not impose any restrictions on the size or number of resources that can be requested by the client/user, lead to Denial of Service (DoS) attack on APIs
  5. Broken Function Level Authorization Complex access control policies with different hierarchies lead to authorization flaws.
  6. Mass Assignment – Without proper properties filtering based on an allowlist, usually leads to Mass Assignment.
  7. Security Misconfiguration – Misconfiguration or lack of Security configuration  is commonly a result of insecure APIs
  8. SQL Injection SQL Injection occurs when untrusted data is sent to an interpreter as part of a command or query.
  9. Improper Assets Management – APIs tend to expose more endpoints than traditional web applications lead to improper expose APIs.
  10. Insufficient Logging & Monitoring – Insufficient logging and monitoring fail to find your vulnerability and broken integration.

How to mitigate API security risk?

  • API supports secure sockets layer (SSL), transport layer security (TLS), and Hypertext Transfer Protocol Secure (HTTPS) protocols, which provide security by encrypting data during the transfer process.
  • Apply Basic Auth minimum with API or  if you want to more secure your API then enable 2 way authentication through OAuth framework . 
  • Apply Authorization on each API resource to more control on API security through external Identity and access management provider (IAM).
  • Use encryption and signatures to all your API exposed personal and organizational sensitive data.
  • Apply API throttling through API manager to control number of user access per API (Rate Limiting).
  • Implement best practice of exception handling on your APIs to hide all your internal server and database information to mitigate SQL injection security risk.
  • Use Service Mesh to manage different layers of API management and control.
  • Audit your APIs and remove all unused API from your API catalog.
  • Add proper logging, Monitoring and Alerting on your APIs to keep track of your APIs activity.

Conclusion: APIs are a critical part of modern AI, mobile, SaaS, IOT and web applications. APIs Security should be the main focus on strategies and solutions to mitigate the unique vulnerabilities and security risks .

Mulesoft Performance Tunning

API Performance Tunning
  1. Keep the application synchronous if possible. Synchronous flows avoid serialization/deserialization of messages sent through VM queues, do not cause context switches, and do not cause contention when messages move across thread pools.
  1. Store as little as possible in variables. The vars are serialized and deserialized every time a message crosses an endpoint, even if it is a VM endpoint. This will impact performance overhead in direct proportion to the size of variables and the number of endpoints. 
  1. Use Dataweave Java payloads whenever possible. The usage of a canonical data model is recommended for projects that deal with data (mapping, transformation etc.). It is also recommended to create them in Java objects as dataweave whenever possible, as this provides the fastest format to access fields and change information and to convert to other formats.
  1. Encourage dataweave  languages. For better performance, use Dataweave for simple data extraction from messages, and Java components with dataweave for everything else. 
  1. Use flow references instead of VM endpoints. To communicate between flows internally within an application, use flow references instead of VM endpoints. The VM connector, even though it is an in-memory protocol, emulates transport semantics that serialize and deserialize parts of your messages, most notably the vars. This makes it slower than a flow reference, which just injects messages into the referenced flow with no intermediate steps. Please note that in some cases the usage of VM endpoints is preferred (see the chapter on reliability patterns). For example, a Mule cluster can load balance applications that use VM endpoints by deferring execution to another, available node in the cluster.
  1. Cache aggressively. Take advantage of Mule’s caching scope when making requests to external resources like Web services or databases. Also consider caching reusable assets such as security tokens or ephemeral API keys and cookies. Mule’s Notification subsystem can additionally be used to “warm up” a cache when Mule starts. For example, consider doing this for situations where an initial cache miss is not acceptable.
  1. Configure message processors and endpoints at the global level. Some connectors allow you to configure some parameters at both the global and the endpoint/message processor level. We recommend placing the configuration at a global level to avoid repeated initialization of resources. 
  1. Avoid creating a large volume of business events. Business events incur performance overhead in Mule and in platform when platform’s internal event buffer overflows. Thus, avoid using either default flow level business events or a large volume of custom business events in a high message volume project.
  1. Consider using message compression. For communicating between Mule applications over the network consider using Mule’s compression processors to compress/decompress the message payloads before they hit the wire if their sizes are large.
  1. Consider using VM queues instead of an external message broker. VM queues are fast and have some guaranteed delivery semantics in a cluster. Consider using these instead of going out to an external messaging broker for inter-application Mule communication.
  1. Use the async scope when appropriate. If a flow is performing processing on a message that is neither modifying the message nor changing how it is routed, then it could be wrapped in an async block. This will cause the processing to occur in a different thread and will avoid adding unnecessary overhead to processing the message.
  2. Use connection pooling for connectors because the performance cost of establishing a connection to another data source, such as a database, is relatively high.
  3. Optimize your logging within your mule flows. Too much logging will slow down your process and too less logging will hard to debug.
  4. Encryption and decryption of data is very costly. Whenever your Mule application really needs then apply encryption/decryption on your data.

APIs Integration with IOT and CRM improves Customer Service

APIs integration helping IOT and CRM to enable better customer experience

IOT (Internet of things) is revolutionizing our lives. As per Gartner report by 2025 IOT market will expand a 58-billion-dollar opportunity. It is affecting all parts of our life. In our pandemic era we found more use of IOT device to maintain social distancing.

IOT is also one of the main disruptive technologies in our businesses. It is affecting all business domain including healthcare, retail, automotive, security.

There are wide range of IOT benefits in business.

  • Enhanced productivity
  • Better customer experience
  • Cost-effectiveness

CRM system is keeping all your customer relationship like data, notes, metrics and more – in one place. CRM is helping small business to take off all burden from the IT management team by automating the business process. It is also helping employee to keep the focus on the critical business areas.

API is helping to integrate these two unrelated systems. APIs are enabling this system to optimize process and streamline whole business process. API is the main communication channel to build robust process and keeping real time update to these systems. APIs are allowing to build context-based application with IOT and CRM to interact with the physical world.

Now here are few areas where IOT is helping CRM system with help of APIs to optimize business process.

  1. Optimize customer service – Before your customer finds any error in your service/product you proactively acting on error and fixing those error. This will help to build relationship with customer.
  2. Increase sales – With help of IOT and CRM system you are finding untouched opportunity and using those opportunity to increase your sale.
  3. Personalize customer experience – You are analyzing data provided by IOT and CRM system and building user based predictive model to enable personalize experience to user.
  4. Customer retention – CRM provide customer data and relationship. IOT data providing customer behavior. This will help any business to personalize and target marketing for their customer.
  5. Omnichannel instore experience – IOT and CRM is helping business to enable 360 omnichannel customer experience. This process will help and suggest the products which the customer might purchase.

APIs  integration with IOT and CRM helping business to enable higher degree of personalization, target marketing, optimize price model, higher revenue and enhance customer satisfaction.

Anypoint Platform: External (OKTA) Identity Management

Anypoint Platform acts as a client provider by default, but you can also configure external client providers to authorize client applications. As an API owner, you can apply an OAuth 2.0 policy to authorize client applications that try to access your API. You need an OAuth 2.0 provider to use an OAuth 2.0 policy. You can configure more than one client provider and associate the client providers with different environments. If you configure multiple client providers after you have already created environments, you can associate the new client providers with the environment. 

MuleSoft supports client management by identity providers that implement the OpenID Connect Dynamic Client Registration open standard. MuleSoft explicitly verifies support in Anypoint Platform for Salesforce, Okta, and OpenAM v14 Dynamic Client Registration. The following table contains examples of the URLs you need to supply, depending on your provider, during registration.

URL NameOkta Example URLOpenAM Example URLSalesforce Example URL
Base https://example.okta.com/oauth2/v1 https://example.com/openam/oauth2 https://example.salesforce.com/services/oauth2
Client Registration {BASE URL}/clients {BASE URL}/connect/register {BASE URL}/register
Authorize {BASE URL}/authorize {BASE URL}/authorize {BASE URL}/authorize
Token {BASE URL}/token {BASE URL}/access_token {BASE URL}/token
Token Introspection {BASE URL}/introspect {BASE URL}/introspect {BASE URL}/introspect
URL Name Okta Example URL OpenAM Example URL Salesforce Example URL

Steps to Create External Client Provider

  • Log in to Anypoint Platform using an account that has the organization administrator role.
  • In Anypoint Platform, click Access Management.
  • In the menu on the left, click Client Providers.

  • Click Add Client Provider, and then select OpenID Connect Dynamic Client Registration.
    The Add OIDC client provider page appears.
  • After obtaining values from your identity provider’s configuration, complete the following required fields in each section:
    • Dynamic Client Registration
      • Issuer: URL that the OpenID provider asserts is its trusted issuer.
      • Client Registration URL: The URL to dynamically register client applications as a client application for your identity provider.
      • Authorization Header
        • For Okta, this value is SSWS ${api_token}, where api_token is an API token created through Okta.
        • For ForgeRock, this value is Bearer ${api_token}, where api_token is an API token created through ForgeRock.
        • For Salesforce, this value is Bearer ${api_token}, where api_token is an API token created through Salesforce. In Advanced Settings you can also select:
      • Disable server certificate validation: Disables server certificate validation if your OpenID client management instance presents a self-signed certificate, or one signed by an internal certificate authority.
      • Enable client deletion in Anypoint Platform: Enables deletion of clients created with this integration.
      • Enable client deletion and updates in IdP: To use this option, you must also select the Enable client deletion in Anypoint Platform option.
    • Token Introspection Client
      • Client ID: The client ID for an existing client in your IdP capable of introspection of all tokens from all clients.
        • For Okta, this value should be a “Confidential” client.
        • For ForgeRock, this value should be a “Confidential” client.
        • For Salesforce, this value should be a “Confidential” client.
      • Client Secret: The client secret that corresponds to the client ID.
    • OpenID Connect Authorization URLs
      • Authorize URL: The URL where the user authenticates and grants OpenID Connect client applications access to the user’s identity.
      • Token URL: The URL that provides the user’s identity, encoded in a secure JSON Web Token.
      • Token Introspection URL: endpoint that returns metadata about the access token, including expiration and token active state.

Mulesoft 4: Using Java in Dataweave 2.0

Mule 4 introduces DataWeave 2.0 as the default expression language replacing Mule Expression Language (MEL). DataWeave 2.0 is tightly integrated with the Mule 4 runtime engine, which runs the scripts and expressions in your Mule application.

Since Dataweave 2.0 is default expression language for Mule 4, Dataweave can use almost all place within your Mule application. So, In some use-case Dataweave needs to call java method or instantiate java class to execute java complex business logic.

In my previous blog I explained usage of java within Mulesoft flow. In this blog I am explaining usage of java within Dataweave 2.0.

There are 2 ways we can use java within Dataweave code

  1. Calling java method
  2. Instantiate Java class
1. Calling java method There is restriction with Dataweave when calling to java. you can only call Static methods via DataWeave (methods that belong to a Java class, not methods that belong to a specific instance of a class). Before making a method call from java class, you must import the class.

Here is Dataweave code

In dataweave this method can use multiple way

Import only method instead of the whole class:

%dw 2.0
import java!com::vanrish::AppUtils:: encode
output application/json
---
{
encode:encode("mystring" as String)
}

import and call it in a single line:

%dw 2.0
output application/json
---

    encode: java!com::vanrish::AppUtils:: encode ("mystring" as String)
}
2. Instantiate Java class Dataweave allows to instantiate a new object of any java class but you can’t call its instance method through dataweave. You can refer it as variables.

%dw 2.0
import java!com::vanrish::AppUtils
output application/json
---
{
     value: AppUtils::new().data
}

AppUtils.java

package com.vanrish;

import sun.misc.BASE64Encoder;
/**
 * @author rajnish
 */
public class AppUtils{
	public static final BASE64Encoder encoder = new BASE64Encoder();
        private String data;
	
	/**
	 * @param plainString
	 * @return
	 */
	public static String encode(String plainString)
	{
		String encodedString = encoder.encodeBuffer(plainString.getBytes());
		return encodedString;
	}

/**
	 * @param dataStr
	 * @return
	 */
	public String getData(String dataStr)
	{
              data = dataStr+" : test";
              return  data;
	}
}

MuleSoft 4: Using Java in Mule Flow

MuleSoft is a lightweight integration and API platform that allows you to connect anything anywhere and enable your data through API. Mule evolved from java and spring framework. MuleSoft supports multiple language although all Mule module are developed in java.

 Since Mule evolved from java it has capability to use direct java class and method in Mule flow. This capability gives flexibility to Mule developer to use java for complex business logic.

There are several ways you can use java within Mule. Here are some of Java modules available to use within MuleSoft application

There are 4 java modules are available in MuleSoft flow

  1. New
  2. Invoke
  3. Invoke static
  4. Validate type

To explain all these components and uses in Mule flow I created Utils.java and AppUtils.java classes

1. New – AppUtils.java class instantiation can be achieved by calling constructor of this class through MuleSoft New component within Mule flow.

AppUtils java class defined 2 contractors, So Mule constructor properties for NEW component is showing 2 options.

New module without parameter

<java:new doc:name="Instantiate appUtils" doc:id="22ddcb7e-82ed-40f8-bc11-b779ceedd1a1"
constructor="AppUtils()" class="com.vanrish.AppUtils" target="appInst">
</java:new>

New module with parameter

<java:new doc:name="Instantiate appUtils" doc:id="22ddcb7e-82ed-40f8-bc11-b779ceedd1a1"
constructor="AppUtils(String)" class="com.vanrish.AppUtils" target="appInst">
<java:args ><![CDATA[#[{paramVal:"Hello world}]]]>
</java:new>

In above code, Instance of AppUtils class is created and placed into the “appInst”  target variables to reuse same instance in Mule flow.

New module
2. InvokeIn new java module we instantiate AppUtils.java class and placed into “appInst” variable. Now to use this variable set Invoke module and call one of method define in AppUtils.java class. In AppUtils.java class, there is one non static method “generateRandomNumber” defined with String parameter. In example we call this method through Invoke module.
<java:invoke doc:name="Invoke" doc:id="9348e2cf-87fe-4ff7-958c-f430d0421702"
instance="#[vars.appInst]" class="com.vanrish.AppUtils" method="generateRandomNumber(String)">
<java:args ><![CDATA[
#[{numVal:”100”}]]]></java:args>
</java:invoke>
Invoke module
3. Invoke staticInvoke static java module enable mule flow to call java static method. This is one of the easy ways to call any java method in Mule flow.

Mule code is calling to java static method

<java:invoke-static doc:name="Invoke static" doc:id="bc3e110c-d970-47ef-891e-93fb3ffb61bd" 
class="com.vanrish.AppUtils" method="encode(String)">
<java:args ><![CDATA[#[{plainString:"mystringval"}]]]></java:args>
</java:invoke-static>
Invoke-static module
4. Validate typeValidate type java module use instance of method from java. This module accepts “Accept subtypes” parameter which indicates if the operation should accept all subclasses of a class. By default it acceptSubtypes=“true” which means it will accept all sub class of main class but if it will set as false acceptSubtypes=“false” then during execution the operation throws an error (JAVA:WRONG_INSTANCE_CLASS)
<java:validate-type doc:name="Validate type" doc:id="288c791c-50eb-4be0-b924-56481dfdc023"
class="com.vanrish.Utils" instance="#[vars.appInst]" acceptSubtypes="false"/>
Validate-type module

Java in Mule flow diagram

java in Mule flow

Utils.java

package com.vanrish;

public class Utils{
	
}

AppUtils.java

package com.vanrish;
import java.util.Random;
import sun.misc.BASE64Encoder;
/**
 * @author rajnish
 */
public class AppUtils extends Utils {
	public static final BASE64Encoder encoder = new BASE64Encoder();
	//Constructor without Parameter
	public AppUtils(){
		System.out.println("Constructor with no parameter");
	}
	//Constructor with Parameter
  public AppUtils(String paramVal){
		System.out.println("Constructor with parameter value="+paramVal);
	}
	/**
	 * @param String
	 * @return
	 */
	public  String generateRandomNumber(String numVal) {
		Integer numNoRange = null;
		  Random rand = new Random();
		  if(numVal !=null) {
			  numNoRange = rand.nextInt(new Integer(numVal));
		  }else {
		   numNoRange = rand.nextInt();
		  }
	    return numNoRange.toString();
	}
	/**
	 * @param plainString
	 * @return
	 */
	public static String encode(String plainString)
	{
		String encodedString = encoder.encodeBuffer(plainString.getBytes());
		return encodedString;
	}
}

Mulesoft Code

<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:java="http://www.mulesoft.org/schema/mule/java" xmlns:db="http://www.mulesoft.org/schema/mule/db"
xmlns:ee="http://www.mulesoft.org/schema/mule/ee/core" xmlns:http="http://www.mulesoft.org/schema/mule/http" xmlns="http://www.mulesoft.org/schema/mule/core"
xmlns:doc="http://www.mulesoft.org/schema/mule/documentation"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/http http://www.mulesoft.org/schema/mule/http/current/mule-http.xsd
http://www.mulesoft.org/schema/mule/ee/core http://www.mulesoft.org/schema/mule/ee/core/current/mule-ee.xsd
http://www.mulesoft.org/schema/mule/hl7 http://www.mulesoft.org/schema/mule/hl7/current/mule-hl7.xsd
http://www.mulesoft.org/schema/mule/db
http://www.mulesoft.org/schema/mule/db/current/mule-db.xsd
http://www.mulesoft.org/schema/mule/java http://www.mulesoft.org/schema/mule/java/current/mule-java.xsd"> <http:listener-config name="HTTP_Listener_config" doc:name="HTTP Listener config" doc:id="af3ce281-bf68-4c7b-83fb-52b2d2506677" > <http:listener-connection host="0.0.0.0" port="8081" /> </http:listener-config> <flow name="helloworldFlow" doc:id="a13826dc-67a1-4cda-8133-bc16a59ddba2" > <http:listener doc:name="Listener" doc:id="82522c77-5c33-4003-820a-7a04b51c3001" config-ref="HTTP_Listener_config" path="helloworld"/> <logger level="INFO" doc:name="Logger" doc:id="b40bc107-1ab5-4a3f-8f20-d7dfb63e5acb" message="entering flow"/> <java:new doc:name="Instantiate appUtils" doc:id="c1854580-f4d4-4e5c-a34d-7ca185152d02" constructor="AppUtils(String)" class="com.vanrish.AppUtils" target="appInst" > <java:args ><! [CDATA[#[{ paramVal:"Hello world" }]]]></java:args> </java:new> <java:invoke doc:name="Invoke" doc:id="9348e2cf-87fe-4ff7-958c-f430d0421702" instance="#[vars.appInst]" class="com.vanrish.AppUtils" method="generateRandomNumber(String)"> <java:args ><![CDATA[#[{ numVal:null }]]]></java:args> </java:invoke> <java:invoke-static doc:name="Invoke static" doc:id="bc3e110c-d970-47ef-891e-93fb3ffb61bd" class="com.vanrish.AppUtils" method="encode(String)"> <java:args ><![CDATA[#[{ plainString:"mystringval" }]]]></java:args> </java:invoke-static> <java:validate-type doc:name="Validate type" doc:id="288c791c-50eb-4be0-b924-56481dfdc023" class="com.vanrish.Utils" instance="#[vars.appInst]"/> <set-payload value="Success" doc:name="Set Payload" doc:id="8b6c9c0b-07c8-4e17-b649-2c24d4da8bea" /> </flow> </mule>

MuleSoft: Cloudhub vCores Usage Optimization

Mulesoft Connect 2019 was wrapped last month in North america. These connects are one of the premier conferences for API led connectivity and digital transformation.These conference brought more content for developers, architects, and business executives across different business domain. At MuleSoft CONNECT plethora of market experts, and business executives including industry’s CEO/CTO,  discussed their Mulesoft experience and democratization of innovation.

During these conferences, I got an opportunity to talk to some business executives about their Mulesoft experiences and challenges.

One of the biggest challenges is to optimize Mulesoft vCore in cloudhub to keep their project in budget. 

Here are few steps in Mulesoft application to keep vCore usage low and project in budget.

1. API Optimization — As per Mulesoft best practices, Mulesoft suggest API led connectivity to expose data to application within or outside of your organization through reusable and purposeful APIs.

The APIs used in an API-led approach to connectivity falls  into three categories:

  • Experience APIs
  • Process APIs 
  • System APIs

When you are working on API led connectivity, do we really need all three layers of APIs every time?

No, It is not necessary to implement all three layers of APIs every time.

API Layers

Here are some of API layers use-case to save vCores usage and optimize APIs led connectivity.

  • Experience APIs — Experience API is similar to process APIs but unlike Process APIs, Experience APIs are more specifically tied to a unique business context, and project data formats, interaction timings, or protocols into a specific channel and context. These  APIs simplifies your front end data, based on different GUI. For example if you are working on PC website or Mobile website, we display data based on user experience, so we need different APIs to show these data, but if your application needs  only data irrespective of user experience we can skip Experience APIs and application can work only on Process APIs or System APIs. This will save some vCores and keep project in budget.
  • Process APIs — Process APIs, if you are working on complex business logic based on different organization department then you can incorporate all these business specific data in process layer and expose these data through process APIs. But if APIs are not incorporating any complex business logic and most of datas are processing through System APIs then in this use-case you can skip Process APIs and expose your data through System APIs. In this way you can save some vCore and keep your project within budget.

2. Salesforce Platform Events Integration – Salesforce integration with Mulesoft is one of the very common integration use-cases. In the old days  Salesforce synced their data through polling. Poll run couple of time in whole day and sync data between different salesforce org. Since this is polling process, it is not easy to predict the volume of data flowing through Mulesoft application during a certain period of time. So in this case, we go with higher mulesoft vCore to avoid any memory leak. 

Salesforce introduced “Platform Events”   the Salesforce Enterprise Messaging Platform on June 2017. After introduction of “Platform Event”,  integration of Mulesoft and salesforce has become very easy. “Platform Event” enterprise messaging service is event based. So any update for any create Object within salesforce generates event and sends payload to salesforce messaging queue. Mulesoft-Salesforce connector read these payload for data sync from Salesforce messaging queue FIFO based. Since this integration is event based, so as soon as Mulesoft receives event from “Platform event” it is processes Platform event message. So any time we have no large set of data to process. In this integration then we can go for lower vCore and execute project within budget. 

3. Batch Process Optimization — Mulesoft allows to process messages in batch. Mule batch process provides a construct for asynchronous processing larger-than-memory data sets that can split into individual records. Mulesoft batch extracting, transforming and loading (ETL) information into a target system like hadoop.

               Mulesoft needs large memory/vCore to run large sets of data in batch process.   These Mulesoft batch process runs max once or twice a day . These Mulesoft batch  hold large number of vCore idle rest of day without any active usage. You can optimize vCore usage and reduce your batching processing cost by following these two steps.

  • Reuse vCore by deploying multiple batch process applications — As you know, batch run certain time of day once or twice. Suppose one batch application is running every midnight and other batch application is running every morning . Both your batch application is taking 1 vCore. So both applications consuming  total 2 vCore.

If you are configuring any CI/CD process like Jenkin/Code build to deploy your batch application into cloud then it is very easy to manage your process to reuse your vCore. Your can configure you CI/CD process to build your application and deploy your application into cloud when you want to run batch. Once batch is done then you un-deploy your application and deploy next batch application on same memory. In this way you can keep reusing your vCore memory and keep your project within budget.

  • Deploy Batch application in on-premise Mulesoft server — As we all know Batch process is simple and easy to maintain application in most to their use-case. In this case it is very easy to maintain on-premise Mulesoft server and deploy your batch application without much worry about vCore usage.

Link:

Salesforce Platform Event Mulesoft Integration : https://www.vanrish.com/blog/2018/10/01/mulesoft-salesforce-platform-events-integration/

Mulesoft: FedRamp Compliance Cloud Integration for Government

Fiscal year 2019, government estimated $45.8 billion on IT investments at major civilian agencies, which will be used to acquire, develop, and implement modern technologies.78% of this budget goes to maintain existing IT system. In a constantly changing IT landscape, the migration of federal on-premise technologies to the cloud is increasing every year. Federal agencies have the opportunity to save money and time by adopting innovative cloud services to meet their critical mission needs and keep up to date with current technology. Federal agencies are required by law to protect any federal information that is collected, maintained, processed, disseminated, or disposed of by cloud service offerings, in accordance with FedRAMP requirements.

What is Federal Risk and Authorization Management Program (FedRamp) ? 

FedRamp is a US government-wide program that delivers a standard approach to security assessment, authorization, and continuous monitoring for cloud products and services. The stakeholders for FedRamps are 

  1. Federal Agencies
  2. FedRamp PMO & JAB(Joint Authorization Board)
  3. Third Party Assessment Organization

FedRamp Process There are 3 ways a cloud service can be proposed for FedRamp Authorization.

  1. Cloud BPA — Cloud Services through FCCI BPAs
  2. Government Cloud Systems — Services must be intended for use by multiple government or government approved agencies.
  3. Agency Sponsorship — This is the most popular route for cloud service providers (CSPs) to take when working toward a FedRAMP Authorization. CSP to establish a partnership with an Agency and agree to work together for an Authority to Operate(ATO).

Mulesoft FedRAMP Authorize Integration Platform

Mulesoft recently announced, FedRAMP process implementation of Anypoint Platform. MuleSoft is one of the first integration platform companies with FedRamp authorization and enabling both on-premises and cloud integration in the federal government and state government. Enablement of FedRamp of Mulesoft Anypoint platform, government IT teams can leverage the same core Anypoint Platform benefits in the cloud to accelerate their project delivery via reusable APIs.Anypoint Platform allows all government integration assets to be managed and monitored from a single, secure, cloud based management console, simplifying operations and increasing IT agility. 

Mulesoft Anypoint platform enables FedRamp-compliant iPAAS for government organization. Government IT integration project deploy in Anypoint platform within Mulesoft Government cloud 

  1. Accelerate government IT project deliveries by deploying sophisticated cross-cloud integration applications and create new APIs on top of existing data sources
  2. Project deliveries improve efficiencies at lower cost by allowing IT integration teams to focus on designing, deploying, and managing integrations in the cloud and allowing agencies to only pay for what they use, .
  3. Reduce risk of your IT project integration and increase application reliability by using of self-healing mechanism to recover from problems and load balancing.  

What is Mulesoft Government Cloud?

Mulesoft government cloud is a FedRamp-compliant, cloud based deployment environment for Anypoint platform. 

  1. It is built on AWS GovCloud with FedRamp control. 
  2. Mule Runtimes configured in secure mode to support the highest encryption standards and FIPS(Federal Information Processing Standard)  140-2 hardware and software encryption compliance.
  3. It is FedRamp-compliance at the moderate impact level.
  4. It is continuous 3rd party(3 POs) auditing and monitoring of security control.

Mulesoft government cloud can be access through this link https://gov.anypoint.mulesoft.com/login/ . Mulesoft Government cloud resources are available through Anypoint exchange. Mulesoft Government cloud exchange URL is https://gov.anypoint.mulesoft.com/exchange/ .

If you are accessing FedRamp-compliant Anypoint platform, after logging you get end user agreement as a consent. It is very typical for FedRamp-compliant government application.  

Conclusion — Executing any government or state project and working on different integration as well as API enablement, FedRamp-compliant Anypoint platform is one of the best options. It accelerate IT project deliveries, improve efficiencies and reduce IT risk .    

Mule 4: Consuming APIs through Mule 4 application

Mulesoft is all about API strategy and digital transformation of your organization through APIs within cloudHub or in premise.  Mulesoft also provides platform for APIs to monitor and analyze the usage, control access and protect sensitive data with security policies. API is at the heart of digital transformation and it enables greater speed, flexibility and agility of any organization.

            Exposing of your APIs is one aspect of your digital transformation strategy, but consuming API is also as important as exposing APIs. Consuming API is either application getting data from APIs or create/update data through APIs. Most of APIs are based on HTTP/HTTPS protocol. In Mule 4 consuming APIs is also start with configuration of HTTP/HTTPs protocol.

Configuration of HTTP/HTTPS— HTTP/HTTPS configuration start with selecting protocol. If API is available through HTTP then select protocol HTTP with default port 80 or change port based on expose API document.  If APIs are available through secured connection, then select HTTPS protocol with default port 443. Fill the Host with your expose API end point without any protocol. Fill the other field with default value.

Authentication of API are available with five different selection

  1. None – No authentication. Available for everyone
  2. Expression – Custom or expression-based authentication
  3. Basic authentication – Username/Password authentication
  4. Digest authentication — web server can use to negotiate credentials, such as username or password, with a user’s web browser
  5. Ntlm authentication — NT (New Technology) LAN Manager (NTLM) . Microsoft security protocols intended to provide authentication, integrity, and confidentiality to users


If you are working on post/patch/put method api to send data into expose api, set some important parameter based on streaming mode. If API are exposed as streaming mode, then you need to mention content-size of streaming otherwise set value as “NEVER”, then you no need to set content-size.

API Get Call – API get call implement GET method of APIs. Implementation of API get call need parameters. Based on these parameters application get set of data. MuleSoft provide 4 ways to pass these parameters or values.

  • Body
  • Headers
  • Query Parameters
  • URI Parameters

Flow for GET Method

API POST/PUT/PATCH Call –

POST – Create data

PUT/PATCH – Update data

Similar to Get method call, for POST/PUT/PATCH method application send API parameters based on API requirement. Since application is creating/Updating data through POST/PUT/PATCH api call, application sends these data through body parameters with content-type.

Flow for POST/PUT/PATCH Method

Here is flow of API GET call

Implemented code

<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:db="http://www.mulesoft.org/schema/mule/db" xmlns:ee="http://www.mulesoft.org/schema/mule/ee/core"
xmlns:http="http://www.mulesoft.org/schema/mule/http"
xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/http http://www.mulesoft.org/schema/mule/http/current/mule-http.xsd
http://www.mulesoft.org/schema/mule/ee/core http://www.mulesoft.org/schema/mule/ee/core/current/mule-ee.xsd
http://www.mulesoft.org/schema/mule/db http://www.mulesoft.org/schema/mule/db/current/mule-db.xsd">
 <configuration-properties doc:name="Configuration properties" doc:id="4bfab9b8-5f36-4b72-b335-35f6f7c3627e" file="mule-app.properties" />
 
<http:listener-config name="HTTP_Listener_config" doc:name="HTTP Listener config" doc:id="49f690e3-8636-45a2-a3f3-244fa170f2f0" basePath="/media" >
<http:listener-connection host="0.0.0.0" port="8081" />
</http:listener-config>
<http:request-config name="HTTPS_Request_configuration" doc:name="HTTP Request configuration" doc:id="4e8927df-e355-4dcd-9bdc-bf154ab1146a" requestStreamingMode="NEVER">
<http:request-connection host="api.vanrish.com" port="443" protocol="HTTPS" connectionIdleTimeout="50000" streamResponse="true">
<http:authentication >
<http:basic-authentication username="2xxxxxxx-xxx0-xxbx-bxxf-xxxxxxxc5" password="xxxxxxxx3dxxxxxb153dbxxxxx29cxxx"/>
</http:authentication>
</http:request-connection>
</http:request-config>
<flow name="demo-mediaFlow" doc:id="a55b1d37-b63d-4336-8d8e-5b70bc354078">
<scheduler doc:name="Scheduler" doc:id="9dc041e5-3140-47a1-a13c-39ee3ba59389" >
<scheduling-strategy >
<fixed-frequency timeUnit="SECONDS"/>
</scheduling-strategy>
</scheduler>
<logger level="INFO" doc:name="Logger" doc:id="9bc54079-8718-4dc1-a433-91081dfdabff" message="Get flow Entering ..... "/>
<http:request method="GET" doc:name="Request" doc:id="872f3240-e954-469a-b330-83aa7e40c3db" config-ref="HTTPS_Request_configuration" path="/api/users">
         <http:headers ><![CDATA[#[output application/java
---
{
"client_id" : "xxxx2-d960-4xxx-b5df-4d704ce2xxxx",
"client_secret" : "xxxd5e8663xxxf1b153db732529cxxx",
"Range" : "items=0-2000"
}]]]></http:headers>
<http:uri-params ><![CDATA[#[output application/java
---
{
test : "123"
}]]]></http:uri-params>
     
</http:request>
<logger level="INFO" doc:name="Logger" doc:id="9b733315-038b-4f4c-9c2e-15fc026f0524" message="data coming from GET API ...... total payload size --- #[sizeOf(payload)]"/>
<ee:transform doc:name="Transform Message" doc:id="383e857d-efe7-45df-ab33-b75a073080b7" >
<ee:message >
<ee:set-payload ><![CDATA[%dw 2.0
output application/json
---
payload]]></ee:set-payload>
</ee:message>
</ee:transform>
<logger level="INFO" doc:name="Logger" doc:id="769a6306-ee9a-4f51-94ad-562ad1042c5e" message="Getting data from API #[payload]"/>
</flow>
</mule>