Zero Trust API Security Architect

The cybersecurity threat landscape has changed dramatically in the last couple of years. Every day new kinds of threats are coming and impacting the organization’s business. Infosec/Security teams have always had challenges with this new threat to find the root cause and mitigate these risks.

To mitigate and overcome these constant/real-time threats and risks, the security fraternity introduces Zero Trust Architecture (ZTA) Or Zero Trust Strategy (ZTS).  ZTA is not a product or application, but it is a concept and practice to mitigate any risk for your organization.

What is ZTA/ZTS?

Zero Trust is an information security model that denies access to applications and data by default. Threat prevention is achieved by continuously validating for security configuration and posture before being granted or keeping access to applications and data across users and their associated devices. All entities are untrusted by default; least privilege access is enforced; and comprehensive security monitoring is implemented.

Here are the basic properties for ZTA/ZTS

  • Default deny
  • Access by policy only
  • For data, workloads, users, devices
  • Least privilege access
  • Security monitoring
  • Risk-based verification

How API implement ZTA/ZTS?

API Security focuses on strategies and solutions to understand and mitigate the unique vulnerabilities and security risks of Application Programming Interfaces (APIs). In API security we establish certain rules and processes to mitigate security risks.  These rules and processes are around Zero trust architecture or strategy. Here are a few basic strategies in API security to implement ZTA.

  1. All API communications are secured regardless of network location – This risk can be mitigated by ensuring all communication happens over an encrypted communication channel (TLS) and implementing a proper Cross-Origin Resource Sharing (CORS) policy. The endpoint for API needs to be exposed through the HTTPS protocol.
  2. All API endpoints are authenticated regardless of their environments (Prod, QA, Dev) — By default, all APIs need to be authenticated and authorized using username/password, JSON Web Token (JWT), OAuth, OpenID Connect, or third-party services.
  3. All API resources are protected and restricted to all users by default — Running multiple versions of an API requires additional management resources from the API provider and expands the attack surface. As per ZTA, make sure all API versions and their resources are restricted if it is not used by the user. Always validate and properly sanitize data received from integrated APIs before using it.
  4. Access to API resources is determined by dynamic policy including the client identity, application/service, and the requesting asset – Any API requires resources such as network bandwidth, CPU, memory, and storage. It is easy to exploit these resources by simple API calls or multiple concurrent requests. According to Zero Trust Architect, all APIs must implement API policies like:
    • Client identity (ClientID/Client-Secret)
    • Execution timeouts (Rate limiting)
    • Maximum allowable memory
    • Maximum number of file descriptors
    • Maximum number of processes
    • Maximum upload file size
  5. Implement or configure API monitoring posture and API Alert system — API monitoring helps identify and resolve performance issues as well as security vulnerability issues before they negatively impact users, which can impact user experience. The alert system notifies the operation team to mitigate risk quickly.
  6. Continuous API security risk assessments – Continuous risk assessments help the Infosec/Security team identify any security risk gap. By conducting the security risk assessments, organizations establish a baseline of cybersecurity measurements, and such baselines could be referenced to or compared against future results to improve overall cyber posture and resiliency further and demonstrate progress. A Free Security assessments tool VAT is available to mitigate any security risk for your organization.

https://www.vanrish.com/secassessment/

Organizations that have adopted the Zero Trust API model, see trust as fundamental to creating a positive, low-friction work culture for their clients and empowering the organization at all levels. Many of our Vanrish Technology clients, we worked with have many of the technologies in place that can be leveraged toward full Zero Trust architect model adoption.

API Security

Modern-day APIs are the building block for integration and application for any organization. Every day organizations are using APIs to unlock new features and enable innovation. From banks, retail, and transportation to IoT, autonomous vehicles, and smart cities, APIs are a critical part of modern mobile, SaaS, and web applications and can be found in customer-facing, partner-facing, and internal applications.

Organizations are exposing sensitive data, such as Personally Identifiable Information (PII) through APIs, and because of this have increasingly become a target for attackers. Due to this organizations are concerned about their API security & compliance. API Security focuses on strategies and solutions to understand and mitigate the unique vulnerabilities and security risks of Application Programming Interfaces (APIs). According to the Open Web Application Security Project (OWASP) 2023, these API threats are categorized into 10 different categories

  1. Broken Object Level Authorization (BOLA) – Object-level authorization is an access control mechanism that is usually implemented at the code level to validate that a user can only access the objects that they should have permission to access.
    Comparing the user ID of the current session (e.g. by extracting it from the JWT token) with the vulnerable ID parameter isn’t a sufficient solution to solve Broken Object Level Authorization (BOLA).

    For example, any API providing a listing of all school revenue based on the school’s name of any county could be a security threat like this API endpoint: /county/{schoolName}/revenues.
    Hacker simply manipulates {schoolName} in the above endpoint’s school name to get all revenue details for all schools.

    To mitigate this risk Use the authorization mechanism to check if the logged-in user has access to perform the requested action on the record in every function that uses an input from the client to access a record in the database.
  2. Broken Authentication – API authentication is very vulnerable and an easy target for attackers. Attackers can gain complete control of other users’ accounts in the system, read their personal data, and perform sensitive actions on their behalf.

    API authentication flow and process need to be well protected and “Forgot password / reset password” should be treated the same way as authentication mechanisms. Make sure you know all possible flows to authentication to API (Mobile/Web/any link) and it gets well protected with authentication.
  3. Broken Object Property Level Authorization – When authorizing a user to access an object using an API endpoint, It is very important to validate that the user has permission to access the specific or all object properties.
    An API endpoint is considered as vulnerable if :
    • The API endpoint exposes properties of an object that are considered sensitive and should not be read by the user.
    • The API endpoint allows a user to change, add/or delete the value of a sensitive object’s property which the user should not be able to access.

      When you are exposing any API endpoint, always make sure that the user has access to the object’s properties you expose and avoid using any generic methods like to_json() and to_string().
  4. Unrestricted Resource Consumption – Enabling any API request, requires resources such as network bandwidth, CPU, memory, and storage. These resources have limited bandwidth and money associated with these resources.

    It is easy to exploit these resources by simple API calls or multiple concurrent requests. An API is vulnerable if at least one of the following limits is missing or set inappropriately.
    • Execution timeouts
    • Maximum allowable memory
    • Maximum number of file descriptors
    • Maximum number of processes
    • Maximum upload file size
    • Number of operations to perform in a single API client request (e.g. GraphQL batching)
    • Number of records per page to return in a single request-response
    • Third-party service providers’ spending limit
  5. Broken Function Level Authorization If any of the administrative API flows like delete, update, or create expose to unauthorized users it will be an easily vulnerable API endpoint. The best way to find broken function level authorization issues is to perform a deep analysis of the authorization mechanism while keeping in mind the user hierarchy, different roles or groups in the application, and asking the following questions:
    • Can a regular user access the administrative endpoint?
    • Can a user perform sensitive actions (e.g. creation, modification, or deletion) that they should not have access to by simply changing the HTTP method (e.g. from GET to DELETE)?
    • Can a user from Group X access a function that should be exposed only to users from Group Y, by simply guessing the endpoint URL and parameters?

      To mitigate this risk, the enforcement mechanism(s) must deny all access by default, requiring explicit grants to specific roles for access to every function.
  6. Unrestricted Access to Sensitive Business Flows — When you create an API endpoint some endpoints are more sensitive and critical than others. It is very important to understand which API endpoint and business flow you are exposing to the customer. Any restricted business flow exposed to clients can harm your business. In general, technical impact is not very severe but business impact might hurt your company’s credibility.

    For example, if your company offers a discount for one customer 20% and another customer 30% through API, if the first customer knows this discount variation, it will impact the credibility of the company as well as revenue loss.
    The mitigation planning should be done in two layers:
    • Business – identify the business flows that might harm the business if they are excessively used.
    • Engineering – choose the right protection mechanisms to mitigate the business risk.
  7. Server-Side Request Forgery – Server-Side Request Forgery (SSRF) vulnerability occurs when you are consuming remote APIs and resources without validating the remote endpoint or user-supplied URL. SSRF enables attackers to force the application to send formatted requests to an unknown destination even if protected by a firewall. Successful exploitation might lead to internal services enumeration (e.g. port scanning), information disclosure, bypassing firewalls, or other security mechanisms.

    The SSRF risk cannot be eliminated but you can mitigate these risks by isolating the resource fetching mechanism in your network, accepting media types for a given functionality, disabling HTTP redirections, Validating and sanitizing all client-supplied input data, and Using a well-tested and maintained URL parser to avoid issues caused by URL parsing inconsistencies.
  8. Security Misconfiguration — Security Misconfiguration vulnerability occurs when the latest patches are missing on the server or systems are outdated, Transport Layer Security (TLS) is missing, A Cross-Origin Resource Sharing (CORS) policy is missing, Error messages include stack traces or expose other sensitive information. Attackers often attempt to find unpatched flaws, common endpoints, services running with insecure default configurations, or unprotected files and directories to gain unauthorized access or knowledge of the system. These Security misconfigurations not only expose sensitive user data but also system details that can lead to full server compromise.

    Security misconfiguration risk can be mitigated by a repeating hardening process leading to fast and easy deployment, ensuring all communication happens over an encrypted communication channel (TLS), and implementing a proper Cross-Origin Resource Sharing (CORS) policy.
  9. Improper Inventory Management — It is important for organizations not only to have a good understanding and visibility of their own APIs and API endpoints but also how the APIs are storing or sharing data with external third parties. Multiple versions of APIs need to be properly managed, secure, patched and well-documented. Hackers usually get unauthorized access through old API versions or endpoints left running unpatched and using weaker security. requirements.
    Improper Inventory Management security vulnerability can be mitigated by documenting all hosted APIs for all environments (Prod or Non-Prod), Generating documentation automatically by adopting open standards and avoiding using production data with non-production API deployments.
  10. Unsafe Consumption of APIs — Unsafe Consumption of APIs vulnerability occurs when your developers tend to adopt weaker security standards, for instance, in regard to input validation, sanitization, URL redirections and not implementing timeouts for interactions with third-party services.
    This vulnerability can be mitigated by implementing proper data validation, and schema validation. Ensuring all API interaction happens on secured communication channels like TLS. Maintain an allowlist of well-known locations integrated APIs may redirect yours to do not blindly follow redirects.

Generative AI for Public Sector: An API Opportunity

The disruptive power of AI extends to every industry, opening up unlimited possibilities for new business opportunities. It turns imagination into reality, insights into action, and possibility into discovery. Generative AI is a type of AI that produces content such as text, audio, code, videos, images, or any other content based on prompts input by the user. Generative AI models use complex computing processes like deep learning to analyze patterns from large sets of historical data to create new business opportunities.

Generative AI is a one of the most promising technologies that can help the public sector to improve productivity and service quality. However, it is important to ensure that the technology is used responsibly and ethically.

Generative AI can enable the public sector to improve productivity and service quality. Generative AI has a wide range of applications in the public sector. It can be used to extract information and automate paper-based processing. It can also be used to automate repetitive and mundane tasks, enabling staff to take on higher value work, optimize resource allocation, and enhance decision making. It also uses to summarize large amounts of information from different sources, such as public health data and economic indicators, to identify patterns, trends, and correlations for Government to take decision in favor public.

Here are a few examples of tasks that Generative AI can perform in the public sector:

  • Providing support to clients such as chatting, responding, and delegating task to correct department.
  • Writing and editing documents and emails
  • Coding tasks, such as debugging and generating templates and common solutions.
  • Summarizing information.
  • Research, translation, and learning

To ensure the responsible use of GenAI tools and maintain public trust , the public sector should align with the “FASTER” principles:

  • Fair: Content should comply with human rights, accessibility, procedural and unbiased obligations
  • Accountable: Content generated by these tools should make sure it is factual, legal, ethical, and compliant with the legal terms of use.
  • Secure: In pub-sec security is paramount goal. Content generated by Generative AI should appropriate for the security classification of the information and privacy & personal information are protected. Compliance with PII data.
  • Transparent: In Government sector, it is very important that your all procedural is transparent, and users know that they are interacting with an AI tool.
  • Educated: It is very important to document the strengths, limitations, and responsible use of the Generative AI tools. It should also highlight; how to create effective prompts and to identify potential weaknesses in the outputs.
  • Relevant: Generative AI tools should support user and organizational needs, contributes to improved outcomes and become relevant to society and business.

Since Generative AI has a wide range of benefits in the public sector, there are also some challenges associated with its use.

Here are Some of these challenges:

  1. Ethical dilemmas: Generative AI can be used to create deepfakes by manipulating videos and images. That can be used to spread misinformation and create confusion among public.
  2. Dependency on technology: Generative AI is dependent on the latest technology and underline system. It is based on how secure your data technology and how your data is communicating with AI models.
  3. Equity and accessibility issues: Generative automated certain task that led some job displacement. Which lead to accessibility and equity concern.
  4. Staff resistance to change: If Pub-Sec staff perceive Generative AI as a threat to their job then they may be resistance to change into Generative AI process.
  5. Project delays and failures: Generative AI projects are complex and time consuming. This may be delay or failure of project implementation.
  6. Regulatory issues: In Public Sector, data are fragmented which raises compliance and regulatory issue. This may be concerns about data privacy, security, and ownership.
  7. Cybersecurity risks: AI in the public sector raises cybersecurity risks. This may be concern about hacking, data breaches and other cyber threats.

API is helping GenAI to import the AI model and enable data for Generative AI. We can mitigate some of these risks by implementing API based approach for Generative AI in public sector.

Here are the few challenges in pub-sec Generative AI which is mitigated by API implementation.

  • Security: According to recent finding Generative AI makes it easier for hacker to find and exploit vulnerabilities. If your Generative AI models are communicating with your organization data through API, it will mitigate vulnerabilities risk many folds. Government sector can implement strict control of their data in a number of ways like MFA or API access permission.
  • Data control: Through API implementation in Generative AI, pub-sec can eliminate any data leakage and data abuse. Through API governance they can monitor data usage by Generative AI models. Government sector can also implement API rate limiting or IP restriction for any API to get tighter control on their sensitive data.
  • Fairness and relevancy:  Accuracy of Generative AI model or LLM are based on independent and relevancy of data. Generative AI models in pub-sec only work when Generative AI model follows compliances and relevant to use-case. API implementation does make sure data is relevant and independent for LLM. API also restrict any unwanted data for AI models and reduce processing time to cleansing unwanted data.   
  • Data Separation: APIs keep data separated from Generative AI Models or LLM (Large Language Model) implementation. This enable LLM to work on different set of data at the same time and enable faster innovation within government sector.
  • Fast delivery: APIs enable faster delivery of generative AI models. During your development of LLM models you focus only on models not on data deliveries. This may enable two stream of development team. One team focus only on data delivery and second team can focus only on Large language models development. This may empower to team for faster project deliveries.

Public sector adoption on Generative AI is still in the early stages, but it needs to accelerate. This will enable faster public project deliveries and AI bot assistances.

Generative AI: How API making powerful customer experiences

Generative AI is more like a child where you instruct child that don’t bounce basketball inside home, but child goes to bounce a soccer ball inside home. But this was not your expectation from child and then this action falls outside of your expectation. Now you add more parameters with your instruction then the child is more likely to get the response that you want.

Generative AI is the same, the more context and parameter we can give to generative AI the better our service replies, the better emails, the better product recommendations get from your Generative AI Models.

We’re all seeing some amazing demos of generative AI these days. Models trained on the whole internet are able to hold a conversation, explain their reasoning, and perform well at a broad variety of tasks.

You’ve probably started to play with Chat GPT, Google Bard, or Microsoft Bing. In your company folks are already experimenting with different ways of data to use it in their work.

These chat interfaces, as an initial proof of concept, are truly amazing. it’s already becoming clear, the ability to create significant business value and it will be dependent on your ability to INTEGRATE and MANAGE these systems and data.

But there are multiple barriers standing in the way of our ability to implement AI.

  • Fragmented data is hard to ingest into AI models.
  • Missing context leads to poor recommendations.
  • Lack of trust in how the LLMs will use your data.
  • Difficulty in acting on the recommendations because AI is completely detached from business processes.
  • And of course, overall security risks of accessing data across various systems.

Technology is moving fast, and the recent introduction of AI innovation is exciting, especially with the promise of increased productivity. If you look at a public source like Hugging Face, there are over 250k AI models compared to only 32 significant industry-produced machine learning models in 2022. If you pair these figures with the fact that the average enterprise has over 1000 applications, suddenly you have a lot of API integrations to account for.

Without addressing your system integration challenges, you risk deploying AI that results in generic data in, and generic insights out.

Generative AI and API ecosystem

Let’s find how API fits into this Large-language models (LLMs) or generative AI space.

You can start with an LLM of your choice, such as Salesforce CodeGen or OpenAI’s CoPilot.

A large language model (LLM) is a deep learning algorithm that can perform a variety of natural language processing (NLP) tasks.

As you know, big models incur big cost, and LLM’s are expensive.

So large language models are exposed as APIs to reduce cost. As we know, APIs are the easiest way to get data in and data out from these LLM. These LLM’s are open for anyone to use. These APIs are also pulling data from your existing system as well as legacy system. Now you are enabling APIs which is required for your business process and adding data context which is make sense to business use-case.

Next, you can establish control over the APIs for your LLM by applying governance and security policies using Universal API Management. In this way, you can assure that your organization is leveraging AI while remaining secure and conformant. Once your APIs are secured then you can add automation and integration flow with your APIs which communicate with your internal systems. Enabling AI data through API You can push and pull data from a variety of data sources, including 3rd party applications, to ensure that you are using the latest data with the latest technology and building a complete 360 view of your customer.

API Safely unlock generative AI capabilities through a layer of trust Use Universal API management (UPIM) to provide security and governance for AI driven systems. The integration and automation tools also ensure the customer 360 is all up to date with the latest data, making powerful customer experiences possible.

Why Airlines Need Digital Transformation

When the Wright brothers flew their first plane they never imagined that after 100 years this industry will be one of the biggest and most complex. Initially Airlines were a part of luxury transportation but nowadays it is a necessity for most. Every day thousands of airplanes are flying and millions of passengers are reaching their destination.

To streamline the whole process, all departments of airlines should work synchronously and efficiently. There are a lot of variables involved to execute one task efficiently with a very little margin for error. The airline industry is the best example of machines and humans working together in harmony, which allows tasks to be completed quickly and accurately without any errors. The Airline’s biggest challenge is finding ways to reduce costs while still providing quality service.

Airline industries need to put forward the best process in place in order to remain competitive and profitable. All processes, including, operational structure, route network, fleet size, and pricing strategy need to be digitized and transparent to compete against their competitors and continue to be cost-effective.

Successful Adoption of Digital Transformation is the key to success to Airlines business.

Digital transformation allows Airlines to enable data efficiently and securely. It also helps in reducing the cost of operation and increases the efficiency.

Here are few area where Digital Transformation is helping Airlines to work efficiently and cost-effective .

Market & Partner Data – The Covid pandemic was a big disruption for the Airline market. Tracking and monitoring real-time Covid data after the pandemic is very important in managing your operation efficiently. Airlines work with their partner to get Market Data, Events, Weather, Traveler In-Flux, Reviews, and External Traveler Information for their operation. Integration with partner data is very important to get contextual insights for airlines.

Travel  Data – To run the airline industry efficiently their passenger System of Record needs to integrate efficiently. Those system are not limited to  Passenger service system (PSS),Computer reservation system ( CRS), Global distribution system (GDS), Enterprise Resource Planning (ERP), Traveler Profile, Fare, Schedule, Availability, Preferences, Assets, and Distribution with Offers & Orders (NDC).

Intent & Sentiment Data – Social Media Platform is one of the invaluable tools for airlines to stay ahead of their competitors. By leveraging Sentiment & Intent Behavior analysis on social media platforms, airlines can better understand passenger preferences and tailor services accordingly. This helps the Airlines build customer loyalty and increase profits over time.

Customer & Services Data – Managing historical customer and service data help airlines to get their customer sentiment and preference. These data includes Demographic and Identity Data, Profile, Cases Contact Center History, and Service Interactions data. This data helps airlines to understand passenger preference and provide better service. 

Marketing & Loyalty – Digital Transformation combining predictive analytics and human-centric design to create a more personalized experience to drive growth in loyalty, satisfaction and incremental revenue. It also helps in marketing to track Campaign Metrics, Digital Footprint, Experiential Targeting, Audience Segmentation, Digital Marketplace. 

Devices & Location Data – Airlines operation depends on IoT Sensor Data, Telemetry, Mobile, Voice, Geolocation, Location-Based data. This Intelligence based data is revolutionizing procurement through real time decision making. Allow the operational team to know the exact location of goods at any given time.

These are the big impact of digital transformation in airlines industries

  • Quicker time to market
  • Smooth Transitioning
  • Enhanced Business Agility
  • Reduce cost
  • Innovate and drive operational excellence   

Security for Critical Data

When organization is migrating to digital transformation, data security is a big concern. Digital transformation impacts every aspect of business operation and execution. The volume of data that any organization creates, manipulates, and stores digitally is growing, and that drives a greater need for data governance. Large volume of data security is the biggest challenge for any organization for their entire data lifecycle. 

Data security is a process to protect sensitive data from unauthorized access, corruption, or theft  during the entire data lifecycle.

Here are a few steps to mitigate data risk and implement data security.

  • Event Monitoring
  • Data Detection
  • Data Encryption
  • Data Audit Trail

Event Monitoring – This activity includes Prevention, mitigation, and monitoring threats to sensitive data.

  • Monitor user activity – Know who is accessing data from where with real-time event streaming and min 3-6 months of event history.
  • Prevent and mitigate threats – Define and  build Transaction Security policies using declarative conditions or code to prevent and mitigate threats.
  • Drive adoption and performance – Analyze user behavior to enable security training for organization and find security bottlenecks to improve user experience.
  • Event Log Files – Create event log file for rich visibility into your org for security, adoption and performance purposes

Data Detection –  Find and classify the sensitive data quickly and mitigate data risk. 

  • Monitor Data Classification Coverage – Determine which data in your organization have been categorized versus uncategorized. High sensitive data needs to be secure properly. Label data appropriately to manage data security.

Data Encryption – Encrypt sensitive data at rest while preserving business functionality.

  • Encrypt data and maintain functionality – Protect data and attachments while data search, lookups, transportation and storage.
  • Key Management – Data encryption key management is very important to secure organization data. It includes control and authorization of data encryption keys.
  • Policy Management – Data policy management is defining and managing rules or procedures for accessing data. It allows individuals to follow certain processes to access data during storing or transit.. 

Data Audit Trail – It allows strengthening data integrity for an extended period. This strengthening data integrity process enables compliance and gains insights.

  • Data History – Store data as long as you can use this historical data for audit Trail or delete if you do not need this data.
  • Data retention policy – Data retention policy defines what data or how long this historical data needs to be stored for audit. Based on sensitivity of data you can archive from 3-6 months or more.
  • Insight of data – Create insight and dashboard for data audit transparency. It allows any organization to track any compliance or data security issue.

AIR India Deal: Catalyst for Aviation Industries

The COVID-19 pandemic caused total disruption in the airline industry. The aviation sector struggled to survive, with 80 percent of flights canceled during the pandemic. Whole travel & hospitality industries were struggling with pandemic slowdown. Before the revival from pandemic Russia-Ukraine war affected the aviation industries. Oil and maintenance price increased which reduced profit margin for aviation sector. So many routes got canceled or restructured airline routing. On top of that, Airlines canceled thousands of flights as a massive winter storm and bitter cold swept the U.S., which directly impacted airlines revenue. Remote work is also impacting aviation industries. Most company travel related work reduces tremendously and it is directly impacting aviation industries.

This disruption directly impacted airlines manufacturers and supporting industries. Aviation Modernization getting impacted with these aviation disruption. Budget cuts are also slowing down digital transformation of airlines industries. Aviation industry leaders were expecting recovery would be very long and it would change forever.

But in the economic slowdown and pandemic affected aviation industries, Air India deal with Boeing and Airbus came with new hope for aviation industries. This deal will help to re-energize and rejuvenate whole aviation industries. This deal is going to impact at least 3 continents and will generate millions of job opportunities. This will help to stop the economic slowdown and restart the economic engine. Air India is going to buy 460 aircraft from two main aircraft manufacturers. This is the 2nd largest aircraft order in the history of global aviation. It is called the mother of all deals in aircraft industries. Total list price for these deals are approx. 80 billion dollars. These deals are splitted between two aircraft manufacturers, Boeing and Airbus. Boeing is providing a total 220 aircraft and Airbus will deliver 240 aircrafts. This mega deal is so huge that it is elevating India’s image and putting India as a bright spot in the whole world economy.

Now lets see how this deal is going to affect the Aircraft Manufacturer, Airlines and Airport sector.

Aircraft Manufacturer
  1. Parts traceability for Airbus and Boeing – Boeing Aircraft parts manufacturers spread across approx. 3 North America countries and 44 US states. Similarly Airbus parts manufacturers are  spread across  approximately 10 European countries. This will lead to generating millions of jobs in these places but the big challenge is to manage part traceability and assembly of these parts. Materials management team picks and packs parts into kits to be delivered to the parts assembly working area in the aircraft factory. Create a robust system for real-time visibility of these parts for quick collaboration and decisions.
  1. Certification Compliances traceability  – Each small aircraft part is a very critical item for aircraft assembly. All these parts need to be certified with the external Federal Aviation Administration (FAA) for US and European Union Aviation Safety Agency (EASA) for European countries. Tracing and management of these certifications is one of the most important  activities for Aircraft industries. This system should be highly visible and audited for aircraft safety.
  1.  Cross Team Visibility and Collaboration –  A lot of people, both external and internal are working for one aircraft. It also involves a number of systems and processes to deliver one aircraft. People, systems and processes need to come together to deliver and maintain aircraft. This whole collaboration needs to be highly transparent and visible to deliver aircraft efficiently.
  1. Delivery and Service – Safety and documentation is a very important activity for any aircraft delivery. All steps and processes need to be properly documented and executed through respective teams. Any of these missing steps/processes  lead to delayed aircraft delivery. 
Airlines

This mega deal will generate all kinds of  job opportunities within India and across the world wherever Air India will fly. This deal will also impact the Air India system and process.  

  1. Airlines Route management – More aircraft for airlines, more robust and transparent route management system. Any of these systems delay airline operation. You also need to collaborate with your partner airlines route for high profit margin. These systems and processes need more people across the world wherever Air India airlines will fly. Digital transformation will also help to transparent the whole system and increase operational efficiency.
  1. Aircraft Maintenance and parts management –  Aircraft maintenance and part management is a big challenge for airlines. Like Air India buying more aircraft it will need a more transparent maintenance  system/tracking, people and airlines hubs. Airlines also need large hangers to maintain their aircraft. 
  1. Employee management and experience – More aircraft, more routes and more employees to manage the whole airline system. This includes more corporate employees, In flight crews and ground maintenance associates  recruitment, onboarding & retention. 
Airport 

An airport is a massive business and has many verticals on its own. An airport as not just the spaces you see, such as departure, arrivals hall, duty free, security, etc. but an airport is a complex organization with many parties coming and going, the airport, retailers, service providers on the airside, the airlines, the cargo & warehousing spaces, aircraft MRO, fueling, as well as day authorities such as air traffic control, customs for people & Cargo, security, fire & emergency services, etc.

Since Air India is increasing its fleet , It will also impact airport operation and process. It will bring more opportunities to the airport.

  1. Efficient passenger process – Airport customer service, passport control, boarding/arrival, airport gate management comes in this category. It will be heavily impacted with more flights from Air India.
  2. Airport Safety, Security and health management – With this deal, more travelers will arrive and depart from the airport. To keep airports safe and secure, airports need more resources and their system transparent. They need to enable touchless solution to create perceived security around COVID-19
  3. Baggage operation – Baggage operation and management  is also a very important process for airports. This process will also get heavily impacted with this new  Aircraft purchase.

Retail 2023: The new Trend

From the last few years COVID pandemic has changed the whole Retail business spectrum in ways we could have never imagined before. Exploring new and accelerated trends gives us an indication of how this evolution will continue into the new normal. This pandemic also leads to closure of countless stores and bankruptcy. After surviving from the pandemic, inflation is hard hitting Retail business. Supply chain is also getting impacted with the Russia-Ukraine war. Now experts are saying that the greatest risk facing global supply chains has shifted from the pandemic to the Russia-Ukraine military conflict and the geopolitical and economic uncertainties.

With all this news for Retail industries, customer expectations and habits have shifted. Customers expect engagement on values to go beyond point of purchase to creating moments of engagement across the full journey. Now retailers have been compelled to find new ways to connect with consumers in a personalized and tailored way in-store as well as online to make a more intuitive experience. Retailers are going more digitized in their approach to connect with customers.

This is how retailers are moving forward to reach a wider customer base and lure their product. 

  1. e-commerce Technologies – In pandemic time if your business presence was not online then you will be out of business quickly. So Retailers have increased investment in e-commerce technologies. They increased the budget for digital transformation. To get ahead of competition, they are offering a mix of digital and physical experiences ahead of their rivals. Retailers are also focusing on customer service and providing seamless service experience across messaging, web and mobile channels. Retailers are creating a cohesive and connected customer shopping journey with e-commerce and unified data across systems.
  1. Infrastructure– Retailers are upgrading their instore as well as online infrastructure. They are replacing traditional store signs with digital signs and screens to display ads and videos. They are also adding kiosks and self-checkouts within the store. This is making the shopping experience more convenient and personalized. Shoppers are in and out, without having to make small talk or wait in queues. Deployment of in-store technologies double in a year.
  1. API-first and Cloud – Retailers are focused on Composable architecture. Composable architectures are key players to  implement successful digital transformations and most engaging digital experiences. 2023 will be a year of focus for retailers to remove entirely their legacy monolithic architectures. API-first and Cloud based solutions help retailers to switch to new functionality without the need for significant investment and resources. This will reduce the incredible amount of time and cost of ownership of a fraction of legacy technologies. API-first connectivity helps customers to shop anytime, anywhere and anyhow
  1. Customer experience – Customer experience is the one the main focus for Retailers this year. The focus of customer experience is online as well as in store experience. Retailers are providing customers enhanced assisted-selling experiences through assisted Selling. They are also focusing online customers through distributed OMS (Order Management System), Omni-channel and remote Selling. Retails are preparing for next level customer experience through loyalty(customers long-term relationships), native App and AI based digital fitting room.
  1. Merchandising & Supply Chain – Retailers are providing real time tracking and inventory information to their customers. They are also providing purchase incentives to their loyal customers so that they can keep engaging customers for their products. Retailers are also focusing on upgradation of warehouse management (WMS) to fulfill in-store as well as online orders.

ChatGPT: A Intro & Company Use-Case

The internet is full of buzz about the new AI based chatbot, chatGPT. ChatGPT reminds me of the early days of  google, how google came and changed our internet search forever. We were using lycos search engine but google gave a new definition of search engine. Similarly I am seeing chatGPT is trying to define our search which is based on AI and AI models. It is coming as a new disruptive technology. Suddenly google is looking like old school.

Generative Pretrained Transformer 3 (GPT-3)  from OpenAI, is the main component for Jasper.ai and other cloud based content writing, chatbot and machine learning applications. GPT-3 was first publicly released by OpenAI on June 11, 2020.  GPT-3 is based on the concept of natural language processing (NLP) tasks and “generative pretraining”, which involves predicting the next token in a context of up to 2,048 tokens. 

GPT-3 is based on Large language models (LLMs). Large language models (LLMs) are AI tools that can read, summarize, and translate text. They can predict words and craft sentences that reflect how humans write and speak.Three popular and powerful large language models include Microsoft ’s Turing NLG, DeepMind’s Gopher, and OpenAI ’s GPT-3.

ChatGPT was first publicly released by OpenAI on November 30, 2022 based on the GPT-3 framework. Initially developed as part of the GPT-3 research program, ChatGPT was built on top of the powerful GPT-3.5 language model to specifically address natural language processing tasks that involve customer service chat interactions.

OpenAI’s Chat GPT3 has demonstrated the capability of performing professional tasks such as writing software code and preparing legal documents. It has also shown a remarkable ability to automate some of the skills of highly compensated knowledge workers in general. ChatGPT has immense potential for ecommerce customer experience automation. ChatGPT allows customers to personalized shopping and fully automated 24 x 7 customer service on-demand.

In spite of chatGPT buzzwords, ability to content writing and customer service on-demand, I am little careful to use this technology for my business. I tested a few use-cases in chatGPT. It is working fine with some simple use-case and problem solving. But as soon as I added a few more variables to my problem, the chatGPT response was not correct.

Here is screenshot from ChatGPT for my problem and solution from chatGPT

The problem shown above chatGPT directly calculated from equation and response came as 5 min.

In chatGPT’s response it is not calculating a person’s waiting time in the queue. 

So from above question right answer would be

Average Waiting Time = Average Processing Time x Utilization / (1-Utilization).

Average Waiting Time = 5 x (5/6) / (1 – 5/6) = 25 minutes

So, the correct answer is 25 minutes waiting in line. If we add the 5 minutes at the kiosk, we obtain a total of 30 minutes.

So from the above issue, I would like to highlight a few points if your company is trying to implement any ChatGPT solution.

  1. Does the ChatGPT AI model is configured based on your company use case?
  2. Do you have enough historical data to run and test AI based chatGPT LLM models?
  3. ChatGPT runs on the big model like LLM model. Big models incur a big cost, and LLM are expensive.
  4. Since ChatGPT runs on a big model (LLM), ChatGPT  needs to overcome performance constraints.

Keep an eye out for GPT-4, which may be released as early as the first half of 2023. This next generation of GPT may be better at their results and more realistic. 

RPA, BOTS, AI and API

In today’s competitive markets, industries face many challenges in order to remain successful. These include staying ahead of the competition, understanding customers need and preferences, and providing a high level of service that will make customers happy.

Here are few challenges for current industries

  1. Resolve customer issues ASAP
  2. Collect and qualify customer information
  3. Easily connect to business process
  4. Enable business new features quickly.

In current business requirements 90% of organizations see increased demand for automation from business teams, due to that 95% of IT leaders are prioritizing automation. 

Automation is a critical component of digital transformation and business success. Robotic Process Automation (RPA) bots are at the forefront of this revolution, providing businesses with an automated solution to optimize their processes while improving customer experiences. RPA bots can be used in many areas such as data entry, document processing and workflow management; they automate repetitive tasks that would otherwise take up valuable resources from human employees. This automation not only increases efficiency but also reduces costs associated with manual labor, allowing companies to focus on more pressing issues like innovation or collaboration between departments. By utilizing intelligent bots powered by artificial intelligence (AI), companies can further streamline operations and provide customers with immediate feedback on requests or inquiries in real time without manual intervention from employees. Additionally, natural language processing (NLP) capabilities allow chatbots used in websites or apps to respond quickly and accurately when communicating with customers.

Using NLP, Bots can decipher specific sentences or words customers type and associate them to an intent. NLP provides insights by analyzing past chat transcripts to identify common customer utterances or phrases (such as order status, account information, password reset, logging an issue, etc.) that the Bot can use to take action. A predictive model for bots to understand intent and take action called intent model. The intent model is made up of intents and utterances.

APIs , NLP and AI are the essential components for Bots. Once an intent model from NLP identifies action then Bots call APIs. APIs help to execute tasks from the backend system for Bots. Suppose if users are looking for order status from bots and APIs are not responding on time it will fail the whole Bots purpose. So APIs are one of the key components for Bots.

APIs streamline Bots tasks and automated any process/tasks for any team. Bots and APIs empower business and IT teams to collaborate with ease and break silos in every step of their automation journey. Enable end-to-end automation at scale Reuse and compose RPA securely.